When I say a typedef "allows client code to know less"
what I mean is code can be written that doesn't depend
on the types representation.
But the same would be true if you used, say, a #define to try to "hide"
the definition as well.
[...] Not that it _can't_ be
written to depend on the representation, but that
it _can_ be written so that it _doesn't_ depend.
But that's totally nonsense. If you actively avoid such direct access,
then it doesn't matter whether you've hidden the definition or not.
The point of using real language support for opacity is to have the
compiler work with you in enforcing such opacity. Some might consider
this a part of what it means to have strong typing.
That's an important property, and using typedef
enables that.
Except that this doesn't distinguish itself from #defines which can
accomplish the same thing.
That's true but irrelevant to the point I'm making.
I'm sure it is -- but it seems to me that its your point which is less
relevant to the topic of real opacity.
[...] Thus it is
relevant to how opaque the type is. Using a
typedef might be desirable or it might not
be, but it is relevant.
If you were able to somehow *hide* this typedef, and have the typedef
just magically be there, and somehow turn off its compatibility with
other pointers and sizeof(), then I would agree with you. But you
don't have this.
In fact if you want to truly "opaquify" a pointer, the way to do that,
is to put it into an opaque struct (or union).
To be technically accurate, C doesn't have opaque types.
The standard doesn't mention them, [...]
Ok, now you are slipping into Keith Thompson mode. Just because the
standard does not explicitely use the language, doesn't mean the thing
isn't there. C *DOES* contain the following: variables, a stack and a
heap (or dynamic memory pool), regardless of what the standard says
(even if the implementation of the later two are not specified).
That's because all those are general concepts that the C language has
support for, even if not spelled out in the specification. Similarly,
my computer has the ability to factor large integers even though none
of its specifications explicitely say that it can do that.
Same is true of opacity. C isn't missing opacity because it doesn't
discuss it in the standard. Opacity is merely a *property* of certain
declarations (structs and unions that are not fully specified and void
* pointers). This is a real concept because of how it dictates the
interaction between *two* programmers. If one programmer makes an
opaque type, then another cannot normally gain access to its explicit
definition, and this is enforced (more like 'supported' for void *'s)
by the compiler/language. So the two programmers don't even have to
contact each other to understand how things are supposed to be used.
On the other hand, if one is simply relying on typedef, the other
programmer can just read the header file see that its a typedef and
just go ahead and do things like handle->privateEntry (unless real
opacity via a struct or union is in there).
[...] and there is no construct
in C that produces a type that is 'opaque' in the sense
that the term is used in languages that have opaque types
incorporated into the language.
Obviously every language puts a slightly different spin on things
(compare tables in Lua with tables in TCL with Dictionaries in Python
with hashes in Perl; or coroutines in Lua versus generators in Python).
C's main weakness is that an "opaque type" can be *falsely* redefined
anyways -- but doing so, is an explicit subversion of intent, and
should not be a problem in real practice (assuming you are doing things
in good faith -- i.e., avoid the void * method, and don't ever define
any struct/union in more than one file). Other languages are likely to
have tougher enforcement, but the main concepts are the same.
Rather, what C has is ways of making types that we
can _think of_ as opaque.
If such mechanisms exist primarily in your mind, then they apply
equally to all other languages, and exposed mechanisms from any data
type anywhere. There is nothing special about typedef which
corresponds to one's "thinking". If you are just one programmer, its
not a problem at all to "think" about certain data types as being
opaque in certain contexts regardless of what the real state is --
typedef has nothing to do with this.
There is a big difference between opacity enforced by a compiler and
opacity conceptualized through convention. And this is most critically
seen as the number of developers on a project increases.