That's another one for the Red Book, Dickie.
"comp.programming is not about programmers"
"Nilges isn't in comp.risks"
"I am smarter than Peter Neumann"
"chars aren't characters"
You see, the referents of your language are fantasies and the words
themselves, which you foolishly have learned to fit together without
troubling to see if they have any relationship to reality.
Adler's and GCC's usage reflects a possibility of which you are
unaware. This is that while on most computers, the smallest
addressible piece of memory is a character (where this is a Western
letter or ideogram), this wasn't the case in the second and first
generation of computers, and given the reality that "retro" computing
is practised at places like the Computer Museum in Mountain View, we
need to program physically reconstructed or simulated machines. And
not only for shits and giggles: especially in military applications,
the baby-killers might want to reprogram older baby-killing machines
(I don't like it but it's a "need", and they might be beating swords
into plowshares or reprogramming older land mines so they don't go off
when touched by a kid).
For example, the IBM 7094 had a 36 bit word with six 6 bit characters
packed inside words using BCD. The smallest addressable unit of memory
was the word, not the character. Implementing C on the 7094 means that
the sizeof the character is 1/6, a double or single precision value.
This means that you want foolishly to conflate char* and void*, but on
the 7094, Adler's default (void points to sizeof 1) is not the same as
char, and on the 7094, sizeof has to return a float. You foolishly
confuse C with a language like Lisp or Java, not seeing how much (bad)
history is incorporated in it. And like Seebach you try to destroy
reputations of people far more qualified than you based on your
Sophistry.
This is foolish because you think that C is "portable" whereas it is
tied to a machine in which the smallest addressable unit of memory is
the character.
Of course, this is a "retro" issue, but going forward, multibyte
characters mean that "the smallest addressable unit of memory" is no
longer the character.
C can of course be "hacked" to accomodate both retro and multibyte
computing, but you think foolishly that this shows the power of C.
What it shows is the power of Turing machines no matter how silly they
are. C is a Turing machine...one that sucks and that obscures what
goes on.
But hey don't go changin'. Your world is self-referential and you are
hopeless when it comes to the possibility of retraining. This was
clear to me in 2003 when you made a big whoop tee doo about an
invariant in a for loop. That is, when you can't dominate the agenda,
you do ANYTHING to invent agenda items, you lie about people, and you
deny your own lies.