PH> The problem is you take all that as a conclusion, instead of
PH> asking the simple question of *WHY* those events happen. The
PH> compiler vendors didn't bother because the programming
PH> community doesn't care for any of C99's features. If
PH> programmers *demanded* certain features, then compiler writes
PH> would implement them whether they wanted to or not.
But this brings me right back to the original question. Why is it so
important that this new improved language be called C, and why is it
so important that it have ANSI/ISO's imprimatur?
Still walking backwards in this argument. Go ask the ANSI C committee
why they continue to refer to the ANSI C 1999 standard as if it were
referring to the C language!
[...] We've seen, in the
case of things like C99, that the imprimatur is insufficient to get
people to use the thing, and we've seen, in the case of things like
BSD sockets, that usefulness does not require official imprimatur to
proliferate.
Right, and people are using various libraries that I have created, and
so on. But what you are missing is that people are also picking up
things from the various interim release of the C standard (like 1994
when they added vsnprintf()) without *too* much controversy (some
older compilers like Turbo C don't have a vsnprintf() because it
wasn't maintained; oh well, we live with it.)
If the standard endorses good ideas, *LOTS* of people will pick them
up. One of the things you are missing with BSD sockets, is the fact
that Microsoft went and implemented them with an interface that is
very different under Windows than it is under the Unixes (I wonder how
they implemented it under MacOS9). And various embedded systems don't
even bother with any kind of sockets. So there's no "standard
sockets" that anyone can assume -- its just a platform specific idea.
The capability of sockets is good -- the current state of
fragmentation is not so good. With effort you can reasonably write
wrappers that let you write socket code that is portable -- but
there's not single centralized standard for this.
I wonder if the C Standards Committee is aware of this new fangled
thing called the "intarweb". With a standardized interface they could
hide the fact that its implemented on top of tubes, instead of a big
truck. I've heard that sockets can help with that.
PH> Well I wonder why the people who designed STL for C++ didn't
PH> decide to go off and make their own language. Same for the
PH> boost people.
Those aren't different languages, though: those are libraries. They
don't require semantic changes in the language, and they don't require
the same stringency in the standardization process.
Things like garbage collecting and operator overloading *do* require
semantic changes in the language.
What is your problem? When you do a strcmp ("Paul Hsieh", "Jacob
Navia") do you get 0 for some reason? I think there's a bug in your
brain. Please go to the following link:
http://en.wikipedia.org/wiki/Straw_man
In defense of Jacob, it appears as though he wants to *add* to the
functionality of the language by adding features that are almost
entirely neutral to existing code. Compare his extensions, for
example, to hijacking the word "restrict" and making it a keyword in
C99.
PH> My proposals tend to be limited to the library. Perhaps you
PH> can enlighten me with examples where my proposals would lead
PH> to actually giving something up (to be a trade off you have to
PH> gain something in balance with giving something up)? When
PH> people try to challenge me on this, the best I get is the
PH> typical circular reasoning: "that's not in the standard,
PH> therefore its not portable" or some crap like that, when
PH> clearly I am trying to suggest something to be standardized.
So why not just create your new improved library?
So who says I have not? The problem is that my various efforts are in
pieces, not well documented, sporadically tested, very context
dependent, not portable, not peer reviewed, etc (the main exception of
this being Bstrlib which, by now, is at compiler-level quality). And
I know there are thousands of different developers who have written
substantially the same thing with different degrees of quality
(probably at least some are better).
[...] POSIX is not C99;
neither is BSD sockets, or X11. Why does it need to have the ANSI
imprimatur for you to be happy?
Because I want to write C. Not Microsoft C as opposed to Gnu C as
opposed to WATCOM C, etc. I mean -- what do I care? Why do I have to
learn sockets as many different times as compilers that I end up
using?
[...] C is good enough in a lot of ways; more importantly, the
places where it's not good and the edge cases where it gets
pathological are well known.
PH> You can't make a sentence like that and not have sympathy for
PH> the position that the C standard needs some changes.
I have sympathy for the position, but I see the costs, and I don't
understand why you consider the ANSI C imprimatur so important.
ANSI accomplished one thing -- and they did it back in 1989. They got
all the compiler vendors to agree to a common standard. This meant
that people could write common code. It meant that people did not
duplicate effort in redesigning and reimplementing common things like
FILE IO, some math, etc. If you don't see the value of this, then
this discussion is over.
If you do, then I would point out that the *reason* that we *needed*
the first ANSI standard is that everyone's C compiler was doing their
own random thing with their own extensions that rendered them
incompatible with every other C compiler out there. I.e., the purpose
of standardization was to *decrease* the disparities between platform
and compiler implementations, so that learning the language once was
immediately translatable to other platforms.
So take for example some of the things I am proposing: 1) Heap
extensions and heap debugging APIs. These things obviously exist in
various products and compiler extensions -- no two of which are
compatible, of course. 2) "mulhi" and "SIMD" instructions -- many
modern compiler come with assembly language extensions, and that's how
they are implemented; one by one, every one of them incompatible with
the next (actually Intel is somewhat compatible with MSVC, but that's
a weird special case.) 3) Some kind of fseek/ftell on intmax_t
instead of long: every modern compiler I know implements some sort 64
bit fseek and ftell (again, each incompatible with the other), because
"64 bits ought to be enough for everyone".
People clearly want and need those things. The way it happens today
is kind of pathetic. Why do I have to learn a different way to deal
with those things as a Solaris, or a VxWorks or a QNX or even a Linux
programmer?