This is just so some implementors could claim their code was compliant
and still win some esoteric benchmark. It also allowed implementors
to throw the responsibility of aliasing issues to the developers
rather than the system vendor.
I don't buy this analysis. Do you have some kind of basis for it?
The impression I got was that, through the magic of separate compilation,
it could be just plain impossible to determine while compiling a given
module whether or not aliasing of arguments was possible, but that, if you
had the qualifier, you could then optimize some code better.
That is so retarded. This *ONLY* applies to C++ code. If I see mixed
code and declarations, I always assume its either wrong or its C++,
with the expected differences in semantics. Mixing the declarations
and code in C never serves any useful purpose.
Not true:
* VLAs actually care.
* Other stuff could actually care, although it probably usually doesn't.
(And actually, the for-loop-declarator was done pretty much at the same
time, and for the same reasons.)
So, while you may not think the benefit is particularly *significant* outside
of C++, it's still useful to some developers.
[...] and variable-length arrays;
In which you broke compatibility with gcc, which is the mostly widely
used compiler that implement variable length arrays. Good job.
Yeah, that was definitely a clusterfsck. Back then, there was some
general distrust of standardization in GNU land. (This was back when
the gcc docs claimed that some exceptionally bogus behavior was
mandated by ANSI, so far as I can tell simply out of spite; it wasn't,
and never had been, mandated by ANSI.) As a result, there were no gcc
implementors active in the standards effort. I do agree that it woulda
been better to be more aggressive about researching that, but to be
fair, I believe we did have other implementations to look at... Which
had, of course, decided things differently.
Which explains why oh so many people were clamoring for these features
as can be evidenced by the massive volume of technical articles and
books describing them and the need for the transition and ... oh
wait. None of that happened.
I didn't say these features were demanded by the vast majority of users, only
that there were users who pushed for them.
Most real world developer don't even
have a clue about C99, what its substantive differences are, not could
they imagine why they would need or want its features.
Probably true.
In the exact same period of time security (the various viruses, worms,
buffer overflow exploits etc), portability (as evidenced by Java,
HTML, Perl, Samba and other portability solutions), and unicode (now
that its clear that the Chinese and the rest of the world will adopt
it) were become extremely hot topics that system level programmers and
application developers were really focusing on. Furthermore there was
a general transition from 32 bit systems to 64 bit systems looming.
Take a look at the C99 standard in light of that.
Okay. Let's see. We adopted UCNs in identifiers and string literals,
gave reasonable support for UTF-8, 16, and 32, and dramatically improved the
quality of support for character sets uther than 7-bit ASCII. C99 not only
added a 64-bit type, but added a massive rework of the integral promotion
system to make it scalable and durable in the face of possibly-larger types,
clarified and standardized how to store pointers in an integer of large
enough size if possible, reserved a suitable namespace with suitable patterns
for expressing concepts such as "at least 32 bits" or "exactly 32 bits" or
"fastest thing you have with at least 16 bits". Security, and buffer
overruns? snprintf solves a HUGE chunk of that problem space, if used
competently. (VLAs actually help a lot with some common use cases, too.)
In short, a whole lot of these were issues that got specific attention and
improvements. The Unicode support and better support for larger bit sizes
and systems with a broader range of types both looked like serious work
that addressed those problems, apparently well enough to be of use to
people.
Also, some of the limit-raising has made life a lot easier for people writing
portable code.
You don't get out much do you?
Heh.
Microsoft's latest compiler actually
issues a warning if you use those functions. gcc issues a linker
warning if you use gets(). The implementors are going beyond the
standard, because the issue is too important.
That's fine, and that's what I'd expect. It's a quality-of-implementation
issue. (That said, you're arguably wrong; gcc doesn't issue that warning in
and of itself, that's managed by, or not managed by, the system C library
and additional linker magic.)
You can't take those things out without at the very least deprecating them
first. You can, however, encourage implementors to issue diagnostics.
Say, by deprecating gets, and calling it "obsolescent", warning about it
in future language directions, and so on.
None of the printf class of functions should ever be put in the same
sentence as the word "efficient". If you want efficiency, you need to
do compile, or code-time expansion of such code directly. If you put
printf-like semantics in the *PRE-PROCESSOR* then we could discuss
efficiency.
Compare snprintf to what you had to do before it existed. It's substantially
more efficient, even if it's not efficient "enough".
Yeah, I've seen it. You use memory patterns in the contents to detect
the countedness. Its like you don't even understand why \0
termination is such a bad idea in the first place.
It's not a perfect design, certainly, but on the other hand, it does
noticably reduce the *likelihood* of collisions. (In fact, the
"memory patterns" thing has no impact unless you want to try to take
advantage of the magic implicit conversions; if you don't use those,
it isn't an issue.)
*Particular* vendors with an agenda. Not surprisingly, gcc was slow
with the uptake. Does the standard committee even take feedback from
gcc?
If they offer it, absolutely! At the time, the people working on gcc
weren't that interested -- this was pre-egcs, remember.
Lol! Do you buy all the products you see in all the TV commercials
too?
Nope.
Those vendors representative are paid to *RIG* the language so that
they can win a benchmark against their competitors, and that is all.
Your cynicism fascinates me, but it does not persuade me. I spent a
while talking to these people and seeing what they thought was important.
I am pretty sure they were not trying to partake in some huge crazy
conspiracy.
Do you have people on your committee that are from Metrowerks,
Borland, WATCOM or the Free Software Foundation? No, they are from
IBM, Intel and Sun.
Anyone can participate. I did it out of my own pocket for about a decade
just because I thought it was fun -- and I got just as much of a vote,
and I got listened to. If someone from the FSF wanted to go, I don't
think anyone would complain at all.
While both classes of entities make compilers,
one class has a blatantly obvious conflict of interest (i.e., they are
willing to harm the language if it harms their competitors more than
it harms themselves).
You say this, but again, I see no evidence. The people I saw participating
were, consistently, devotees of C who really wanted it to succeed.
What that's an understatement. In fact there is nothing you can do in
C99 that you couldn't do in C89.
Well, strictly speaking, there's nothing I can do in any language that I
can't do in any other.
But there's a ton of stuff that works enough better in C99 than C89 that
I've mostly switched to specifying that and using those features.
The C99 committee didn't even try to make the language better. There
are obvious candidates like a ranged memory expansion (like realloc,
but resizes to a size in a given range, to improve the chances of a
non-moving realloc) and widening multiply (multiply two 32 bit inputs
to get a 64 bit result, or two 64 bits to get a 128, etc) which you
didn't bother to consider (and would have truly made the language more
powerful).
Fascinating to hear from someone who wasn't there what we did or didn't
consider.
No coroutines (that would have been nice -- alas, I see it
show up in language like Lua and Python, first).
I've seen them implemented successfully often enough that I'm not sure this
is as fatal as it might seem.
No attempt at type
safety (see gcc) or lambda capabilities in the pre-processor.
There was some discussion of the pre-processor thing, which I personally
liked, but there was a pretty strong consensus that the pre-processor was
bad enough laready.
.... Aren't you the guy who made it to Chapter 7 in the IAQ suggesting
"corrections"? (chromatic.com?) If so, I remain stunned; it's not just
the failure to recognize the jokes, it's that in several cases the answers
were technically correct (albeit carefully phrased poorly), and my
correspondant "corrected" them with errors.
I bring this up partially because I still wonder to this day whether the
epic set of corrections was intended as some kind of joke, and also because
it would be relevant to the question of whether I should trust your statements
about other peoples' intent...
-s