What's the deal with C99?

B

Bartc

[from 'The problems in comp.lang.c']
Malcolm McLean said:
We're currently in the undesireable situation of having a rejected
standard,
C99, which means that "standard C" is no longer the precise thing it once
was. That doesn't mean it is impossible to hold a coherent discussion. I
don't think we can legitimately hold C99 to be off-topic, but we should
point out that non-block top declarations are not, de facto, portable.

I've looked at the differences in C99 according to the Wikipedia C article.

It doesn't look like that much to add to C90.

So why the problem with creating compilers for it, is it motivation?

What is the most difficult part of C99 to implement?

Some of the extensions, like Complex support (I've never used complex
numbers and never will, and I'm sure many others will say the same) are
really not very interesting; perhaps that should have been optional if it
made producing the compilers easier.

Or is the real problem that there will always be C compilers that are not
C99 compatible, effectively breaking the standard because any supposedly
portable C99 code (does anyone else keep typing it as C((?) will not be
portable to those compilers?
 
J

jacob navia

Bartc said:
[from 'The problems in comp.lang.c']
Malcolm McLean said:
We're currently in the undesireable situation of having a rejected
standard,

This is wrong. gcc implements almost all of C99.
IBM xlc compiler line implements everything
Intel compilers offer some c99 support

If we apply the same standards to C++ we would say that there is no
C++ now, since no compilers implement the full C++ standard
issued 10 years ago!

This is not true. Standard C means the current C standard: C99

Even Microsoft implements now "long long" and will improve C99 support
in the future.

I've looked at the differences in C99 according to the Wikipedia C article.

It doesn't look like that much to add to C90.

So why the problem with creating compilers for it, is it motivation?

What is the most difficult part of C99 to implement?

The problem is not C99 but the fact that C is perceived as a dead
language by GNU and Microsoft.

Every C++ book starts with a page explaining how awful C is. This
starts making believe many people that C is as awful as everybody is
saying.

The attitude of the C community (and exemplified in this group) doesn't
help at all. Obvious flaws of the language will be dismissed, any
security discussion will be rejected, etc.

It is is=nstructive to see the reaction to Microsoft's safer C proposal,
where everybody criticized Microsoft but nobody started even to
acknowledge the problems Microsoft was trying to solve.

Some of the extensions, like Complex support (I've never used complex
numbers and never will, and I'm sure many others will say the same) are
really not very interesting; perhaps that should have been optional if it
made producing the compilers easier.


lcc-win implements complex numbers using operator overloading. This way
I will be able to incoporate all kind of new number systems (decimal
number systems proposed by IBM, fixed point numbers proposed by
embedded systems people) without changing ANYTHING in the compiler.

My opinion is that this is the only true solution to this problem.
Or is the real problem that there will always be C compilers that are not
C99 compatible, effectively breaking the standard because any supposedly
portable C99 code (does anyone else keep typing it as C((?) will not be
portable to those compilers?

Most of C99 is in gcc now. The differences are minimal.
 
V

vippstar

Bartc said:
[from 'The problems in comp.lang.c']
Malcolm McLean said:
We're currently in the undesireable situation of having a rejected
standard,

This is wrong. gcc implements almost all of C99.
IBM xlc compiler line implements everything
Intel compilers offer some c99 support
That is correct.
I've never heard before that C99 is a rejected standard.
If we apply the same standards to C++ we would say that there is no
C++ now, since no compilers implement the full C++ standard
issued 10 years ago!


This is not true. Standard C means the current C standard: C99


Even Microsoft implements now "long long" and will improve C99 support
in the future.
I never understood why Microsoft chose to implement __int64 but *not*
long long.
Surely, the semantics aren't the same but the MS operating systems
don't run on exotic architectures.Don't trust wikipedia, it's not a valid source.
Try C: a reference manual for the differences.
Most likely yes, the features that are not implemented are rarely (if
not never) used.
The problem is not C99 but the fact that C is perceived as a dead
language by GNU and Microsoft.
If it's considered dead by MS, then why lately I've seen so many
recommendations from MS in C?
As for GNU, that's not true. Sure, RMS said the language he prefers is
LISP but that's even "deader".
Moreover, RMS does not depict GNU, and most of GNU projects are in C.
You've said this multiple times, please explain what you mean by
saying that GNU considers C dead..
Every C++ book starts with a page explaining how awful C is. This
starts making believe many people that C is as awful as everybody is
saying.
If you don't doubt what you read.. don't read :).
The attitude of the C community (and exemplified in this group) doesn't
help at all.
I agree at this point. That's because in clc ISO C is discussed. If
you have a suggestion for the C language, there is comp.std.c
Obvious flaws of the language will be dismissed,
No, that is not true. If there is a mistake at the standard (and there
might be) and it's pointed out only a fool would argue otherwise.
any security discussion will be rejected, etc.
Because that is beyond the point of this group. As long as your
program is conforming you are on the safe side.
It is is=nstructive to see the reaction to Microsoft's safer C proposal,
where everybody criticized Microsoft but nobody started even to
acknowledge the problems Microsoft was trying to solve.
Which problem? The proposals of MS are *pure* BS, moreover, you said
just before MS considers C to be dead.. why would MS bother then?
I don't want to see MS having any impact on the next C standard. Next
thing that happends, you'll have to pay to do anything remotely
associative with the new standard.
That does not, however, mean that I put the BS label in MS' proposals
before considering them. It's just that I've yet to see a reasonable
proposal from MS.
It doesn't matter whether it's optional or not. If the implementation
skips _Complex (not Complex) and documents that UB is invoked when the
programmer uses it, it's in my opinion fine. Not a conforming
implementation, but usable.
lcc-win implements complex numbers using operator overloading. This way
I will be able to incoporate all kind of new number systems (decimal
number systems proposed by IBM, fixed point numbers proposed by
embedded systems people) without changing ANYTHING in the compiler.

My opinion is that this is the only true solution to this problem.
Software is not perfect. Live with it :) (or improve it? or make your
own)
 
I

Ioannis Vranos

jacob said:
If we apply the same standards to C++ we would say that there is no
C++ now, since no compilers implement the full C++ standard
issued 10 years ago!


I am sorry, you are completely wrong on this. Most mainstream C++
compilers support the 100% of C++98 except of one feature, the export
template (but one compiler supports this too).

With "most mainstream C++ compilers" I mean, Visual C++, GCC, Borland
C++ products, Intel C++, and others.
 
B

Ben Bacarisse

jacob navia said:
lcc-win implements complex numbers using operator overloading.

It is still an open question whether the way lcc-win32 does this can
cconform to C99. My evidence suggests that complex.h is needed to
pre-load the overloading, so compilation units that don't include
complex.h can't conform:

void f(double _Complex c);
void g(double _Complex a) { f(a * a); }

for example. Aside: is this allowed? I can't see any reason why not
but it might have made more sense for C99 to insist that feature X
requires one to include X.h. In that way, implementors could rely on
the header to trigger the mechanism. It would have allowed _Complex
to be a macro expanding to internal magic and would this have allowed
testing for complex support. Just a passing thought.

Of course, such programs are rare, and you can say you will ditch
conformance in these cases because your way is "better", but just bare
in mind that that is what the gcc people did for a while with VLAs --
and you considered that odd in another thread.
Most of C99 is in gcc now. The differences are minimal.

It would help your case if you had a C99 conformance page like gcc
has.
 
S

santosh

Bartc said:
[from 'The problems in comp.lang.c']
We're currently in the undesireable situation of having a rejected
standard,

This is wrong. gcc implements almost all of C99.
IBM xlc compiler line implements everything
Intel compilers offer some c99 support
That is correct.
I've never heard before that C99 is a rejected standard.

Well I don't about "rejected", but it's true that it has not been
received with anywhere near the enthusiasm that C90 was received.
I never understood why Microsoft chose to implement __int64 but *not*
long long.
Surely, the semantics aren't the same but the MS operating systems
don't run on exotic architectures.

Anyway the workaround is trivial. The problem is that MS does not
implement a *lot* of C99. If it was *only* long long, there would be
little complaint.
Don't trust wikipedia, it's not a valid source.
Try C: a reference manual for the differences.

Most likely yes, the features that are not implemented are rarely (if
not never) used.

If it's considered dead by MS, then why lately I've seen so many
recommendations from MS in C?

Where? I haven't seen anything. Have you any links?
As for GNU, that's not true. Sure, RMS said the language he prefers is
LISP but that's even "deader".
Moreover, RMS does not depict GNU, and most of GNU projects are in C.
You've said this multiple times, please explain what you mean by
saying that GNU considers C dead..

Jacob, I believe, is saying that the *gcc* developers have shifted most
of their efforts to g++ and not gcc. To an extent this is probably
true, but it's true for all mainstream compilers. They have *all* shown
more interest in C++ than with C, and C99 in particular. It's only "C
only" compilers like Jacob's and PellesC that have, by needs, focused
on C99 exclusively.
If you don't doubt what you read.. don't read :).

I agree at this point. That's because in clc ISO C is discussed. If
you have a suggestion for the C language, there is comp.std.c
No, that is not true. If there is a mistake at the standard (and there
might be) and it's pointed out only a fool would argue otherwise.

You are not getting what he is saying. He is not talking about errors in
ISO 9899:1999, but about "language flaws", i.e., things like null
terminated strings and their functions in string.h, *alloc/free instead
of GC, lack of operator overloading etc. *He* perceives these as flaws
in the language.
Because that is beyond the point of this group. As long as your
program is conforming you are on the safe side.

He is saying that the *wider* C programming community more often than
not rejects proposals for adding things like GC, counted strings,
operator overloading etc., which *he* sees as a negative response and a
failure to keep up with the state of the art.
Which problem? The proposals of MS are *pure* BS, moreover, you said
just before MS considers C to be dead.. why would MS bother then?
I don't want to see MS having any impact on the next C standard. Next
thing that happends, you'll have to pay to do anything remotely
associative with the new standard.
That does not, however, mean that I put the BS label in MS' proposals
before considering them. It's just that I've yet to see a reasonable
proposal from MS.

I agree here. The "safe C" proposal by MS is next to useless, and if
standardised, will only add more bloat to the language and help to
further sunder implementations.
It doesn't matter whether it's optional or not. If the implementation
skips _Complex (not Complex) and documents that UB is invoked when the
programmer uses it, it's in my opinion fine. Not a conforming
implementation, but usable.

Things like complex support are seldom used outside specialised areas.
So they *could* have been optional like IEC 60559 support. This would
probably have motivated many implementors to make more efforts towards
conformance.

<snip>
 
S

santosh

Ben said:
It is still an open question whether the way lcc-win32 does this can
cconform to C99. My evidence suggests that complex.h is needed to
pre-load the overloading, so compilation units that don't include
complex.h can't conform:

void f(double _Complex c);
void g(double _Complex a) { f(a * a); }

for example. Aside: is this allowed? I can't see any reason why not
but it might have made more sense for C99 to insist that feature X
requires one to include X.h. In that way, implementors could rely on
the header to trigger the mechanism.

Couldn't Jacob still include complex.h behind the scenes when he
encounters _Complex?

[ ... ]
It would help your case if you had a C99 conformance page like gcc
has.

In the case of gcc's page, it's rather disappointing that while it
mentions which feature is implemented and which is not, no rationale is
given for failure to implement (well apart from a few footnotes)
feature X. That would at least give us an idea of *why* gcc has not
implemented certain C99 facilities, and the prognosis for their future
inclusion.
 
H

Harald van Dijk

Couldn't Jacob still include complex.h behind the scenes when he
encounters _Complex?

No, when complex numbers get used, it is wrong to automatically include
the macros and functions that <complex.h> provides. The names are not
reserved for the implementation; the user is free to use them for other
purposes.

It would be perfectly valid to automatically include some other header,
as long as this only defines the operators, and no more. Some other
compilers automatically include <startup.h> or similar which can be
easily extended, and I expect this approach would work very well on lcc
too.
 
S

santosh

Harald said:
No, when complex numbers get used, it is wrong to automatically
include the macros and functions that <complex.h> provides. The names
are not reserved for the implementation; the user is free to use them
for other purposes.

It would be perfectly valid to automatically include some other
header, as long as this only defines the operators, and no more. Some
other compilers automatically include <startup.h> or similar which can
be easily extended, and I expect this approach would work very well on
lcc too.

Yes. I should have been more precise in what I said. I meant that
lcc-win would include only those portions of complex.h which are needed
for compiling code involving _Complex, unless of course complex.h was
explicitly included by the programmer.

But as you say, this might be better placed in a private header or even
within the compiler.
 
J

jacob navia

santosh said:
Yes. I should have been more precise in what I said. I meant that
lcc-win would include only those portions of complex.h which are needed
for compiling code involving _Complex, unless of course complex.h was
explicitly included by the programmer.

But as you say, this might be better placed in a private header or even
within the compiler.

Yes, that's a good idea.

I think that in the lexer, the first time I see
_Complex
I will include <complexoperators.h>

The standard file
<complex.h>
will include
<complexoperators.h> but define more things like
I
and other stuff.
 
K

Keith Thompson

Bartc said:
[from 'The problems in comp.lang.c']
Malcolm McLean said:
We're currently in the undesireable situation of having a rejected
standard,
C99, which means that "standard C" is no longer the precise thing it once
was. That doesn't mean it is impossible to hold a coherent discussion. I
don't think we can legitimately hold C99 to be off-topic, but we should
point out that non-block top declarations are not, de facto, portable.

I've looked at the differences in C99 according to the Wikipedia C article.

It doesn't look like that much to add to C90.

So why the problem with creating compilers for it, is it motivation?

I think motivation is a big part of it.

In the 1980s, there was no C standard other than K&R1. Different
compilers did different things for fundamental features of the
language; for example, I think there was real inconsistency in the
rules for an operator with one signed and one unsigned operand, and
for the semantics of shifts and division for negative operands.
Runtime libraries had differences, subtle and not so subtle. And in
most cases there was no basis for saying that one implementation was
right and another was wrong. Programmers had to use huge nests of
"#ifdef"s to get things to work.

The ANSI standard, released in 1989, changed all this (well, some of
it). It was quickly adopted by vendors. It largely formalized
existing practice, in some cases making specific decisions where
existing practice was inconsistent. The biggest new feature it
introduced (the function prototype) was a clear improvement over what
had gone before.

When ISO introduced the C99 standard, the situation was different.
The C programming community *already* had a standardized language, and
it worked well enough for most purposes.

Compiler vendors aren't in the business of conforming to standards, as
much as we might like that to be the case. They're in the business of
meeting the demands of their customers (and that applies to freeware
compilers such as gcc as well as to commercial compilers). There
wasn't nearly as great a demand for C99 conformance as there had been
for C89/C90 conformance.

[snip]
 
S

santosh

When ISO introduced the C99 standard, the situation was different.
The C programming community *already* had a standardized language, and
it worked well enough for most purposes.

Compiler vendors aren't in the business of conforming to standards, as
much as we might like that to be the case. They're in the business of
meeting the demands of their customers (and that applies to freeware
compilers such as gcc as well as to commercial compilers). There
wasn't nearly as great a demand for C99 conformance as there had been
for C89/C90 conformance.

If, as you say, most of the C programming community had a standard with
which they were well pleased and the compiler vendors were pleased that
most of their user base was pleased, why was there another standard at
all? Who were the main driving force behind C99? I was informed that a
section of compiler vendors, users, and other organisations pressed
WG14 for inclusion of greater range of mathematical features, so that
they might replace more of their Fortran code with C, and this was one
of the chief reasons for WG14 to come out with C99. Is this your take
on the matter too? And what of the next, C1x standard in discussion?
Can the community and WG14 ensure that, this time at least, only
features that have the support of a broad swathe of users and compiler
vendors would be added? Should C continue to be a minimalist general
purpose language or should it include specialised support for those
domains that WG14 thinks would constitute the mainstay of C in the
future, at the risk of the language faller further into disuse on
desktops?
 
K

Keith Thompson

jacob navia said:
Yes, that's a good idea.

I think that in the lexer, the first time I see
_Complex
I will include <complexoperators.h>

The standard file
<complex.h>
will include
<complexoperators.h> but define more things like
I
and other stuff.

As long as <complexoperators.h> doesn't intrude on user code, that
should be fine. You might run into problems with arithmetic
promotions (are the rules different for functions / overloaded
operators vs. built-in operators?).

Of course you can't include <complexoperators.h> literally the first
time you see _Complex; that could be in the middle of a declaration.
You'd have to retroactively go back and add the header at an
appropriate point, at or near the top of the translation unit, and
re-do the lexical analysis from there. And without a bit more
research, I don't know whether it's possible to encounter a complex
operator without having seen the _Complex keyword.

A simpler solution would be to implicitly include <complexoperators.h>
unconditionally at the top of every translation unit. Since it
declares things that are, as far as the standard is concerned, built
into the language, that might be the best approach.
 
M

Malcolm McLean

Bartc said:
Some of the extensions, like Complex support (I've never used complex
numbers and never will, and I'm sure many others will say the same) are
You've never coded a Mandelbrot? You haven't lived.
really not very interesting; perhaps that should have been optional if it
made producing the compilers easier.
A complex number library is neither here nor there. There's one in my book
Basic Algorithms. The problem with C99 was that the extensions were neither
so trivial as to be doable in a moment, such as adding #defines for true and
false, nor so extensive as to represent a radically improved language that
could be sold for extra money.
 
K

Keith Thompson

santosh said:
If, as you say, most of the C programming community had a standard with
which they were well pleased and the compiler vendors were pleased that
most of their user base was pleased, why was there another standard at
all? Who were the main driving force behind C99? I was informed that a
section of compiler vendors, users, and other organisations pressed
WG14 for inclusion of greater range of mathematical features, so that
they might replace more of their Fortran code with C, and this was one
of the chief reasons for WG14 to come out with C99. Is this your take
on the matter too?

Hmm. I'm not familiar enough with the history to comment on that.

Your account seems to imply that vendors demanded these new features,
and then when they got them in a new standard, they declined to
implement that new standard. Is that really what happened?
And what of the next, C1x standard in discussion?
Can the community and WG14 ensure that, this time at least, only
features that have the support of a broad swathe of users and compiler
vendors would be added? Should C continue to be a minimalist general
purpose language or should it include specialised support for those
domains that WG14 thinks would constitute the mainstay of C in the
future, at the risk of the language faller further into disuse on
desktops?

In my opinion, the best chance for the survival of C and for
widespread support for any new standard (note that these are two
different, but related, things) is for C to remain fairly minimalist.
If that makes it a niche language, rather than the universal
programming language it seemed to be a few decades ago, that's not
necessarily a bad thing. (I'm willing to radically change this
opinion at the slightest provocation.)

Of course, what should happen it that the next standard should include
the features *I* like, but reject all other new features in the
interests of simplicity. :cool:}
 
J

jacob navia

Malcolm said:
A complex number library is neither here nor there. There's one in my
book Basic Algorithms. The problem with C99 was that the extensions were
neither so trivial as to be doable in a moment, such as adding #defines
for true and false, nor so extensive as to represent a radically
improved language that could be sold for extra money.

The problem is the lack of generality of the proposed changes.

Complex numbers would have been much more interesting in the context
of operator overloading, since that would have allowed ANY kind
of numbers to be defined.

The generic math package proposed by C99 would have been much more
interesting if true generic functions would have been proposed,
that would allow not only a selected set of math functions to be
defined as generic, but ANY function that the user of the language
wants to make generic!

And the problems with the outdated C library remained untouched.
Nothing was done in this respect, not even gets() went away, and is
still there, together with asctime() and many other abominations.
 
B

Bartc

You've never coded a Mandelbrot? You haven't lived.

Of course. But I don't remember complex numbers. If they were needed then I
probably worked with the 2 parts separately.

But, that's all complex numbers are, just a pair of floats given special
treatment.

In that case, why stop there? Far more useful are operations on 3D points
and matrices:

q = m * p; /* transform point p into q */
c = (p+q)/2; /* midpoint of p,q */

instead of the cumbersome:

transformpoint (&m,&p,&q) and so on.

This is where the operator overloads jacob is always on about start to
become useful.
 
S

santosh

Keith said:
Hmm. I'm not familiar enough with the history to comment on that.

Your account seems to imply that vendors demanded these new features,
and then when they got them in a new standard, they declined to
implement that new standard. Is that really what happened?

Well this is what I gathered from various posts here and in comp.std.c:
that it was a fairly small group that lobbied for inclusion of complex
arithmetic, VLAs, fenv.h and tgmath.h.

It does explain (if it is true) why the features listed above are among
the ones that are least widely implemented.
In my opinion, the best chance for the survival of C and for
widespread support for any new standard (note that these are two
different, but related, things) is for C to remain fairly minimalist.
If that makes it a niche language, rather than the universal
programming language it seemed to be a few decades ago, that's not
necessarily a bad thing. (I'm willing to radically change this
opinion at the slightest provocation.)

Maybe C should follow what ISO did for Pascal and include features that
are (or might be) poorly implemented into an "extended" standard for
the language, with the core standard being more or less frozen around
C95?

Then nearly all implementors could have the satisfaction of labelling
their products "fully conforming to the Core C Standard" while
ambitious vendors could implement "the extended C Standard". This way
programmers who want their source to be maximally portable could stick
to the core standard while simultaneously those who want to use widely
implemented but not ubiquitous features could write to the extended
standard.

Features like VLAs, complex arithmetic, fenv.h etc. could be moved to
the extended standard and widely available things like APIs for
traversing directories etc., could be added to it, without
necessitating that either all implementations implement them or risk
being labelled "non-conforming".

<snip humour>
 
R

robertwessel2

I never understood why Microsoft chose to implement __int64 but *not*
long long.
Surely, the semantics aren't the same but the MS operating systems
don't run on exotic architectures.


MS's __int64 support long predates C99, and "long long" isn't in C++
(yet).

And even as regards C99, it was far from clear until late in the
process if long long was going to be accepted - there was a large, and
very vocal, community that did not want long long in the standard.
 
M

Malcolm McLean

Keith Thompson said:
In my opinion, the best chance for the survival of C and for
widespread support for any new standard (note that these are two
different, but related, things) is for C to remain fairly minimalist.
If that makes it a niche language, rather than the universal
programming language it seemed to be a few decades ago, that's not
necessarily a bad thing. (I'm willing to radically change this
opinion at the slightest provocation.)

Of course, what should happen it that the next standard should include
the features *I* like, but reject all other new features in the
interests of simplicity. :cool:}
We need one programming language for everything except the nichiest of niche
areas. I've over twenty years programming experience. Occasionally I have to
knock up little Perl scripts. I find myself puzzling over the Perl handbook
trying to work out how to break out of a loop, or how to sort a list of
files by suffix. Essentially Perl does these things in the same way as C,
but with tiny differences to make it look more like a Unix shell script, or
maybe just to be different to emphasise that it is not C. It's a huge waste
of time. As I said, this is someone with 20 years experience who can't get
his tool to sort files by suffix. However useless the individual concerned,
that would be unacceptable in any other industry. You wouldn't tolerate
eningeers being unable to calculate bolt tolerances because someone had
suddenly decided to use a weird and wonderful new measuring system, or
lawyers unable to read new legislation because the Federal government had
decided on Latin. However we toleratye the same in software.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,766
Messages
2,569,569
Members
45,042
Latest member
icassiem

Latest Threads

Top