C99 portability challenge

A

Antoninus Twink

So it only uses a small amount of C99. Here is another C99 program and
the output on my fully patched Ubuntu machine...
markg@brenda:~$ cat t.c
#include <stdio.h>
#include <math.h>

int main(void)
{
printf("%f\n",atan2(1,1));
}
markg@brenda:~$ gcc -std=c99 -pedantic -Wall -Wextra t.c
/tmp/cc8KagMH.o: In function `main':
t.c:(.text+0x1d): undefined reference to `atan2'
collect2: ld returned 1 exit status

Compiles fine on my current Debian system.

You have proved to the world that your machine has an obsolete
development environment, nothing more, nothing less.
 
B

Ben Bacarisse

Lassie said:
jacob navia said:
Several people in this group argue that standard C
is not portable since there are no compilers for it, etc.

I propose this program in Standard C, that I have compiled
in several OSes to test if this is actually true. My
basic idea is to see which systems do not have a compiler that
supports standard C.

Try this program:

#include <stddef.h>
#include <stdio.h>

void func(size_t n, char arr[*])

This form is not allowed in a function definition. It is not a
constraint violation, so no diagnostic is required. I am not 100%
sure of the status of a program that breaks a rule that is phrased as
this one is:

6.7.5.2p4:

If the size is * instead of being an expression, the array type a
variable length array type of unspecified size, which can only be
used in declarations with function prototype scope;124) such arrays
are nonetheless complete types.

(Footnote 124 says: Thus, * can be used only in function declarations
that are not definitions (see 6.7.5.3))
 
A

Anand Hariharan

Several people in this group argue that standard C
is not portable since there are no compilers for it, etc.

I propose this program in Standard C, that I have compiled
in several OSes to test if this is actually true. My
basic idea is to see which systems do not have a compiler that
supports standard C.
(...)

Thanks in advance.

I wish you didn't multi-post. Had Keith Thompson not hinted that you
had posted in comp.std.c as well, I would have missed posts there (I
don't follow that group at all).

I think Charlie Gordon's post in that group was a very balanced
critique of your program, and he makes some valid suggestions on how
you can extend your test to truly ascertain what you want to find.

- Anand
 
V

vippstar

I wish you didn't multi-post. Had Keith Thompson not hinted that you
had posted in comp.std.c as well, I would have missed posts there (I
don't follow that group at all).

Not only he multi-posted, but it's off-topic in comp.std.c
(note: he said "in this group" and he multi-posted that message...)
 
N

Nick Keighley

Rui Maciel said:
[C99 program] fails to compile on Windows using Visual Studio 2005 and
Visual Studio 2008.
When compiling using C mode (/TC) the application fails to compile due
to the declaration of 'int top', 'int i' and any other variable
declaration that is not at the beginning of the function. I believe this
is default C90 behavior.
He did mentioned C99 in the subject line, which doesn't require that all
variable declarations should be restricted to the beginning of a
function.
Why not try to compile the code with a decent, standard compliant
compiler?

begging the question
So you seem to be saying that C99 programs are portable to conforming C99
compilers, is that right? What do you want us to do - hold the front page?

As I understand it, the point (such as it is) of Jacob Navia's challenge is
to establish whether a rather naive little program that makes undemanding
use of a VLA is portable across a variety of not-quite-C99
implementations.

I don't think its unreasonable to produce some sort of basic
validation test case for C99. Of course Jacob's program
falls *way* short of being that.

If he had written the program slightly differently:

#include <stdlib.h>
#include <stdio.h>
int main(int argc, char **argv)
{
  int rc = EXIT_SUCCESS;

  unsigned long n = argc > 1 ? strtoul(argv[1], NULL, 10) : 0;
  if(n > 0 && n <= 100)
  {
    printf("The sum of the first %lu integers is %lu\n",
           n, n * (n + 1) / 2);
  }
  else
  {
    puts("Please use an argument in the range 1 to 100 inclusive.");
    rc = EXIT_FAILURE;
  }
  return rc;

}

then it would have been portable to *all* conforming hosted
implementations, C90 as well as C99 (as well as being faster and shorter
than his original).

um. and not a test of C99 implementations (using "implementation" in
its
weak form).

But that would have been beside his point. He has
purposefully introduced non-portable features into the code with the
intent of demonstrating that they aren't really as non-portable as all
that, really. How far he has succeeded is rather questionable, since his
code steps around many C99 conformance issues.

(Frankly, I'd rather use the shorter, faster, more portable version, but
YMMV.)

but that misses the point.

Doesn't C99 have rather more new features than mixing declarations
with statments and VLAs?


--
Nick Keighley

"an easy to use computer should do what I mean, not what I say,
and by no means send me a dancing paper clip to ask"
Nicholas Negroponte (MIT Professor)
 
N

Nick Keighley

many people claim that C99 programs are not very portable
because few implementaion of C99 exist.

Are there *any* embedded implentations of C99?

Tests are meaningless.

so how can *any* implementation of any language
claim to be conformant with any standard?

Do the implementors submit proofs of correctness?

No, they use a validation suite of some sort.

(oh course Jacob's program *very* isn't a validation suite}
 
N

Nick Keighley

What I think Jacob Navia is trying to show - and it's a perfectly
reasonable thing to try, I hasten to add - is that a typical C program
that uses typical C99 constructs (in what we might call a vanilla,
non-destruct-test way) is portable *not only* among implementations that
*conform* to C99 - which of course it must be, by definition - but *also*
among implementations that merely "support" C99 - such as gcc, for
example.

I don't actually agree that the experiment will tell us very much that is
useful (for reasons which I can explain fully if anyone cares enough)

go on then.

wouldn't a well defined, moderatly portable subset of C99 be a useful
thing to have?

<snip>
 
J

jacob navia

Not only he multi-posted, but it's off-topic in comp.std.c
(note: he said "in this group" and he multi-posted that message...)

I did NOT multi post, and if you read comp.std.c you will
see that the messages are different.
 
V

vippstar

I did NOT multi post, and if you read comp.std.c you will
see that the messages are different.

Oh yeah, you are right, I'm sorry. The messages are indeed different.
(their hash might differ; in essence they are the same; multi-post for
me)
Your message in comp.std.c is _STILL_ off-topic for that group.
 
J

jacob navia

Richard said:
Nick Keighley said:


<sigh> I'd have thought you'd know this already. Fortunately, I've found an
easy way to explain it.

Let's just imagine a world in which Jacob Navia's test program can be
compiled on a dozen different compilers that offer "support" for C99, with
the right semantics and everything. What does that tell us about the
portability of *this* program (which, as far as I'm aware, is legal C99,
although obviously I haven't tried it because I don't have a C99
implementation):

#include <stdio.h>
#include <math.h>
struct vector { double x; double y; };
double vector_length(const struct vector *p)
{
return sqrt(p->x * p->x + p->y * p->y);
}
int main(void)
{
printf("%f\n", vector_length(&(const struct point){.x=3.0,.y=4.0}));
return 0;
}

? And of course the answer is that it doesn't tell us anything at all.


Well, it could tell us something if you would get rid of the
errors to begin with...

PTOTBBC!!!!

(Please Turn On The Brain Before Coding)
 
B

Bartc

Richard Heathfield said:
Nick Keighley said:

Sure. We could call it "C90".

Suppose C99 offered 1000 extra features over C90.

How many of those 1000 features would have to be fully implemented for a
valid C99 implementation? All of them?

So you wouldn't even use 1 useful C99 feature, because a particular
implementation only manages 998 of the rest?

(I'd imagine such an attibute would be quite dispiriting to any C99
developers; they can nearly kill themselves getting 997 of the 1000 in
place, but that might as well be 0 out of 1000 as far as some people are
concerned.)

Let's look at just one C99 feature, being able to declare ints with a
guaranteed 64 bits. That makes sense to have, for someone programming 64
bits, especially on 64-bit-capable hardware.

This seems to be available on several C compilers that are not considered
(especially by you) to be C99-capable.

Is someone using such an extension, using effectively a small subset of C99?

It seems to me a good idea, when using extensions, to use those which are
part of C99, even if your current compiler is not C99. Then your code can be
portable to other compilers whice /are/ C99, or which implement a subset of
C99 which is a superset of yours.

In this case C99 does become useful now, even if you have no full
implementation to hand. It will give encouragement to C99 compiler
developers, because there will be code around using C99 features, and
therefore there will be a demand for compilers which implement at least that
subset.

And, you get the benefit of those extra features!

So Nick's suggestion for there to exist a well-defined set of C99 features,
that are an /enhancement/ to C90, sounds good. Better than everyone using
ad-hoc extensions to their compilers.
 
B

Ben Bacarisse

jacob navia said:
Well, it could tell us something if you would get rid of the
errors to begin with...

One typo. Presumably you are grateful for that typo since the
corrected version gives 12 compile errors with lcc-win32.

Does lcc-win32 compile the VLA example I posted in reply to your
suggestion that C99 is portable? Personally, I think C99 is
reasonably portable for some application domains provided one avoids
lcc-win32, at least for the moment. The problems with designated
initialisers, compound literals, VLAs, variable macro argument lists
and non-constant initialisations make it too fragile.

I have no problem with your decision to implement extensions before
C99 features (it probably makes commercial sense given that, on
Windows, the dominant compiler vendor is not interested in C99) but it
sits oddly with your apparent passion for C99 here.
 
B

Bartc

Richard Heathfield said:
Bartc said:
.....
It depends on your goals. If you don't know in advance on which
implementations your program is expected to work, you can't be sure that
the subset of features you use is supported by all the implementations.

Never mind a thousand - let's try six. We'll call them P Q R S T U.

Implementation Features supported
A P Q S T U
B P Q R T
C P Q T U
D P R S U
E P Q R S U
F P R S T
G ? ? ? ? ? ?

Seems a 'well-defined subset of C99' is still a good idea.

Which subset this should be is hard to tell from your example: there are 4
implementations of each feature apart from P.

I would go with the easiest to implement and/or those which already exist
anyway. It might be there is already a subset implemented across many
compilers, for example //-comments and long-long-int. That would at least be
a start, and require little effort. Eg. the subset [P] in your example.

And for platforms for which there is only G, and there exists a good reason
to port your code to that platform, then the customers of that platform
should demand a [P]-compliant compiler. Otherwise why should G stifle the
development and progress of all the others?
 
K

Keith Thompson

Bartc said:
Seems a 'well-defined subset of C99' is still a good idea.

I'll grant you that a "well-defined subset of C99" *would be* a good
idea. Though I'd be much happier if all the relevant implementers
just implemented the whole language, eliminating the need for subsets.
Note that the latter is essentially what happened for C90; you don't
see many C compilers that fail to implement all of the C90 standard.

The problem is that nobody has actually defined such a subset -- other
than C90 itself, of course (or rather, the intersection of C90 and
C99).
Which subset this should be is hard to tell from your example: there are 4
implementations of each feature apart from P.

And in the real world, where there are more than 6 C99-specific
features and more than 6 implementations, it's nearly *impossible* to
tell what this subset should be.
I would go with the easiest to implement and/or those which already exist
anyway. It might be there is already a subset implemented across many
compilers, for example //-comments and long-long-int. That would at least be
a start, and require little effort. Eg. the subset [P] in your example.

And how would you go about enforcing this?

Let's assume, for the sake of argument, that all C compilers implement
// comments and long long. (Note that the latter also requires
library support for the *printf and *scanf functions; the library
isn't necessarily provided by the same vendor as the compiler.)

But as far as I know, no compiler enforces that particular subset.
For example, gcc also implements mixed declarations and statements.
If I'm going to write code using a well-defined subset of C99, I want
my compiler to warn me when I violate that subset. If I tell gcc to
accept // comments and long long, it will *silently* accept mixed
declarations and statements as well, and when I try to port my code to
another compiler that supports this well-defined subset, it chokes
because the other compiler *doesn't* support mixed declarations and
statements.

A conforming C90 compiler or a conforming C99 compiler will diagnose
code that fails to conform to the specified standard. That's one of
the great advantages of having a standard. If there were widely
available conforming well-defined-subset-of-C99 compilers that
diagnosed code that fails to conform to that subset, we could use that
subset with some confidence. But there aren't -- *unless* the subset
you choose is just the intersection of C90 and C99. Even then you
could miss warnings for things like implicit int that are valid C90
but invalid C99, but there are few enough features that we can live
with that.

Now if you only care about implementations that support some certain
subset of C99 features, and you're willing to accept the risk that one
compiler won't diagnose code that goes beyond that subset, and you
therefore can't be sure of how portable your code is until you test it
on all relevant systems, then that's fine; go ahead and use whatever
C99 features you like. It may even be the case that most C
programmers are in a position to do so.
And for platforms for which there is only G, and there exists a good reason
to port your code to that platform, then the customers of that platform
should demand a [P]-compliant compiler. Otherwise why should G stifle the
development and progress of all the others?

Have you ever tried *demanding* that a vendor support some particular
feature?
 
F

Flash Gordon

jacob navia wrote, On 27/08/08 13:55:
Should have been:
printf("%f\n", vector_length(&(const struct vector){.x=3.0,.y=4.0}));
Well, it could tell us something if you would get rid of the
errors to begin with...

PTOTBBC!!!!

(Please Turn On The Brain Before Coding)

Everyone makes mistakes occasionally. Now I've fixed it I get the
correct result from gcc here. Does it work on the other compilers you
have access to?

Also consider the following program:

#include <stdio.h>
#include <math.h>
#include <fenv.h>
#include <assert.h>

void foo(int round_dir)
{
#pragma STDC FP_CONTRACT OFF
#pragma STDC FENV_ACCESS ON
int save_round;
int setround_ok;
save_round = fegetround();
setround_ok = fesetround(round_dir);
assert(setround_ok == 0);
printf("%99.99f\n",1.0/3);
/* ... */
fesetround(save_round);
/* ... */
}

int main(void)
{
foo(FE_UPWARD);
}

I believe that if it compiles it should run without triggering the
assert and print a number slightly higher than the mathematical result
of 1/3. However gcc gives me the following:

markg@brenda:~$ gcc -std=c99 -pedantic -Wall -Wextra -lm t.c
t.c: In function ‘foo’:
t.c:8: warning: ignoring #pragma STDC FP_CONTRACT
t.c:9: warning: ignoring #pragma STDC FENV_ACCESS
markg@brenda:~$ ./a.out
0.333333333333333314829616256247390992939472198486328125000000000000000000000000000000000000000000000

Note that the warnings suggest that gcc is not correctly handling the
pramas and the result is rounded down not up as requested. The function
foo is only slightly modified from an example in the standard, modified
*just* enough to give us some output to check.

Do the compilers you have access to do the right thing? Have I
miss-interpreted the standard or is gcc wrong?
 
J

jacob navia

Flash said:
Also consider the following program:

#include <stdio.h>
#include <math.h>
#include <fenv.h>
#include <assert.h>

void foo(int round_dir)
{
#pragma STDC FP_CONTRACT OFF
#pragma STDC FENV_ACCESS ON
int save_round;
int setround_ok;
save_round = fegetround();
setround_ok = fesetround(round_dir);
assert(setround_ok == 0);
printf("%99.99f\n",1.0/3);
/* ... */
fesetround(save_round);
/* ... */
}

int main(void)
{
foo(FE_UPWARD);
}

I believe that if it compiles it should run without triggering the
assert and print a number slightly higher than the mathematical result
of 1/3. However gcc gives me the following:

markg@brenda:~$ gcc -std=c99 -pedantic -Wall -Wextra -lm t.c
t.c: In function ‘foo’:
t.c:8: warning: ignoring #pragma STDC FP_CONTRACT
t.c:9: warning: ignoring #pragma STDC FENV_ACCESS
markg@brenda:~$ ./a.out
0.333333333333333314829616256247390992939472198486328125000000000000000000000000000000000000000000000


Note that the warnings suggest that gcc is not correctly handling the
pramas and the result is rounded down not up as requested. The function
foo is only slightly modified from an example in the standard, modified
*just* enough to give us some output to check.

Do the compilers you have access to do the right thing? Have I
miss-interpreted the standard or is gcc wrong?

You have set up the rounding direction of the processor.

What is the rounding direction?

When the processor determines that the true result of a calculation
IN THE LAST BIT lies between the 1 and 0 it must ROUND the LAST BIT
either to the nearest (1 or 0) or upwards (towards + or minus inf) or
downwards (towards zero).

Note that this is done to the LAST BIT of the calculation.

Since 1/3 --> 0.33333333333333333333333333333333333333333333 (etc)
somewhere the processor should round to
0.3333333 (precision number of "3") 4

That result should be the one displayed by printf.
 
K

Keith Thompson

Oh yeah, you are right, I'm sorry. The messages are indeed different.
(their hash might differ; in essence they are the same; multi-post for
me)
Your message in comp.std.c is _STILL_ off-topic for that group.

They're very similar, and they have the same subject header. jacob
should at least have mentioned in each that he was posting a very
similar article to the other newsgroup.

I don't necessarily agree that it was off-topic for csc. The question
of whether C99 is sufficiently widely implemented is relevant to the
maintenance of the current standard and the development of the next
one. (Whether jacob's article actually says anything about that is
another question, but that was the intent.)
 
F

Flash Gordon

jacob navia wrote, On 27/08/08 21:55:
You have set up the rounding direction of the processor.

What is the rounding direction?

When the processor determines that the true result of a calculation
IN THE LAST BIT lies between the 1 and 0 it must ROUND the LAST BIT
either to the nearest (1 or 0) or upwards (towards + or minus inf) or
downwards (towards zero).

Note that this is done to the LAST BIT of the calculation.

Since 1/3 --> 0.33333333333333333333333333333333333333333333 (etc)
somewhere the processor should round to
0.3333333 (precision number of "3") 4

That result should be the one displayed by printf.

OK, I can accept that the result could possibly be correct. However, if
I change FE_UPWARD to FE_DOWNWARD I get the same result. I really find
it hard to believe that the correct result for upward and downward
rounding is the same when the number cannot be accurately represented.

Also if I change 1.0/3 to 1.0/10 I get, whatever rounding I select, a
result of:
0.100000000000000005551115123125782702118158340454101562500000000000000000000000000000000000000000000

Again, I find it *incredibly* hard to believe that this is the correct
result for both upward and downward rounding.

Also you have not said whether the other compilers you have access to
(including your own) do the correct thing (whatever that is) with this
code. It is not complex and to some people for whom C99 could be of
interest the correct operation of things like this is important.

Finally you have not addressed the point that if gcc is ignoring the
pragmas I am using (as it claims) then there is something that it is
doing wrong.
 
B

Bill Reid

Doesn't C99 have rather more new features than mixing declarations
with statments and VLAs?

Yes, clueless Google(TM) Groups user, yes it does.

As one example, apparently C99 adds a "hh" input specifier
modifier to *scanf(). The other day I got curious about this, so
although my compiler does not include this in the documentation,
and generally it SEEMS to mean C89 when it refers to "ANSI
Standard C", it does have several C99-"like" "extensions" (some
of which are actually documented, others not), I tried assigning
a small integer to an unsigned char using "%hhu".

It worked! So I tried changing an entire library that uses HUNDREDS
of *scanf()s, did a fairly quick global, ran them through the library
test suite, and found they ALL WORKED!!!

EXCEPT ONE!!! One lousy *scanf() returned 0 trying to scan like
"19" or something!!! Out of SEVERAL HUNDRED conversions, ONE
DIDN'T WORK RIGHT!!!

So is that a bug in my compiler? Remember, they don't document
the "hh" modifier AT ALL, so I can't very well complain can I? How
portable is that, I mean, it's not even RELIABLY portable to my OWN
"implementation"...

Just for the record, VLAs are supported and documented as a
"special extension" to the "ANSI C Standard", as are "//" comments
(it's a C/C++ compiler anyway) and some other stuff like that (again,
some documented, some not), but NOT mixing declarations...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,776
Messages
2,569,603
Members
45,197
Latest member
ScottChare

Latest Threads

Top