jaysome said:
I'd believe you if you could give me an example of a compiler that
does nothing at all, or formats my hard drive, or melts my monitor, or
makes demons fly out of my nose, when presented with the following
code:
int main(void)
{
int i = 0;
i = i++;
return 0;
}
Why would you want to write "i = i++;" in the first place? "i++;" is
both simpler and correct.
Nasal demons presumably aren't a real possibility, but the point is
that the standard doesn't say *anything* about what happens when that
statement is executed. If demons do fly out of your nose when you
execute the program, you can complain that the compiler is broken, but
you can't complain (on that basis) that it's non-conforming.
And there are real-world instances of undefined behavior that can
cause Really Bad Things to happen. Many (most?) viruses take
advantage of buffer overruns, for example. A while ago the hard drive
on my laptop died and had to be replaced; I have no idea what caused
it, but who can say it wasn't a result of undefined behavior in some C
program?
The ANSI C standard introduced the "#pragma" directive, which "causes
the implementation to behave in an implementation-defined manner".
Some versions of gcc started up a game of nethack or rogue when they
encountered it. This was obviously silly, but it was conforming
behavior.
[...]
I've seen code like the following:
i = i++ % ARRAY_SIZE; /* increment circular buffer index */
and never thought twice about it--until I *really* started reading
this newsgroup. Perhaps others will have a similar experience. I sure
hope so.
That's just bad code.
I think a lot of programmers, particularly those who come to C from
other languages, think that the "++" operator is really cool, and use
it when it's not at all necessary. Both the examples you've shown
indicate that the programmer is using "i++" as a substitute for "i+1".
C has an operator (prefix "++") that yields the successor of its
argument and has the side effect of modifying its argument. It
doesn't have an operator that just yields the successor of its
argument (it doesn't need one, "1+i" or "i+1" servers the purpose just
fine), but too many programmers mistakenly think that "++" should be
used that way because it's cool and impressively terse.
If you misuse the language, your programs will misbehave. In C, the
language often imposes no constraints on how badly it can misbehave.
And even if your instance of undefined behavior appears harmless
today, it could literally reformat your hard drive tomorrow.