jacob said:
As many posters have replied here, you *can* test for overflow
yourself. What bothers me, is that the language does not enforce this
even if it is a break of the standard, as Jack Klein has noted in
this same thread.
It does not do this for the same reason that all other undefined behavior is
not necessarily diagnosed: to avoid constraining implementations.
If an implementation generates a hardware trap on overflow, for example, and
there is nothing a C program or compiler can do to modify this behavior in
the slightest, there is little to gain from making provisions in the
standard about it.
If defined behavior is necessary on such platforms, two integer calculations
have to be performed for every single one to be checked. It would obviously
be ill-advised for the standard to mandate this.
The lcc-win32 compiler tests for overflow with a command line
switch. This wasn't very difficult to do, and the impact
in the run time is minimal, in any case not even measurable
for a normal application in a PC environment.
OF COURSE there are other environments where each microsecond counts
and where the quality of the results doesn't matter so much...
Or where the C compiler is not expected to guarantee the quality of the
results, but other methods are employed.
For *those* environments the language stdandard *could* make an
exception, for instance
#pragma STDC intergeroverflow(off)
or similar.
This same could be done for many language features that trigger UB, giving
well-defined behavior for constructs unless a pragma was defined. The only
fly in the ointment is that some checks would be very cheap on most
platforms (integer overflow) while others could be impossible to check
without degrading performance to nonexistent levels on some platforms
(making sure the aliasing rules are not broken).
In the end, that's just not what C is about.
But in a normal case, integer overflow is a serious error,
an error that is very difficult to catch if not done at the
compiler level. You think you could have an overflow *here* or
*there* but actually you got an overflow in a completely unexpected
place!
And the same is true for accessing null pointers, uninitialized variables,
defining duplicate external symbols, accessing freed memory and a myriad
other things that the standard could mandate well-defined behavior for, but
deliberately doesn't. Many languages safer than C have been created in
response to this, but C persists.
In C, it is always the programmer's responsibility to make sure
platform-defined limits are not exceeded, and the programmer's
responsibility to explicitly check for them if it cannot be guaranteed. The
programmer may even choose to exploit their platform's known behavior
despite it being left undefined by the standard, and sacrifice portability
for performance. The merits of this are debatable, of course.
[snip x86 code]
Other mechanisms could be considered, and the generated code *could*
be better. I have found that in PC environnments, where lcc-win32 runs,
this is absolutely not measurable for normal applications...
And I'm sure the lcc overflow detection is a valuable help to Win32
programmers. But from a practical and philosophical standpoint, the standard
can't include it. You could argue that it's so cheap and portable that it
ought to be done on every platform, but I don't know if that's true, and I'm
not on the committee.
S.