G
Guest
Is the behavior of INT_MIN/-1 defined?
6.5.5.6 specifies a result consistency for a/b if a/b is representable.
But if it is not representable (INT_MIN/-1 is such a case for int) the
standard is silent. There are other places where the standard says the
result is undefined. But for this it doesn't say so, or define it. Am
I to take that as undefined, anyway?
The behavior I see on x86, when the compiler is not optimizing away the
calculation from run time, is a floating point exception (which probably
really means an arithmetic exception to the CPU architecture).
6.5.5.6 specifies a result consistency for a/b if a/b is representable.
But if it is not representable (INT_MIN/-1 is such a case for int) the
standard is silent. There are other places where the standard says the
result is undefined. But for this it doesn't say so, or define it. Am
I to take that as undefined, anyway?
The behavior I see on x86, when the compiler is not optimizing away the
calculation from run time, is a floating point exception (which probably
really means an arithmetic exception to the CPU architecture).