Barry said:
On Wed, 05 Dec 2007 17:31:44 +0000, Ben Bacarisse
Does the standard really require an object to have padding bits in
order for a value to be a trap representation?
Not in general. However, the rules for unsigned types require that all
possible bit patterns of the value bits represent valid values.
Therefore, the only way to have a trap representation of an unsigned
type is if there are padding bits. Since unsigned char is not allowed to
have padding bits, this option isn't open.
By this reasoning, if an int of indeterminate value happens to contain
an unspecified value (as opposed to a trap representation), then
accessing that int would not invoke undefined behavior.
Correct. But, since you have no way of knowing or ensuring that this is
the case when you leave an int uninitialized, it doesn't do you much good.
By the way, 6.2.6.1-5 insures that all character types, not just
unsigned ones, cannot contain a trap representation.
No, it merely restricts the undefined behavior that can result from
creating or reading such a representation to non-character types. The
defining characteristic of a trap representation is not the undefined
behavior; it is the fact that it does not "represent a value of the
object type". I find this a meaningless distinction, and probably
contrary to the intent of the authors, but that is what they actually wrote.