N
nicola
Hi,
I'm a bit confused about the following:
#include <iostream>
int main (int argc, char** argv) {
std::cout << "Bits of unsigned char: " <<
std::numeric_limits<unsigned char>::digits << std::endl;
std::cout << "Bits of char: " <<
std::numeric_limits<char>::digits << std::endl;
std::cout << "Bits of signed char: " <<
std::numeric_limits<signed char>::digits << std::endl;
return 0;
}
prints, on my system:
Bits of unsigned char: 8
Bits of char: 7
Bits of signed char: 7
I read in the standard (3.9.1) that all the three variants "occupy the
same amount of storage and have the same alignment requirements; that
is, they have the same object representation. For character types, all
bits of the object representation participate in the value
representation". Shouldn't that mean that their size, in bits, must be
the same? Isn't numeric_limits<> supposed to return such size?
Going on, I read that "For unsigned character types, all possible bit
patterns of the value representation represent numbers. These
requirements do not hold for other types". Which I interpret as "for,
let's say, signed char, one bit may be used for the sign". Still, signed
char's size, in bits, should be the same as for unsigned char, shouldn't
it?
Nicola
I'm a bit confused about the following:
#include <iostream>
int main (int argc, char** argv) {
std::cout << "Bits of unsigned char: " <<
std::numeric_limits<unsigned char>::digits << std::endl;
std::cout << "Bits of char: " <<
std::numeric_limits<char>::digits << std::endl;
std::cout << "Bits of signed char: " <<
std::numeric_limits<signed char>::digits << std::endl;
return 0;
}
prints, on my system:
Bits of unsigned char: 8
Bits of char: 7
Bits of signed char: 7
I read in the standard (3.9.1) that all the three variants "occupy the
same amount of storage and have the same alignment requirements; that
is, they have the same object representation. For character types, all
bits of the object representation participate in the value
representation". Shouldn't that mean that their size, in bits, must be
the same? Isn't numeric_limits<> supposed to return such size?
Going on, I read that "For unsigned character types, all possible bit
patterns of the value representation represent numbers. These
requirements do not hold for other types". Which I interpret as "for,
let's say, signed char, one bit may be used for the sign". Still, signed
char's size, in bits, should be the same as for unsigned char, shouldn't
it?
Nicola