Hi,
I was wondering something today. The following code :
unsigned char a = 200;
char b = 200;
printf( "%d %d", a, b );
gives :
200, -56
How comes? I didn't tell printf that the first argument was unsigned
and it detected it on its own. It doesn't seem possible with varargs.
How is it possible?
Thanks,
This has nothing to do with printf(), per se. It does not know the
difference between the original objects providing the values.
What you are seeing is the result of what the C standard calls the
"default argument promotions". These occur when calling a function
without a prototype in scope, or for the "..." arguments in a variadic
function.
In the case of integer types of lesser rank than int, the "integer
promotions" occur. Since a signed int can hold the value 200, the
value in the unsigned char 'a' is converted to the int value 200.
On your particular system, chars have 8 bits and 'plain' char is
signed. The binary representation of 200 in an 8-bit byte happens to
be the same as the 8-bit 2's complement representation of -56, and
that is the implementation-defined result of assigning the
out-of-range value to a signed 'plain' char.
The integer promotions take the 'plain' char value of -56 and convert
it to an int value of -56.
So the values of 200 and -56 are generated during argument evaluation,
before printf() is called.
On some platforms, char has more than 8 bits, and 200 would remain 200
in a 'plain' char or signed char. On some platforms, 'plain' char is
unsigned, not signed. On platforms with either of these
characteristics, the output would be "200, 200".
And on some platforms, the output would not appear at all because you
did not add a final '\n'.