C
Christopher Benson-Manica
In a thread from substantially earlier this week,
Being rather pendantic, I decided to try to verify whether this was
true. I would appreciate knowing whether my reading of the Standard
is correct.
7.19.7.1 (as we all know) states that fgetc() (and thus its friends)
"obtains [a] character as an unsigned char converted to an int".
There is nothing in the Standard (that I was able to find) which
states that sizeof(int) may not be 1, so it occurred to me to ask, "Is
163 always representable as a signed int if sizeof(int) is 1?"
5.2.4.2.1 states that INT_MAX may not be less than 32767, so the
answer to that question appears to be "yes".
On the other hand, I do not see anything in 5.2.4.2.1 which requires
that UCHAR_MAX not be greater than INT_MAX - which indeed it must be,
if sizeof(int) == 1, correct? In such a case, fgetc() may return
UCHAR_MAX (right?), and so either fgetc() must work behind-the-scenes
magic to return a signed integer representing UCHAR_MAX, or invoke UB
by overflowing the signed type int. Both of these alternatives seem
ridiculous to me, so what am I missing?
Harald van D?k said:getchar does not work with plain chars, it works with unsigned chars. 163
fits just fine in an unsigned char, so getchar is allowed to return 163.
Being rather pendantic, I decided to try to verify whether this was
true. I would appreciate knowing whether my reading of the Standard
is correct.
7.19.7.1 (as we all know) states that fgetc() (and thus its friends)
"obtains [a] character as an unsigned char converted to an int".
There is nothing in the Standard (that I was able to find) which
states that sizeof(int) may not be 1, so it occurred to me to ask, "Is
163 always representable as a signed int if sizeof(int) is 1?"
5.2.4.2.1 states that INT_MAX may not be less than 32767, so the
answer to that question appears to be "yes".
On the other hand, I do not see anything in 5.2.4.2.1 which requires
that UCHAR_MAX not be greater than INT_MAX - which indeed it must be,
if sizeof(int) == 1, correct? In such a case, fgetc() may return
UCHAR_MAX (right?), and so either fgetc() must work behind-the-scenes
magic to return a signed integer representing UCHAR_MAX, or invoke UB
by overflowing the signed type int. Both of these alternatives seem
ridiculous to me, so what am I missing?