L
Luke Wu
Whenever one runs across programs that use the return value from
getchar() to read input, it's almost always accepted into an
int-defined variable.
I've read explanations on this and have always used this form.
Recently, someone asked me why an int was needed, since -1 (usually
EOF) can be safely represented in a signed char value. In fact, any
negative number can be safely represented in a signed char value, and
wouldn't conflict with the 0-127 positive values.
I ran the following program under two diff environments and it worked
fine:
int main()
{
signed char c;
while((c = getchar()) != EOF)
putchar(c);
}
return 0;
}
If one is only expecting input in the form of a 7-bit character set
(i.e., US-ASCII), is it safe to use signed char? I'm asking because I
use C to program little, resource poor, 8bit microcontrollers, and
often have to implement 7-bit ASCII based communications. Using
unsigned char instead of int could improve timing/ROM/RAM. But is
there something I'm missing (lurking bug)?
Help will be greatly appreciated. Thanks
getchar() to read input, it's almost always accepted into an
int-defined variable.
I've read explanations on this and have always used this form.
Recently, someone asked me why an int was needed, since -1 (usually
EOF) can be safely represented in a signed char value. In fact, any
negative number can be safely represented in a signed char value, and
wouldn't conflict with the 0-127 positive values.
I ran the following program under two diff environments and it worked
fine:
int main()
{
signed char c;
while((c = getchar()) != EOF)
putchar(c);
}
return 0;
}
If one is only expecting input in the form of a 7-bit character set
(i.e., US-ASCII), is it safe to use signed char? I'm asking because I
use C to program little, resource poor, 8bit microcontrollers, and
often have to implement 7-bit ASCII based communications. Using
unsigned char instead of int could improve timing/ROM/RAM. But is
there something I'm missing (lurking bug)?
Help will be greatly appreciated. Thanks