Malcolm said:
char may be signed or unsigned, but this is a hangover from K and R and
That's what I originally thought. But there was another post that
suggested the defined range of char is 7 bits (0-127), and in newer
standards is different to 'signed char' and 'unsigned char'. As to what
standard this is, I wasn't sure by the reply post. Thinking about it
more, I don't quite believe this, as then the datatype specifies char to
be 7-bits and not 8-bits.
shouldn't be of any interest to you. char holds a human-language character,
unsigned char an arbitrary byte, signed char is of limited use but you might
occasionally want a small signed integer.
I'm interested in-so-far as I'm getting a diagnostic from a compiler for
a small piece of code like:
01 #include <stdio.h>
02 int ctest(char s)
03 {
04 if (s < 0 || s >= 128) {
05 printf("Out of range\n");
06 return 0;
07 } else {
08 printf("In range\n");
09 return 1;
10 }
11 }
file.c:4: warning: comparison is always false due to limited range of
data type
and I want to know if the compiler is generating a useless diagnostic in
this case (and I ignore it, or figure out how to turn it off), or if it
is a problem with my code.
So, if 'char' is either 'signed' or 'unsigned', line 3 should stay as it
is. This particular compiler I'm using I assume treats 'char' as 'signed
char'.
As a concession to efficiency, characters '0'-'9' are always consecutive and
you might want to exploit this fact to convert numbers to and from text.
This is the only occasion in which the machine representation of a
particular character should normally affect your program.
Not quite. I don't actually care about the text itself as humans don't
need to read it, and I'm treating it literally as a string of random
values. The range of the values concerns me as I'm using an array of
structures that's indexed by the character value itself. I'm defining
that array to be 128 elements and check that the 'char' is in the range
of 0 to 127 (for a Posix system) before using it to index that char.
And of course, the actual size of a char is defined to be 8 bits (or
more, but 8 in Posix), so I interpret this as I must check the 8th bit
somehow.
It seems a little waste of CPU cycles/code clarity to take the value,
typecast it to int (or assign it to an int), make the comparison, then
index the array with the int as to avoid the warning.
Thanks,
Jason.