"Signed" chars may sound strange, but use the default (signed) type,
unless you have a specific need for unsigned.
In a strange historical quirk, the default type of char is actually a
third type which happens to have the same values and representation of
one or another of either signed char or unsigned char, but is not
either of the others. It is not always the case that a plain char
is signed, though, which makes it unlike all the other integer types.
For all the other integer types, if you omit the qualifier, you get
the signed version -- and you really do get the signed version of that
type, there is NO difference between "int" and "signed int".
For char, though, "char" may be either signed or unsigned, and whichever
it is, it remains a distinct type, even though it has the same range,
representation, and behavior. Which one is implementation-defined, so
it should be in the docs somewhere. (I have seen many compilers that
allow you to choose this.)
-s