Richard Heathfield said:
Keith Thompson said:
I suppose that this attitude is more natural for those of us who have
written code that has to work on real live non-ASCII systems, simply
because we're so used to not being /able/ to assume ASCII that it never
occurs to us to rely on ASCII even when we might get away with it.
Perhaps, but I've never really programmed for a non-ASCII system.
It just wouldn't occur to me to write code that depends on the
assumption that 'A' == 65. If I want 'A', I write 'A'.
Indeed. And, on a related note, I find it very difficult to understand this
fascination with integers that have a particular number of bits. If I need
8 bits, I'll use char (or a flavour thereof). If I need 9 to 16 bits, I'll
use int (or unsigned). If I need 17 to 32 bits, I'll use long (or unsigned
long). And if I need more than 32 bits, I'll use a bit array. I see
absolutely no need for int_leastthis, int_fastthat, and int_exacttheother.
But there are times when you need some exact number of bits,
particularly when you're using an externally imposed data format.
(But then whoever is imposing the data format on you should have
provided a header that declares the appropriate types.)
Something that might be more useful would be a way to ask for an
integer type with (at least) a specified *range*. If I'm using a type
to hold numbers rather than bags of bits, I care what numbers I can
store in it, not how many bits it uses to store them.
The introduction of long long int was, in my continued opinion, a mistake.
All the ISO guys had to do was - nothing at all! Any implementation that
wanted to support 64-bit integers could simply have made long int rather
longer than before - such a system would have continued to be fully
conforming to C90. And if it broke code, well, so what? Any code that
wrongly assumes long int is precisely 32 bits is already broken, and needs
fixing.
That's true, but 64 bits is the effective limit for this. The
following:
char 8 bits
short 16 bits
int 32 bits
long 64 bits
is a reasonable set of types, but if you go beyond that to 128 bits,
you're going to have to leave gaps (for example, there might not be
any 16-bit integer type).
My objection to C's integer type system is that the names are
arbitrary: "char", "short", "int", "long", "long long", "ginormous
long". I'd like to see a system where the type names follow a regular
pattern, and if you want to have a dozen distinct types the names are
clear and obvious. I have a few ideas, but since this will never
happen in any language called "C" I won't go into any more detail.