So we've got rid of that annoying overwork of the 'char' keyword to mean
"small integer". Where int is 64 bits, short 32, short shor 16, we can
introducue a short short short type of 8 bits.
Now I know what you're going to say. Shouldn't we have a short short short
short? The answer is no. That violates the rule of three (see website).
short short short short should be s4, and is, conveniently, 4 bits on a
64-bit system.
LOL, I enjoyed your addition to the "rule of three," however I don't
totally agree with it. I a five dimensional array container that I
conceptualized, implemented and functions quite well. True, most
developers who follow on to my logic have difficulty visualizing
beyond three dimensions, they can learn how it works.
Physics is rapidly approaching complete acceptance of eleven
dimensions since realizing it united the various string theories into
one "magically."
That said, a short short is as redundant as long long. I understand
as our technology advances, increasing variable storage to currently
inconceivable widths, new "language" will always be needed to insure
backwards portability with existing software. But, at some point we
will stop this practice.
Perhaps a long should be the largest integer available on all
hardware, and int should be half the size of a long, a short half the
size of an int, and a char half the size of a short. So long as the
number of bits reflected in each primitive defined is accessible, one
can adapt their use accordingly.
When our technology extends to 128-bit system, we do not simply adopt
"long long long" declaration for a 128-bit integer.
cj