Even assuming you ment 8 bits, this is not true. If one system uses ascii
and the other uses ebcdic, you're screwed. Even the subtle distinctions
between iso-latin-1 and iso-latin-15, two almost compatible and often used
character sets, might bite you. All of these use 8 bits (well OK, ascii
uses 7).
But there is no requirement in either C or C++ for a byte to be exactly 8
bits; only that it must be /at least/ 8 bits.
But note the unfortunate discrepancy between the meaning of the word byte
in C/C++ and that of measoring storage. However, C/C++ is not alone here,
Internet standards talk about octets when they mean 8 bits.
Same with the unit words. That means different things to different people.
The way I learned it at uni, very long time ago, was that a word was the
basic unit of storage. Same as the definition of byte in C/C++. Along came
MicroSoft and institutionalised the word-size of the 8086 as a WORD, so to
others a word now is 16 bits. I've seen even different uses of the word
'word', anyone got an example?
Why am I saying this? Because in the context of C/C++ a byte has a defined
meaning. However, in the context of disks and memory, a byte has a
different meaning. When the context is not clear it is very easy to get
confusion. Ah I here you say, but this is a C/C++ group, so the meaning is
clear. That may be true, but:
- The problem described a certain context, one where many people
(incorrectly) use the word byte to mean 8 bits.
- It is very confusing to people anyhow. Youngsters are raised with the
notion that a byte is 8 bits.
In the end, we can only conclude that this difference in meaning is very
unfortunate. Technically, an octet is the correct term for 8 bits. But
we're never going to change the common use of byte anymore. In the
meantime we'll have to live with it.
I just wished the C/C++ standards had used a different term than byte.
Even word would have been better.
M4