P
pellepluttrick
Hi,
I thought I understood this stuff - but...
This little program (taken from W. Richard Stevens Unix Network
Programming Volume 1) determines a machine's endianess:
#include <iostream>
using namespace std;
int main()
{
union {
unsigned short us;
char c[sizeof(unsigned short)];
} un;
un.us = 0x1234;
if (sizeof(unsigned short) == 2) {
if (un.c[0] == 0x12 && un.c[1] == 0x34)
cout << "big-endian\n";
else if (un.c[0] == 0x34 && un.c[1] == 0x12)
cout << "little-endian\n";
else
cout << "unknown\n";
} else
cout << "sizeof(unsigned short) = " << sizeof(unsigned short)
<< ".\n";
}
On my machine it prints little-endian (as expected). This obviously
must mean that the bits are laid out in memory as: "0x34, 0x12". Right?
So if I want to send an unsigned short over the network from this
machine (which should be done in network byte-order/big-endian) I must
convert the order of the bytes in memory so this unsigned short is sent
as "0x12, 0x34". Right? (Oh, btw I do not have htons on this platform
:-( ... )
Now the following short block of code converts the unsigned short into
a stream of two bytes:
// Remember: stored in memory as 0x34 0x12
unsigned short us = 0x1234;
char buf[2];
// Convert from little-endian to big endian
buf[0] = us & 0xFF; // Should now contain 0x12
buf[1] = (us >> 8) & 0xFF; // Should now contain 0x34
// Print the stream...
printf("0x%.2X", buf[0]);
printf(", ");
printf("0x%.2X", buf[1]);
printf("\n");
To my great surprise it printed:
0x34, 0x12
What?!?!?!?
If I change to:
buf[0] = (us >> 8) & 0xFF;
buf[1] = us & 0xFF;
everything works OK but I think it should not! Sigh...
Please enlight me!
/Pelle
I thought I understood this stuff - but...
This little program (taken from W. Richard Stevens Unix Network
Programming Volume 1) determines a machine's endianess:
#include <iostream>
using namespace std;
int main()
{
union {
unsigned short us;
char c[sizeof(unsigned short)];
} un;
un.us = 0x1234;
if (sizeof(unsigned short) == 2) {
if (un.c[0] == 0x12 && un.c[1] == 0x34)
cout << "big-endian\n";
else if (un.c[0] == 0x34 && un.c[1] == 0x12)
cout << "little-endian\n";
else
cout << "unknown\n";
} else
cout << "sizeof(unsigned short) = " << sizeof(unsigned short)
<< ".\n";
}
On my machine it prints little-endian (as expected). This obviously
must mean that the bits are laid out in memory as: "0x34, 0x12". Right?
So if I want to send an unsigned short over the network from this
machine (which should be done in network byte-order/big-endian) I must
convert the order of the bytes in memory so this unsigned short is sent
as "0x12, 0x34". Right? (Oh, btw I do not have htons on this platform
:-( ... )
Now the following short block of code converts the unsigned short into
a stream of two bytes:
// Remember: stored in memory as 0x34 0x12
unsigned short us = 0x1234;
char buf[2];
// Convert from little-endian to big endian
buf[0] = us & 0xFF; // Should now contain 0x12
buf[1] = (us >> 8) & 0xFF; // Should now contain 0x34
// Print the stream...
printf("0x%.2X", buf[0]);
printf(", ");
printf("0x%.2X", buf[1]);
printf("\n");
To my great surprise it printed:
0x34, 0x12
What?!?!?!?
If I change to:
buf[0] = (us >> 8) & 0xFF;
buf[1] = us & 0xFF;
everything works OK but I think it should not! Sigh...
Please enlight me!
/Pelle