N
nandh_dp
When the below program is compiled and executed,
#include <stdio.h>
#define MASK 0xFFFFFFULL
main() {
unsigned long long a;
unsigned long b, c;
a = 0x12345678ULL;
b = a & MASK;
c = 0x00345678UL;
printf("%d %d\n", sizeof(unsigned long), sizeof(unsigned long
long));
printf("0x%x %lu \t 0x%x %lu \t 0x%x %lu \n",
a & MASK, a & MASK, b, b, c, c);
}
the result from a BigEndian machine is
4 8
0x0 3430008 0x0 3430008 0x345678 3430008
Surprisingly, the result of 'a & MASK' and 'b' are 0x0, when 0x%x
format is used.
And the result from a Little Endian machine is
4 8
0x345678 0 0x345678 0 0x345678 3430008
Again more surprisingly, the results are 0 when %lu is used. Why these
unexpected results ?
Also, when the printf is changed like this,
printf("0x%x %lu \t 0x%x %lu \t 0x%x %lu \n",
(unsigned long)(a & MASK), a & MASK, b, b, c, c);
the result is
0x345678 0 0x345678 3430008 0x345678 3430008
in a Big Endian machine and
0x345678 3430008 0x0 3430008 0x345678 3430008
in a little endian.
And with the below
printf("0x%x %lu \t 0x%x %lu \t 0x%x %lu \n",
(unsigned long)(a & MASK), (unsigned long)(a & MASK), b, b,
c, c);
the results are as expected
0x345678 3430008 0x345678 3430008 0x345678 3430008
in both the machines.
How does typecasting of one argument affect printed values of others ?
Or is there anyother reason behind it and what I see is a misconception
of some hidden fact ?
#include <stdio.h>
#define MASK 0xFFFFFFULL
main() {
unsigned long long a;
unsigned long b, c;
a = 0x12345678ULL;
b = a & MASK;
c = 0x00345678UL;
printf("%d %d\n", sizeof(unsigned long), sizeof(unsigned long
long));
printf("0x%x %lu \t 0x%x %lu \t 0x%x %lu \n",
a & MASK, a & MASK, b, b, c, c);
}
the result from a BigEndian machine is
4 8
0x0 3430008 0x0 3430008 0x345678 3430008
Surprisingly, the result of 'a & MASK' and 'b' are 0x0, when 0x%x
format is used.
And the result from a Little Endian machine is
4 8
0x345678 0 0x345678 0 0x345678 3430008
Again more surprisingly, the results are 0 when %lu is used. Why these
unexpected results ?
Also, when the printf is changed like this,
printf("0x%x %lu \t 0x%x %lu \t 0x%x %lu \n",
(unsigned long)(a & MASK), a & MASK, b, b, c, c);
the result is
0x345678 0 0x345678 3430008 0x345678 3430008
in a Big Endian machine and
0x345678 3430008 0x0 3430008 0x345678 3430008
in a little endian.
And with the below
printf("0x%x %lu \t 0x%x %lu \t 0x%x %lu \n",
(unsigned long)(a & MASK), (unsigned long)(a & MASK), b, b,
c, c);
the results are as expected
0x345678 3430008 0x345678 3430008 0x345678 3430008
in both the machines.
How does typecasting of one argument affect printed values of others ?
Or is there anyother reason behind it and what I see is a misconception
of some hidden fact ?