Barry Schwarz said:
i found something tricky this morning.
char *p="abc";
1. char m=1[p];// m='b'
2. char n=sizeof('h')[p]; // n=1;
I guess the reason of 1is,
1+p=p+1; so the same as p[1];
*(1+p) = *(p+1) = p[1] = 1[p].
While your demonstration clears some of the OP's interrogations, it does not
account for 2.
Let's delve into this tricky problem:
sizeof('h')[p] is a very different biest from (sizeof('h'))[p]
in C++ (sizeof('h'))[p] would do the same thing as sizeof(char))[p] yield
1[p] which is 'b'
in C however sizeof('h') is the same as sizeof(int) which is implementation
defined.
on some cumbersome DSPs, sizeof('h') is 1 and (sizeof(char))[p] evaluates
to 'b'
on MSDOS 16 bits, sizeof('h') is 2 and (sizeof(char))[p] evaluates to 'c'
I do not know any system on which sizeof('h') is 3, but that is not
impossible. (sizeof(char))[p] would then equal 0
on most modern systems sizeof('h') is at least 4 and (sizeof(char))[p]
invokes undefined behaviour.
on the other hand sizeof('h')[p] does not evaluate as sizeof(char)[p] nor
sizeof(int)[p], both syntax errors.
Indeed sizeof('h')[p] evaluates as sizeof (('h')[p]).
Or more simply: sizeof 'h'[p] which is equivalent to sizeof p['h']
I can hear purists argue that p points to a 4 byte array and 'h' is most
likely larger than 4 so p['h'] is an invalid pointer reference. Yet the
expression argument to sizeof does not get evaluated, only its type is
determined. The type here is char. sizeof(char) is 1 by definition. QED.
Note that you can further mesmerize you audience by defining this inoccuous
looking macro:
#define sizeof(x) (sizeof(x))
What becomes of sizeof('h')[p] in this case ?
What other side effects does this macro have ?