A
aling
Given following code snippets:
int a[10];
int* p=(int*)((&a)+1);
when I debuged the code using IDE like VC7.1, I found that:
a = 0x0012fc50 (int [10])
p = 0x0012fc78 (int *)
(&a)+1 = 0x0012fc51 (char *)
(&a)+2 = 0x0012fc52 (char *)
(&a)+3 = 0x0012fc53 (char *)
Why p isn't equal to (&a)+1 ? Why "(&a)+1" became "char*" type?
int a[10];
int* p=(int*)((&a)+1);
when I debuged the code using IDE like VC7.1, I found that:
a = 0x0012fc50 (int [10])
p = 0x0012fc78 (int *)
(&a)+1 = 0x0012fc51 (char *)
(&a)+2 = 0x0012fc52 (char *)
(&a)+3 = 0x0012fc53 (char *)
Why p isn't equal to (&a)+1 ? Why "(&a)+1" became "char*" type?