Supoose we have,
int x;
int *p = &x;
int *q;
q = p + 1;
Is calculating p+1 correct ? My uderstanding is that, only for arrays,
we can take the address of one element past the end of array. In this
case, p + 1 should invoke undefined bahaviour even if it produces
sizeof(int) bytes beyond the location of x.
In practical terms, q is now pointing to memory one int* beyond p; the
contents of that memory are entirely unknown to you, and attempting to
do anything with that memory is where you'll run into trouble.
(Since you never assigned anything to x or to *p, attempting to do
access the contents of x, *p, or *q are also going to run into
trouble.)
I'm not a standards guru, so I can't tell you whether this trouble is
"undefined behavior" or not, but *writing* to memory that has never
been allocated (also called "smashing memory") is the source of the
nastiest bugs to track down, and probably the number one reason for
avoiding pointer arithmetic. (Imagine the memory address you have now
pointed q at is the start of a string that is used somewhere else in
the code entirely; setting that to '0' terminates the string; while
you're trying to figure out why that string disappears, you might
never think to look in *this* block of code. Nasty stuff.)
Reading from uninitialized memory can also be confusing, as *most* of
the time, that uninitialized value might be 0, like you expect, but
then occasionally it's not, which also makes for some headaches. Bugs
that can't reliably be reproduced are never fun. <OT> Also, some
compilers will 0 out uninitialized automatic variables on low levels
of optimization, but not on higher levels; so if you compile in a
debug mode during development, and then add optimization when you're
ready to release your product: voila, bugs you've never seen before.
Again, not fun.</OT>