Eric Sosman said:
Pointers are like numbers that denote positions on a
scale of some kind. It makes sense to subtract two such
positions to find the interval between them, and it makes
sense to add two intervals, or to add a position and an
interval. But it doesn't make sense to add two positions;
one can only do so by agreeing on a zero reference and then
converting one or both positions to intervals.
If I take a tape measure and put the end at some arbitrary distance
away, and measure the position of two points (along the line of the
measure) with it, then I can find the average position by adding the
measurements and dividing by two, and I will get the same answer
regardless of where the end of the measure is. Similarly, it works
perfectly well to add two temperatures in Fahrenheit and divide by two
to get the average, and it also works perfectly well in Celsius and
Kelvin, and you get the same answer whichever you use.
If a scale has some kind of zero offset, then of course the sum of two
values will have two of those offsets in it, but provided the meaning
you ascribe to the sum requires you to subtract off a value of the
same kind, or divide by two, before interpreting it on the original
scale, it will all cancel out.
The sum of two pointers doesn't make sense *as a pointer* any more
than the sum of two Fahrenheit temperatures makes sense as a
Fahrenheit temperature. But that doesn't mean it doesn't make sense
at all.
In an object-oriented language you could define a type "sum of two
pointers", whose constructor takes two pointers. Then you could
defined methods on the type such as subtraction of a pointer, which
would return a pointer.
-- Richard