First,
it seems kind of rude, to me, to post the question elsewhere, and expect
someone reading the message here to follow a link to find the real question.
(e-mail address removed) said:
hi,
i'd a question abt taking the difference between two pointers.
for eg:
if you have 2 char pointers pointing to members of an array, you advance one
till you encounter a space then take the difference between the two, will give
the correct length of the string irrespective of whether char is represented
by 2 bytes (like in unicode)??
Second, by definition, char is always 1 byte; there are no exceptions. (A byte
may be larger than 8-bits, but sizeof(char) is always equal to 1).
Third, if you're using Unicode, then the functions in ctype.h (like
isspace) are
insufficient.
i believe it would give the correct length, because the compiler is
responsible for scaling the difference when one advances a pointer to point to
the next element like ptr++, or is it that pointer difference is not pointer
arithmetic and we need to scale it??
I'm not sure what you mean by scaling, but, assuming that all pointers are
valid, and you don't run off the end of an array, etc.:
If you increment the pointer x times, then (end - start) == x.
eg problem:
Orig string - char *s;
Ptrs, char *start = s, *end = s;
int length;
while(*end != isspace(*end))
ITYM: while(!isspace((unsigned char)*end))