arithmetic on a void * pointer

S

spinoza1111

Nick Keighley said:
Ian Collins wrote:
Richard Heathfield wrote:
spinoza1111 wrote:
[spinoza stuff]
[technical C stuff]
I do not expect you to understand the difference.
Why do you bother?
Very good question. The guy seems to be completely incapable of
understanding even the simplest things.
for The Lurkers

That has been my justification, such as it is.

Richard, why not follow your own advice (that I have been trying to
follow) and respond only to technical, on-topic, matters?  I know it
is going to be harder for you because EGN is taking pot shots at you,
but by this stage that is surely a compliment.

Oops, you made an excellent suggestion...and then fired off a
ricocheting salvo. I think for the most part you do avoid insulting
verbiage that STARTS fights (as opposed to verbal self defense which
is justified) but you goofed. You imply that when I take a pot-shot it
should be taken as a complement. This is flattering and sucking up to
Heathfield, who doesn't deserve your admiration since you are a much
better programmer than he, and I of course reject its implication
about my credibility.
I'd go further and stop when it becomes a matter of style like the
recent case concerning the relative merits of do {...} while (0) vs. a
plain block in a macro, but any technical discussion will typically be
short.

An excellent suggestion. I would say that there's an inverse
relationship between a poster's technical depth and the number of
times he STARTS an exchange by challenging credibility and ability
(where again, I'm not counting responses in self-defense). You seldom
take pot-shots but there are exceptions.
 
S

spinoza1111

spinoza1111 wrote:



Right. They're chars.



In C, which is what we discuss in comp.lang.c, it's a byte, not a character.
Conceded.



On machines in which the smallest addressable unit is 36 bits wide,
either the byte is 36 bits wide too, or the implementation has to do
extra work (e.g. if the implementor wants to use 9-bit bytes, then there
has to be some mechanism for translating "C bytes" into "machine bytes"
- those aren't formal terms, by the way; I'm just trying to find a way
to express it that's easy to understand).

That means that C is not a universal or ideal machine-oriented
language, but just another way of thinking that carries with it its
own serious limitations. The fact that you have to make a silly
distinction so its easy to understand means that C is Procrustean.
The whole point of my reply was that we should *not* conflate char* (or
rather unsigned char *) and void *.

Nor should we foolishly deny that void is a type like any other. Both
ways are valid, like Kepler v Copernicus, but Adler's and GCC's way is
simpler.

void doesn't point anywhere, since it's an incomplete type that cannot

Begging the question aren't we. Adler was using it as a complete type,
and he (and Bacarisse) is one of the few competent specialists in C to
post here.
be completed. sizeof 1 is equivalent to sizeof(int), which is typically
4 on modern desktop systems (but needn't be).

 > is not the same as


sizeof yields a size_t, which is guaranteed to be an unsigned integral type.

Begging the question, aren't we
The blind man does not see
Argal (as the Gravedigger said) says he
Sizeof yields a size underscore t
But that was my point:
C is is totally out of joint.
 
S

spinoza1111

Nick Keighley said:
Ian Collins wrote:
Richard Heathfield wrote:
spinoza1111 wrote:
[spinoza stuff]
[technical C stuff]
I do not expect you to understand the difference.
Why do you bother?
Very good question. The guy seems to be completely incapable of
understanding even the simplest things.
for The Lurkers

That has been my justification, such as it is.

Richard, why not follow your own advice (that I have been trying to
follow) and respond only to technical, on-topic, matters?  I know it
is going to be harder for you because EGN is taking pot shots at you,
but by this stage that is surely a compliment.

I'd go further and stop when it becomes a matter of style like the
recent case concerning the relative merits of do {...} while (0) vs. a
plain block in a macro, but any technical discussion will typically be
short.

Well, we're all fond of little techniques that we invented. I think I
can advance technico-aesthetic arguments in the minor key for using
{ ... } in place of "do something once while false is true": my
solution takes fewer characters and is more elegant in the sense that
"behold, e pluribus unum: out of many statements my macro maketh one"
is more elegant than "do something once while falsity is false" as
English.

Do bracket print eff newline semicolon bracket while falsity is true?
What is this, a Boy named Sue?
Oh, now I get it, a microsecond later
The guy is trying to make his macro safer!
Fucking guy, I say with a sigh,
That just took a piece of my life
I could have had a little more time in the sack with the wife
Had I not had to ponder this trouble and strife.
Just place the code in braces
Or you'll have me making faces.
 
N

Nick Keighley

I agree that Linux was ported at light speed,

and other Unix before it
and partly because it was written in the high-level/low-level language
C.


it was this I was intending to refute
well, no, not necessarily. OO is [not] the be all and end all of programming
technology
[...] it is NOT. But this argument (which I've heard
more than once in corporate nurseries and kinder-gartens) is
fallacious: "because your better solution is not best, my inferior
solution is preferable and therefore (ta da) the best, or (sighs and
flute notes) better than yours:

no. what I was saying is that if you want abstraction and type safety
you wouldn't necessarily use OO. It's not the only tool in the box.
Let A be "my solution such as C"
Let B be "your solution such as C Sharp"
Let P be "the language of the gods such as Lisp, Scheme, Haskell or
what ev er"
Let .orEven. be an or operator connotative of rude astonishment

Let quality(x) be "the quality of x"

1. quality(B) < quality(P)
2. ergo quality(A) > quality(B) .orEven. quality(A) == quality(P)

This is of course an invalid argument.

luckily I didn't make it.

"The statement [Dealing with large numbers of interrelated types while
still preserving modularity in the design of large systems is very
difficult, and is an area of much current research.], [...] is just as
true now as it was when we wrote it twelve years ago. Developing a
useful, general framework for expressing the relations among different
types of entities (what philosophers call ``ontology'') seems
intractably difficult. The main difference between the confusion that
existed ten years ago and the confusion that exists now is that now a
variety of inadequate ontological theories have been embodied in a
plethora of correspondingly inadequate programming languages. For
example, much of the complexity of object-oriented programming
languages -- and the subtle and confusing differences among
contemporary object-oriented languages -- centers on the treatment of
generic operations on interrelated types. Our own discussion of
computational objects in chapter 3 avoids these issues entirely.
Readers familiar with object-oriented programming will notice that we
have much to say in chapter 3 about local state, but we do not even
mention ``classes'' or ``inheritance.'' In fact, we suspect that these
problems cannot be adequately addressed in terms of computer-language
design alone, without also drawing on work in knowledge representation
and automated reasoning."
Structure And Interpretation of Computer Programs.

They may be right, but it's silly to quote Abelson et al.  in defense
of C, as you seem to be doing.

in defence of OO not being the Only Way.

They would replace OO not by  C by
logic programming in Lisp and in Scheme, and both of those languages
(especially Lisp) also reflect their origins. And when you get a
language that does "knowledge representation and automated reasoning"
your troubles have only begun, since the Logical Positivists like
Carnap and Nelson Goodman discovered that knowledge is hard to
represent. The problem becomes in fact an unsolved philosophical
problem.

The best is the enemy of the good. Sure, I might migrate to F Sharp
and I should probably learn a lot more about lisp. But it's INSANE to
quote these guys to defend C. They'd laugh at you.

no comment?


no comment?


<snip>
 
S

spinoza1111

I agree that Linux was ported at light speed,

and other Unix before it
and partly because it was written in the high-level/low-level language
C.

it was this I was intending to refute
well, no, not necessarily. OO is [not] the be all and end all of programming
technology
[...] it is NOT. But this argument (which I've heard
more than once in corporate nurseries and kinder-gartens) is
fallacious: "because your better solution is not best, my inferior
solution is preferable and therefore (ta da) the best, or (sighs and
flute notes) better than yours:

no. what I was saying is that if you want abstraction and type safety
you wouldn't necessarily use OO. It's not the only tool in the box.
Let A be "my solution such as C"
Let B be "your solution such as C Sharp"
Let P be "the language of the gods such as Lisp, Scheme, Haskell or
what ev er"
Let .orEven. be an or operator connotative of rude astonishment
Let quality(x) be "the quality of x"
1. quality(B) < quality(P)
2. ergo quality(A) > quality(B) .orEven. quality(A) == quality(P)
This is of course an invalid argument.

luckily I didn't make it.

But you did. You posted a long and silly quote from Abelson et al.
critical of OO which doesn't make your case, and they haven't shown
that they can formalize knowledge outside of the laboratory.
"The statement [Dealing with large numbers of interrelated types while
still preserving modularity in the design of large systems is very
difficult, and is an area of much current research.], [...] is just as
true now as it was when we wrote it twelve years ago. Developing a
useful, general framework for expressing the relations among different
types of entities (what philosophers call ``ontology'') seems
intractably difficult. The main difference between the confusion that
existed ten years ago and the confusion that exists now is that now a
variety of inadequate ontological theories have been embodied in a
plethora of correspondingly inadequate programming languages. For
example, much of the complexity of object-oriented programming
languages -- and the subtle and confusing differences among
contemporary object-oriented languages -- centers on the treatment of
generic operations on interrelated types. Our own discussion of
computational objects in chapter 3 avoids these issues entirely.
Readers familiar with object-oriented programming will notice that we
have much to say in chapter 3 about local state, but we do not even
mention ``classes'' or ``inheritance.'' In fact, we suspect that these
problems cannot be adequately addressed in terms of computer-language
design alone, without also drawing on work in knowledge representation
and automated reasoning."
Structure And Interpretation of Computer Programs.
They may be right, but it's silly to quote Abelson et al.  in defense
of C, as you seem to be doing.

in defence of OO not being the Only Way.

I never said it was. I've used C, Fortran, assembler, Cobol and PL.1,
none of which are OO. Do me the courtesy of not confusing me, ever
again, with the half-educate language Fundamentalists here.

However, it is a BETTER solution.
 
A

Antoninus Twink

Firstly, it's not BS

You are hardly an impartial judge on that matter, are you?
Secondly, let's count them, shall we?

Why don't you do so - and honestly this time? You may find it
enlightening.
Six months ago, it was five: you, Kenny, Richard NoName, Twink, and
Han - all except you being widely recognised as trolls.

Just about everyone who posts to this group is widely recognized as a
"troll" by one constituency or another - hardly a distinguishing
feature.
Judging by the lack of feedback, Han appears to have gone

Laughable. No one believes your claims that you've killfiled Han - he
demolished them comprehensively while he was still with us.
And if we reduce the number to those who actually post under their
real name, that number shrinks to one.

Say what? As far as I'm aware, all of those you mentioned (except Han)
post under our real names.
And, unless their posting behaviour has altered dramatically since
they hit my killfile, none of the above has anything worthwhile to say
about C.

Heathfield, you are deluding yourself if you believe that. Stick around
and you might learn something about C and its uses in the real world
from us "trolls".
For those who have a higher level approach, however, the abstract
machine (AM) comes to the fore - and the AM doesn't define arithmetic
on void pointers because it can't be sure of the outcome of such
arithmetic on the real machines on which the AM sits.

Nonsense. It's a pointless omission from the standard. If p is a void*
then defining
p + n := (void*) ((char*)p + n)
breaks nothing and would be convenient in many situations.
Now, if p is a void *, then we're asking for p to be incremented by the
value of sizeof(void), which is obviously meaningless, since void is an
incomplete type that cannot be completed and so there is no way to know
how big it is.

It's only meaningless because your precious standard assigns it no
meaning.

Defining sizeof(void) := 1 would break nothing and would be convenient
in many situations.
The point of void* is not to be a mere synonym for char*, but to allow
us to abstract away the differences between types when performing
operations on objects for which their type is immaterial

What is the relevance to the question at hand? The abstraction is very
valuable - but permitting arithmetic would be a convenience that would
do nothing whatsoever to break the abstraction, any more than allowing a
memcmp or memcpy given a void* and the size of the object pointed to
breaks the abstraction.
 
F

Flash Gordon

Antoninus said:
On 12 Jan 2010 at 3:33, Richard Heathfield wrote:




Say what? As far as I'm aware, all of those you mentioned (except Han)
post under our real names.

<snip>

So your real name is Antonius Twink?

Sorry people, I couldn't resist this one. Oh, and I know my real name
isn't Flash, but anyone who wants to find my real name and has a modicum
of intelligence can find it easily enough.
 
P

Phil Carmody

Richard Heathfield said:
spinoza1111 wrote:
[SNIP - that which addresses things not C]
It may well be that the OP's attitude to void * stems from what might
be called a "pigeon-hole" approach to programming - memory is a bunch
of pigeon-holes, and a program is just a list of instructions for
visiting them (and modifying their contents) in a particular, dynamic,
order. This very low-level approach is absolutely fine and highly
justifiable for some people (especially writers of object code!). One
can see why such a person might have little patience with restrictions
on pointer arithmetic (and indeed with other forms of abstraction).

I think it stems simply from the fact that in most architectures,
(address) registers are *not typed*, and therefore *nominally* map
onto void* in C. And yet adding 1 to them increments the pointer.

Some low level C code occasionally escapes this paradigm by using
uint32_t as the rawest "pointer" type. (It is, after all, what you
need to pass to the DMA controller's register, say.)

Phil
 
S

spinoza1111

Including Flash himself, who explicitly mentioned it in his post to
which yours is an antecedent.

Still, often, pseudonyms are handy.  For example, its useful to remember
that Kenny McCormack is a deliberately incoherent cartoon character,

This ability (to think of people as cartoon characters) is Fascistic.
Germans were prepared to sit idle while Jews were shipped to death
camps by any number of American cartoons in which it was OK for the
cartoon character to be subject to simulated violence. And, cartoon
violence ALSO defangs the impulse to dignified self-defense.
 
S

spinoza1111

...


Excellent.  Of course no one else here has any clue as to what you are
talking about (pearls before swine to the max!), but you have got it so
dead to rights.

Especialy the bit about "girlie" stuff.  As I've noted before, the ethic
of this group is so tightly bound to the prototype of an uneducated (I
don't need no stinkin' college degree!) but manly (no girlie stuff for
me!) programming stud.  And, as you've noted elsewhere, they share the
idea that they don't wanna go into management, because going into
management means the end of their eternal youth.

Paradoxically, even Seebach (who appears to be a feminine identified
gay man) adheres to the ethic, for the same reason a lot of gay men
overcompensated...by being the macho penetrator who gave the "weak"
partner AIDs.

There are two ways to be self-educated. One is fox and grapes, in
which Seebach reports proudly that he's never taken a computer science
class, implying that to do so would have diminished his *mana*, and
laughably comparing himself to Dijkstra et al. ... who, as I pointed
out, didn't take no steenking computer science because there was no
steenking computer science: they invented it, and for a script kiddie
like Seebach to wear their robes would be laughable if it weren't just
sad.

The Urban Legend (what Adorno would call the myth) is that guys like
Heathfield are "practical men" uncorrupted by the obfuscations of
their (feminine or effeminate) teachers, who tried to pull a fast one
on Dickie when he was in school, telling Dickie that each individual
statement in a multi-statement for loop is followed by a test. They
showed their awestruck (or terrified, probably) teacher that she was
wrong. Likewise they "prove" that "Nilges isn't in comp.risks" by
deliberately using the wrong way, and then act all innocent when their
fraud is immediately exposed.

Their problem here is that they can't Call Security or start shoving
me around the meeting room, and this bugs the shit out of them. They
are forever playing Whack-a-Mole.

It is possible to be wholly or partially self-educated in CS and
retain respect for education. However, I sense here a rage to destroy
teachers.
 
F

Flash Gordon

I was not complaining.

Richard Heathfield listed a number of posters, and Antonius Twink
claimed that the only poster in the list not posting under his/her real
name was Han. That was a factually incorrect claim since Antonius Twink
was on the list and that is not his real name.
Including Flash himself, who explicitly mentioned it in his post to
which yours is an antecedent.

I also pointed out that it is easy to find out what my real name is.
Still, often, pseudonyms are handy. For example, its useful to remember
that Kenny McCormack is a deliberately incoherent cartoon character,
aimed at adolescents and easily-amused adults.

A lot of the versions of the Flash Gordon comics/cartoons have not been
that serious either.
 
F

Flash Gordon

I was not complaining.

Richard Heathfield listed a number of posters, and Antonius Twink
claimed that the only poster in the list not posting under his/her real
name was Han. That was a factually incorrect claim since Antonius Twink
was on the list and that is not his real name.
Including Flash himself, who explicitly mentioned it in his post to
which yours is an antecedent.

I also pointed out that it is easy to find out what my real name is.
Still, often, pseudonyms are handy. For example, its useful to remember
that Kenny McCormack is a deliberately incoherent cartoon character,
aimed at adolescents and easily-amused adults.

A lot of the versions of the Flash Gordon comics/cartoons have not been
that serious either.
 
S

spinoza1111

Gareth said:
(e-mail address removed) (Kenny McCormack) writes:
I was not complaining.

Richard Heathfield listed a number of posters, and Antonius Twink
claimed that the only poster in the list not posting under his/her real
name was Han. That was a factually incorrect claim since Antonius Twink
was on the list and that is not his real name.


I also pointed out that it is easy to find out what my real name is.


A lot of the versions of the Flash Gordon comics/cartoons have not been
that serious either.

The difference is you name yourself, while calling Kenny, who appears
to post under his own name, a cartoon character.
 
F

Flash Gordon

spinoza1111 said:
The difference is you name yourself, while calling Kenny, who appears
to post under his own name, a cartoon character.

Nope, you've got it wrong yet again. I've never called Kenny a cartoon
character nor compared him to one. Other people have, but not me.
 
D

David Thompson

[...]
And actually, no. size_t is the size-in-bytes, canonically, since that's
what sizeof(x) yields.

There are, however, functions in the Standard library taking
size_t parameters that do not specify byte counts. calloc(),
qsort(), bsearch(), fwrite(), and fread() come to mind immediately,
and there may be others.

All of those take both bytes-per-foo and num-foo as size_t, so they
could count as evidence on both sides of the issue!

Many of the wcs* and *w* functions take num-wchar_t, and wchar_t is
nearly always bigger than a byte -- since it's entire raison-d'etre is
to handle a larger character set than char=byte generally can.
(It isn't formally required to be bigger though.)

Ironic aside: and yes I know etre should have an accent, but it's too
much trouble for me to get nonASCII to work in Usenet. Glurk.
 
S

Seebs

Let's see, one mouse-click to bring up Keyboard Viewer, one key press
(of the option key) to see which modifier gives me the accent, then two
key presses gives me ê.

.... Which appears as a diamond with a question mark in it on my display.

The fundamental problem here goes a lot further than "how do I type it".

-s
 
N

Nick

Tim Streater said:
Let's see, one mouse-click to bring up Keyboard Viewer, one key press
(of the option key) to see which modifier gives me the accent, then
two key presses gives me ê.

Wow! What a cumbersome way of doing it. I just press the "compose" key
(which, for some strange reason, has a little graphic of a piece of loo
roll on it) then 'e. And here we are: é
 
N

Nick

Tim Streater said:
Well, of course, having activated Keyboard Viewer I can just leave it
there if I'm likely to be needing a number of accents.

Besides which, you got é, but I wanted ê. What do you do for that? And
how do you find it out? I'm certainly not going to bother
*remembering* key combinations.

I'm not sure I've ever typed one of those, and I certainly don't know
the key combination. But I'm going to guess that it's compose, shirt-6,
e. Shall we see what happens... ê

Yup, that did it. I can guess how to do an e with two dots on it as
well - ë as well as one with a line over it: ē Never typed those or
learnt the combinations either.
 
N

Nobody

Besides which, you got é, but I wanted ê. What do you do for that?

Compose e ^ or Compose ^ e.
And how do you find it out? I'm certainly not going to bother
*remembering* key combinations.

They're all mnemonic, e.g. apart from accented characters:

Compose s s = ß
Compose s o = §
Compose c o = ©
Compose ! p = ¶
Compose + - = ±
Compose L - = £
Compose Y - = ¥
Compose - : = ÷
Compose c - = ¢
Compose 1 2 = ½

and so on. The order doesn't matter, case only matters if the resulting
character has upper- and lower-case versions, many of them have several
forms (e.g. yen and sterling can use - or =, cent can use | or /, etc).
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,537
Members
45,020
Latest member
GenesisGai

Latest Threads

Top