arithmetic on a void * pointer

S

spinoza1111

I am not a troll. A troll posts in bad faith to "get a rise" out of
people. It is not a person with a different point of view, who used C
heavily enough to be asked to assist Nash. You use the word because
you're a Nordic racist: it referred to the peoples of north and
western Europe whose culture was destroyed by invaders.

Furthermore, whatever your opinion, asshole, I've contributed code and
ideas to this newsgroup which have been the source of productive
discussion, most recently a demonstration of the fact that C Sharp
does not have the time complexity relationship to C that a truly
interpreted language would have. To do so, I have taken risks that
most posters are too cowardly to make.

Kenny is right. You are an autistic and sad little creep.
 
S

spinoza1111

What case is that?  That since others feed the troll, it's ok for
you to do so?

In talking about me as if I weren't here, you're:

(1) Feeding the troll, and
(2) Being a boor
 
K

Kenny McCormack

I am not a troll. A troll posts in bad faith to "get a rise" out of
people.
[/QUOTE]

I think that, in actuality, we are all trolls (with the possible, no,
probable, exception of Kiki - aka, Mr. Android - who is, thus, a much
sadder case). The working definition of "troll" is someone who posts to
get a reaction, and the fact is, we all do that. If we aren't doing
that, then we're truly wasting our time here.
Yup.

....
Furthermore, whatever your opinion, asshole, I've contributed code and
ideas to this newsgroup which have been the source of productive
discussion, most recently a demonstration of the fact that C Sharp
does not have the time complexity relationship to C that a truly
interpreted language would have. To do so, I have taken risks that
most posters are too cowardly to make.

Kenny is right. You are an autistic and sad little creep.

While I think we have all contributed to the development of the idea,
it was Twink who has most recently articulated this truth. So, credit
where credit is due.
 
R

Richard Bos

Kaz Kylheku said:
Hypothesis: maybe, in the ancient early history of GCC, that's in
fact what it meant.

Counter-hypothesis: it still does. However, it refers (and probably
always has referred) to all warnings for Ganuck, not for real C.

Richard
 
E

Edward A. Falk

I wasn't talking about the standard. I was talking about the compilers.

GCC is the only compiler I've ever used that did arithmatic on void*.

I'm sure there may be others, but that's the only one I know about.

Frankly, I wish the standard *did* allow it, but there you have it.

I generally try not to use non-standard language extensions, as I
prefer to keep my code portable.
 
S

spinoza1111

...


I think that, in actuality, we are all trolls (with the possible, no,
probable, exception of Kiki - aka, Mr. Android - who is, thus, a much
sadder case).  The working definition of "troll" is someone who posts to
get a reaction, and the fact is, we all do that.  If we aren't doing
that, then we're truly wasting our time here.

We post for attention (Hegelian recognition) but a troll posts purely
for this reason, AND he doesn't mean what he says. So, none of the
major players here are Trolls. Heathfield actually means what he says.
More the pity.

No big deal. Just what Heathfield calls "terminology". If Kiki thinks
I'm a troll, fine, although it amuses me to point out his limitations
from time to time.
While I think we have all contributed to the development of the idea,
it was Twink who has most recently articulated this truth.  So, credit
where credit is due.

That is correct. But congratulations to the growing number of people
who are calling Heathfield on his BS.
 
S

Seebs

The point of void* is not to be a mere synonym for char*, but to allow
us to abstract away the differences between types when performing
operations on objects for which their type is immaterial, or to allow us
to process them so far, and then hand on their type-specific part to a
function that knows about their type. Obvious examples are mem*, qsort,
bsearch, fread, fwrite, and abstract container libraries.

In particular:

I find that it's *USEFUL* to get warned if I try to do arithmetic on
a (void *), because it means I think I know what I have a pointer to, and
that either I should change the type of the pointer to reflect what I know
it points to, or I don't know enough to do that arithmetic.

If I want a buffer of unsigned chars, I know where to find it.

-s
 
S

spinoza1111

Firstly, it's not BS - but I realise you aren't able to recognise that.
Secondly, let's count them, shall we? Six months ago, it was five: you,
Kenny, Richard NoName, Twink, and Han - all except you being widely
recognised as trolls. Judging by the lack of feedback, Han appears to
have gone, so it seems that either the number has actually reduced to
four or everybody simultaneously killfiled Han some months ago. And if
we reduce the number to those who actually post under their real name,
that number shrinks to one. And, unless their posting behaviour has
altered dramatically since they hit my killfile, none of the above has
anything worthwhile to say about C.

If one's companions are an indication of one's credibility, you couldn't
have made a worse choice as far as C programming is concerned. And I'm
beginning to see that Keith is right. A cost/benefit analysis suggests
that discussing anything with you is far too expensive, since you are
too stupid and arrogant to learn quickly. You assume you know it all, as
a result of which it takes ages to teach you even the simplest thing,
because you're too busy denying it to bother learning it. And even when
(or rather if) the knowledge does finally break through, you then claim
you knew it all along (despite having argued for its falsity).

The rest of this article addresses the original subject of the thread,
"arithmetic on a void * pointer".

It may well be that the OP's attitude to void * stems from what might be
called a "pigeon-hole" approach to programming - memory is a bunch of
pigeon-holes, and a program is just a list of instructions for visiting
them (and modifying their contents) in a particular, dynamic, order.
This very low-level approach is absolutely fine and highly justifiable
for some people (especially writers of object code!). One can see why
such a person might have little patience with restrictions on pointer
arithmetic (and indeed with other forms of abstraction).

For those who have a higher level approach, however, the abstract
machine (AM) comes to the fore - and the AM doesn't define arithmetic on
void pointers because it can't be sure of the outcome of such arithmetic
on the real machines on which the AM sits.

The Standard's definition of p++ can be thought of as "increment p by
sizeof *p" - i.e. if p's object representation is 0xF000 and it's a
pointer to an int where int is known to be 6 bytes, then we should not
be entirely surprised if p's object representation changes to 0xF006.
Now, if p is a void *, then we're asking for p to be incremented by the
value of sizeof(void), which is obviously meaningless, since void is an
incomplete type that cannot be completed and so there is no way to know
how big it is.

The point of void* is not to be a mere synonym for char*, but to allow
us to abstract away the differences between types when performing
operations on objects for which their type is immaterial, or to allow us
to process them so far, and then hand on their type-specific part to a
function that knows about their type. Obvious examples are mem*, qsort,
bsearch, fread, fwrite, and abstract container libraries.

If the OP can't see the point of void *, it may be that this is simply a
facet of the way he approaches programming - in other words, maybe for
him there isn't any point to it. It is certainly true that, for his
compression needs, he doesn't actually need void * - he could easily
make do with unsigned char * instead.

Pompous and content-free, since if you were really interested in
abstraction and type safety you would not use C. You would learn an
object oriented language, but you refuse to be vulnerable and to
learn.

The original poster is in fact a competent C programmer. You're not.

Object oriented languages allow both "incomplete classes" and
"abstract classes" and they separate these notions, whereas void is
neither.

In C Sharp, for example, an incomplete class definition is simply part
of its source code. An abstract class is one that cannot be
instantiated but must be inherited.

But void is-not inherited in any meaningful sense in C. You cannot say
than an int is-a void with additional properties and additional
restrictions.

Which means in C that the GCC option makes perfect sense and provides
the ability to do arithmetic on pure pointers which point to sod-all.
This is in fact the world of assembler: Never-Never land, a dream time
when programmers had Fun as opposed to the corporate reality of today,
where they, like Seebach, have to send bugs to Timbuktoo for fixing
and at best write tools and scripts that nobody asked them to write to
keep busy, as I myself have done at more than one job...because
there's no competitive advantage to be had, any more, from new and
risky software.

C programmers are Peter Pans who want to be simultaneously recognized
as grownups interested in Grownup things like reliability and
portability, but at the same time to have Fun in a time-less Never-
Never land where they can fantasize that they are close to the
machine.

But my Dad asked me in 1971 a very good question. He said, what will
happen when you programmers are done with your work? I had no answer
and the reality today is 12% unemployment and mass homelessness, even
among former programmers.

But JM Barrie's Peter Pan not only comes to mind. Another book that
comes to mind in these discussions is Lord of the Flies, because the
bullies in that book pass a law: we must have fun. The goal of
Heathfield et al. is to have fun, if necessary at the reputation of
solid professionals like Peter Neumann, Herb Schildt, and Jacob Navia.
Anyone who forms his own correct view and is able, precisely because
he's done the real homework, puts it into his own words, abandoning
the shibboleths of the Lost Boys here, is a Piggy and a spoil sport
who must die.

Most of you creeps flunked English, which is part of your problem, so
at this point I can see you saying WTF.
 
S

spinoza1111

Firstly, it's not BS - but I realise you aren't able to recognise that.
Secondly, let's count them, shall we? Six months ago, it was five: you,
Kenny, Richard NoName, Twink, and Han - all except you being widely
recognised as trolls. Judging by the lack of feedback, Han appears to
have gone, so it seems that either the number has actually reduced to
four or everybody simultaneously killfiled Han some months ago. And if
we reduce the number to those who actually post under their real name,
that number shrinks to one. And, unless their posting behaviour has
altered dramatically since they hit my killfile, none of the above has
anything worthwhile to say about C.

If one's companions are an indication of one's credibility, you couldn't
have made a worse choice as far as C programming is concerned. And I'm
beginning to see that Keith is right. A cost/benefit analysis suggests
that discussing anything with you is far too expensive, since you are
too stupid and arrogant to learn quickly. You assume you know it all, as
a result of which it takes ages to teach you even the simplest thing,
because you're too busy denying it to bother learning it. And even when
(or rather if) the knowledge does finally break through, you then claim
you knew it all along (despite having argued for its falsity).

The rest of this article addresses the original subject of the thread,
"arithmetic on a void * pointer".

It may well be that the OP's attitude to void * stems from what might be
called a "pigeon-hole" approach to programming - memory is a bunch of
pigeon-holes, and a program is just a list of instructions for visiting
them (and modifying their contents) in a particular, dynamic, order.
This very low-level approach is absolutely fine and highly justifiable
for some people (especially writers of object code!). One can see why
such a person might have little patience with restrictions on pointer
arithmetic (and indeed with other forms of abstraction).

For those who have a higher level approach, however, the abstract
machine (AM) comes to the fore - and the AM doesn't define arithmetic on
void pointers because it can't be sure of the outcome of such arithmetic
on the real machines on which the AM sits.

The Standard's definition of p++ can be thought of as "increment p by
sizeof *p" - i.e. if p's object representation is 0xF000 and it's a
pointer to an int where int is known to be 6 bytes, then we should not
be entirely surprised if p's object representation changes to 0xF006.
Now, if p is a void *, then we're asking for p to be incremented by the
value of sizeof(void), which is obviously meaningless, since void is an
incomplete type that cannot be completed and so there is no way to know
how big it is.

The point of void* is not to be a mere synonym for char*, but to allow
us to abstract away the differences between types when performing
operations on objects for which their type is immaterial, or to allow us
to process them so far, and then hand on their type-specific part to a
function that knows about their type. Obvious examples are mem*, qsort,
bsearch, fread, fwrite, and abstract container libraries.

If the OP can't see the point of void *, it may be that this is simply a
facet of the way he approaches programming - in other words, maybe for
him there isn't any point to it. It is certainly true that, for his
compression needs, he doesn't actually need void * - he could easily
make do with unsigned char * instead.

Not if the size of the character is larger than the byte. News flash,
Heathfield. The "smallest addressable unit" of most modern platforms
is no longer == char, because of internationalization: the wide char
IS the char in reality. Therefore Adler needs to code what he MEANS,
which is the calculation of byte and not character addresses.
 
S

spinoza1111

Firstly, it's not BS - but I realise you aren't able to recognise that.
Secondly, let's count them, shall we? Six months ago, it was five: you,
Kenny, Richard NoName, Twink, and Han - all except you being widely
recognised as trolls. Judging by the lack of feedback, Han appears to
have gone, so it seems that either the number has actually reduced to
four or everybody simultaneously killfiled Han some months ago. And if
we reduce the number to those who actually post under their real name,
that number shrinks to one. And, unless their posting behaviour has
altered dramatically since they hit my killfile, none of the above has
anything worthwhile to say about C.

If one's companions are an indication of one's credibility, you couldn't
have made a worse choice as far as C programming is concerned. And I'm
beginning to see that Keith is right. A cost/benefit analysis suggests
that discussing anything with you is far too expensive, since you are
too stupid and arrogant to learn quickly. You assume you know it all, as
a result of which it takes ages to teach you even the simplest thing,
because you're too busy denying it to bother learning it. And even when
(or rather if) the knowledge does finally break through, you then claim
you knew it all along (despite having argued for its falsity).

The rest of this article addresses the original subject of the thread,
"arithmetic on a void * pointer".

It may well be that the OP's attitude to void * stems from what might be
called a "pigeon-hole" approach to programming - memory is a bunch of
pigeon-holes, and a program is just a list of instructions for visiting
them (and modifying their contents) in a particular, dynamic, order.
This very low-level approach is absolutely fine and highly justifiable
for some people (especially writers of object code!). One can see why
such a person might have little patience with restrictions on pointer
arithmetic (and indeed with other forms of abstraction).

For those who have a higher level approach, however, the abstract
machine (AM) comes to the fore - and the AM doesn't define arithmetic on
void pointers because it can't be sure of the outcome of such arithmetic
on the real machines on which the AM sits.

The Standard's definition of p++ can be thought of as "increment p by
sizeof *p" - i.e. if p's object representation is 0xF000 and it's a
pointer to an int where int is known to be 6 bytes, then we should not
be entirely surprised if p's object representation changes to 0xF006.
Now, if p is a void *, then we're asking for p to be incremented by the
value of sizeof(void), which is obviously meaningless, since void is an
incomplete type that cannot be completed and so there is no way to know
how big it is.

Nope. In verse:

The size of void is unity
We can say so with impunity
Because this ain't theology
It's JUST technology.

Throw the standard in the trash
'Twas writ to save vendor cash

Stop feigning false ignorance
Surplus to your genuine stupidity
There's no need to pretend to be a dunce
When it's clear you so clue-challenged be.

Besides (to lapse back to prose), if Adler uses a GCC compiler that
allows him to use voids as real pointers that point to the smallest
addressible unit of memory, the only problem will be when he needs to
retarget to a machine that's not supported by GCC (and GCC is both
retargetable and runs on many platforms).
 
S

spinoza1111

We're talking about chars, not characters. I do not expect you to
understand the difference.

Pompous fool.
A char is guaranteed to occupy exactly one byte of storage, and the byte
is guaranteed to be at least 8 bits wide (but it can be wider).

That's silly, because it means that I can't program the IBM 1401
second-generation transistor computer being reconstructed at the
Computer Museum in Mountain View in C! Its "byte" is six bits, seven
bits if you count the word mark.

Although C was invented on the DEC 10 (I think) during the 1401's
useful life, a search for "C programming on the IBM 1401" gets no
hits.

Alternatively, how would you write a simulator for a machine whose
basic addressable unit is six bits?

You pretend that these conventions are knowledge.

Silly bastard.
 > News flash,


Olds flash: the fact that the smallest addressable unit of storage is
the byte, and one char requires exactly one byte of storage, remains
true for all conforming C implementations, even on modern platforms.

So let the void have sizeof one
And with this nonsense, be done.
 > because of internationalization: the wide char


Which is precisely what unsigned char * will give him.

That's absurd, because there's no meaning to signing a character
outside of C. The only question is whether enough bits are in the
character to represent all possible languages world-wide.
 
N

Nick Keighley

spinoza1111 wrote:
But congratulations to the growing number of people
who are calling Heathfield on his <rubbish>

[...] let's count them, shall we? [...] Judging by the
lack of feedback, Han appears to have gone, so it seems that either
the number has actually reduced [...] or everybody simultaneously
killfiled Han some months ago.

I don't use a kill file. Han really has gone.

And if
we reduce the number to those who actually post under their real name,
that number shrinks to one.

using/not using a pseudoname isn't a 100% indication of quality of
postings

The rest of this article addresses the original subject of the thread,
"arithmetic on a void * pointer".

It may well be that the OP's attitude to void * stems from what might be
called a "pigeon-hole" approach to programming - memory is a bunch of
pigeon-holes, and a program is just a list of instructions for visiting
them (and modifying their contents) in a particular, dynamic, order.
This very low-level approach is absolutely fine and highly justifiable
for some people (especially writers of object code!). One can see why
such a person might have little patience with restrictions on pointer
arithmetic (and indeed with other forms of abstraction).

For those who have a higher level approach, however, the abstract
machine (AM) comes to the fore - and the AM doesn't define arithmetic on
void pointers because it can't be sure of the outcome of such arithmetic
on the real machines on which the AM sits.

yes, but that was a design choice. They could have defined the AM
differently if they'd wanted to.

The Standard's definition of p++ can be thought of as "increment p by
sizeof *p" - i.e. if p's object representation is 0xF000 and it's a
pointer to an int where int is known to be 6 bytes, then we should not
be entirely surprised if p's object representation changes to 0xF006.
Now, if p is a void *, then we're asking for p to be incremented by the
value of sizeof(void), which is obviously meaningless, since void is an
incomplete type that cannot be completed and so there is no way to know
how big it is.

The point of void* is not to be a mere synonym for char*,

there wouldn't be much point in introducing it if it had been.

but to allow
us to abstract away the differences between types when performing
operations on objects for which their type is immaterial, or to allow us
to process them so far, and then hand on their type-specific part to a
function that knows about their type. Obvious examples are mem*, qsort,
bsearch, fread, fwrite, and abstract container libraries.

If the OP can't see the point of void *,

....then perhaps he should use unsigned char* instead. I usually define
a type Byte or Octet if I want to manipulate buffers of bytes.

it may be that this is simply a
facet of the way he approaches programming - in other words, maybe for
him there isn't any point to it. It is certainly true that, for his
compression needs, he doesn't actually need void * - he could easily
make do with unsigned char * instead.

compression seems a very good candidate for operating on bytes/octets.


--
We recommend, rather, that users take advantage of the extensions of
GNU C and disregard the limitations of other compilers. Aside from
certain supercomputers and obsolete small machines, there is less
and less reason ever to use any other C compiler other than for
bootstrapping GNU CC.
(Using and Porting GNU CC)
 
S

spinoza1111

Why do you bother?

That's another one for the Red Book, Dickie.

"comp.programming is not about programmers"

"Nilges isn't in comp.risks"

"I am smarter than Peter Neumann"

"chars aren't characters"

You see, the referents of your language are fantasies and the words
themselves, which you foolishly have learned to fit together without
troubling to see if they have any relationship to reality.

Adler's and GCC's usage reflects a possibility of which you are
unaware. This is that while on most computers, the smallest
addressible piece of memory is a character (where this is a Western
letter or ideogram), this wasn't the case in the second and first
generation of computers, and given the reality that "retro" computing
is practised at places like the Computer Museum in Mountain View, we
need to program physically reconstructed or simulated machines. And
not only for shits and giggles: especially in military applications,
the baby-killers might want to reprogram older baby-killing machines
(I don't like it but it's a "need", and they might be beating swords
into plowshares or reprogramming older land mines so they don't go off
when touched by a kid).

For example, the IBM 7094 had a 36 bit word with six 6 bit characters
packed inside words using BCD. The smallest addressable unit of memory
was the word, not the character. Implementing C on the 7094 means that
the sizeof the character is 1/6, a double or single precision value.

This means that you want foolishly to conflate char* and void*, but on
the 7094, Adler's default (void points to sizeof 1) is not the same as
char, and on the 7094, sizeof has to return a float. You foolishly
confuse C with a language like Lisp or Java, not seeing how much (bad)
history is incorporated in it. And like Seebach you try to destroy
reputations of people far more qualified than you based on your
Sophistry.

This is foolish because you think that C is "portable" whereas it is
tied to a machine in which the smallest addressable unit of memory is
the character.

Of course, this is a "retro" issue, but going forward, multibyte
characters mean that "the smallest addressable unit of memory" is no
longer the character.

C can of course be "hacked" to accomodate both retro and multibyte
computing, but you think foolishly that this shows the power of C.
What it shows is the power of Turing machines no matter how silly they
are. C is a Turing machine...one that sucks and that obscures what
goes on.

But hey don't go changin'. Your world is self-referential and you are
hopeless when it comes to the possibility of retraining. This was
clear to me in 2003 when you made a big whoop tee doo about an
invariant in a for loop. That is, when you can't dominate the agenda,
you do ANYTHING to invent agenda items, you lie about people, and you
deny your own lies.
 
G

gwowen

Very good question. The guy seems to be completely incapable of
understanding even the simplest things.

So stop. What are you trying to prove anyway? That you're smarter
that Nilges? I would suggest that stopping would go some way to
prove that, whereas not stopping provides strong counter-evidence.

It's not as if this hasn't been going on for the best part of a
decade. What's that they say about the definition of insanity is
doing the same thing over and over again and expecting different
results?
 
N

Nick Keighley

<snip longish post by Richard heathfield>

learn to trim posts you idiot
Pompous and content-free, since if you were really interested in
abstraction and type safety you would not use C.

C is less abstract than some but probably more so than you appeciate.
The quite portable Linux Kernel is written in (a dialect of) C
You would learn an
object oriented language,

well, no, not necessarily. OO is the be all and end all of programming
technology

"The statement [Dealing with large numbers of interrelated types while
still preserving modularity in the design of large systems is very
difficult, and is an area of much current research.], [...] is just as
true now as it was when we wrote it twelve years ago. Developing a
useful, general framework for expressing the relations among different
types of entities (what philosophers call ``ontology'') seems
intractably difficult. The main difference between the confusion that
existed ten years ago and the confusion that exists now is that now a
variety of inadequate ontological theories have been embodied in a
plethora of correspondingly inadequate programming languages. For
example, much of the complexity of object-oriented programming
languages -- and the subtle and confusing differences among
contemporary object-oriented languages -- centers on the treatment of
generic operations on interrelated types. Our own discussion of
computational objects in chapter 3 avoids these issues entirely.
Readers familiar with object-oriented programming will notice that we
have much to say in chapter 3 about local state, but we do not even
mention ``classes'' or ``inheritance.'' In fact, we suspect that these
problems cannot be adequately addressed in terms of computer-language
design alone, without also drawing on work in knowledge representation
and automated reasoning."

Structure And Interpretation of Computer Programs.


but you refuse to be vulnerable and to learn.

The original poster is in fact a competent C programmer. You're not.

Object oriented languages allow both "incomplete classes" and
"abstract classes" and they separate these notions, whereas void is
neither.

quite, so why mention them? void* is more like an opaque type. Only
Ada (that I'm aware of) dealt with these explicitly.
In C Sharp, for example, an incomplete class definition is simply part
of its source code.
what?

An abstract class is one that cannot be
instantiated but must be inherited.

and so not an opaque type
But void is-not inherited in any meaningful sense in C.

unsurprising as C doesn't have inheritance.

You cannot say
than an int is-a void with additional properties and additional
restrictions.

no you can't. And more importantly ***you don't want to*** as that is
not the appropriate semantic abstraction.
Which means in C that the GCC option makes perfect sense and provides
the ability to do arithmetic on pure pointers which point to sod-all.

doing arithmatic on pointer that point to sod all seems at best
pointless and at worst dangerous. Typically in C this will lead to
undefined behaviour. Good.
This is in fact the world of assembler: Never-Never land, a dream time
when programmers had Fun

well assembler can be fun but it takes a long time to get anything
done. This is why things like C hash were invented.

Most of you creeps flunked English, which is part of your problem, so
at this point I can see you saying WTF.

I've got 2 O Levels (though my spelling doesn't give it away)
 
S

spinoza1111

In particular:

I find that it's *USEFUL* to get warned if I try to do arithmetic on
a (void *), because it means I think I know what I have a pointer to, and
that either I should change the type of the pointer to reflect what I know
it points to, or I don't know enough to do that arithmetic.

If you had enough education and experience, Petey, you'd know that
sometimes you need to forget what you know in order to write software.
That should be easy for you IF forgetting takes the same amount of
energy as learning (and there are neurochemical reasons for thinking
that it may), since you haven't learned much.

If C were truly a portable and safe programming language, then I could
write ONE data compression program that wouldn't have to "know or
care" whether it was compressing bytes or structs or whatever. All I
want is a void pointer that points to the smallest addressable unit of
memory. If I am compressing 16 bit international characters then I can
use multiplication by two (by means of shifting) after calculating the
offset. I don't want some clown porting it to a source code ecosystem
where the char is multibyte, because then my calculations will be off
by two times.

I don't have much experience in Adler's area but it appears to me that
GCC is a good tool if he must use C, since it separates the idea of a
character from the idea of a pure memory pointer. If you want to use
strong types at all times, go back to school and learn Java. But if
C's claims to be "close to the machine" are at all true, then it needs
to provide a pure assembler mode, in which we are Noble Savages
innocent of anything except computer memory. We don't know the sizeof
its units, and we don't know if they contain characters, parts of
characters, or several characters.

This is where I think void* would come in handy. In fact, I am
beginning to think its inclusion in GCC was a stroke of genius. You
see, the GCC developers didn't have greedy vendors on their ass.

But hey what do I know, I just have thirty years of experience, and
shit.

ROTFLMFAO
 
N

Nick Keighley

by (I consider) an unfortunate historical accident C made a byte the
same as a char ***by definition***. And a byte/char must be at least 8
bits. C implementaions must respect this definition.
That's silly, because it means that I can't program the IBM 1401
second-generation transistor computer being reconstructed at the
Computer Museum in Mountain View in C! Its "byte" is six bits, seven
bits if you count the word mark.

you can't make the char type equal to the native byte, no. They'll
have to use some larger "object".

Alternatively, how would you write a simulator for a machine whose
basic addressable unit is six bits?

I'd use a char to represent the basic addressable unit and make sure
it never used more than 6 bits. Since a 6 bit byte machine is likely
pretty old I don't expect I'd get a memory problem. My key ring
contains more than 2000 Vaxs.
You pretend that these conventions are knowledge.

they're handy if you want to write C programs. If you don't they are
pretty esoteric.

That's absurd, because there's no meaning to signing a character
outside of C. The only question is whether enough bits are in the
character to represent all possible languages world-wide.

a C char is only very loosly related characters in natural written
languages. C was written by Americans before unicode was even a dream.
They think a pound symbol is some sort of noughts and crossses grid.
 
S

spinoza1111

<snip longish post by Richard heathfield>

learn to trim posts you idiot


C is less abstract than some but probably more so than you appeciate.
The quite portable Linux Kernel is written in (a dialect of) C

I agree that Linux was ported at light speed, but I'd ask whether this
was because it was ported by virtual slaves who didn't know they were
slaves. I can see that Windows generated despair and anguish: when I
read about OS/360, I wanted to write my own operating system. It was
cool that Linux got done, but its developers should have been paid
more, say a fraction of the 2008 bailout.
well, no, not necessarily. OO is the be all and end all of programming
technology

You mean it's NOT, and it is NOT. But this argument (which I've heard
more than once in corporate nurseries and kinder-gartens) is
fallacious: "because your better solution is not best, my inferior
solution is preferable and therefore (ta da) the best, or (sighs and
flute notes) better than yours:

Let A be "my solution such as C"
Let B be "your solution such as C Sharp"
Let P be "the language of the gods such as Lisp, Scheme, Haskell or
what ev er"
Let .orEven. be an or operator connotative of rude astonishment

Let quality(x) be "the quality of x"

1. quality(B) < quality(P)
2. ergo quality(A) > quality(B) .orEven. quality(A) == quality(P)

This is of course an invalid argument.
"The statement [Dealing with large numbers of interrelated types while
still preserving modularity in the design of large systems is very
difficult, and is an area of much current research.], [...] is just as
true now as it was when we wrote it twelve years ago. Developing a
useful, general framework for expressing the relations among different
types of entities (what philosophers call ``ontology'') seems
intractably difficult. The main difference between the confusion that
existed ten years ago and the confusion that exists now is that now a
variety of inadequate ontological theories have been embodied in a
plethora of correspondingly inadequate programming languages. For
example, much of the complexity of object-oriented programming
languages -- and the subtle and confusing differences among
contemporary object-oriented languages -- centers on the treatment of
generic operations on interrelated types. Our own discussion of
computational objects in chapter 3 avoids these issues entirely.
Readers familiar with object-oriented programming will notice that we
have much to say in chapter 3 about local state, but we do not even
mention ``classes'' or ``inheritance.'' In fact, we suspect that these
problems cannot be adequately addressed in terms of computer-language
design alone, without also drawing on work in knowledge representation
and automated reasoning."

Structure And Interpretation of Computer Programs.

They may be right, but it's silly to quote Abelson et al. in defense
of C, as you seem to be doing. They would replace OO not by C by
logic programming in Lisp and in Scheme, and both of those languages
(especially Lisp) also reflect their origins. And when you get a
language that does "knowledge representation and automated reasoning"
your troubles have only begun, since the Logical Positivists like
Carnap and Nelson Goodman discovered that knowledge is hard to
represent. The problem becomes in fact an unsolved philosophical
problem.

The best is the enemy of the good. Sure, I might migrate to F Sharp
and I should probably learn a lot more about lisp. But it's INSANE to
quote these guys to defend C. They'd laugh at you.
quite, so why mention them? void* is more like an opaque type. Only
Ada (that I'm aware of) dealt with these explicitly.


and so not an opaque type


unsurprising as C doesn't have inheritance.


no you can't. And more importantly ***you don't want to*** as that is
not the appropriate semantic abstraction.


doing arithmatic on pointer that point to sod all seems at best
pointless and at worst dangerous. Typically in C this will lead to
undefined behaviour. Good.


well assembler can be fun but it takes a long time to get anything
done. This is why things like C hash were invented.



I've got 2 O Levels (though my spelling doesn't give it away)

You're the exception that proves the rule
I congratulate you on your performance in school
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,764
Messages
2,569,567
Members
45,041
Latest member
RomeoFarnh

Latest Threads

Top