subroutine stack and C machine model

J

John Kelly

For example, Kenny has wondered why I don't go out and get
laid more: the answer is that desublimation is a tool of control.

Whose control?

There are also any number of lurkers who are afraid to post because of
Richard Heathfield's behavior, which borders on the bizarre

You could study psychiatry in this ng.
 
D

Dennis \(Icarus\)

spinoza1111 said:
Perhaps. But if a text (not a "statement": a "statement" is a
fetishised and reified piece of a formal language spoken by
troglodytes) is clear this means you've understood it and have
verified that it has a precise relation to reality, eg., is for the
most part, true.

You may want to check the definition again, as I'm rather sure that's not in
the OED.
Good to see that you now realize that statements that care understood can
either be true or false, contradicting what you said above.
<snip>

Dennis
 
J

Julienne Walker

AND today's previously-unreported and not-on-Seebs's-page error from
CTCR is...p683. "For most keys on the keyboard, these scan codes are
converted into 8-bit ASCII values by the operating system." No, Mr
Schildt. ASCII is a 7-bit code.

The code table is different from the physical representation. If I
store an ASCII value in a 32-bit integer, is it no longer an ASCII
value because I didn't use exactly 7 bits? This mixup is made quite a
bit when it comes to Unicode (code table) and the representations for
Unicode (UTF-8, UTF-16, UTF-32 are the usual suspects), but I'll admit
I hadn't seen it with ASCII before now. Usually it's the assumption
that ASCII is the only possible character set, or that char is always
8 bits.

Further, taking into account that the smallest addressable object in
any C program is 8 bits (as per the minimum limits in the standard),
this "error" strikes me as looking too hard for a mistake. I'm
confident that you can find something better, even in the sentence you
quoted from page 683. ^_^
 
C

Colonel Harlan Sanders

S

spinoza1111

In

spinoza1111wrote:


I have asked [Peter Seebach], repeatedly, to post the data
base of 100s of known errors and he has not done so.
Do your own homework. He found a bunch on a flickthrough. I find
one
You keep claiming this in the same way you repeat claims about
people here, but you haven't documented these claims.

Yes I have. Like I said, I find one or two new ones every time I open
the book. That can't remain true forever (because if I keep doing it,
eventually I'll run out of book). But it's been true for a while.

AND today's previously-unreported and not-on-Seebs's-page error from
CTCR is...p683. "For most keys on the keyboard, these scan codes are
converted into 8-bit ASCII values by the operating system." No, Mr
Schildt. ASCII is a 7-bit code.

This is idiotic. Although the original ASCII was a seven bit code, it
is universally used in an 8 bit code or a 16 bit code for the very
good reason that no competent computer architect would make a word
size into a prime number, you moron.

This is insane. It's the replacement of useful information by
shibboleths and idioms which show membership in the in-crowd, because
in America, the UK and other developed countries, "programmers" no
longer code. Instead they are a sort of eunuch class which "sends bugs
to the compiler experts" and wastes my time by posting crap here using
their employer's computers.

The reader needs to know that information is sent normally in 8 bit
bytes universally, but you prefer, like my student "Otto" to show off
what you consider deep knowledge to teaching people how to be
effective on the job.

You don't know how to program and you hide this ignorance, the secret
contour of your weakness, by running software you do not understand
and could not write, by repeating folklore, and running deliberate
campaigns of personal destruction.

I hope to see you in a court of law. I hope to see you in prison.
 
S

spinoza1111

The code table is different from the physical representation. If I
store an ASCII value in a 32-bit integer, is it no longer an ASCII
value because I didn't use exactly 7 bits? This mixup is made quite a
bit when it comes to Unicode (code table) and the representations for
Unicode (UTF-8, UTF-16, UTF-32 are the usual suspects), but I'll admit
I hadn't seen it with ASCII before now. Usually it's the assumption
that ASCII is the only possible character set, or that char is always
8 bits.

Further, taking into account that the smallest addressable object in
any C program is 8 bits (as per the minimum limits in the standard),
this "error" strikes me as looking too hard for a mistake. I'm
confident that you can find something better, even in the sentence you
quoted from page 683. ^_^

You're pissing in the wind, Julienne. With Seebach and Heathfield,
what everyone needs to know, from the standard use of twos complement
arithmetic, to the 8 bit byte, to the need for a stack, is "wrong"
because this would devalue their "knowledge", a set of quasi-religious
incantations and shibboleths which have here created confusion, and
make this ng useless for its intended purpose. And you enable these
creeps when you confuse justified self-defense against their campaigns
of personal destruction by not doing your homework.
 
S

spinoza1111


Forgot about that one. OK, Orville, I did it there too, in self-
defense against a bunch of redneck creeps who remind me of you. But
Kenny McCormick and Richard are not sock puppets since I change my
gender when I sock puppet.

And in my book, it takes more brains to sock puppet than to copy and
paste the OED without understanding, and it takes more morals than to
conduct campaigns of personal destruction.

As a matter of fact, maybe Lilith needs to post to this thread.
 
J

Julienne Walker

[...] by not doing your homework.

You keep stating this without any rationale for it. Collegial people
tend to assume that their peers are both intelligent *and* informed,
just as collegial people tend not to butt into conversations unless
they really *are* informed. So please, tell me what homework I haven't
done. To save you some effort, I'll cover what I believe to be
relevant to your claims of "personal destruction":

1) I've read the majority of your posts on comp.lang.c and
comp.programming as far back as Google Groups history goes.
2) I've been a regular reader of these newsgroups (clc, clcm,
comp.programming, etc...) for many years, and I'm familiar with the
frequent participants.
3) I've kept up-to-date on Schildt's writings enough to avoid looking
like an ass when I talk about them.

Please try to make a distinction between "not doing your homework" and
"I don't agree with your conclusions", because it seems very much like
you're using the former to justify the latter. I try to treat everyone
here with some modicum of respect, and I rather expect the same in
return. Otherwise things degenerate into petty and childish arguments.
 
C

Colonel Harlan Sanders

Forgot about that one. OK, Orville, I did it there too, in self-
defense against a bunch of redneck creeps who remind me of you.

R i i i i g h t.
But
Kenny McCormick and Richard are not sock puppets since I change my
gender when I sock puppet.

And in my book, it takes more brains to sock puppet than to copy and
paste the OED without understanding, and it takes more morals than to
conduct campaigns of personal destruction.

Brains? In every case your attempts at creating a supporting voice
were so obviously you, and very obviously not a female, it was
laughable. You were outed in that thread immediately.

Morals? You're an unrepentant liar.
 
J

John Kelly

I try to treat everyone here with some modicum of respect, and I
rather expect the same in return. Otherwise things degenerate into
petty and childish arguments.

If there was a maturity requirement for posting, that could eliminate
selected prominent regulars. Their bad manners, childish jealously, and
outright hatred contribute to the decline of Usenet.
 
K

Kenny McCormack

If there was a maturity requirement for posting, that could eliminate
selected prominent regulars. Their bad manners, childish jealously, and
outright hatred contribute to the decline of Usenet.

If it weren't for CLC, their human contact would be limited to the guy
who slips their meals under the door each morning.
 
S

Seebs

You may want to check the definition again, as I'm rather sure that's not in
the OED.
Good to see that you now realize that statements that care understood can
either be true or false, contradicting what you said above.

Again, the problem is that he's failing to distinguish between knowledge
about the statement and knowledge about the things the statement describes.

Consider, to make it even more interesting, the category of statements
about fiction. Is the statement "Treebeard is an ent" clear? I think it
is. How about the statement "Treebeard is a dragon"? There's no reason
for which one should be easier or harder to understand than the other --
but understanding gets you only as far as having accurate knowledge of
the statement. To decide whether the statement is true, you have to also
have knowledge of the topic of the statement, and then compare the two.

-s
 
J

Julienne Walker

In




Interesting question. I would say no, but I suspect you're not going
to agree. :)

Your suspicion is correct, and your answer borders on being asinine. I
can only assume that you're thinking in the abstract, because
otherwise you'd be saying that it's impossible to use ASCII in C.
 
J

Julienne Walker

ASCII code-points are values 0..127 and can be held in char or short or int or
whatever arithmetic types. It's a value thing. A code value > 127 is not ASCII.

Yes, and unless RH is being especially obtuse or I misinterpreted his
answer, he's saying that any value within that range in any integer
type is *not ASCII* unless the integer type is exactly 7 bits, which
is impossible in standard C because the smallest addressable unit is
at least 8 bits.
 
M

Moi

[...] by not doing your homework.

You keep stating this without any rationale for it. Collegial people
[snip]

1) I've read the majority of your posts on comp.lang.c and
comp.programming as far back as Google Groups history goes. 2) I've been
[snip]


Please try to make a distinction between "not doing your homework" and
"I don't agree with your conclusions", because it seems very much like

[snip]
Please don't breast-feed the troll. He'll bite.

:)
AvK
 
J

Julienne Walker

In <[email protected]>,




Well, as for "asinine", I guess that's a matter of opinion. I am
indeed thinking in the abstract, and I'm not sure that it makes sense
to talk about "using ASCII" in C. C doesn't mandate any particular
character set, and is just as at home with, say, EBCDIC as it is with
ASCII.

As I thought, you were going for something more abstract.
char ch = 80;

(a perfectly legal line of C), is it meaningful to say that ch holds
an ASCII code? I would argue that it is *not* meaningful within the
context of a portable C program.

Then again, if you say that ch holds an ASCII value, you've applied a
restriction that removes any claims of portability beyond the ASCII
character set.
On an ASCII-based machine, it is not unreasonable to say that 80 is
the ASCII code point for 'P'. But it /is/ unreasonable, in my view,
to claim that it is an 8-bit value. Either it's ASCII or it isn't. If
it's ASCII, it's a 7-bit value. The fact that it's stored in a data
type with more than 7 bits is neither here nor there. Putting a 7"
pencil in a 32" pencil box doesn't make the pencil 32" long - it just
means that there's some empty space in the box too. Similarly with
ASCII, the 7-bit code 1010000 remains a 7-bit code even if you store
it in an 8-bit object.

Sorry, I still think the "error" you found was too pedantic to be
anything but a stretch for finding something to complain about. It
seems I did indeed misinterpret your answer to my question, and we
agree for the most part.
 
H

Hektor Rottweiler

spinoza1111wrote:


One of the more successful process control computers had a
13 bit word.

Yes, and I developed the compiler for the SL/1 XT, a PBX with a 24 bit
word. The software was unnecessarily complex because it should have
been a 32 but word but the mechanical engineers couldn't figure out
how to fit 32 bit data paths into the rack.
Two ascii characters fit nicely in the 14 bit word of some of
Microchip's mid range pic's.

....only to have to be unpacked...
Clearly your experience is limited when sending information.

Of course, I didn't say all platforms had an 8 bit byte or were powers
of two word length (see above). But the limitations are yours, since
you apparently confuse overspecialized experience with some crummy
little chip with knowledge of computer science.
 
H

Hektor Rottweiler

As I thought, you were going for something more abstract.



Then again, if you say that ch holds an ASCII value, you've applied a
restriction that removes any claims of portability beyond the ASCII
character set.

This is absurd. It was a mistake to use 7 bits for ASCII, made at a
time when engineers wanted to save bits at any price, including making
it difficult and slow to pack and unpack data in prime number widths.
It was an era in which hardware engineers hated software engineers,
thinking them to be dispensable, bearded and/or effeminate beatniks
and hippies; at Motorola, the development of the first cell phone was
considerably delayed by the hardware engineers' bullying and the
passive-aggressive responses of the software designers in my (limited
but first-hand) experience.

To want to preserve the mistake as a shibboleth, an idiom, or a manner
of speaking merely shows your loyalty to the wrong way of doing
things.
Sorry, I still think the "error" you found was too pedantic to be
anything but a stretch for finding something to complain about. It
seems I did indeed misinterpret your answer to my question, and we
agree for the most part.- Hide quoted text -

Don't go there, Julienne, and stop enabling this person. He will
continue, I predict, to insist on his silly idioms.
 
H

Hektor Rottweiler

The ego of the man. Just because they're not /his/ sock puppets, he
claims that they're not sock puppets at all! I don't actually believe
them to be sock puppets, by the way - it's his reasoning, not
necessarily his conclusion, that is at fault here.

Creeps like you use language to obscure, in the same way language is
dysfunctional in the back offices of criminal firms. So let's see:
Kenny & Richard are sock puppets, but not mine. OK, someone else is
making them his sock puppets. Maybe John Nash? And what would that
mean?

In a trivial sense, for the same reason "duh, it's not ASCII if it is
in 8 bits", we could define x as his own sock puppet universally as a
null case. I'm my own sock puppet, Kenny is his own sock puppet,
etcetera.

This would make you trivially right
And to be so is your heart's delight
Not for you the grand gesture, no:
Of truth you are at best, the big toe.
You've never met a tautology
But to fall madly
In love with the hackneyed saw
And then spew it forth, from out your maw.
And now he has moved from denying that he uses sock puppets (except
once, er, twice, er, three times, er...) to extolling the virtues of
sock puppeteers. No limits, it seems, to his ego or his
shamelessness.

Creeps like you love to define the minor infraction for the same
reasons criminal offices have a lot of work rules. Yes, I'd rather see
pranks here, or what was originally meant by "trolling", than people
destroying reputations while being paid to do so, or perhaps out of
bitterness at working for a string of crappy little Blight banks and
Limey insurance companies with nothing to show for it.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,770
Messages
2,569,584
Members
45,075
Latest member
MakersCBDBloodSupport

Latest Threads

Top