casts

K

Kenny McCormack

Tom St Denis (obvious sock puppet said:
This sort of experience and forethought is what separates people like
me [real developers] from people like you [angry usenet trolls who act
like little kids].

Rich...

And priceless.
 
S

spinoza1111

If by "abstracting tasks away", you mean unethically forcing your
client to bankroll Yet Another development of a vanity system for
supporting GUIs, yeah I dunno how do dat. In True Basic, I was able to
develop a system for hydrostatic stability in 1986 that allowed a
vessel's captain to point to parts of the vessel and determine weight
and balance. Whereas in C a few months later I found myself having to
invent string processing.

You clearly have not worked in a team larger than yourself.  It's
typical in large enough organizations that you have a team dedicated
to the GUI, another to localizations, another to QA, another to the
underlying engine [whatever the application does], etc...

(Sigh) I prefer (as do most bright people) to work on things myself.

Most software delivered as above is unusable prior to release 2.0. Why
do you suppose that might be?
I'm happy you were able to write your "My First Program (tm)" in Basic
in 1986, but not all applications amenable to that development

No, I wrote My First Program in the then-new library of Northwestern
University in machine language for the IBM 1401 in 1970. I was not
allowed to smoke, being already addicted to Gauloises, but nonetheless
managed by careful desk checking to produce a bug free program.
process.  You complain that C sucks at GUI work.  I don't disagree.
But I also contend that languages like VB or Perl or Python or ...,
tend to suck at doing math, bit twiddling, etc, sort of work.

Platonism causes people to confuse their psychological ideas with
reality. VB .Net, in using the same semantics as any other .Net
language including C, doesn't "suck", but programmers don't know how,
any more, to express proper structured programming constructs in
languages that don't fully support them. I do, since I learned how to
do structured programming while being employed in a Cobol job.
So you split the difference, write the engine in C, and GUI in a
language where that is better suited.  If you do your work properly
the C code amounts to a library that performs the required work that
you can reuse in other applications/platforms.  So only your GUI work
runs the huge risk of being platform specific [e.g. GTK+ in Linux, GDI
in Win32, etc...].

This sort of experience and forethought is what separates people like
me [real developers] from people like you [angry usenet trolls who act
like little kids].

In the sense that I refused to "grow up" in a society where "growing
up" is constituted in sacrificing your brains and imagination, to the
extent where "expert programmers" become so adept at doing the
corporate shuck and jive that they wind up permanently unemployed at
40 (the corporate shuck and jive replacing their flexibility), then
yes I act like a child.

The fact is that most software written by teams that separate people
into cubicles is dead on arrival and has to be debugged by the user
community.

The "maturity" is in the organization, not in the people, who act like
vicious children right here, for example when they hound and harass
"trolls".

Ponder this passage from Theodore W Adorno's Minima Moralia. It
describes precisely how "research technicians" are dialectically
moronized, not only because the tools they use are of ever greater
elaboration and sophistication, but also because they are smart enough
to master and create those tools, yet are secretly "in denial" of
their inner weakness.

"But far from finding anything inimical in the prohibitions on
thinking, the candidates - and all scientists are candidates for posts
- feel relieved. Because thinking burdens them with a subjective
responsibility which their objective position in the productive
process does not allow them to meet, they renounce it, shiver a bit,
and run to join their opponents. Dislike of thinking rapidly becomes
incapacity for it: people who can effortlessly discover the most
sophisticated statistical objections when it is a question of
sabotaging a piece of knowledge, are unable to make ex cathedra the
simplest predictions. They hit out at speculation and in it kill
common sense. The more intelligent of them suspect the sickness of
their intellectual powers, since it first appears not universally but
in the organs whose services they sell. Many wait in fear and shame
for their defect to be discovered. But they all find it publicly
acclaimed as a moral achievment, and see themselves recognized for a
scientific asceticism which to them is none, but the secret contour of
their weakness. Their rancour is socially rationalized with the
argument: thinking is unscientific. At the same time, their mental
power has, in a number of dimensions, been prodigiously increased by
control mechanisms. The collective stupidity of research technicians
is not simply an absence or regression of intellectual faculties, but
a proliferation of the thinking faculty itself, which consumes thought
with its own strength."
 
T

Tom St Denis

What I love about your posts is that you string together words like
they have plausible meanings that are correct. You sir are a king
amongst trolls.

(Sigh) I prefer (as do most bright people) to work on things myself.

Most software delivered as above is unusable prior to release 2.0. Why
do you suppose that might be?

Lax testing and verification standards? I dunno, my customers expect
my software to work, let alone "be useable" the first time I deliver
it.
No, I wrote My First Program in the then-new library of Northwestern
University in machine language for the IBM 1401 in 1970. I was not
allowed to smoke, being already addicted to Gauloises, but nonetheless
managed by careful desk checking to produce a bug free program.

That's uninspiring. I was hoping for another story about how you
taught another random famous person how to program, or at least that
you were involved in the Apollo moon landing missions or something.
Platonism causes people to confuse their psychological ideas with
reality. VB .Net, in using the same semantics as any other .Net
language including C, doesn't "suck", but programmers don't know how,
any more, to express proper structured programming constructs in
languages that don't fully support them. I do, since I learned how to
do structured programming while being employed in a Cobol job.

Cobol is the last language I'd want to learn if I had to write an ASN.
1 library from scratch. And if you think VB.net is better than C at
manipulating data you either [or both] don't know VB or C.
In the sense that I refused to "grow up" in a society where "growing
up" is constituted in sacrificing your brains and imagination, to the
extent where "expert programmers" become so adept at doing the
corporate shuck and jive that they wind up permanently unemployed at
40 (the corporate shuck and jive replacing their flexibility), then
yes I act like a child.

I was more so referring to the fact that you troll usenet posting
about how C is so bad and useless despite not knowing thing 1 about
how to develop software. That's childish.
The fact is that most software written by teams that separate people
into cubicles is dead on arrival and has to be debugged by the user
community.

That's patently false. Look at, for example, console video games.
I'd say >95% of them work flawlessly out of the gate [whether they're
"good" games is another story] and maybe 1/100 games are dead out of
the gate, and the remaining 4% have minor improvement/tweaks to fix
things. That's just one industry.
The "maturity" is in the organization, not in the people, who act like
vicious children right here, for example when they hound and harass
"trolls".

Well I call you a troll because you outright ignore good common sense,
and rant about things that are just not true. C isn't bad because it
has no GUI facilities, it's just bad at GUI tasks. Just like VB isn't
bad because it can't bit-twiddle, it's just not good at that task.

But you insist on posting inflammatory material, calling people names,
insisting that they're persecuting you. That makes you a troll.

If you want a more positive reaction stop trying to insult us all the
time. Calling me stupid or "indoctrinated" because I tend to code a
lot in C isn't going to win me over (I also happen to code in shell
and Perl too...). You don't like C, great, but don't call me stupid
because I know how to use it to get things done.

<snip ramblings>

Let me enumerate your faulty ideas

1. Developers who like C only program everything in C. False.
2. Developers who like C think that all problems are solved with C.
False.
3. C is the best language for every problem. False.
4. All applications written in C are buggy and non-portable. False.
5. All applications written in Java are bug-free and portable.
False.
6. All developers who use C are indoctrinated and fear change.
False.
7. A company can only have teams that work in one language. False.

and so on...

I'm sorry you don't have any useful real world experience working in a
software team, but you have no idea, whatsoever, about what you're
talking. Smart developers use the right tool for the job. Sometimes
that's C, sometimes that isn't. But a professional developer wouldn't
get all emotional and upset because the tool they happen to prefer
isn't the right one.

I like C and all, but if I have to do string manipulations [often when
doing test generators] I use Perl. Why? Not because C sucks, but
because it's so much easier in Perl that the time I spend on learning/
relearning things in Perl is made up by not having to code the
equivalent application in C. That being said, when I need to make a
vector generator or something that meddles with bytes, I write it in
C, even if it has to parse some strings. Why? Because Perl sucks at
working with bits.

etc, and so on.

In short, if you want to stop being called a troll, stop posting
things you can't back up with facts, stay on topic for clc, and stop
the name calling.
 
D

Dik T. Winter

>
> Cobol is the last language I'd want to learn if I had to write an ASN.
> 1 library from scratch.

Dijkstra: "The use of COBOL cripples the mind; its teaching should,
therefore, be regarded as a criminal offence."

BTW, C does was not stolen from PL/1 (Dijkstra: "PL/I - 'the fatal
disease' - belongs more to the problem set than to the solution set.")
but from BCPL, showing how much spinoza knows about history, but he is
of course only the 1111-th.
 
S

spinoza1111

...
...
 > > Platonism causes people to confuse their psychological ideas with
 > > reality. VB .Net, in using the same semantics as any other .Net
 > > language including C, doesn't "suck", but programmers don't know how,
 > > any more, to express proper structured programming constructs in
 > > languages that don't fully support them. I do, since I learned how to
 > > do structured programming while being employed in a Cobol job.
 >
 > Cobol is the last language I'd want to learn if I had to write an ASN.
 > 1 library from scratch.

Dijkstra: "The use of COBOL cripples the mind; its teaching should,
           therefore, be regarded as a criminal offence."

BTW, C does was not stolen from PL/1 (Dijkstra: "PL/I - 'the fatal
disease' - belongs more to the problem set than to the solution set.")
but from BCPL, showing how much spinoza knows about history, but he is
of course only the 1111-th.

Not true. By the end of the 1960s, PL/I had popularised and
legitimated block structure in the USA. This is just a failure to
credit an "evil" IBM. IBM screwed up PL/I bad in the 1960s but did a
very effective job in writing a good PL/I compiler in 1974.

I'd suggest, Dik, that you overcome your anti-American and anti-IBM
biases and truly learn software history. Dijsktra praised Fortran, and
Saul Rosen's 1967 collection of papers, "Programming Systems and
Languages" has papers about PL/I in addition to the Algol 60 report.
It makes no mention of BCPL, as I recall, and BCPL was a curiosity. It
had no influence on anything.

Kernighan and Ritchie got tired of Multics, which was PL/I centric,
and used block structure, from Algol by way of PL/I. When they were
innovative, they made mistakes as in the case of the For statement and
aliasing. Had Ritchie been a truly great programmer, he would have
made Algol-60 run on the DEC 10.
 
T

Tim Streater

spinoza1111 said:
Not true. By the end of the 1960s, PL/I had popularised and
legitimated block structure in the USA. This is just a failure to
credit an "evil" IBM. IBM screwed up PL/I bad in the 1960s but did a
very effective job in writing a good PL/I compiler in 1974.

I'd suggest, Dik, that you overcome your anti-American and anti-IBM
biases and truly learn software history. Dijsktra praised Fortran, and
Saul Rosen's 1967 collection of papers, "Programming Systems and
Languages" has papers about PL/I in addition to the Algol 60 report.
It makes no mention of BCPL, as I recall, and BCPL was a curiosity. It
had no influence on anything.

Hardly surprising given that the language was only designed in 1967, and
underwent development over the following five years.
 
S

spinoza1111

What I love about your posts is that you string together words like
they have plausible meanings that are correct.  You sir are a king
amongst trolls.




Lax testing and verification standards?  I dunno, my customers expect
my software to work, let alone "be useable" the first time I deliver
it.

You say it works. But that doesn't imply it works. My experience with
people whose primary person in communication is competitive and zero-
sum is that they can't code.
That's uninspiring.  I was hoping for another story about how you
taught another random famous person how to program, or at least that
you were involved in the Apollo moon landing missions or something.

No, I don't lie. Do you?
Platonism causes people to confuse their psychological ideas with
reality. VB .Net, in using the same semantics as any other .Net
language including C, doesn't "suck", but programmers don't know how,
any more, to express proper structured programming constructs in
languages that don't fully support them. I do, since I learned how to
do structured programming while being employed in a Cobol job.

Cobol is the last language I'd want to learn if I had to write an ASN.
1 library from scratch.  And if you think VB.net is better than C at
manipulating data you either [or both] don't know VB or C.

VB.Net IS better at manipulating data than C. I wrote a 26000 line
compiler in VB.Net while living in motels to demonstrate this, and I
am TIRED of arrogant programmers who open up VB.Net, fail to write a
Hello, World, and conclude that it's no good.
I was more so referring to the fact that you troll usenet posting
about how C is so bad and useless despite not knowing thing 1 about
how to develop software.  That's childish.

Yeah, whatever, kid.
The fact is that most software written by teams that separate people
into cubicles is dead on arrival and has to be debugged by the user
community.

That's patently false.  Look at, for example, console video games.
I'd say >95% of them work flawlessly out of the gate [whether they're
"good" games is another story] and maybe 1/100 games are dead out of
the gate, and the remaining 4% have minor improvement/tweaks to fix
things.  That's just one industry.

Wow. Video games. Look, they have bugs, but these are features because
14 year olds are happy if they show flashy graphics and make noise.
They LIKE finding the numerous bugs in video games! Stop making such
idiotic claims, kid.
Well I call you a troll because you outright ignore good common sense,
and rant about things that are just not true.  C isn't bad because it
has no GUI facilities, it's just bad at GUI tasks.  Just like VB isn't
bad because it can't bit-twiddle, it's just not good at that task.

No, YOU can't bit-twiddle in VB, because YOU suck at VB. What part of
shift don't you understand?
But you insist on posting inflammatory material, calling people names,
insisting that they're persecuting you.  That makes you a troll.

That's not the definition, kiddo.
If you want a more positive reaction stop trying to insult us all the

Oh, you're "insulted". Boo hoo.
time.  Calling me stupid or "indoctrinated" because I tend to code a
lot in C isn't going to win me over (I also happen to code in shell
and Perl too...).  You don't like C, great, but don't call me stupid
because I know how to use it to get things done.

Who's calling you stupid? I haven't, yet. I've just questioned your
smug wisdom by saying things you don't want to hear. But I can call
you stupid if you like.
<snip ramblings>

Let me enumerate your faulty ideas

Complete fantasy.
1.  Developers who like C only program everything in C.  False.
2.  Developers who like C think that all problems are solved with C.
False.
3.  C is the best language for every problem.  False.
4.  All applications written in C are buggy and non-portable.  False.
5.  All applications written in Java are bug-free and portable.
False.
6.  All developers who use C are indoctrinated and fear change.
False.
7.  A company can only have teams that work in one language.  False.

and so on...

I'm sorry you don't have any useful real world experience working in a
software team, but you have no idea, whatsoever, about what you're
talking.  Smart developers use the right tool for the job.  Sometimes
that's C, sometimes that isn't.  But a professional developer wouldn't
get all emotional and upset because the tool they happen to prefer
isn't the right one.

I like C and all, but if I have to do string manipulations [often when
doing test generators] I use Perl.  Why?  Not because C sucks, but
because it's so much easier in Perl that the time I spend on learning/
relearning things in Perl is made up by not having to code the
equivalent application in C.  That being said, when I need to make a
vector generator or something that meddles with bytes, I write it in
C, even if it has to parse some strings.  Why?  Because Perl sucks at
working with bits.

etc, and so on.

In short, if you want to stop being called a troll, stop posting
things you can't back up with facts, stay on topic for clc, and stop
the name calling.

Excuse me, I start "the name calling" after you guys go nuclear in ALL
cases. You see one of my technical claims, it bothers you, and because
you're too inarticulate to defend the opposite, you take the easy way
out by impugning a person's technical credentials. Then I invite you
to go **** yourself, and you are such girliemen that this is an
"insult". So you say increasingly absurd things about me, and I reply
with the facts, my views, and my reasons, and then I invite you to go
**** yourself with a Roto Rooter. This causes you to repeat the
process, but we are by no means the same. I am using this newsgroup on-
topic, I am contributing and discussing code, but I don't take shit
from wet behind the ear punks, or pompous jerks who think it's cute to
say things that would lead to a punch up in real life.

I have in all cases been quite honest about my technical credentials
and background. It's different from and in some ways inferior to that
of SOME people here, such as Dik and Ben B. This is because I started
out at an inferior, labor-oriented university on a machine that was
out of date, and used Cobol for a number of years. At the same time I
have far greater depth in general culture and the history of
technology than many here. And using inferior tools while studying
computer science means that one gets SOME skills in the silk purse
from sow's ear dept.

I'd suggest that we all come in here with something to contribute, but
Richard Heathfield has enforced a pecking order which prevents this ng
from being productive, since people like you and me are constantly
forced by behavior, which originates here from you but is inspired by
Richard, in which people's rights to be here are always in question.

This is what happens when lower middle class anarchist libertarians
say that "there are no rules", and it happened at the Altamont
speedway at a Rolling Stones concert in 1968. Since most people are
little more than apes, they are made anxious by freedom and then you
have Hells Angels with pool cues. Usenet was supposed to be open to
all views but this is of course not so. There's a prejudice in favor
of brutality, and aliteracy, and the poster who spends, as Richard
spends, most of his time bullying others.

I'm tired of it.
 
B

Ben Bacarisse

spinoza1111 said:
I'd suggest, Dik, that you overcome your anti-American and anti-IBM
biases and truly learn software history.
and BCPL was a curiosity. It had no influence on anything.

BCPL had a significant influence on C; even down to the name of the
language. Ken Thompson used BCPL on Multics, and his B programming
language (C's predecessor) is a syntactic variant of BCPL.

<snip>
 
W

Walter Banks

spinoza1111 said:
Not true. By the end of the 1960s, PL/I had popularised and
legitimated block structure in the USA. This is just a failure to
credit an "evil" IBM. IBM screwed up PL/I bad in the 1960s but did a
very effective job in writing a good PL/I compiler in 1974.

I'd suggest, Dik, that you overcome your anti-American and anti-IBM
biases and truly learn software history. Dijsktra praised Fortran, and
Saul Rosen's 1967 collection of papers, "Programming Systems and
Languages" has papers about PL/I in addition to the Algol 60 report.
It makes no mention of BCPL, as I recall, and BCPL was a curiosity. It
had no influence on anything.

The Development of the C Language
Dennis M. Ritchie

The C programming language was devised in the early 1970s as a system implementation language for the nascent Unix operating system. Derived from the typeless language BCPL, it evolved a type structure; created on a tiny machine as a tool to improve a
meager programming environment, it has become one of the dominant languages of today.
.. . .
This paper is about the development of the C programming language, the influences on it, and the conditions under which it was created. For the sake of brevity, I omit full descriptions of C itself, its parent B [Johnson 73] and its grandparent BCPL
[Richards 79], and instead concentrate on characteristic elements of each language and how they evolved.

I believe that you owe Dik Winter an apology for your juvenile outburst.

w.
 
W

Walter Banks

Ben said:
BCPL had a significant influence on C; even down to the name of the
language. Ken Thompson used BCPL on Multics, and his B programming
language (C's predecessor) is a syntactic variant of BCPL.

There was a debate at the time whether the next language should have been called D or P.

w..
 
T

Tim Streater

Ben Bacarisse said:
BCPL had a significant influence on C; even down to the name of the
language. Ken Thompson used BCPL on Multics, and his B programming
language (C's predecessor) is a syntactic variant of BCPL.

As it says in the Introduction to K&R.
 
T

Tom St Denis

You say it works. But that doesn't imply it works. My experience with
people whose primary person in communication is competitive and zero-
sum is that they can't code.

http://en.wikipedia.org/wiki/List_of_fallacies

This is an "Appeal to Probabilities."
VB.Net IS better at manipulating data than C. I wrote a 26000 line
compiler in VB.Net while living in motels to demonstrate this, and I
am TIRED of arrogant programmers who open up VB.Net, fail to write a
Hello, World, and conclude that it's no good.

This smacks of several fallacies, most notably you're denying the
existence of alternatives, and are using unsubstantiated anecdotal
evidence as fact.

Also LOC stats are *not* impressive. Impressive is degrees of
functionality with MINIMAL lines of code.
Wow. Video games. Look, they have bugs, but these are features because
14 year olds are happy if they show flashy graphics and make noise.
They LIKE finding the numerous bugs in video games! Stop making such
idiotic claims, kid.

I'd contend that the video game industry employs more people, and
ultimately performs more in sales than pretty much any other consumer
software industry. So it's hardly something to scoff at just because
you don't happen to play video games. They're also highly complex
applications that deal with efficient tree parsing [BSP trees for
instance], lists, vector math, deal with hardware [graphic shaders,
other routines on the GPU], etc. It pulls together a lot of
disciplines.

Long gone are the side scroller games where a basic knowledge of 2d
graphics can make you a game. Nowadays if you want to be a graphics
programmer you have to know how to efficiently manipulate a variety of
data structures, algebra, and general computer knowhow [e.g. how to
optimize down to the last bit].

I also don't navigate nuclear submarines, doesn't mean the software on
there isn't important. You're dismissing a valid point because you're
not personally involved in it (that's a cognitive bias btw).
No, YOU can't bit-twiddle in VB, because YOU suck at VB. What part of
shift don't you understand?

I never said there is no way to accomplish the same tasks in VB. I am
contending that it's easier and more straight forward to work with raw
data types in C.

<snip ramblings>

You assume a lot of things that are just not the case. For example,
you seem to assume that we think we're superior to other developers
because we use C. But as I pointed out several times I also program
in assembler, perl, and bash shell. You seem to contend that if you
found a way to do something it's ultimately the best and only way to
do something. That isn't the case either.

You constantly insult people because ad hominem attacks are the only
way you seem able to discuss matters in. I'm still not clear what
you're trying to achieve here. You openly admit you're trying to
"rediscover/relearn" C, but then assert your ultimate goal is to prove
to all us wankers that C is useless.

I contend that if you admittedly lack experience working in a software
development shop which focuses on C, you're ill-qualified to have an
opinion whether C is good or not.

I further contend that whether C is good or not is not a clc topic.
So move this discussion elsewhere.

Tom
 
B

bartc

spinoza1111 said:
Kernighan and Ritchie got tired of Multics, which was PL/I centric,
and used block structure, from Algol by way of PL/I. When they were
innovative, they made mistakes as in the case of the For statement and
aliasing. Had Ritchie been a truly great programmer, he would have
made Algol-60 run on the DEC 10.


Eh? Algol60 ran fine on the Dec-10, although I've no idea who was
responsible for it. It just was.

We had a good selection of languages, except for C (or perhaps us students
were kept well away from it).
 
S

spinoza1111

http://en.wikipedia.org/wiki/List_of_fallacies

This is an "Appeal to Probabilities."


This smacks of several fallacies, most notably you're denying the
existence of alternatives, and are using unsubstantiated anecdotal
evidence as fact.

Well, part of the behavior in dysfunctional families and dysfunctional
social systems is to normally dismiss actual lived experience as
"anecdotes".
Also LOC stats are *not* impressive.  Impressive is degrees of
functionality with MINIMAL lines of code.

No, impressive is literate programming.
Wow. Video games. Look, they have bugs, but these are features because
14 year olds are happy if they show flashy graphics and make noise.
They LIKE finding the numerous bugs in video games! Stop making such
idiotic claims, kid.

I'd contend that the video game industry employs more people, and
ultimately performs more in sales than pretty much any other consumer
software industry.  So it's hardly something to scoff at just because
you don't happen to play video games.  They're also highly complex
applications that deal with efficient tree parsing [BSP trees for
instance], lists, vector math, deal with hardware [graphic shaders,
other routines on the GPU], etc.  It pulls together a lot of
disciplines.

Long gone are the side scroller games where a basic knowledge of 2d
graphics can make you a game.  Nowadays if you want to be a graphics
programmer you have to know how to efficiently manipulate a variety of
data structures, algebra, and general computer knowhow [e.g. how to
optimize down to the last bit].

Well, my information is that video game developers today use C++.
I also don't navigate nuclear submarines, doesn't mean the software on
there isn't important.  You're dismissing a valid point because you're
not personally involved in it (that's a cognitive bias btw).

The world would be better off without nuclear submarines. Their
software is in frighteningly out of date and written in bad
programming languages to the best of my knowledge, such as a
bastardized Algol called Jules Schwartz's Own Version of the
International Algebraic Language (JOVIAL).

Apart from ADA, US military data systems are pretty irresponsible as
far as I know. For example, the Veteran's Administration uses a
horrible 1960s vintage language called MUMPS which regularly screws up
veteran care, and is probably responsible for the obscene spectacle of
homeless Iraq veterans.
I never said there is no way to accomplish the same tasks in VB.  I am
contending that it's easier and more straight forward to work with raw
data types in C.

Programmers who use this language never in my experience have the
courage to weight what they say by their ignorance of the alternative.

As I'd done in Fortran II, I simulated bit twiddling in earlier
releases of VB using simple mathematics. This is a performance hit but
I am not impressed, like a child or adolescent, by blinding "speed",
especially because this "speed" is a metaphor. Anything is better than
being in the semantic world of C, wherein neither the preprocessor nor
aliasing can be avoided.
<snip ramblings>

You assume a lot of things that are just not the case.  For example,
you seem to assume that we think we're superior to other developers
because we use C.  But as I pointed out several times I also program
in assembler, perl, and bash shell.  You seem to contend that if you
found a way to do something it's ultimately the best and only way to
do something.  That isn't the case either.

Absurd. I thought Cobol sucked, yet I simulated a digital switch in
it. I thought VB sucked A LOT in 1994, and steadily sucked less on
each release in the manner of Microsoft software to arrive at a steady
state of suck by Visual Basic Express 2008. At this time I'd cut over
to C Sharp which sucks less but still sucks somewhat.
You constantly insult people because ad hominem attacks are the only
way you seem able to discuss matters in.  I'm still not clear what
you're trying to achieve here.  You openly admit you're trying to
"rediscover/relearn" C, but then assert your ultimate goal is to prove
to all us wankers that C is useless.

Yup. And ad hominem is irrevelantly arguing from the personal
characteristics of your opponent to the invalidity of their
conclusions. I don't know you people, nor do I wish to know most of
you.
I contend that if you admittedly lack experience working in a software
development shop which focuses on C, you're ill-qualified to have an
opinion whether C is good or not.

This is silly. Employment in such a shop isn't controlled by the
workers. It is controlled in all cases and in the last analysis by the
suits. I don't want to work the personalities I see here, but I have
worked, in other languages, in software development "shops".
Furthermore, I claim that software development shops are white male
laagers such that working in them produces mental blocks rather than
valuable experience.
 
S

spinoza1111

spinoza1111wrote:

Eh? Algol60 ran fine on the Dec-10, although I've no idea who was
responsible for it. It just was.

We had a good selection of languages, except for C (or perhaps us students
were kept well away from it).

I will admit that in 1970, Algol probably would not have been a good
choice for writing OS code, and my understanding is that this is what
Ritchie wanted to write.

But note that the point of "kernel" OS design is minimizing OS code.
Perhaps C made it too easy to write too much nonkernel code.

The vision of writing the whole OS in a language like Algol has gone
by the wayside because of a somewhat adolescent rage for efficiency.
But this is just my personal opinion. When I want to ram my opinions
down your throat, you'll know it.
 
D

Dik T. Winter

>
> Not true. By the end of the 1960s, PL/I had popularised and
> legitimated block structure in the USA. This is just a failure to
> credit an "evil" IBM. IBM screwed up PL/I bad in the 1960s but did a
> very effective job in writing a good PL/I compiler in 1974.

What is the relevance?
> I'd suggest, Dik, that you overcome your anti-American and anti-IBM
> biases and truly learn software history. Dijsktra praised Fortran,

This is praise?
1972 Turing Award Lecture:
Fortran's tragic fate has been its wide acceptance, mentally chaining
thousands and thousands of programmers to our past mistakes.
" When Fortran has been called an infantile disorder, full PL/1, with its
growth characteristics of a dangerous tumor, could turn out to be a
fatal disease.
EWD498, 1975
Fortran, "the infantile disorder', by now nearly 20 years old, is
hopelessly inadequate for whatever computer application you have in
mind today: it is now too clumsy, too risky, and too expensive to usee.
" In the good old days physicists repeated each other's experiments, just
to be sure. Today they stick to Fortran, so that they can share each
other's programs, bugs incuded.
> and
> Saul Rosen's 1967 collection of papers, "Programming Systems and
> Languages" has papers about PL/I in addition to the Algol 60 report.
> It makes no mention of BCPL, as I recall, and BCPL was a curiosity. It
> had no influence on anything.

I think that you should read:
<http://cm.bell-labs.com/cm/cs/who/dmr/chist.html> by
Dennis M. Ritchie, and see that BCPL had a large influence. The abstract:
"The C programming language was devised in the early 1970s as a system
implementation language for the nascent Unix operating system. Derived
from the typeless language BCPL, it evolved a type structure; created
on a tiny machine as a tool to iprove a meager programming environment,
it has become one of the dominant languages of today. This paper studies
You might also read the BCPL reference manual put online by Ritch as:
<http://cm.bell-labs.com/cm/cs/who/dmr/bcpl.pdf> dated 1967 as part of
project MAC at MIT, and not that it contains blocks (not so very surprising
as Martin Richards was British).
> Kernighan and Ritchie got tired of Multics, which was PL/I centric,
> and used block structure, from Algol by way of PL/I.

As BCPL had block structure, I would think it also came to C from BCPL.
Which in turn had it (probably) from CPL which in turn was derived
from Algol 60.
> When they were
> innovative, they made mistakes as in the case of the For statement and
> aliasing. Had Ritchie been a truly great programmer, he would have
> made Algol-60 run on the DEC 10.

At that time at many places there were still difficulties with the
creation of Algol 60 compilers, one of the reasons that subsets have
been created. I have still somewhere a report from (I think Pittsburgh
University) where it is investigated how to implement dynamic arrays.
 
T

Tom St Denis

<snip off topic material>

This isn't a usenet group to discuss Algol, ADA, Fortran, COBOL, C++,
Visual Basic or politics.

Please restrict your posts to the C language.

Thanks,
Tom
 
S

spinoza1111

<snip off topic material>

This isn't a usenet group to discuss Algol, ADA, Fortran, COBOL, C++,
Visual Basic or politics.

Please restrict your posts to the C language.

That's impossible and insane. The "C language" is itself a fuzzy set;
nobody can give a straight answer to some of the simplest questions
about it owing to the fact that its initial design was so deficient
that it had to be jiggered and extended with libraries, preprocessor
code, and new versions to be minimally useful.

It is true that the "kernel" approach in OS design, and the RISC
approach in hardware design, create good designs by eliminating
nonessentials. However, a programming language, unlike an OS kernel or
the machine instructions of a computer, is used by far more people in
the real world.

Any more than ignorance is skepticism, is lack of culture an
attractive sparse aesthetic. Ignorance is ignorance, and a narrow
education is a narrow education.

Furthermore, C is political. Kernighan and Ritchie were able to make
it popular while better languages designed in unfashionable regions of
the US and worldwide failed because Kernighan/Ritchie worked at the
intersection of power and prestige: Princeton and the old Bell Labs.
Had the United Nations had the political and military clout its
founders had intended for it, UNESCO would have created a world
programming language and saved us all a great deal of time for nobler
pursuits: hell, we might even have health insurance in the USA, and a
one-state solution in Israel.

Chaos theory teaches us that a butterfly fart in Beijing can cause a
tornado in Kansas, and everything is connected. The point is being
able to use language to make valid connections. If you do not have
this ability, you are well-advised to be sparse in speech, but I'd
recommend that this economy extend to your criticism of what I say
here. In fine, zip it.

You regard me as "offtopic" primarily because of stylistic cues:
proper names, larger words spelled in most instances correctly,
complex sentence structure, and in general a writing style considered
girlish in American and world schools that tend to track boys into
technology. However, I started in programming before the modern "ideal
programmer", and unattractively fat and inarticulate male, was created
by the media, so I don't choose to be this persona here.
 
S

spinoza1111

...
 > > BTW, C does was not stolen from PL/1 (Dijkstra: "PL/I - 'the fatal
 > > disease' - belongs more to the problem set than to the solution set.")
 > > but from BCPL, showing how much spinoza knows about history, but he is
 > > of course only the 1111-th.
 >
 > Not true. By the end of the 1960s, PL/I had popularised and
 > legitimated block structure in the USA. This is just a failure to
 > credit an "evil" IBM. IBM screwed up PL/I bad in the 1960s but did a
 > very effective job in writing a good PL/I compiler in 1974.

What is the relevance?
Not much, just clarifying the record. PL/I can be made useful and safe
with something of the same "discipline" (surplus repression in
Frankfurt school terms) one must show with C.
 > I'd suggest, Dik, that you overcome your anti-American and anti-IBM
 > biases and truly learn software history. Dijsktra praised Fortran,

This is praise?
  1972 Turing Award Lecture:
     Fortran's tragic fate has been its wide acceptance, mentally chaining
     thousands and thousands of programmers to our past mistakes.
  "  When Fortran has been called an infantile disorder, full PL/1, with its
     growth characteristics of a dangerous tumor, could turn out to be a
     fatal disease.
  EWD498, 1975
     Fortran, "the infantile disorder', by now nearly 20 years old, is
     hopelessly inadequate for whatever computer application you have in
     mind today: it is now too clumsy, too risky, and too expensive to usee.
  "  In the good old days physicists repeated each other's experiments, just
     to be sure. Today they stick to Fortran, so that they can share each
     other's programs, bugs incuded.
Look at his 1999 *tour d'horizon*.
 >                                                                    and
 > Saul Rosen's 1967 collection of papers, "Programming Systems and
 > Languages" has papers about PL/I in addition to the Algol 60 report.
 > It makes no mention of BCPL, as I recall, and BCPL was a curiosity. It
 > had no influence on anything.

I think that you should read:
<http://cm.bell-labs.com/cm/cs/who/dmr/chist.html> by
Dennis M. Ritchie, and see that BCPL had a large influence.  The abstract:
  "The C programming language was devised in the early 1970s as a system
   implementation language for the nascent Unix operating system.  Derived
   from the typeless language BCPL, it evolved a type structure; created
   on a tiny machine as a tool to iprove a meager programming environment,
   it has become one of the dominant languages of today.  This paper studies
You might also read the BCPL reference manual put online by Ritch as:
<http://cm.bell-labs.com/cm/cs/who/dmr/bcpl.pdf> dated 1967 as part of
project MAC at MIT, and not that it contains blocks (not so very surprising
as Martin Richards was British).

 > Kernighan and Ritchie got tired of Multics, which was PL/I centric,
 > and used block structure, from Algol by way of PL/I.

As BCPL had block structure, I would think it also came to C from BCPL.
Which in turn had it (probably) from CPL which in turn was derived
from Algol 60.

PL/I influenced the lads too: but there is a cultural ban on crediting
IBM, because Kernighan and Ritchie were already living the foolish
fantasy that they were fighting the Forces of Darkness.

 >                                                      When they were
 > innovative, they made mistakes as in the case of the For statement and
 > aliasing. Had Ritchie been a truly great programmer, he would have
 > made Algol-60 run on the DEC 10.

At that time at many places there were still difficulties with the
creation of Algol 60 compilers, one of the reasons that subsets have
been created.  I have still somewhere a report from (I think Pittsburgh
University) where it is investigated how to implement dynamic arrays.

But these problems have been overcome, no thanks to C. C was a net
waste of time. Basically, the US was damned if the Europeans were to
go ahead with Algol since that would disrupt the Cold War hegemony of
the US. Therefore, difficulties in the Algol development process which
also occured in the development of Fortran were exagerrated by US
computer media.

In my view, it would have been better for programmers world-wide to
wait for Algol while using assemblers with conditional and macro
facilities: this technology was good to go in the 1950s. Instead,
Fortran was adopted and although it was supposed to enable end users
to program, it merely created one of the first hordes of coding bums.

Of course, it was easier for me to debug Fortran in machine language
than write a full-featured assembler in 8K for the IBM 1401 without
having to create 50+ passes in 8K of storage, and I used Fortran
wherever practicable.

The problem with Algol was, IMO, that whenever developers started
actually THEORIZING (thinking), their managers at banks and in
government would flip out because as my fat pal Adorno noticed in the
1930s when he worked in market research, capitalism is anti-thought:
the "thinker" is suspected of soldiering and wasting time in
"scientific" management. The quotations from Wittgenstein in the Algol
report were probably considered too la-de-da, not only by American
auto industry managers, but also, probably, by Dutch merchant seamen,
Danish agribusiness managers, and other Practical Men.

The basic idea being something noticed by Kant circa 1790: that the
philosopher, or the man doing philosophy, must always be considered an
impractical dreamer while the rest of us go about our business,
starting Seven Years' Wars, developing Fortrans and Cs, creating
Chernobyls, and in Shakespeare, "getting wenches with child and
wronging the ancientry".
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,054
Latest member
TrimKetoBoost

Latest Threads

Top