How to convert Infix notation to postfix notation

N

Nick

spinoza1111 said:
Thanks for the bug report. This will be fixed in the next release.
Would you rather be Nick or Anon2?

I'm not an "anon". You can find out my complete name with very little
effort (in a comment in the GD source for a start - some of us are
motivated by status!), but if you don't want to "Nick" will do fine.
Ewwwwww nasty problem. Sounds like for portability I should follow
this advice. But on the other hand, if in your environment this is a
possibility, it sounds like you have too many editors.

Editors can be people to. I really don't have a clue where the data
coming into my programs comes from (for example, my main hobby program
takes input from web browsers, which I have no control over and which
can do all sorts of things).

My view is that if there is a handy macro provided in the standard
library to recognise spaces, it costs me nothing to use it, experienced
programmers will recognise what it does, and I gain some bug protection
for no cost.

comp.lang.c people will tell you things about casting the parameters for
isspace. It's a nasty wart on C, but nevertheless I would recommend it
(sooner or later someone is going to throw UTF-8 or something similar at
your code).

I generally take the position of being as flexible as possible until it
gets difficult. Then I decide whether to trade flexibility off against
limiting the environment that the program will operate in. But it's in
that order, rather than saying "I don't need to do this, because it
won't happen in my environment". There is code at the heart of
CanalPlanAC that started life on quite a different computer to the one
it's running on - and thinking like this has saved me a lot of hassle
along the way (needing to go from Ascii to UTF-8 recently when I
expanded by database from the UK to the European mainland for example).
 
S

spinoza1111

In

spinoza1111wrote:


Unless you're paying him, he's under no obligation to pay the
slightest attention to your orders.



You will recall that I expressed several other concerns. Let's see if
you've addressed them yet.

FINALLY something substantial...I hope.
My first step was to remove all your comments. That's because they are
C99-style comments, and I don't have a C99 compiler. (Nor do most
people. You probably don't.) Under C90, // is a syntax error. On the
off chance that you do have a C99 compiler, removing the comments
seemed to be an easy way of sidestepping the issue.

Oh for Christ's sake...you're too incompetent to use a global edit
change? No, this doesn't make the cut, and I won't change the use of
the comments. I refer you to my definition of programming, per
Dijsktra and Knuth: it's the communication of intent as to using
computers between human beings, and if you act like an ape, you aren't
qualified to be in this conversation. It was clear even to you that
the lines are comments, and you know how to change them. But because
you're here to throw your weight around, you won't, and you endanger
my reputation with your posts as you harm Navia's business and
personal reputation.

If this were a physical space I'd have long since asked you to leave,
and physically ejected you if necessary. And no, I wouldn't call
Security, either. I'd do it myself.

I've already said that I don't use balanced comments because they are
error prone.

"For thou has made this happy earth thy hell". This facility could be
an ESCAPE from the savage, brutal market relations in which
programmers must work in which programmers could, as in the old days,
exchange information without their competence constantly challenged by
managers less competent than they. Somehow, this bothered you, so you
came in here and transformed it into a nasty little business office,
one in which good people who don't "toe the mark" in some militaristic
form find themselves talked about behind their back, and the people
who get ahead are the back-stabbers: who write attacks with 20
"errors" and then make reference to "100s".
Line-wrapping is a Usenet problem that has caught you out before, but
you're too stupid to learn from it, so it's caught you out again.

**** you, asshole.
Still, I know there's no point arguing about it with you because
you're too stupid to gain useful knowledge from discussions, so I'll
just fix the linewrap and carry on.

I fixed it like this:

    testCase("((10+(113-(2+((((((2/(4+3)))))))+2+2))))*a",
             "10 113 2 2 4 3 + / + 2 + 2 + - + a *",
             intMallocMax);

so that, on the off chance that you are overcome by an insane desire
to be sensible, you can copy and paste it into your own source so
that it won't line-wrap the next time you post it.

This suggestion is too minor to include in the Change Record but I
will make it. Consider yourself credited with a minor formatting
correction.
And now we come to the first "real" problem, and it's a showstopper:
  printf("%sInvalid malloc request %s\n", intMallocMax);
This was noticed by Nick and I added it to Issues 30 minutes before
this post appeared, so no cigar. Furthermore, it's not a showstopper,
because it appears only when (1) an unusual request for malloc
limitation is made to check the program's use of malloc (a test made
in tester() as well) and (2) the request is made using an invalid non-
number.

So again, no credit for you.

failed to remind you that intMallocMax is an int. Kind of makes you
doubt the point of such a stupid stupid convention, doesn't it?

Remove the first %s, and change the other one to %d to fix this
problem.

Duh. Ever hear of a typo? And what is more, are you even fucking aware
that to find typos we ask thugs and bullies to leave the room when
they open their big fat filthy mouths and calling people incompetent
based on stylistics and typos, jerk face?

I knew the fix as soon as Nick pointed out this minor error.
Next, return a value from main.

OK, this will be done in rel 5 and you will be credited since it seems
to be somewhat of a consensus. What should your name be in the Change
Record? Heathfield? Anon? Fat Bastard? Whatever you like.
When you've fixed those, I've got another 60 diagnostic messages for
you. And that's just trivial syntax stuff. When - in, at this rate,
another six or seven years - we get a clean compile, then we can
start looking at the semantics.

Note what this clown is doing. It would have been a simple matter for
him to paste the "60 diagnostic messages" into this email. But like
his friend Seebach, his style is to post a few trivial errors (here, a
format nit and an error in a rare diagnostic message) and then make
reference (as did Sen McCarthy during the Army-McCarthy hearings)
"many more errors".

[Alger Hiss may very well have passed information to the Soviets: this
is unknown. And, he was caught out in a lie by Whittaker Chambers. But
the man who nailed him was Richard Nixon, not Sen. McCarthy: McCarthy
took what Nixon had found and used electronic media to multiply
"communists in government to an absurd level. Anyone who got in
McCarthy's way was a "communist" by virtue of...getting in his way.
Likewise, to disagree with Heathfield or Seebach is an "incompetent".]

Nobody believes you, Richard. You are a liar who's bullied too many
people here. You started in on my ten years ago because your lack of
education was exposed in a thread on programming professionalism, and
you've been keeping your nasty stuff up for ten years...with such
frequency that yes I believe you are being paid to harass people here.

But your SINGLE valid point, which does not amount to a bug report,
will be on the next Change Record and you will be credited. That's
because I am a professional and this is my experiment in making this
ng useful for its announced purpose: not your little fiefdom.

FYI, the code compiles under Microsoft C++ .Net Express in a .C file
type as a C program with zero compiler and zero linker errors or
warnings. You're pulling the unprofessional stunt that a psychotic
Randy Thompson pulled with the code for my book in 2006: either
running the code in an environment deliberately set to make it break,
or using your ignorance and incompetence to make your case.

Note that I don't claim that you CANNOT change the comment style
automatically. Instead you prefer to pretend you can't. Like Sen.
McCarthy, you're a rat in a trap who's willing to look like an
incompetent to make his charges stick.
 
M

Moi

On Nov 7, 10:09 pm, Moi <[email protected]> wrote:

[snip snip snip]
Sloppy English won't make your case. How can a program be "error prone"
for the "reader"? Do you mean "conducive to misreading?" That's
precisely what my style is not.

Since you seem to be very good at learning new languages, I suggest you
put your writings in Dutch, then. French or German is acceptable, too.

AvK
 
S

spinoza1111

I'm not an "anon".  You can find out my complete name with very little
effort (in a comment in the GD source for a start - some of us are
motivated by status!), but if you don't want to "Nick" will do fine.





Editors can be people to. I really don't have a clue where the data
coming into my programs comes from (for example, my main hobby program
takes input from web browsers, which I have no control over and which
can do all sorts of things).

My view is that if there is a handy macro provided in the standard
library to recognise spaces, it costs me nothing to use it, experienced
programmers will recognise what it does, and I gain some bug protection
for no cost.

comp.lang.c people will tell you things about casting the parameters for
isspace.  It's a nasty wart on C, but nevertheless I would recommend it
(sooner or later someone is going to throw UTF-8 or something similar at
your code).

I generally take the position of being as flexible as possible until it
gets difficult. Then I decide whether to trade flexibility off against
limiting the environment that the program will operate in.  But it's in
that order, rather than saying "I don't need to do this, because it
won't happen in my environment".  There is code at the heart of
CanalPlanAC that started life on quite a different computer to the one
it's running on - and thinking like this has saved me a lot of hassle
along the way (needing to go from Ascii to UTF-8 recently when I
expanded by database from the UK to the European mainland for example).

Misunderstanding. I thought you actually meant that an editor MIGHT
put a tab in the SOURCE CODE inside the character delimiters.
Unlikely, and if it happens, change your editor.

I am of course aware of the C concept of "white space" and I elected
to make blanks and only blanks (the last character in the white space
range) the separator character, so this is for me not an issue.
 
B

Ben Bacarisse

spinoza1111 said:
On Thu, 05 Nov 2009 20:52:43 -0800,spinoza1111wrote:

[snipped]
#define ADDFACTOR \
        int addFactor(char *strInfix, \
                      char *strPolish, \
                      int *intPtrIndex, \
                      int intEnd, \
                      int intMaxPolishLength)
ADDFACTOR;
[snipped]


// --------------------------------------------------------------- //
Parse add factor
//
// int addFactor(char *strInfix,
//               char *strPolish,
//               int *intPtrIndex,
//               int intEnd,
//               int intMaxPolishLength) //
// addFactor = mulFactor [ *|/ mulFactor ] //
ADDFACTOR
{
    char chrMulOp = ' ';
    int intStartIndex = 0;
    if (!mulFactor(strInfix,
                   strPolish,
                   intPtrIndex,
                   intEnd,
                   intMaxPolishLength))

[snipped]

The strange thing about this coding style is that it
will save you *no* keystrokes.
Since you actually duplicate the function prototype in the comment block
just above the function definition.
Whenever you have to change the function's arguments
you will have to change the (redundant) comment as well.

Yes. This, however, gives the best result for the program reader.

I disagree. In fact I think the more usual style helps the reader at
only a small cost to the writer. Your macros might help a casual
reader, but someone who is, say, reviewing the code or, more
importantly, tracking down a bug, needs to know the actual types, not
those in the comment. The comment might even lead a reader astray if
there is typo on the macro.

<snip>
 
S

spinoza1111

But as lots of other people have pointed out, doing it twice means that
sooner or later (usually sooner) one of them will be wrong.

I quite like the idea of avoiding having to keep the prototype and the
definition in synch if you change them (not enough to want to adopt it,
but I do quite like it).  But what you need to do is make the macro
definition act as the documentation as well.  Without this, you are - as
they pointed out - defining it twice anyway.

No, I'm not. I'm defining it once, and documenting it once as a
courtesy. A professional programmer will keep the comments up to date
as the return courtesy in gratitude for the effort made in formatting
the code.

[The thugs here will be certain to not understand this use of the word
"courtesy", any more than they have any notion of common decency.]
 
N

Nick

spinoza1111 said:
No, I'm not. I'm defining it once, and documenting it once as a
courtesy. A professional programmer will keep the comments up to date
as the return courtesy in gratitude for the effort made in formatting
the code.

That was "defining" as an English word rather than a C word - my fault
for being ambiguous. If you really can guarantee that you will always
remember to change the comments when you change the code, when they are
in different parts of the file, you're a better programmer -
irrespective of language - than me. Of course, you're also working on
the assumption that anyone else who edits your code will be that good -
and from what I know that's pretty unlikely.

I'm vain enough to hope that people will be interested in my code when
I'm no longer around - so I try to write it in a way that only requires
them to be very good, not superhuman.
 
S

spinoza1111

spinoza1111 said:
On Thu, 05 Nov 2009 20:52:43 -0800,spinoza1111wrote:
[snipped]
#define ADDFACTOR \
        int addFactor(char *strInfix, \
                      char *strPolish, \
                      int *intPtrIndex, \
                      int intEnd, \
                      int intMaxPolishLength)
ADDFACTOR;
[snipped]
// --------------------------------------------------------------- //
Parse add factor
//
// int addFactor(char *strInfix,
//               char *strPolish,
//               int *intPtrIndex,
//               int intEnd,
//               int intMaxPolishLength) //
// addFactor = mulFactor [ *|/ mulFactor ] //
ADDFACTOR
{
    char chrMulOp = ' ';
    int intStartIndex = 0;
    if (!mulFactor(strInfix,
                   strPolish,
                   intPtrIndex,
                   intEnd,
                   intMaxPolishLength))
[snipped]
The strange thing about this coding style is that it
will save you *no* keystrokes.
Since you actually duplicate the function prototype in the comment block
just above the function definition.
Whenever you have to change the function's arguments
you will have to change the (redundant) comment as well.
Yes. This, however, gives the best result for the program reader.

I disagree.  In fact I think the more usual style helps the reader at

If you mean by "the more usual style" putting trivia before important
functions, I reject this. It's more important to me that main() not be
last than it return a value.

If on the other hand you mean a list of prototypes followed by the
functions in which the function prototype is re-coded, I reject this.
It is far, far worse than an extra comment, since mistakes in
correspondence cause the program not to even compile, or compile with
errors. Mistakes in the comment cause at worse a mistake in
apprehension which is correctable by changing the comment.

My way is clearly the best. The problem here is that like "Levine the
Genius Tailor" in Gerald Weinberg's Psychology of Computer
Programming, people have been so crippled by the defects of C that
they fetishize, reify and rationalize those defects, making a normal
human response to those defects (including coding great "pseudo-code"
which C cannot handle) itself a defect, and "normalizing" their own
deviance. Their being mentally crippled into people who write code
that is objectively disorganized (trivia before important things)
becomes in their own eyes a virtue.

As it is, C programs start not with a bang but a whimper. A program
should start with "front matter" including an announcement *sans
honte* which tells the reader what it does: but C programs in the real
world start in most cases with trivia which represents a waste of
time for the maintainer.

Programming languages were NOT invented to create a class of
normalized deviants who have discovered that their learning disorder
is overlooked or a benefit in some job. They were invented so that
people could say what they meant. Therefore a programming language in
which the central issues are what main returns or in which trivia must
precede important code is an abomination.
only a small cost to the writer.  Your macros might help a casual
reader, but someone who is, say, reviewing the code or, more
importantly, tracking down a bug, needs to know the actual types, not
those in the comment.  The comment might even lead a reader astray if
there is typo on the macro.
In that case he need only split the screen and view the macro
definition versus the code.

The global autism of programming, however, is sure to make for
programmers who, when they see code not laid out "their" way, will
declare without due diligence that their way is the only sensible way
of arranging code, no matter how deviant and that "everybody" does
things their fucked up way unless they are "incompetent", where
"incompetence" is actually the name of what Adorno called the secret
contour of their weakness.

This generates the abusrdity of appealing to the nonprogramming "user"
in discussing what is or is not readable code, for the nonprogramming
"user" doesn't read code...by definition.

The absurdity is generated because, as in the case of my coworker at
Princeton, the programmer feels himself in a closed system. There's no
way to "prove" apart from empirical sociological research which no one
programmer is qualified to do that predefining decls as macros is
"more readable", and it becomes his word against another's, so the
*deus ex machina* is invoked.

Since programming shops are staffed (either in reality or in the
*mythos* which creates the reality as is narrated by its participants)
by deviants who are by the terms of their employment subaltern, the
meaning of the symbol "user" is necessarily someone external to the
otherwise global capitalist system...an absent Lacanian phallus, an
absent father, or what Zizek calls "big Other".

But unlike Big Brother in Orwell's 1984, who's not an actual character
but is represented, not incarnated, by the midlevel Party functionary
O'Brien, the big Other today has to be incarnate in order to maintain
the illusion and the control. He's Donald Trump, or Rupert Murdoch.

He's the "father I never had", but also the person who says "you're
fired, for you have revealed the secret contour of your incompetence
and weakness to me". This persona is internalized but emerges as the
Decider in these sorts of "programming style" issues, which
programmers are simply not qualified to answer if they are the issue
of empirical, sociological reality.

Years ago, I gave a talk at the old "Data Processing Management
Association" in Peoria on the subject of how to simulate, with Go To,
the structured constructs of Bohm and Jacopini, and in that talk I
said that what most discourse neglected about the issue of
"readability and maintainability" was that it's silly to speak of
"readability and maintainability" without knowing much about the so-
called "user" (the reader of your code, who's not a user but another
programmer): brutally, if he won't read it it's unreadable as far as
we know and in this case.

I'd already seen people labor hard to comment and format their code,
only to be viciously attacked by software-ignorant managers for that
great crime against Holy Private Property, "wasting the company's
resources"...where the company's resources happen to be not JUST your
time but also what Marx called your very power to labor, which, if the
company is on its toes from the fiduciary point of view of the
majority stockholders, is taken completely...leaving the employee with
nary a jot or tittle of the time-resource to do anything outside of
his remit.

The remit is never quite defined, for that would give away the game,
which is power, and power is the destruction of other people's
autonomy.

This is why wounded spirits drag themselves in here not to be friendly
or to act in solidarity but to demonstrate that they are like Tony
Soprano, the chosen one who always pleases the dead Father. The
problem is that this dreamworld always demands the sacrifice of other
people's reputations to shore up the self-image of a person like
Heathfield and Seebach.

Heathfield is literally incapable of discussion abstract enough to
remove any imputations about the competence of other people, and
Seebach was unable to address the genuine issue of C portability
across the MS and unix great divide. For Seebach to do this would have
required him to become actually familiar with Microsoft systems but
unlike me he is unable to step outside his autistic comfort zone.

I'm willing to program in a language I dislike to prove my point, but
Seebach failed to research adequately why many of Herb's
recommendations work on Microsoft and not on unix. He preferred to
create a disorganized list of trivia which became the source for
nearly all the claims about Herb's competence because it flattered the
vanity of unix losers...I mean users.
 
S

spinoza1111

That was "defining" as an English word rather than a C word - my fault
for being ambiguous.  If you really can guarantee that you will always
remember to change the comments when you change the code, when they are
in different parts of the file, you're a better programmer -
irrespective of language - than me.  Of course, you're also working on
the assumption that anyone else who edits your code will be that good -
and from what I know that's pretty unlikely.

I'm vain enough to hope that people will be interested in my code when
I'm no longer around - so I try to write it in a way that only requires
them to be very good, not superhuman.

Wait a minute. This is bullshit.

Since when is it "superhuman" to be capable of a simple clerical task?

Again, the passage "Diagnosis" from Adorno's Minima Moralia applies in
an almost occult way:

"Even when they prove to be quite humane and reasonable beings outside
of the enterprise, they freeze into pathic stupidity the moment they
think professionally."

That is, people successful at paraprofessions and unprotected by
tenure learn to use shibboleths (code words and turns of phrase that
are clumsy and without meaning, but accepted as idiomatically
expressing what's often called "a good work attitude", such as an
irrevelant reference to an abstract computer user). They mocked the
tenured class, sometimes with good reason (for professors can be
intellectually corrupt) but blind to the fact that corresponding to
corruption that results from a privilege such as tenure, a reciprocal
corruption (sucking up to the boss and using meaningless language
exhibiting "a good attitude") exists when your employment is at will.
This is the pathic stupidity of which Adorno writes.

"Far from perceiving such prohibitions on thought as something
hostile, the candidates – and all scientists are candidates – feel
relieved. Because thinking burdens them with a subjective
responsibility, which their objective position in the production-
process prevents them from fulfilling, they renounce it, shake a bit
and run over to the other side."

Sure. If you've been told that arcane specialists on the other side of
the world must fix all compiler bugs, god forbid you should. You have
a "subjective" responsibility when you see something wrong in the
global process to fix it, but your fix can easily break the release:
note that Adorno sometimes used "subjective" in a way opposite to its
common use: he felt that "subjective" thought actually engages reality
whereas most "objectivity" is gulled and tricked by artifice including
cooked statistics...and silly assed programming languages.

"The displeasure of thinking soon turns into the incapacity to think
at all: people who effortlessly invent the most refined statistical
objections, when it is a question of sabotaging a cognition, are not
capable of making the simplest predictions of content ex cathedra
[Latin: from the chair, e.g. Papal decision]. They lash out at the
speculation and in it kill common sense. The more intelligent of them
have an inkling of what ails their mental faculties, because the
symptoms are not universal, but appear in the organs, whose service
they sell. "

This is astonishing since Adorno was writing about what happened, in
his direct experience, to the promotion of classical music to a mass
audience. His own recommendations (that the audience be actually told
about themes and structure in the same way music theory students learn
it) were ignored and the result was the near incoherence of the way in
which Deems Taylor says one should listen to classical music in
Fantasia, which arguably destroyed classical music listening as a mass
phenomenon in America for good in 1940.

Adorno noted that intelligent, civilized men produced baby-talk in the
form of guides to classical music for "everyman". Likewise, large
software systems are full of bugs because the dislike of thinking
emerges FIRST in the "organs whose service they sell".

"Many still wait in fear and shame, at being caught with their defect.
All however find it raised publicly to a moral service and see
themselves being recognized for a scientific asceticism, which is
nothing of the sort, but the secret contour of their weakness. Their
resentment is socially rationalized under the formula: thinking is
unscientific. Their intellectual energy is thereby amplified in many
dimensions to the utmost by the mechanism of control. The collective
stupidity of research technicians is not simply the absence or
regression of intellectual capacities, but an overgrowth of the
capacity of thought itself, which eats away at the latter with its own
energy. The masochistic malice [Bosheit] of young intellectuals
derives from the malevolence [Bösartigkeit] of their illness."

Astonishing, because a popular computer book is "Don't Make Me Think!.

You actually dislike the idea that the program maintainer MIGHT have
to change a comment "unnecessarily" although in the scheme of things,
that's doing his job. Not even clerical thinking is approved anymore
in specific cases while "thinking" in general receives more than lip
service.
 
F

Flash Gordon

bartc said:
I could pick up radio telephone calls on airband radio in the 70's. I doubt
microprocessors were involved.

They have been involved for a long time. Identifying whether the call is
for this particular phone. Selecting the base station with the strongest
signal, negotiating which channel to use (you need some mechanism of
having multiple calls on a base station, and it involves some mechanism
of selecting a channel/sideband/whatever to do it). All sorts of things.
And while the sophistication of GSM would be difficult without them,
wireless telephony has been around for a long time.

Even simple mobile phones (as opposed to two way radios) are hard to do
without a processor (you could use a lot of customer hardware instead).
For those of us outside that world [industry etc], who only really have
access to consumer-level PCs, why should we concern ourselves with
anything else? At most we might worry about portability between Windows,
MacOS and Linux.

So? A lot of people don't know either English or Chinese. If someone who
only spoke Swahili said that because he does not know any English or
Chinese speakers that Swahili is the most common language in the world,
and English and Chinese speakers are not people, would you think he was
correct?

No. But that's nothing to do with the undeniable ubiquity of PCs.

It is in part. You were saying that servers, among other things, were
not computers. You are the Swahili speaker saying the the servers etc
are not people because you never come across them and that Swahili
(Windows) is the most common language (OS) because that is all you come
across whilst ignoring the majority of the world.

PCs are only undeniably the most common computers in your little corner
of the world, not in the world at large.
 
N

Nick

spinoza1111 said:
Wait a minute. This is bullshit.

Since when is it "superhuman" to be capable of a simple clerical task?

Not to be capable. To do it every time, without fail and without the
smallest error, even when not particularly experienced with the system
(we're talking about other people maintaining code, remember). That's
not going to happen. And this is a field I have some experience in.

Can we avoid throwing words like "bullshit" around please - I'm being
polite to your opinions (I happen to think a lot of them are wrong) but
I don't just leap in with "they are bullshit". I'm not objecting to the
term through any sort of prudishness, just wanting a more elevtated
level of debate.
Again, the passage "Diagnosis" from Adorno's Minima Moralia applies in
an almost occult way:

"Even when they prove to be quite humane and reasonable beings outside
of the enterprise, they freeze into pathic stupidity the moment they
think professionally."

That is, people successful at paraprofessions and unprotected by
tenure learn to use shibboleths (code words and turns of phrase that
are clumsy and without meaning, but accepted as idiomatically
expressing what's often called "a good work attitude", such as an
irrevelant reference to an abstract computer user). They mocked the
tenured class, sometimes with good reason (for professors can be
intellectually corrupt) but blind to the fact that corresponding to
corruption that results from a privilege such as tenure, a reciprocal
corruption (sucking up to the boss and using meaningless language
exhibiting "a good attitude") exists when your employment is at will.
This is the pathic stupidity of which Adorno writes.

"Far from perceiving such prohibitions on thought as something
hostile, the candidates – and all scientists are candidates – feel
relieved. Because thinking burdens them with a subjective
responsibility, which their objective position in the production-
process prevents them from fulfilling, they renounce it, shake a bit
and run over to the other side."

Sure. If you've been told that arcane specialists on the other side of
the world must fix all compiler bugs, god forbid you should. You have
a "subjective" responsibility when you see something wrong in the
global process to fix it, but your fix can easily break the release:
note that Adorno sometimes used "subjective" in a way opposite to its
common use: he felt that "subjective" thought actually engages reality
whereas most "objectivity" is gulled and tricked by artifice including
cooked statistics...and silly assed programming languages.

"The displeasure of thinking soon turns into the incapacity to think
at all: people who effortlessly invent the most refined statistical
objections, when it is a question of sabotaging a cognition, are not
capable of making the simplest predictions of content ex cathedra
[Latin: from the chair, e.g. Papal decision]. They lash out at the
speculation and in it kill common sense. The more intelligent of them
have an inkling of what ails their mental faculties, because the
symptoms are not universal, but appear in the organs, whose service
they sell. "

This is astonishing since Adorno was writing about what happened, in
his direct experience, to the promotion of classical music to a mass
audience. His own recommendations (that the audience be actually told
about themes and structure in the same way music theory students learn
it) were ignored and the result was the near incoherence of the way in
which Deems Taylor says one should listen to classical music in
Fantasia, which arguably destroyed classical music listening as a mass
phenomenon in America for good in 1940.

Adorno noted that intelligent, civilized men produced baby-talk in the
form of guides to classical music for "everyman". Likewise, large
software systems are full of bugs because the dislike of thinking
emerges FIRST in the "organs whose service they sell".

"Many still wait in fear and shame, at being caught with their defect.
All however find it raised publicly to a moral service and see
themselves being recognized for a scientific asceticism, which is
nothing of the sort, but the secret contour of their weakness. Their
resentment is socially rationalized under the formula: thinking is
unscientific. Their intellectual energy is thereby amplified in many
dimensions to the utmost by the mechanism of control. The collective
stupidity of research technicians is not simply the absence or
regression of intellectual capacities, but an overgrowth of the
capacity of thought itself, which eats away at the latter with its own
energy. The masochistic malice [Bosheit] of young intellectuals
derives from the malevolence [Bösartigkeit] of their illness."

Astonishing, because a popular computer book is "Don't Make Me Think!.

You actually dislike the idea that the program maintainer MIGHT have
to change a comment "unnecessarily" although in the scheme of things,
that's doing his job. Not even clerical thinking is approved anymore
in specific cases while "thinking" in general receives more than lip
service.

I've left all of this in because I don't understand a word of it (well,
I understand the words, but ...).

It's my experience that if you have multiple copies of things - it
doesn't matter whether we are talking C source code, or copies of
canteen menus, or regulations for what to do in an emergency, unless you
are incredibly rigorous in setting up processes and enforce them through
complicated procedures and cross-checks the average individual will,
sooner or later, forget to update every copy of them. Maybe you won't.
I find that difficult to believe, but even if so, you may be that good.
Not everyone is like you.

So, to return to the actual example away from whatever Adorno was saying
about classical music in 1940 (?!) it seems to me, that all the benefit
of not having two versions of the code in the prototype and the
definition (which the compiler will catch if they mismatch) is thrown
away by having another version in the documentation that nothing will
catch. That's why I suggested that if you were so keen to avoid
duplication (which I though you were; if you're not, why on earth not
stick to the traditional repetition of the code?) then putting the
comments in there as well would be a good idea.

Why not take the opportunity to make it easy for yourself, never mind
everyone else?

Isn't this what literate coding is all about?

PS - can you snip signatures please:

Thanks
 
S

Seebs

You do keep making these unfounded assertions, don't you? How do *we*
know you're not written by Douglas Hofstadter?

I don't think you do. I'm not even sure I do.

But in practice, it's been a good working assumption, and I think the idea
that I should be especially worried that someone might think I'm a hypocrite
ridiculous. Should I be worried that they will also figure out that I
sometimes shit? Whatever. People have flaws. They are not necessarily
worth making a big deal over, in general.

Now, if the question were not whether people would *think* I were a hypocrite,
but whether I was behaving hypocritically in a given area, that might be
a legitimate cause for interest and concern. However, given the sheer
bulk of medical research available on the question, I'm not going to worry
about this one.

-s
 
S

Seebs

Kenny is right.

Not usually.
May as well admit that you're motivated by status.

If someone were to present evidence that I cared about it, sure. But
correcting factual errors isn't evidence of that.

I wonder if you could find any examples of me correcting factual errors
in cases where it led, predictably, to a decrease in my apparent status
with the people who have the same model of status you and Kenny share.

(Originally I thought that was a rhetorical question, but considering
your research skills, it's actually a substantive question of fact. I
will point out that, for just about anyone else, the answer would be an
obvious "yes".)

-s
 
S

Seebs

The point is your job function is not a programmer,

Actually, it is.

My involvement with compiler bug reports isn't programming, mostly. The
rest of what I do is programming.

The point, I think, is that you appear to have about the worst reading
comprehension I've ever seen. You keep conflating separate events into
single things and oversimplifying complex relationships. It's actually
sorta spooky sometimes. Hmm.

I am not sure I've ever seen you correctly parse qualifiers.

-s
 
N

Nick Keighley

Itemise them. NOW. Otherwise you're again making a small number of
errors into a full scale trashing as you did with Schildt.

Look back through this (admittedly rather large) thread. Richard
posted a long list of compiler errors. Many of them seemed to be
mismatches between the prototypes and the actual functions.
 
S

Seebs

Itemise them. NOW.

No.

(Also, don't be stupider than you have to be; Richard Heathfield already
gave you a list of dozens that a compiler caught.)
Otherwise you're again making a small number of
errors into a full scale trashing as you did with Schildt.
Incorrect.

Release 4 of the infix2Polish conversion program is attached to this
reply. All of your concerns, including the small H issue, have been
addressed, and you have been credited in the Change Record. From now
on, I want you to economically report problems as they arise or
contribute your own code to this discussion.

What makes you think I care what you want?

Consider. The first post of yours I saw contained various allegations about
the nature and purpose of the standardization process. You've been asked
several times, by both me and others, to give any kind of evidence or support,
but none has been forthcoming.

Since that time, there has been no period during which I have believed you
to be basically sane. Everything available suggests that you are a
charter member of the tinfoil hat brigade. You can't think, you can't
argue, you believe stuff without evidence, and you consider that you believe
something to be persuasive evidence. You rant at length about how qualified
various people other than you are, and how important your work history is,
but fundamentally, you can't make an argument.

So why should I care what you think, or what you want? You're a kook.
You have consistently demonstrated a categorical inability to support
your claims at any level. The closest you've ever gotten is that you
once cited the abstract of a totally unrelated paper because it used a
particular word and you didn't bother to find out what it actually said.
That's not, by normal standards, evidence.
When I want anything more, I'll beat it out of you.

No, you won't.
If you by your own repetition force me to repeat then yes indeed, your
own reputation is being damaged by this online discussion.

Non-sequitur.

Your use of "as I have said" to introduce unsupported claims does not
create support for them. You have never supported your claims.
I choose to, and in so doing I repeat what you've told me
about your learning disorder and the fact that you don't seem to have
either academic qualifications or real programming experience.

Which establishes only your inability to read for comprehension.
Here is version 4 of infix2Polish.

That's nice.

Here is an ASCII cow:

(__)
(oo)
/-------\/
/ | ||
* ||----||
^^ ^^

-s
 
S

Seebs

Literally untrue, for the comment being wrong won't make the actual
program error-prone.

But it will make the process of reading the program error-prone, because the
user won't notice when you change the calling semantics of the function and
forget to update the comment. So the reader will expect it to have
different arguments...

And given the number of crazy things like "two %s for a single int value"
you've done, I don't think it's reasonable to just assume that won't
happen.

-s
 
S

Seebs

Since when is it "superhuman" to be capable of a simple clerical task?

Remembering everything one's seen is pretty much superhuman; very few
people can do it. The alternative is to spend a bunch of time bouncing
around a source file reading unrelated things to compare them.

Remember, if you get the prototype and the function declaration out of
sync, the compiler catches it instantly, so it's not a risk -- there's no
chance of keeping it screwed up if you pay even casual attention to your
compiler's output. By contrast, requiring someone to ignore the comment
and look at a #define hundreds of lines away is just asking for trouble.

You really don't seem to have any clue what the *point* of documenting
things is. Documentation which is maintained with your "you can check it
out if it's wrong" attitude is worse than no documentation at all.

-s
 
S

Seebs

Well people are motivated by hundreds of things to a greater or lesser
extent.

Yes. And he's mostly right, but he's understated the frequency of the
exceptions. Hint: Clinical autism is frequently one of them.
Most of comp.lang.c seems to be about you two claiming people are
motivated by status. Could we return to talking about C please?

You know, a few days ago, I would have made a snarky comment about Kenny
not being able to, but I saw a post from him recently that, if I recall
correctly, was both topical and informative. So I guess I'll agree; I
wish he'd talk about C more, because he seems to be an interesting and
capable writer in the field. I just wish he'd abandon his talking about
psychology, which appears to be a field in which he is less versed.

And, since he appears to rely heavily on status, it is perhaps worth
pointing out that I *do* have a degree in psychology, not to mention a
great deal of time spent talking to psychologists and other specialists
about at least one particular expression of autism, which has among
other things revealed that I don't have the "status" wiring, although I
can sort of fake an awareness of it enough to get along with people who
care about it when I have to.

-s
 
S

Seebs

Twice, now. I expect to have to post it twice or three times more
before he eventually notices. At that point, he'll claim that he knew
all along, but that he'd made a deliberate decision to include those
mistakes, since any decent compiler ought to be capable of guessing
what he intended. Or something along those lines, anyway.

There was an amusing bit in a book (Tom Holt, I think) in which someone
does something which results in a massive object of some sort falling
on him. (Not fatal for various reasons.) From under it, a weak voice
calls out: "Okay, now who spotted my deliberate mistake."
Yes. But he claims a clean compile. This suggests that he is failing
to invoke his compiler correctly.

He might be looking only at errors, rather than warnings -- many IDE-based
compilers show you errors, but if it's just warnings all you get is a
tiny little icon somewhere changing color.

-s
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,777
Messages
2,569,604
Members
45,219
Latest member
KristieKoh

Latest Threads

Top