Warning to newbies

S

spinoza1111

What were those fatal flaws?
It was thought that the proof might have an error.
[...]

Furthermore, nobody understood Dijsktra's point: that the proof could
be developed alongside the program.

except for you of course!
Yes.
Also, I'd add that a properly constructed software walkthrough is a
venue for the presentation of software correctness proofs in ordinary
language.

I'm not so sure about ordinary language. Mathematicians invented their
notations for a reason.

....and they write Journal articles in a combination of natural
language and formal language, not just formal language, for a reason.
airbus don't prove their programs formally correct. Or they didn't
used to. Several UK comp.sci people got quite hot under the collar
about it.


I've seen more impressive joy sticks on the average games machine.



I try to put those sorts of comment in. And there's assert() the ghost
of formal methods.

In object oriented code I developed a methodology and documented it in
"Build Your Own .Net Language and Compiler" (Apress 2004). Each object
with state has an inspect() method which checks state values using
assertions. If any is failed, the object marks itself unusable and
subsequently it refuses to do anything.
 
S

spinoza1111

[ snip ]
I have no respect it is true for Peter Seebach. Why? Because he's
bragged that he's never taken a computer science class in his life

Why do you keep claiming that Seebs "bragged" or boasted about not
having taken CS classes?  as best I can tell, you made a guess about
various people's backgrounds, in message ID


and he followed up to someone else's response, in message ID

<[email protected]>

as follows (slightly reformatted for readability):

He later wrote, in message ID

<[email protected]>





Where's the bragging or boasting here?

It's in the fact that he's unqualified, and to say that the absence of
this training is in any way a good thing is bragging and boasting and
false.

His humility is feigned. I detected this when he called me a "moron".
Rahm Emanuel has apologized for this type of language: Seebs has not.
He hurls invective which is not to be humble, and deletes email
offering to discuss differences UNREAD.

He's in my opinion what Minnesota poet Robert Bly calls the soft male,
who is unable to face his shadow side. It was the reason he failed to
see that a disorganized laundry list of minor complaints could
seriously damage Herb Schildt's reputation unnecessarily, in a way
that was NOT a contribution to our understanding of C.
despite the fact that this aporia caused him to make absurd claims and
inferences, including "the 'heap' is a DOS term", forbidding Schildt
to explain runtime using a stack, and fantasizing that windows and
linux use different models for multitasking when both must use some
variant of a semaphore. Furthermore, he calls people, out of the blue
and without a previous history, vile names. He wants to be tolerated
as a gay man and as having a fashionable disease, but bullies and
disrespects others like a common fagbasher and mocks them as mentally
disordered in preference to answering their objections.

(Quoted to preserve context.  I'm rather impressed that Seebs doesn't
respond to this sort of thing.  I'm not sure I'd be able to resist
the temptation to defend -- something.)

[ snip ]
 
S

Seebs

interesting question.

It is.

FWIW: You can't.

First off, Moore's Law doesn't expand as fast as some particularly
expensive operations do. Secondly, even if it does expand with the right
big-O relation, it may not expand fast enough to do much good. Thirdly,
a whole lot of devices have issues like "battery life" to think about,
such that reducing CPU usage is still a very valuable endeavor.

In short, no matter how fast your CPUs are, 20% is still 20%.
we might be sad if it happens at an inappropriate moment!

Any World of Warcraft player can tell you how much that can matter!

Well, anyone who played back in the day...
I believe
modern garbage collectors are less "lumpy" than the old mark-and-sweep
ones. Perhaps another example of where efficiency matters!

Yes.

And even if it's less efficient in absolute CPU cycles, it's useful for
the certainty that you won't suddenly get a three-second lag in combat.
compilers aren't usually classified as real time.

I think that was his point, but I can't honestly say.
quite right. On the other hand he may not *ask* for it and still
expect it. "You never told me it would take an hour to reload the
system!"

I cannot suggest a circumstance in which it would NOT be an unalloyed good,
all else being equal. Which it might not be.

-s
 
S

Seebs

(Quoted to preserve context. I'm rather impressed that Seebs doesn't
respond to this sort of thing. I'm not sure I'd be able to resist
the temptation to defend -- something.)

A couple of points:

1. I'm not all *that* strongly autistic, but enough that I don't really
care whether people are "attacking" me or not.
2. I have Nilges plonked.
3. Even when I didn't have him plonked, I'd realized after about two tries
at communicating with him that he was hilarious. I went out of my way to
wave the autism thing at him to get him to spend weeks trying to insult me
with it, and the decision to mention the lack of a CS degree was made with
full awareness that he'd obsess on it for months.
4. If behavior is sufficiently predictable, and sufficiently stupid, it is
not meaningfully an attack.

Basically, if your mom's a drill sergeant, and another kid finds out that
your mom has combat boots, the ensuing cries of "Your mommy wears combat
boots!" are not a particularly emotionally significant thing. :)

-s
 
T

Tim Rentsch

Kaz Kylheku said:
[snip]

I believe that the optimization argument in favor of unspecified
evaluation orders is fairly weak to begin with, and that local
declarative mechanisms like restrict pointers basically squash it.

The vast majority of other people in the C programming community
don't share this opinion. Do you have anything to offer on the
subject besides your opinion?
 
C

Charlton Wilbur

NK> reasonable thing to do. Though if your requirement notation was
NK> rich enough to describe your problem you could argue you'd
NK> already written a program...

I think this is the core of the problem of software proofs.

When you have a sufficiently detailed spec, you can prove that the
software behaves correctly according to that spec.

This just shifts the problem from writing the program to writing the
spec; and when users complain "It doesn't do what it's supposed to!"
they rarely care whether the problem is in the code or in the spec.

Charlton
 
S

spinoza1111

    NK> reasonable thing to do. Though if your requirement notation was
    NK> rich enough to describe your problem you could argue you'd
    NK> already written a program...

I think this is the core of the problem of software proofs.

When you have a sufficiently detailed spec, you can prove that the
software behaves correctly according to that spec.

This just shifts the problem from writing the program to writing the
spec; and when users complain "It doesn't do what it's supposed to!"
they rarely care whether the problem is in the code or in the spec.

I for one think it's part of the problem that any time anyone tries to
speak with any precision in such a manner as to question social
relations in programming, the Urban Legend of the Users is unleashed
as a *deus ex demona*. The fact that many programmers are
overdominated by powerful users, many of whom in the financial
services force programmers to create fraudulent software, is an
unfortunate fact, not a mathematical argument.

In the *mythos*, the Users are orality and its revenge on writing, for
they must be "listened to" even if they screech nonsense like Harpies
and they must always be said to have the floor...be said to have
because this is a legend told around de campfire first and foremost.
Therefore, talking about this abstraction is an excellent way of
ending a conversation. It's a command to stop talking above the level
of the technopeasant.
 
S

spinoza1111

A couple of points:

1.  I'm not all *that* strongly autistic, but enough that I don't really
care whether people are "attacking" me or not.
2.  I have Nilges plonked.
3.  Even when I didn't have him plonked, I'd realized after about two tries
at communicating with him that he was hilarious.  I went out of my way to
wave the autism thing at him to get him to spend weeks trying to insult me
with it, and the decision to mention the lack of a CS degree was made with
full awareness that he'd obsess on it for months.
4.  If behavior is sufficiently predictable, and sufficiently stupid, it is
not meaningfully an attack.

It's plain to me from this that you grew up in a dysfunctional and
female-dominated world, since you make yourself absurd by trying to
map what is on to what you know.
Basically, if your mom's a drill sergeant, and another kid finds out that
your mom has combat boots, the ensuing cries of "Your mommy wears combat
boots!" are not a particularly emotionally significant thing.  :)

-s

News flash. If you killfile a person, you need to refrain from
discussing that person. Otherwise you're like some creep at a party
who talks loud and offensively about a person to others.
 
S

spinoza1111

It is.

FWIW:  You can't.

First off, Moore's Law doesn't expand as fast as some particularly
expensive operations do.  Secondly, even if it does expand with the right
big-O relation, it may not expand fast enough to do much good.  Thirdly,

You should stay out of this conversation, since that's what
"killfiling" means.

Furthermore, your comment is nonsense, and I don't think you
understand computational complexity, for if "it" expands with "the
right big O relation", then it follows that this is fast
enough...although in fact, we shouldn't rely on Moore's Law to justify
the use of mathematically slow algorithms, and that was not implied by
the original post. We weren't discussing buying faster computers so as
to code bubble sort, we were discussing running effective algorithms
faster. You seem to me unqualified to join this conversation, since
like many programmers educated only on the job, you're applying an
oversimplified and normative picture to a science.

You read in the Awesome Computer Journal of Awesome Computers about
computational complexity, and decided that anyone who doesn't strike
the appropriate corporate pose, required for employability, is a Bad
person, because he's not Awesome; for less than Awesomeness is
unacceptable given the inner contour everyone's secret wound. Thus,
Chris T could not have been a real professional talking about a real
problem as opposed to someone finding bugs and writing scripts,
therefore you're trying to distort the conversation into your defense
of Awesome Computing and caricature people who you don't understand or
like as people who would defend something other than Awesome
Computing.

Get a life.
a whole lot of devices have issues like "battery life" to think about,
such that reducing CPU usage is still a very valuable endeavor.

In short, no matter how fast your CPUs are, 20% is still 20%.

True overall. But not what is under discussion, and you don't belong
here if you've killfiled people in the discussion.
Any World of Warcraft player can tell you how much that can matter!

Computer games and TV ("Baldrick") debase grownup discussions IMO.
Well, anyone who played back in the day...


Yes.

And even if it's less efficient in absolute CPU cycles, it's useful for
the certainty that you won't suddenly get a three-second lag in combat.

(Sigh). (Eye roll). (Crotch grab). (Panto-mime of shooting oneself in
the head including mime of the exit spray of blood and brains, an
important part of any proper Panto-mime of this action).
I think that was his point, but I can't honestly say.

Blow me.
 
S

Seebs

I know of no Usenet convention to that effect.

Me neither.

And I've been doing this since before the Great Renaming.
I'd add an ObC, only I can't think of one.

I'll contribute one.

I recently wanted to write something which would derive a string value,
then substitute it into another string. Actually, into several other strings.

The problem is that it might be substituted several times, so I couldn't
just do something like:

char *replace = "replace_here:%s";

and use sprintf.

(In this particular case, the values are "environment variables", but that's
a detail of no particular relevance to C.)

My solution, which is a bit hacky, but I sorta liked:

struct { char *name, *template; } new_environs[] = {
{ "PERL5LIB", "%s/lib64/perl5/5.10.0/i686-linux:%s/lib64/perl5/5.10.0:%s/lib64/perl5/site_perl/5.10.0/i686-linux:%s/lib64/perl5/site_perl/5.10.0:%s/lib64/perl5/vendor_perl/5.10.0/i686-linux:%s/lib64/perl5/vendor_perl/5.10.0:%s/lib64/perl5/vendor_perl:." },
{ "LD_LIBRARY_PATH", "%s/lib64:%s/lib:%s/lib64" },
{ 0, 0 }
};

Then, down below:

for (i = 0; new_environs.name; ++i) {
char *new_environ;
char *t;
char *u;
size_t len = strlen(new_environs.template) + 5;
for (s = new_environs.template; t = strchr(s, '%'); s = t + 1) {
len += strlen(parent_path);
}
new_environ = malloc(len);
u = new_environ;
*u = '\0';
for (s = new_environs.template; t = strchr(s, '%'); s = t + 2) {
u += snprintf(u, (len - (u - new_environ)), "%.*s%s",
t - s, s, parent_path);
}
snprintf(u, (len - (u - new_environ)), "%s", s);
setenv(new_environs.name, new_environ, 1);
free(new_environ);
}

The "setenv" function is a Unixism, which, given a name and a value, makes
that name/value pair available to future execution.

The calculations are very careless; my goal was not to calculate exactly
the right length, but rather, to calculate a length which could be easily
shown to be definitively long enough.

I am, however, pretty happy with this part:

for (s = new_environs.template; t = strchr(s, '%'); s = t + 2) {
u += snprintf(u, (len - (u - new_environ)), "%.*s%s",
t - s, s, parent_path);
}

Note that I don't check, in this version, whether it's "%s" or some other
character after a %. This was one of those "we need something within half
an hour that will work for this circumstance, we can clean it up tomorrow"
jobs.

-s
p.s.: Side note for Unix users: Yes, it's okay to free the argument to
setenv after passing it in, unlike putenv().
 
R

Richard Tobin

spinoza1111 said:
It's plain to me from this that you grew up in a dysfunctional and
female-dominated world

You're starting to sound like Andrew Usher.

-- Richard
 
F

Flash Gordon

Not really. The only way to shut down such an appliance is brutally,
by turning the power off.

No, many programs have just this form, but shut down cleanly when
requested to. Look up the exit() function.[/QUOTE]

Many other programs deliberately have no termination under any
conditions (and a watchdog to reboot the system if they lock somehow).
Any state which needs to be preserved between power cycles is always
kept up to date in an appropriate manner which, like databases and
modern file systems, is resilient to power failures during the update.
 
M

Michael Foukarakis

On Feb 4, 4:08 pm, Nick Keighley <[email protected]>
wrote:
From what I remember, proving a given program correct was equivalent

That is not correct. The Halting Problem is a proof that a Turing
Machine cannot be constructed that will be able to detect whether any
other TM will halt. Whereas programs can be proven correct (for much
the same reason we proved the negative truth of the Halting Problem).

It's amazing that in "break room [or Hooters] bullshit sessions",
programmers have in my experience regularly confused "writing a proof
that a program is correct" with "writing a program that can determine
algorithmically whether another program is correct". The latter is an
instance of the Halting problem. The former is applied mathematics.

Indeed, there is (probably?) no such thing as a correct program that
cannot be proven correct, especially if we're mathematical
intuitionists or pragmatists. If the "halting problem" on the Toyota
is traced to software then program correctness proving may get new
life.

[I really miss the late Dik Winter, familiar to many readers here.
He'd jump me using a Dutch rule: if you speak of things that come from
Holland, such as Intuitionism, you'd better be Dutch. Otherwise you
probably don't understand them. A variant in fine of what
Shakespeare's Captain MacMorris says in Henry V: what is my nation?]

For all sorts of correctness proofs of programs see the Edsger W
Dijkstra archive:http://www.cs.utexas.edu/users/EWD.

"Beware of 'the real world'. A speaker's appeal to it is always an
invitation not to challenge his tacit assumptions. When a speaker
refers to the real world, a useful counterploy is pointing out that he
only refers to HIS perception of it. We find it nowadays hyphenated in
'real-world problems'; please remember that those are typically the
ones we are left with when we have refused to apply their effective
solutions."

Edsger W Dijkstra, 1982
to solving the halting problem. Nevertheless, a team of researchers
some time ago (under Gernot Heiser, iirc) claimed to have proven a
7,5k LOC kernel formally correct in about 250 LOC per man-year. Linux
& Windows probably have more than 50 MLOC, which would take something
like 200.000 years to verify, assuming no increased complexity from,
well, the increased complexity..

This reasoning is absurd. Dijkstra had a lot of laughs at its expense:

"We have shown her representatives what measures we had taken, such as
our ISAPS (Interactive Semi-Automatic Proof System) and even the more
impressive TDF (Theory Development Facility). The first one enables an
average mathematician to increase his proof productivity from 30 to 60
lines a month, the second one allows a mathematician with a minimum of
skills to produce up to a 100 times as much text as he has keyed in."

The silliness is imposing bureaucracy, bullshit and bullying on
software development. If you force the programmers to check a working
if at first minimal system at 5:00 PM every day, along with an
informal correctness proof in the form of minutes of a 4:00 PM
structured walkthrough, you'd have your OS in about the same amount of
time...simply by not dividing the work of proving from that of coding,
an alienation which is based on a silly respect for specialization and
a contempt for humanity.

You refuse to acknowledge a simple joke (assuming you're not anal
enough not to understand it). Let's try a simpler one:

Blow me.
 
N

Nick Keighley

I think that was his point, but I can't honestly say.

I think you're right I think I misunderstood him
I cannot suggest a circumstance in which it would NOT be an unalloyed good,
all else being equal.  Which it might not be.

You probably know this but considering performance an "unalloyed good"
can lead to the "micro-optimisation trap". I've been guilty of
fiddling with things which may have been good enough. I've also been
on a customer's site when a previously working bit of code broke
because somebody had "optimised" it (though that implies our process
was broken as well).
 
N

Nick Keighley

[...] your comment is nonsense, and I don't think you
understand computational complexity, for if "it" expands with "the
right big O relation", then it follows that this is fast
enough...

what's the "right" big O notation?

although in fact, we shouldn't rely on Moore's Law to justify
the use of mathematically slow algorithms, and that was not implied by
the original post. We weren't discussing buying faster computers so as
to code bubble sort, we were discussing running effective algorithms
faster.

not all problems have a known "good" (polynomial) algorithm. Think NP
complete (travelling salesman etc.). I gave you a bunch of examples in
an earlier post of areas where Moore's Law won't bail them out. In
general waiting for N Moore's law doubling just isn't good enough.
There's a reason cell phones didn't exist in 1975.


good point

we might be sad if [garbage collection] happens at an inappropriate moment!
Any World of Warcraft player can tell you how much that can matter!

Computer games and TV ("Baldrick") debase grownup discussions IMO.

games are a commercially important application. If you don't call that
grownup we'll have to agree to differ.

Even if you don't like games you can think of them as simple (not so
simple sometimes) physical simulations. Anything written about the
games problem probably applies to many problems in the physics
community as well. Those graphical accelerators get used for all sorts
of odd stuff.

<snip>
 
S

Seebs

You probably know this but considering performance an "unalloyed good"
can lead to the "micro-optimisation trap". I've been guilty of
fiddling with things which may have been good enough. I've also been
on a customer's site when a previously working bit of code broke
because somebody had "optimised" it (though that implies our process
was broken as well).

Thus the "all else being equal".

If improved performance has a cost, which it probably usually does, then
it may not be worth it. Performance *itself* is an unalloyed good, but
achieving it through things (such as less readable code) may not be.

-s
 
S

spinoza1111

[...] your comment is nonsense, and I don't think you
understand computational complexity, for if "it" expands with "the
right big O relation", then it follows that this is fast
enough...

what's the "right" big O notation?
although in fact, we shouldn't rely on Moore's Law to justify
the use of mathematically slow algorithms, and that was not implied by
the original post. We weren't discussing buying faster computers so as
to code bubble sort, we were discussing running effective algorithms
faster.

not all problems have a known "good" (polynomial) algorithm. Think NP
complete (travelling salesman etc.). I gave you a bunch of examples in

Traveling salesman has a "good" algorithm. It's called simulated
annealling.
an earlier post of areas where Moore's Law won't bail them out.  In
general waiting for N Moore's law doubling just isn't good enough.
There's a reason cell phones didn't exist in 1975.

I worked at Motorola in 1979. Years later, a Siemens engineer told me
that the reason Motorola didn't have a marketable cellphone in 1975
was that under Nixon, the FCC was to embroiled in politics as a
consequence of Vietnam and Watergate that it would not release
frequencies needed to implement cellphones.
good point

we might be sad if [garbage collection] happens at an inappropriate moment!
Any World of Warcraft player can tell you how much that can matter!
Computer games and TV ("Baldrick") debase grownup discussions IMO.

games are a commercially important application. If you don't call that
grownup we'll have to agree to differ.

I don't. I don't call it grown-up when grown men ruin their health and
lives meeting deadlines at Electronic Arts to release another version
of John Madden football. I don't call it grown-up when the US Army
uses a video game to recruit kids.
 
S

spinoza1111

spinoza1111wrote:
I know of no Usenet convention to that effect.

Me neither.

And I've been doing this since before the Great Renaming.
I'd add an ObC, only I can't think of one.

I'll contribute one.

I recently wanted to write something which would derive a string value,
then substitute it into another string.  Actually, into several other strings.

The problem is that it might be substituted several times, so I couldn't
just do something like:

        char *replace = "replace_here:%s";

and use sprintf.

(In this particular case, the values are "environment variables", but that's
a detail of no particular relevance to C.)

My solution, which is a bit hacky, but I sorta liked:

struct { char *name, *template; } new_environs[] = {
  { "PERL5LIB", "%s/lib64/perl5/5.10.0/i686-linux:%s/lib64/perl5/5.10.0:%s/lib64/perl5/site _perl/5.10.0/i686-linux:%s/lib64/perl5/site_perl/5.10.0:%s/lib64/perl5/vend or_perl/5.10.0/i686-linux:%s/lib64/perl5/vendor_perl/5.10.0:%s/lib64/perl5/ vendor_perl:." },
  { "LD_LIBRARY_PATH", "%s/lib64:%s/lib:%s/lib64" },
  { 0, 0 }

};

Then, down below:

        for (i = 0; new_environs.name; ++i) {
          char *new_environ;
          char *t;
          char *u;    
          size_t len = strlen(new_environs.template) + 5;
          for (s = new_environs.template; t = strchr(s, '%'); s = t + 1) {
            len += strlen(parent_path);
          }
          new_environ = malloc(len);
          u = new_environ;
          *u = '\0';
          for (s = new_environs.template; t = strchr(s, '%'); s = t + 2) {
            u += snprintf(u, (len - (u - new_environ)), "%.*s%s",
                          t - s, s, parent_path);
          }
          snprintf(u, (len - (u - new_environ)), "%s", s);
          setenv(new_environs.name, new_environ, 1);
          free(new_environ);
        }

The "setenv" function is a Unixism, which, given a name and a value, makes
that name/value pair available to future execution.

The calculations are very careless; my goal was not to calculate exactly
the right length, but rather, to calculate a length which could be easily
shown to be definitively long enough.

I am, however, pretty happy with this part:

          for (s = new_environs.template; t = strchr(s, '%'); s = t + 2) {
            u += snprintf(u, (len - (u - new_environ)), "%.*s%s",
                          t - s, s, parent_path);
          }

Note that I don't check, in this version, whether it's "%s" or some other
character after a %.  This was one of those "we need something within half
an hour that will work for this circumstance, we can clean it up tomorrow"
jobs.

-s
p.s.:  Side note for Unix users:  Yes, it's okay to free the argument to
setenv after passing it in, unlike putenv().
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
474,262
Messages
2,571,056
Members
48,769
Latest member
Clifft

Latest Threads

Top