In the matter of Herb Schildt: the question of a "bad" book

B

BruceS

Yes. No. I appreciate your input.

FYI, even the C152 I flew had a pitot tube. It was not what we call
fly-by-wire, though there were wires between the controls and the
control surfaces. This was a pretty primitive aircraft.

As for DO-200A, you might be interested in reading up on it for
reasons why languages with garbage collection are avoided in critical
systems.
 
S

spinoza1111

FYI, even the C152 I flew had a pitot tube.  It was not what we call
fly-by-wire,

Didn't say it was.
though there were wires between the controls and the
control surfaces.  This was a pretty primitive aircraft.

As for DO-200A, you might be interested in reading up on it for
reasons why languages with garbage collection are avoided in critical
systems.

OK. However, there seems to be a superstition that software cannot be
proven correct or predicted to stay within a certain performance
bound.
 
S

Seebs

C99 of course also allows the above. Your error is reasonable as code
with declarations in inner blocks is quite unusual (in my experience).

Note that the block needn't be associated with a control structure:

int foo(void) {
int i;

i = 3;
{
int j;
j = 4;
printf("%p\n", (void *) &j);
}
/* j is no longer in scope here */
{
int k;
k = 4;
printf("%p\n", (void *) &k);
}
return 0;
}

There certainly exist compilers on which &j and &k will be the same
value.

-s
 
B

BruceS

Didn't say it was.

You seemed to be implying a connection between a craft having a pitot
tube, and it being fly-by-wire, so I was simply pointing out that
there is no such connection.
OK. However, there seems to be a superstition that software cannot be
proven correct or predicted to stay within a certain performance
bound.

I haven't see that, but I do know that there's a problem with software
that uses GC being provable with regards to timing and behavior. IMO,
the only reason to have GC is that too many programmers are too lazy
and/or incompetent to manage memory usage.
 
K

Keith Thompson

Seebs said:
Note that the block needn't be associated with a control structure:

int foo(void) {
int i;

i = 3;
{
int j;
j = 4;
printf("%p\n", (void *) &j);
}
/* j is no longer in scope here */
{
int k;
k = 4;
printf("%p\n", (void *) &k);
}
return 0;
}

There certainly exist compilers on which &j and &k will be the same
value.

Yes, but you can't prove it without invoking undefined behavior.
( semi-:cool:} )

Of course, you can convert the addresses to uintptr_t (if it
exists), storing the results in objects in an enclosing scope, and
compare them that way. Or you can print each one with printf("%p",
....) while it's in scope and compare the output (as you've done
here). Or you can just compare the pointers and not worry about it;
the undefined behavior is unlikely to cause any real problems.

Unles of course the compiler figures out that you're comparing addresses
of distinct objects and decides to assume that they can't possibly be
equal.

But yes, j and k can certainly be stored in the same memory location.
 
T

Tim Streater

BruceS said:
I haven't see that, but I do know that there's a problem with software
that uses GC being provable with regards to timing and behavior. IMO,
the only reason to have GC is that too many programmers are too lazy
and/or incompetent to manage memory usage.

Certainly, with careful design there should be no issues with managing
memory yourself in C. Again, referring to the portable apps I wrote 20
years ago, it had about 5k page faults initially, and then none during
the subsequent 9 months it typically ran (until the next site-wide power
outage). Setting the app's memory parameters under VMS so it was
allocated adequate memory right from the start, meant monitoring it for
leaks was very easy: just check the page fault count.

These days I'm too lazy to do that so (using PHP and JavaScript) I'm
happy to let someone else worry about memory.
 
D

Dan Henry

since McGraw Hill
intended to sell to a Microsoft audience.

That audience was not stated when I paid^H^H^H^H was robbed of my
money for the first edition.

Evidence please.
 
S

Seebs

That audience was not stated when I paid^H^H^H^H was robbed of my
money for the first edition.

The book clearly states that it's applicable to all C environments.
As is usually the case, Nilges is asserting things which would make him
feel good if they were true. I recommend that you not waste time trying
to bother him with facts; he's transcended such things.

-s
 
S

spinoza1111

The book clearly states that it's applicable to all C environments.
As is usually the case, Nilges is asserting things which would make him
feel good if they were true.  I recommend that you not waste time trying
to bother him with facts; he's transcended such things.

It's true that I have lost interest in reified "facts". Those of us
(like Schildt and myself) that have constructed a compiler/interpreter
are, in "fact", quite different from people like Seebach, since not
having an opportunity to see how "facts" are HUMAN CONSTRUCTIONS
creates an excessive respect for the facts.

If the flaws in the book were obvious and it was not shrink wrapped,
the OP was responsible for looking through it.
 
S

spinoza1111

Note that the block needn't be associated with a control structure:

        int foo(void) {
                int i;

                i = 3;
                {
                        int j;
                        j = 4;
                        printf("%p\n", (void *) &j);
                }
                /* j is no longer in scope here */
                {
                        int k;
                        k = 4;
                        printf("%p\n", (void *) &k);
                }
                return 0;
        }

There certainly exist compilers on which &j and &k will be the same
value.

I'll alert the media. If you knew about stacks, you'd know that they
will almost always have the same value.

Pretty sloppy code example, worthy of the treatment you dished out to
Schildt:

* The declaration and assignment of i is unnecessary and adds nothing
to your exposition
* Likewise for the assignment of j and k
 
D

Dan Henry

The book clearly states that it's applicable to all C environments.

.... and says nothing about Nilges' lie of a Microsoft-intended
audience.

When I pulled my copy to verify the above, I see this little gem
taunting me on the back cover's bottom right corner just above the
$39.95 (USD):

"Get Answers -- Get Osborne"
"For Accuracy, Quality and Value."

0 out of 3.
 
N

Nick Keighley

Not really in Pascal.  Variable declarations must go at the top of a
function.  It is true that this syntactic concept is called a "block",
but that's just an accident of naming.  The compound statements
delimited by begin and end don't constitute blocks in that sense and so
can't contain variable declarations.

ah! Serves me right for relying on memory! thanks.
 
S

spinoza1111

... and says nothing about Nilges' lie of a Microsoft-intended
audience.

When I pulled my copy to verify the above, I see this little gem
taunting me on the back cover's bottom right corner just above the
$39.95 (USD):

       "Get Answers -- Get Osborne"
    "For Accuracy, Quality and Value."

0 out of 3.

Never give a sucker an even break. You see, if you insist on treating
books as tools, and if you won't enter into a conversation with a book
and critically read it, taking what you need, leaving the rest, then
you'r gonna get taken in by an ad.
 
M

Malcolm McLean

IMO,
the only reason to have GC is that too many programmers are too lazy
and/or incompetent to manage memory usage.
Back in the 1970s I remember reading a science fiction story about two
sides who were fighting a war, each trying to outsmart the others with
better computers. Victory came when one of the good guys produced
pencil and paper. "A paper computer?" said the good guy's commander.
"No," said the good guy, and proceeded to write down columns of
figures.

The story is a bit dated. What starts as a crutch for the lazy rapidly
turns into something that exceeds all human capabilities. No human can
perform 2 billion floating point calculations a second. equally,
there's an upper limit of the complexity of a memory system that can
be managed by hnad.
 
B

BruceS

Back in the 1970s I remember reading a science fiction story about two
sides who were fighting a war, each trying to outsmart the others with
better computers. Victory came when one of the good guys produced
pencil and paper. "A paper computer?" said the good guy's commander.
"No," said the good guy, and proceeded to write down columns of
figures.

The story is a bit dated. What starts as a crutch for the lazy rapidly
turns into something that exceeds all human capabilities. No human can
perform 2 billion floating point calculations a second. equally,
there's an upper limit of the complexity of a memory system that can
be managed by hnad.

True, but you can generally split resource management into two
categories: things that should be released within the same function as
they're allocated, and those that add to the resources "owned" by the
function. The first are very easy to clean up, though that seems to
be where most leaks occur. The second category just moves the problem
up a level---the calling function then has the same issue, of whether
to manage the memory internally. Even with interrupt-driven code that
allocates resources, there should be another part that frees those
resources. I spent some time finding and fixing leaks in an
application years ago, and very few were the result of the
application's complexity. Almost all were obvious where the release
should have been. In languages with GC, it seems that programmers end
up spending a bit of time trying to goose the GC to run. For the most
part, I think it's cleaner and easier to just manage the memory
yourself, and be diligent about not leaving pieces behind when you're
through with them.
 
R

Richard Bos

Malcolm McLean said:
Back in the 1970s I remember reading a science fiction story about two
sides who were fighting a war, each trying to outsmart the others with
better computers. Victory came when one of the good guys produced
pencil and paper. "A paper computer?" said the good guy's commander.
"No," said the good guy, and proceeded to write down columns of
figures.

Doctor Asimov, OTOH, went one better than that. At least twice.

Richard
 
S

spinoza1111

spinoza1111wrote:

You clearly don't understand the problem from either an
application point of view or compiler point of view. For this
kind of code resource management *needs* to be
predetermined before execution. There are several posts
that have explained this in small words.

It usually "needs" to be predetermined because you haven't done your
homework, and are incapable of reasoning beyond determinism.
Failure to allocate will result in an unresolved fault.
   - Hal to captain you have a problem.

Garbage collection results in indeterminate timing.
   - I might do this now but then again I might not

Yes, and YOU DON'T CARE. If you do care, then you have an unnecessary
precondition.
The reliability of allocation and garbage collection are
added as series terms in the system reliability.

This is untrue and crude. It assumes that each line of code adds a
constant value to the inverse of reliability. While this is true for
the average incompetent programmer, it need not be true in general.

It generates the idiotic argument that "my code should not check for
errors for the additional lines represent a risk".
   - The most reliable allocator possible will still
     make the runtime code less reliable than
     allocation at system build time.

Look up some references on mechanical reliability then apply
the analysis to software application source code. As a starting

I'd rather not, since software isn't physical. Furthermore, I'd rather
not accept input from mechanical engineers given the incompetence of
British Petroleum engineers.

The problem started when instead of music, mathematics and philosophy
majors, American managers started hiring as programmers the sort of
grease monkeys and Mama's boys who brag that they got D in high school
English and never took computer science. These clowns are the source
of one of the most idiotic programming metaphors: the computer as car.

point make some gross assumptions like every generated
instruction has the same reliability. You won't be able to tell

Which is insane.
when a program will fail but you will be able to compare two
applications that are programmed to have the same functionality
and be able to predict which of them is most likely to fail.

This exercise makes much better programmers.

No, it creates managers.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,537
Members
45,022
Latest member
MaybelleMa

Latest Threads

Top