B
Ben Pfaff
SM Ryan said:Apparently it is okay to go off topic if you are in the Elect.
Apparently it is okay to quote 100 lines to top-post a snarky comment?
SM Ryan said:Apparently it is okay to go off topic if you are in the Elect.
SM Ryan said:Apparently it is okay to go off topic if you are in the Elect.
(e-mail address removed) said:
Well, for starters, either teach them how to program or tell them not to
apply for programming jobs.
Seriously, it's not my place to teach you how to teach. But IF IT WERE, then
I'd suggest the following:
Make sure they have a thorough understanding of an abstract machine, to the
extent that they can write non-trivial machine code programs in it.
Then, once they know all that, teach them programming from the clean end.
High level, in other words. You should find that they pick it up rather
easily, because they understand what's going on underneath.
However, when I teach
intro to programming at the local junior college, I have a "Machine
Language Game" to teach people how the computer works. I have a 16x16
board representing RAM, a whiteboard segmented into registers, a
one-line card for the display, and a stack of papers which tell how to
execute different command codes. I have one student be the data bus,
one student be the ALU, and one student be the instruction handler, and
one student be the graphics card. The instruction handler tells the
data bus to grab the next instruction. The data bus reads it from RAM
and comes back with the instruction and the next two bytes. The
instruction handler then looks up the first number in the instruction
code manual, and does what the manual says. The ALU does what the
instruction handler says, and the graphics card just watches some memory
and updates his output whenever it changes, based on the ASCII codes.
It really helps students figure out how the computer really works
underneath.
In all seriousness though, the fundamental teaching problem is when
and where to draw the line between detail and abstraction, and
persuade the student to choose the appropriate view. The lack
today of simple but complete systems, such as were built around the
8080, is a drawback. A stack is a detail, automatic storage is an
abstraction.
[ snip ]
However, when I teach
intro to programming at the local junior college, I have a "Machine
Language Game" to teach people how the computer works. I have a 16x16
board representing RAM, a whiteboard segmented into registers, a
one-line card for the display, and a stack of papers which tell how to
execute different command codes. I have one student be the data bus,
one student be the ALU, and one student be the instruction handler, and
one student be the graphics card. The instruction handler tells the
data bus to grab the next instruction. The data bus reads it from RAM
and comes back with the instruction and the next two bytes. The
instruction handler then looks up the first number in the instruction
code manual, and does what the manual says. The ALU does what the
instruction handler says, and the graphics card just watches some memory
and updates his output whenever it changes, based on the ASCII codes.
It really helps students figure out how the computer really works
underneath.
Cool idea. And you say this works well in practice? Hm ...
Ben Pfaff said:I know some excellent computer scientists who are not
programmers. It is not what they do. Their papers read like
math papers. I could never do what they do, and I wouldn't try
to claim that I could.
For their part, I suspect that they wouldn't claim they are good
programmers either, at least not in languages like C. Perhaps
that's the real problem: people who claim to be what they are
not. But I wouldn't criticize computer scientists for not being
programmers, as long as they don't claim to be.
Cool idea. And you say this works well in practice? Hm ....
In general, if she herself decided to stay and learn how to code, she'dI had an adult Walmart stocking clerk who had attended my class by
mistake (she was supposed to be in the "how to use MS Word class" decide
to stay and learn to program... and did every bit as good a job as the
kid who played around with Linux in the offtime.
That would likely be the main reason that she was motivated. I think IBecause of the game/simulation, she felt she could actually understand
how it all worked together.
Indeed. I would rather criticise someone who calls himself a
programmer and who isn't able to program a sub-quadratic sorting
algorithm without having to consult a textbook (or use a library
routine).
I can write a Quicksort function if I have to -- but I don't have to.
And if I tried it without consulting any references, it's likely the
resulting function would have some (probably subtle) bugs.
Richard Heathfield said:Keith Thompson said:
I think that's unlikely - in the sense that I am quite confident your
function would sort correctly (after a few muffled curses and a bit more
typing, obviously).
I think it's far /more/ likely that it would in fact be a SlowSort, and
that
you could easily write a Shell sort to out-perform it.
Implementing Quicksort is pretty easy. Implementing it *efficiently* is a
fair bit harder.
I guess you basic argument is something like - Efficiently is why
academics study such things. Programmers often only need concern
themselves with evaluating whether the algorithm is or is not in conflict
with NP completeness
When you define success that way, I doubt if anyone could possible
disagree.
When I first started out (auto industry) success was often being able to
reduce the cost of a part by 1 cent (US). Efficency counted
Richard said:David Lightstone said:
No, that's not what I'm saying at all. In fact, I was agreeing with the idea
that it's far better to write a call to qsort (possibly two minutes to bash
out a comparison function, and another twenty seconds to check the order of
the args in K&R) than to spend two hours getting Quicksort /wrong/. We all
Yeah, but if saving a cent is success, then saving a dollar is a triumph.
The best way to make your sort Quick is to use the standard library's
sorting routine. That cuts down on writing time, debugging time, and run
time.
Richard said:The best way to make your sort Quick is to use the standard library's
sorting routine.
The truth of that is entirely implementation dependant.
pete said:The truth of that is entirely implementation dependant.
CBFalconer said:It is always possible to make quicksort go quadratic solely by the
appropriate input data. See:
SOFTWARE-PRACTICE AND EXPERIENCE, VOL. 29(0), 1-4 (0 1999)
A Killer Adversary for Quicksort
M. D. MCILROY
Dartmouth College, Hanover, NH 03755, USA
Richard Heathfield said:David Lightstone said:
No, that's not what I'm saying at all. In fact, I was agreeing with the idea
that it's far better to write a call to qsort (possibly two minutes to bash
out a comparison function, and another twenty seconds to check the order of
the args in K&R) than to spend two hours getting Quicksort /wrong/. We all
know that Quicksort is faster than Shell sort, so it isn't a question of
choosing which algorithm to use. But having coded up a Shell sort just for
the hell of it, and then a Quick sort because I'm-so-smart-I-can, I was
astounded to discover that my Shell sort consistently outperformed my Quick
sort over a large range of data set sizes - tiny, through mid, and right
the way up to huge. Obviously I got my Quick sort wrong. It sorted just
fine, but it took way too long to do it.
Googmeister said:No, that's not true. If you partition on the median element,
you can guarantee O(n log n) - admittedly finding the median
in linear time is not so practical. Alternatively, if you
partition on a random element (using a good source of
randomness), no fixed input will make it go quadratic.
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.