What is different with Python ?

A

Andrea Griffini

I fail to see the relationship between your reply and my original
message.
I was complaining about the illusion that in the old time people were
more
interested in programming than now. Instead your reply is about low
level
languages being more suitable for beginners than high level languages.
I don't see the connection.

I've been told in the past that one reason for which is
good to start from high-level languages is that you
can do more with less. In other words I've been told
that showing a nice image and may be some music is
more interesting than just making a led blinking.
But if this is not the case (because just 1% is
interested in those things no matter what) then
why starting from high level first then ?

I would say (indeed I would *hope*) that 1% is a low
estimate, but probably I'm wrong as others with more
experience than me in teaching agree with you.
Having more experience than me in teaching programming
is a very easy shot... I never taught anyone excluding
myself. About the 1%, I've two brothers, and one of
them got hooked to programming before me... the other
never got interested in computers and now he's just a
basic (no macros) ms office user.

So in my case it was about 66%, and all started with
a programmable pocket RPN calculator ... but there were
no teachers involved; may be this is a big difference.

Andrea
 
A

Andrea Griffini

And the fact that he's teaching C++ instead of just C seems to go
against your own theories anyway... (though I realize you weren't
necessarily putting him forth as a support for your position).

He's strongly advocating of a starting from high-level;
comp.lang.c++.moderated is where I first posted on this issue.

While I think that python is not a good first language, C++
is probably the *worst* first language I can think to.

C++ has so many traps, asymmetries and ugly parts (many for
backward compatibility) that I would say that one should try
put aside logic when learning it and just read the facts; in
many aspect C++ is the way it is for historical reasons or
unexplicable incidents: IMO there's simply no way someone can
deduce those using logic no matter how smart s/he is.
C++ IMO must be learned by reading... thinking is pointless
and in a few places even dangerous.

Also, given the C/C++ philosophy of "the programmer always
knows perfectly what is doing", experimenting is basically
impossible; trial and error doesn't work because in C++
there is no error; you have undefined behaviour daemons
instead of runtime error angels. Add to the picture the
quality of compile time error messages from the primitive
template technology and even compile time errors often look
like riddles; if you forget a "const" you don't get "const
expected"... you get two screens full of insults pointing
you in the middle of a system header.

Thinking to some of the bad parts of it it's quite shocking
that C++ is good for anything, but indeed it does work; and
can be better than C. I think C++ can be a great tool
(if you understand how it works; i.e. if it has no magic
at all for you) or your worst nightmare (if you do not
understand how it works).

I think that using C++ as the first language for someone
learning programming is absurd. Francis thinks otherwise.

Andrea
 
R

Roy Smith

Andrea Griffini said:
Add to the picture the quality of [C++] compile time error messages
from the primitive template technology and even compile time errors
often look like riddles;

Yeah, but what they lack in quality, they make up for in quantity.
if you forget a "const" you don't get "const expected"... you get
two screens full of insults pointing you in the middle of a system
header.

Python and C++ complement each other quite nicely. For example, one
of the first things I did when I was learning C++ was to write a
Python program which parsed and re-formatted C++ compiler error
messages so they were easier to read :)
 
M

Michele Simionato

Andrea said:
Why hinder ?

Suppose you have to accomplish a given task using a framework
which is unknown to you. The manual is 1000 pages long.
In order to get the job done, it is enough to study 50 pages
of it. There are people with the ability to figure out very
quickly which are the relevant 50 pages and ignore the other
950. Granted, these people will have a shallow knowledge with
respect to somebody studying the whole manual, but they
will get the job done much faster and in some circumstances
speed is more valuable than deep knowledge.
To be able to content himself with a shallow knowledge
is a useful skill ;)

Michele Simionato
 
A

Andrea Griffini

....
To be able to content himself with a shallow knowledge
is a useful skill ;)

Ah! ... I agree. Currently for example my knowledge
of Zope is pretty close to 0.00%, but I'm using it
and I'm happy with it. I did what I was asked to do
and took way less time than hand-writing the cgi stuff
required. Every single time I've to touch those scripts
I've to open the Zope book to get the correct method
names. But I'd never dare to call myself a zope
developer... with it I'm just at the "hello world"
stage even if I accomplished what would require a
lot of CGI expertise.
But once I remember running in a problem; there was
a file of about 80Mb uploaded in the Zope database
that I wasn't able to extract. I was simply helpless:
download always stopped arount 40Mb without any error
message. I wandered on IRC for a day finding only
other people that were better than me (that's easy)
but not good enough to help me.
In the end someone gave me the right suggestion, I
just installed a local zope on my pc, copied the
database file, extracted the file from the local
instance and, don't ask me why, it worked.
This very kind of problem solution (just try doing
stupid things without understanding until you get
something that looks like working) is what I hate
*MOST*. That's one reason for which I hate windows
installation/maintenance; it's not an exact science,
it's more like try and see what happens.
With programming that is something that IMO doesn't
pay in the long run.
I'm sure that someone that really knows Zope would
have been able to get that file out in a minute,
and may be doing exactly what I did.
But knowing why! And this is a big difference.

Indeed when talking about if learning "C" can hinder
or help learning "C++" I remember thinking that to
learn "C++" *superficially* learning "C" first is
surely pointless or can even hinder.
But to learn "C++" deeply (with all its quirks) I
think that learning "C" first helps.

So may be this better explain my position; if you wanna
become a "real" programmer, one that really has things
under control, then learning a simple assembler first
is the main path (ok, may be even a language like C
can be a reasonable start, but even in such a low-level
language there are already so many things that are
easier to understand if you really started from bytes).

However, to be able to do just useful stuff with a
computer you don't need to start that low; you can
start from python (or, why not, even dreamweaver).

Andrea
 
P

Peter Hansen

D said:
So you say he "has done relatively little serious development" and that
he may not even know about Python. I didn't see any evidence from those
pages to draw either conclusion. In fact the 4th paragraph quite
contradicts them both.

Clearly this is a matter of opinion. Now that you've expressed yours,
did you have a point to make besides that you like to contradict my
posts? Maybe you'd like to take the opportunity to mention Boo?

-Peter
 
D

D H

Peter said:
D H wrote:


Clearly this is a matter of opinion.

You're kidding right? Did you even read the about page you cited? The
guy has been doing C++ development for decades, he wrote a book on it,
and yet you say he has done "little serious development"??? That's
absurd. And he was the president of ACCU, which if you look on the ACCU
page at the very top is a mention of Python. And yet you suggested that
he hasn't even heard of Python before. Again, absurd.
 
R

Rocco Moretti

Andrea said:
Indeed when talking about if learning "C" can hinder
or help learning "C++" I remember thinking that to
learn "C++" *superficially* learning "C" first is
surely pointless or can even hinder.
But to learn "C++" deeply (with all its quirks) I
think that learning "C" first helps.

I think you are mistakingly bringing order into the picture, when extent
is more likely the case. If you want to master C++, I think that most
would agree you need to understand C. But there are many who would
disagree that the path to C++ must *start* at C. (In fact, many people
argue that a lot of bad C++ is due to people programming C in C++.)
Instead they would argue that you should start by learning C++
"superficially", then learn C, and re-evaluate you C++ practices in
light of the lessons learned from C.

The example I'll pull out is natural languages - I understood the
grammar & construction of my native tounge *much* better after learning
a foreign language. From people I've talked to, this is a common
occurance. But there would be few people who would advocate that one
should learn a foreign language before learning one's native tounge.
 
M

Mike Meyer

Andrea Griffini said:
Whoops.

So you know assembler, no other possibility as it's such
a complex language that unless someone already knows it
(and in the specific architecture) what i wrote is pure
line noise.

You studied it after python, I suppose.

Nope. I don't think I've learned any assemblers since I learned Python.
Of course, I'd been writing code for 20 years before I learned Python.
In assembler details are simply more explicit. Unfortunately
with computers you just cannot avoid details, otherwise your
programs will suck bad. When I wrote in an high level language
or even a very high level one the details are understood even
if I'm not writing down them. After a while a programmer will
even be able to put them at a subconscius level and e.g. by
just looking at O(N^2) code that could be easily rewritten as
O(N) or O(1) a little bell will ring in you brain telling you
"this is ugly". But you cannot know if something is O(1) or
O(N) or O(N^2) unless you know some detail. If you don't like
details then programming is just not the correct field.

I've never argued otherwise.
Think that "a = b + c" in computes the sum of two real
numbers and your program will fail (expecting, how fool,
that adding ten times 0.1 you get 1.0) and you'll spend
some time wondering why the plane crashed... your code
was "correct" after all.

Especially if b and c aren't floats. I've always used "real" as a
mathematical term, since they had it first. Computers don't deal with
reals.
To use that I've to understand what registers will be
affected and how ugly (i.e. inefficient) the code could
get. Programmin in assembler using such an high level
feature without knowing those little details woul be
just suicidal.

The assembler lets you specify which registers to use. You either name
them in place of variables, or variables that are labels for the
registers.
But saying for example that

del v[0]

just "removes the first element from v" you will end up
with programs that do that in a stupid way, actually you
can easily get unusable programs, and programmers that
go around saying "python is slow" for that reason.

That's an implementation detail. It's true in Python, but isn't
necessarily true in other languages.

Yeah. And you must know which is which. Otherwise you'll
write programs that just do not give the expected result
(because the user killed them earlier).

Actually, it isn't always true in Python. What if v is a dictionary (in
which case the description is wrong), or a class that maps an SQL table's
row id's to objects holding the data for that row? In either case, the
statement will be O(1).

You do need to know which is which.
I think that a *decent* programmer must understand if the
code being written is roughly O(n) or O(n^2). Without
at least that the possibility of writing useful code,
excluding may be toy projects, is a flat zero.
Looking that information later may be just "too" late,
because the wrong data structure has already been used
and nothing can be done (except rewriting everything).

I don't think those two statements contradict each other. A decent
programmer will know the O() of the code they write - or where to
find that information. And they'll check it beforehand.

The advantage of using an HLL is that rewriting everything to try
other data structures (after all, the constants that O() notation
ignore matter as well, so that the fastest O() notation may not be
the fastest solution for the problem in hand).
The problem is that unless you really internalized what
that means you'll forget about it. Don't ask me why,
but it happens. Our mind works that way. You just cannot
live with a jillion of unrelated details you cannot place
in a scheme. It doesn't work. One would do thousand times
the effort that would be done using instead a model able
to justify those details.

Again, you're generalizing from "your mind" to "everyone's mind". My
experience indicates that's not true for me. For instance, I find that
learning a typical assembler involves learning a jillion unrelated
details - because it's not at all uncommon for the opcode mnemonics to
be seemingly random strings of characters. Or random words.
Architectures with irregular register usages seem to have little rhyme
or reason behind those irregularities (though I did avoid those
architectures, so may have missed the reason(s) behind some of them).
Even on architectures with symmetric register usage, register usage
conventions are pretty much arbitrary.
Except that the marketing will continuosly shift what
you application is supposed to do. And this is good, and
essential. This is "building". Sometimes marketing will
change specifications *before* you complete the very
first prototype. For complex enough projects this is more
the rule than the exception. In the nice "the pragmatic
programmer" book (IIRC) is told that there's no known
complex project in which specification was changed less
than four times before the first release... and the only
time they were changed just three times it was when the
guy running with the fourth variations was hit by a
lightning on the street.

Except that those specification changes rarely change the top-level
object/method/whatever. At least, all the ones I dealt with wound
up changing things that were in the middle of the design. The easiest
ones were the ones that were anticipated, and so the changes were all
in data, and not in code.
What you will obtain is that people that will build
wrong models. Omitting details, if they can really
affect the result, is not a good idea.

Well, the "result" largely depends on why the project is being built.
If you're doing exploratory programming, the "result" is a better
understanding of the objects in the problem domain. The details that
affect that result are radically different from the details that affect
the result if you're building a production application, which is again
different from the details that affect the result if you're teaching
people how to program.

Again, the critical thing is teaching students what details matter, and
which ones don't.
As mentioned, you see it all the time in c.l.python. People come from
other languages, and try to write Python as if the rules for that
other language apply.

That's exactly because they don't know the details of
any of the languages you used. Someone knowing the
details would be curious to know *how* "del v[0]"
is implemented in python. Actually it could be changed
easily in an O(1) operation with just a little slowdown
in element access (still O(1) but with a bigger constant).
This is a compromise that has not been accepted and
this very fact is important to know if you plan to
use python seriously.

Actually, you don't need to know *anything* about the compromise if you
plan on using python seriously. You do need to know that "del v[0]" on
list is O(n).
Sorry, but I really don't agree that big O is a "detail"
that could be ignored. Only bubble-and-arrow powerpoint
gurus could think that; I'm not in that crew.
Ignore those little details and your program will be
just as good as ones that don't even compile.

I've never argued that you should treat O() behavior as a detail that
can be ignored. I've argued that it's an implementation detail. As
such, you worry about it when you do the implmentation. If you need to
delete from both ends of an ordered set of objects, you can't use a
python list and get reasonable performance.
Sorting is abstract ?

Yeah. Remember, I'm talking about m.e., chem.e, etc. engineering students
here. Not software engineers or any other type of cs types.
I'll blame my bad english for understanding that you

If you wish. But since you posted your list of misconceptions about
what I said, I'm going to correct them.
said that abelian groups should be taught before
relative numbers (somehow I crazily thought the point
of discussion was what's the correct order of learning
how to program),

Again, I never said that. I said *I* understood them better than
relative numbers, because *you* asked whether or not I did. That
says *nothing* about how I think they should be taught. I'm not so
egotistical as to think that every body thinks they same way I do.
that TAOCP is too abstract (a book
where every single code listing is in assembler!)

I said it was too abstract for a specific group - one that deals
with concrete problems. In FORTRAN, usually.
and that big-o when programming is a detail that can
be safely ignored (good luck, IMO you'll need hell a
lot of it).

No, I said it was an implementation detail. I've maintained all along
that good programmers need to know those details.

<mike
 
M

Mike Meyer

Andrew Dalke said:
Some physicists (often mathematical physicists) propose
alternate worlds because the math is interesting.

Mathematicians, on the other hand, tried to demonstrate that their
alternate worlds couldn't exist - and found the math in their failures
interesting. Hence we get non-euclidean geometries and other
interesting things - that physicists find useful. (To be fair, some
of the alternate mathematical world were first explored by physicists).

<mike
 
M

Mike Meyer

Claudio Grondi said:
What has it all to do with Python? To be not fully off-topic, I
suggest here, that it is much easier to discuss programming
related matters (especially in case of Python :) or mathematics
than any other subjects related to nature, because programming is
_so easy_ compared to what is going on in the "real world".
I see the reason for that in the fact, that programming is based
on ideas and rules developed by humans themselves, so it is
relatively easy to test and proove if statements are right or not.

As a mathematician, I have to say "ugh". Not all statements are easy
to test and prove. In fact, in any non-trivial mathematical system,
there will be statements that *cannot* be proven to be either true
or false. Some of those statements are interesting. The legends of
mathematics are problems that aren't easy to test and prove: fermant's
last theorem, the four color map theorem, and so on. Check out <URL:
http://mathworld.wolfram.com/UnsolvedProblems.html > for a longer list.

It's not clear that the ideas/rules were "developed" by humans. I'd
say "discovered". In some cases in the past, mathematicians unhappy
about some rule set out to show that it must be true (or false). In
failing to show that, they invented a new branch of mathematics.

I'd say programming is more like that. But I approach programming from
a mathematicians viewpoint.

<mike
 
C

Christos TZOTZIOY Georgiou

Actually, a Python string is only good for modelling ROM. If you want to
model read-write memory, you need a Python list.

This is a misquote, since Andrea paraphrased what Peter Maas said. It
was Peter that suggested string usage to model memory (and obviously
forgot momentarily about string immutability in python).

If you included (even better, read :) the rest of Andrea's paragraph, it
would be obvious that you actually agree with Andrea.
 
C

Christos TZOTZIOY Georgiou

At one point, a friend and i founded a university to give our recreational
random hackery a bit more credibility (well, we called ourself a
university, anyway; it was mostly a joke). We called the programming
department 'Executable Poetry'.

That's a good idea for a t-shirt:

"Python: executable poetry"

(kudos to Steve Holden for
(e-mail address removed) where the term PIPO
(Poetry In, Poetry Out) could be born)

and then, apart from t-shirts, the PSF could sell Python-branded
shampoos named "poetry in lotion" etc.
 
P

Peter Otten

Christos said:
and then, apart from t-shirts, the PSF could sell Python-branded
shampoos named "poetry in lotion" etc.

Which will once and for all solve the dandruffs problem prevalent among the
snake community these days.

Not funny? know then that German has one term for both 'dandruff' and
'scale' (Schuppe).

Still not funny? at least you have learned some German.

Peter
 
C

Christos TZOTZIOY Georgiou

(kudos to Steve Holden for
(e-mail address removed) where the term PIPO
(Poetry In, Poetry Out) could be born)

oops! kudos to Michael Spencer (I never saw Michael's message on my
newsserver, so I archived Steve's).
 
S

Scott David Daniels

Peter said:
Which will once and for all solve the dandruffs problem prevalent among the
snake community these days.

And once again the Pythonistas will be known as snake-oil salesmen.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,482
Members
44,901
Latest member
Noble71S45

Latest Threads

Top