What is different with Python ?

P

Philippe C. Martin

I apologize in advance for launching this post but I might get enlightment
somehow (PS: I am _very_ agnostic ;-).

- 1) I do not consider my intelligence/education above average
- 2) I am very pragmatic
- 3) I usually move forward when I get the gut feeling I am correct
- 4) Most likely because of 1), I usually do not manage to fully explain 3)
when it comes true.
- 5) I have developed for many years (>18) in many different environments,
languages, and O/S's (including realtime kernels) .


Yet for the first time I get (most) of my questions answered by a language I
did not know 1 year ago.

As I do try to understand concepts when I'm able to, I wish to try and find
out why Python seems different.

Having followed this newsgroup for sometimes, I now have the gut feeling
(see 3)) other people have that feeling too.


Quid ?

Regards,

Philippe
 
?

=?ISO-8859-1?Q?=22Martin_v=2E_L=F6wis=22?=

Philippe said:
I apologize in advance for launching this post but I might get enlightment
somehow (PS: I am _very_ agnostic ;-).

- 1) I do not consider my intelligence/education above average
- 2) I am very pragmatic
- 3) I usually move forward when I get the gut feeling I am correct
- 4) Most likely because of 1), I usually do not manage to fully explain 3)
when it comes true.
- 5) I have developed for many years (>18) in many different environments,
languages, and O/S's (including realtime kernels) .


Yet for the first time I get (most) of my questions answered by a language I
did not know 1 year ago.

As I do try to understand concepts when I'm able to, I wish to try and find
out why Python seems different.

Unfortunately, you didn't give many examples of what you did for the
last 18 years (except that that also included RT kernels).

So let me guess two aspects:

1. In these 18 years, you got acquainted to a variety of concepts
in various languages. When dealing with Python, you could easily
correlate between Python concepts and the ones you are familiar
with. This is one of Python's strenghts: it tries not to be
surprising, but builds on what most people consider standard.
Try "import this" some time; you may be experiencing the Zen:

Readability counts.
...
Special cases aren't special enough to break the rules.
Although practicality beats purity.
...
In the face of ambiguity, refuse the temptation to guess.

2. You may not have dealt with a weakly-typed language before. If
that is the case, your feeling of "something being different"
most likely comes from that difference.

Regards,
Martin
 
R

Robert Kern

Philippe said:
I apologize in advance for launching this post but I might get enlightment
somehow (PS: I am _very_ agnostic ;-).

- 1) I do not consider my intelligence/education above average
- 2) I am very pragmatic
- 3) I usually move forward when I get the gut feeling I am correct
- 4) Most likely because of 1), I usually do not manage to fully explain 3)
when it comes true.
- 5) I have developed for many years (>18) in many different environments,
languages, and O/S's (including realtime kernels) .

Yet for the first time I get (most) of my questions answered by a language I
did not know 1 year ago.

I cannot understand this sentence. What questions? Which language?

Do you mean that, currently, when you need to solve a problem, you
usually use Python even though you are relatively new to it? And that
before learning Python you usually used a variety of languages, none
dominating the others?
As I do try to understand concepts when I'm able to, I wish to try and find
out why Python seems different.

Python is my language of choice because it doesn't get in the way. I
don't have to contort my problem into strict class heirarchies or
recursive functions. I don't have to construct the whole system to test
just a part of it. The interactive prompt has become vital to my
workflow. By and large, I just Get It Done.

The "one and preferably only one obvious way to do it" principle and
Python's emphasis on readability means that I gain knowledge and
capability as I write code. When I need to do a similar task six months
later, I don't have to spend an inordinate amount of time figuring out
what the hell I was thinking back then. In the same vein, I can also
read and learn from others' code much more than I could from, say, Perl.

--
Robert Kern
(e-mail address removed)

"In the fields of hell where the grass grows high
Are the graves of dreams allowed to die."
-- Richard Harter
 
P

Peter Hansen

Martin said:
2. You may not have dealt with a weakly-typed language before. If
that is the case, your feeling of "something being different"
most likely comes from that difference.

If he's done realtime kernels, he's most definitely worked with a weakly
typed language before (assembly most likely), but I think you meant to
say (or should have said) "dynamically typed".

-Peter
 
P

Philippe C. Martin

Thanks ,
I have gotten many answers already, some not posted.

1) Typing is not the issue - even with RT-Kernels, people use C++
2) Yes I find dynamic binding very nice
3) "... you didn't give many examples of what you did for the
last 18 years (except that that also included RT kernels). ...." assembly
(losts) , basic, cobol, lisp, JAVA, c, c++, perl, Tcl, Java, JavaCard .....

I know the "interactive" aspect helps also, the runtime error/exception
checking, the many libraries/tools, the responsiveness of the people on
this newsgroup, the "introspectiveness" of the system, the cross-platform
it deals with, the way it "pushes" people to code in a clean way, the GUI
support, the stability, the extensibility (in and out) .... I'm sure you'll
agree none of that can explain why after 1 week of playing with, I was more
productive in Python than C/C++ just as I know my product (I will not
describe it here as I am not marketing) would not exist today were it not
for Python.
4) Yes I agree a mix ("... well spiced soup ...") seems to be the answer but
my brain somehow wants to formalize it.

Regards,

Philippe
 
C

Claudio Grondi

Re: What is different with Python ?

from my point of view, to be honest, nothing
except mixing a well spiced soup of what
was available within other programming
languages.

I think, that what currently makes a real
difference is not the language as such,
but the people using it, posting here and
writing new modules for it.

I can imagine, that with becoming more
popular and less supported by the
core development team (following a
special kind of programming philosophy
called "Pythonic way" of approaching
things) this can change, so I can only hope,
that this won't happen.

Don't ask _me_ what "Pythonic way" is -
I think, you have to feel it yourself in order to
understand it (I haven't yet seen any definition
of it different from what is already also known
from other programming languages, but maybe
someone can provide it?).

by the way: I see a contradiction between
- 1) I do not consider my intelligence/education above average and
- 5) I have developed for many years (>18) in many different environments,
languages, and O/S's (including realtime kernels) .
because - 5) is the story many of programmers choosing Python
as a tool or programming language of their choice went through
and because of the fact you are here asking that question.

Claudio
 
P

Philippe C. Martin

I agree '...choice for the very beginners ...': a hundred year ago I was a
Pascal TA, and although I like the language, I find/found people stuggled
as much with the language as with the algorithm they were supposed to
implement.

"...mostly variants of Basic..." What I truly liked going from Basic (which
has greatly evolved) to Pascal was the fact I found a definite risk not
having to declare variable/ or rather I understood the lack of danger in
doing so: The one (so I thought) glitch with Python that almost made me
stop playing with was that very fact. yet I agree a complete beginner would
simply the approach most meaningful "why should I write int i = 1 since I
know 1 is an int". Since the "dangers" of old basic are gone from Python
(can't do i=y if y has not ever been initialized). I must agree with that
too. I'm actually pushing the few CS professors I know to use Python for CS
101. Yet, many issues that a future software engineer should know are
mostly hidden by Python (ex: memory management) and that could be
detrimental.

Regards,

Philippe
 
T

Tom Anderson

Yet for the first time I get (most) of my questions answered by a
language I did not know 1 year ago.

Amazing, isn't it? Rest assured that you're not alone in feeling this way.
I don't know quite why, but python is just makes writing programs
immensely easier than any other language i've ever used; i think it's the
very minimal amount of boilerplate it requires, the clean and powerful set
of builtin types and functions and, for me, the higher-order functions. I
can do in a few lines of python what would have taken me pages and pages
of java.

tom

PS: http://jove.prohosting.com/~zahlman/cpp.html
 
C

Claudio Grondi

4) Yes I agree a mix ("... well spiced soup ...")
seems to be the answer but
my brain somehow wants to formalize it.

Here one further suggestion trying to point out, that
it probably can't generally be formalized, because
the experience one developes after going through
the story of "assembly, basic, cobol, lisp,
JAVA, c, c++, perl, Tcl, Java, JavaCard" has
in my opinion a vital impact on shortcuts one uses
and the way of doing things. I mean, that the concept
of Python has raised from such experience, so anyone
who went through all this, will get the core ideas
implemented in Python without any effort, because
they were already there as a kind of meta-language
used in thinking, unconsciously looking for the
chance of beeing expressed in formalized form
as a new programming language.
To support my thesis I can mention here, that
from my experience, Python seems not to be
the language of choice for the very beginners,
who prefere another approaches which are
mostly variants of Basic.

Claudio
 
P

Peter Hansen

Philippe said:
too. I'm actually pushing the few CS professors I know to use Python for CS
101. Yet, many issues that a future software engineer should know are
mostly hidden by Python (ex: memory management) and that could be
detrimental.

I think new CS students have more than enough to learn with their
*first* language without having to discover the trials and tribulations
of memory management (or those other things that Python hides so well).

Simple concepts like variables, control structures, input and output are
more than enough to start with. In fact, I suspect any course that
attempts to teach with a language that requires things like manual
memory management will be failing to provide an effective grounding in
computer science because of all the noise. Seeing the forest for the
trees and all that...

-Peter
 
R

Roy Smith

Philippe C. Martin said:
Yet, many issues that a future software engineer should know are
mostly hidden by Python (ex: memory management) and that could be
detrimental.

I know I'm going out on a limb by asking this, but why do you think future
software engineers should know about memory management?

I used to worry about register allocation. Today, I don't even know how
many registers any machine I work on has. I used to worry about word size,
and byte order. I used to worry about whether stacks grew up or down and
addressing modes and floating point formats. Sure, somebody's got to worry
about those things, but most people who write software can be blissfully
ignorant (or, at best, dimly aware) of these issues because somebody else
(compiler writer, hardware designer, operating system writer, etc) has
already done the worrying.

There used to be a time when you had to worry about how many tracks to
allocate when you created a disk file. When's the last time you worried
about that?
 
J

John Machin

Roy said:
I know I'm going out on a limb by asking this, but why do you think future
software engineers should know about memory management?

Perhaps we have a terminology problem here i.e. different meanings of
"software engineer". Philippe started talking about "CS" courses,
whereas you may be referring to people who have done an "IT" course or
achieved a certification in the use of app development tool X.
I used to worry about register allocation. Today, I don't even know how
many registers any machine I work on has. I used to worry about word size,
and byte order. I used to worry about whether stacks grew up or down and
addressing modes and floating point formats. Sure, somebody's got to worry
about those things, but most people who write software can be blissfully
ignorant (or, at best, dimly aware) of these issues because somebody else
(compiler writer, hardware designer, operating system writer, etc) has
already done the worrying.

You would hope they'd done more than worry about it. However sometimes
one's fondest hopes are dashed. You must have noticed the anguish in the
timbot's posts that mention Windows 95 memory management.
There used to be a time when you had to worry about how many tracks to
allocate when you created a disk file. When's the last time you worried
about that?

Seeing you asked: early 1970s, on an IBM 1800. But much more recently it
certainly helped if one were slightly more than dimly aware of the
difference between a FAT filesystem and an NTFS filesystem :)

Cheers,
John
 
M

Mike Meyer

John Machin said:
Perhaps we have a terminology problem here i.e. different meanings of
"software engineer". Philippe started talking about "CS" courses,
whereas you may be referring to people who have done an "IT" course or
achieved a certification in the use of app development tool X.

While I agree with John - software engineers should know something
about memory managment - I sort of agree with Roy as well, in that,
like Peter, I think memory management is something that doesn't need
to be taught immediately. A modern programming environment should take
care of the details, but a software engineer will be cognizant of the
details, and know enough to know when they have to worry about it and
when they can safely ignore it.
You would hope they'd done more than worry about it. However sometimes
one's fondest hopes are dashed. You must have noticed the anguish in
the timbot's posts that mention Windows 95 memory management.

I think most of those things are indeed things that your average
software engineer can ignore 90+% of the time. What makes someone a
software engineer is that they know about those details, and know how
they will affect the code they are writing - and hence when they have
to worry about those details.

Oddly enough, I find similar comments apply to a lot of the data
structures I learned in school. I recently applied for a job that had
a series of essay questions in the application. They had a series of
problems with requests for solutions, and my immediate reaction to
each was to reach for off-the-shelf software to solve the
problem. While they wanted - and I provided - a discussion of data
structures and big-O running time for various operations, all the
things they wanted to do were essentially solved problems, and there
was debugged and tuned code available to deal with things - and it's
much faster to not write software if you can to solve the problem.

For instance, one problem was "You have two files that have lists of 1
billion names in them. Print out a list of the names that only occur
in one of the files."

That's a one-line shell script: "comm -12 <(sort file_one) <(sort file_two)"

I gave them that answer. I also gave them a pseudo-code solution, but
frankly, in real life, I'd install the shell script and get on with
things. If I were hiring someone, I'd hire the person who gave me the
shell script. Rather than spending hours/days debugging a program to
solve the problem, I get a solution in minutes. If it runs into
problems, *then* it's time to start hand coding the solution.
Seeing you asked: early 1970s, on an IBM 1800. But much more recently
it certainly helped if one were slightly more than dimly aware of the
difference between a FAT filesystem and an NTFS filesystem :)

For me it was the late 1970s, on an IBM 3081. But I was worried about
disk sector sizes well into the 1990s. Since then I've worked on
systems that didn't have a file system as such; it had a database of
databases, and you queried the former to find the latter.

<mike
 
P

Philippe C. Martin

I guess because I have mostly worked with embedded systems and that,
although I have always tried to put abstraction layers between my
applications and the hardware, some constraints still remain at the
application level: (memory, determinism, re-entrance,...). You will notice
that 99% of the embedded systems with realtime constaints use assembly,
C/C++, or ADA.

I agree with you and Peter though that these issues need not be treated on a
first course. Yet, and depending on the ultimate goal (John spoke of IT
versus CS) some of the newly trained folks should know about it. We could
not enjoy Python if no one were here to implement its VM, I have not looked
at the code, but I gather it is fairly complex and does require an amount
of "low level" skills.

Regards,

Philippe
 
R

Roy Smith

John Machin said:
Perhaps we have a terminology problem here i.e. different meanings of
"software engineer". Philippe started talking about "CS" courses,
whereas you may be referring to people who have done an "IT" course or
achieved a certification in the use of app development tool X.

No, you've missed the point entirely.

No, the problem is that I'm out on the limb, and you're still comfortably
standing on the ground leaning up against the trunk. Climb up and come out
on the limb with me. Now, stop hugging the trunk and take a few steps out
here with me. Don't worry about how it's swaying, and whatever you do,
don't look down.

The point I was trying to make was that as computer science progresses,
stuff that was really important to know a lot about becomes more and more
taken for granted. This is how we make progress.

I used to worry about memory busses at the milivolt and microsecond level.
I knew about termination impedances and wired-OR logic, and power budgets
and all that good stuff. Today all I know about memory is you go to
www.crucial.com, type in your Visa card number, and the nice UPS guy shows
up with some SIMMs in a few days.

I expect that's all most current CS students know as well. Is that bad?
Is their education somehow lacking because they don't understand why
"memory bus" and "transmission line" belong in the same sentence? Not at
all. All that's happened is that very important stuff has become so
standardized that they don't have to worry about it any more and can spend
their time and energy thinking about other problems that need to be solved
today.

There are lots of really important, hard, theoretical problems that today's
CS majors need to be solving. User interfaces for the most part still
suck. Self-configuring and self-healing high speed networks on a global
scale. AI hasn't really progressed in 30 years. Computer vision and
speech. Robotics. Cryptography and security. And what about flying cars?

Just like you can't even begin to think about building today's GUI-driven
desktop applications if you're still worrying about individual logic gates,
you can't begin to think about solving some of these really hard problems
(and others we haven't even imagined) if you're still worrying about memory
buffer reference counting and garbage collection. Yesterday's research
projects are today's utilities and tomorrow's historical footnotes.
 
T

Tom Anderson

For instance, one problem was "You have two files that have lists of 1
billion names in them. Print out a list of the names that only occur
in one of the files."

That's a one-line shell script: "comm -12 <(sort file_one) <(sort file_two)"

Incidentally, how long does sorting two billion lines of text take?

The complementary question, of course, is "how long does it take to come
up with an algorithm for solving this problem that doesn't involve sorting
the files?"!

the best thing i can come up with off the top of my head is making a pass
over one file to build a Bloom filter [1] describing its contents, then
going over the second file, checking if each name is in the filter, and if
it is, putting it in a hashtable, then making a second pass over the first
file, checking if each name is in the hashtable. this would work without
the filter, but would require storing a billion names in the hashtable;
the idea is that using the filter allows you to cut this down to a
tractable level. that said, i'm not sure if it would work in practice - if
you have a billion names, even if you have a filter a gigabyte in size,
you still have a 2% false positive rate [2], which is 20 million names.

tom

[1] http://en.wikipedia.org/wiki/Bloom_filter
[2] http://www.cc.gatech.edu/fac/Pete.Manolios/bloom-filters/calculator.html
 
S

Steven D'Aprano

The point I was trying to make was that as computer science progresses,
stuff that was really important to know a lot about becomes more and more
taken for granted. This is how we make progress.

I used to worry about memory busses at the milivolt and microsecond level.
I knew about termination impedances and wired-OR logic, and power budgets
and all that good stuff. Today all I know about memory is you go to
www.crucial.com, type in your Visa card number, and the nice UPS guy shows
up with some SIMMs in a few days.

Yes. But (to a first approximation) memory either works or it doesn't. And
we never need to worry about it scaling, because you don't get to assemble
your own SIMMs -- you buy them pre-made. Software is nothing like that.

[snip]
Just like you can't even begin to think about building today's
GUI-driven desktop applications if you're still worrying about
individual logic gates, you can't begin to think about solving some of
these really hard problems (and others we haven't even imagined) if
you're still worrying about memory buffer reference counting and garbage
collection. Yesterday's research projects are today's utilities and
tomorrow's historical footnotes.

Nice in theory, but frequently breaks down in practice. Let's take a nice,
real, Python example:

I write an text-handling application in Python. I've taken your advice,
and don't worry about messy details about the language implementation,
and concentrated on the application logic. Consequently, I've used the
idiom:

new_text = ""
for word in text:
new_text = new_text + process(word)

I test it against text containing a few thousand words, and performance is
good. Then my customers use my application in the real world, using texts
of a few hundreds of millions of words, and performance slows to a painful
crawl.

Python does a good job of insulating the developer from the implementation
details, but even in Python those details can sometimes turn around and
bite you on the behind. And your users will discover those bum-biting
situations long before your testing will.

Joel of "Joel On Software" discusses this issue here:

http://www.joelonsoftware.com/articles/fog0000000319.html

Of course, we should not prematurely optimise. But we should also be aware
of the characteristics of the libraries we call, so we can choose the
right library.

Fortunately, a high-level language like Python makes it comparatively easy
to refactor a bunch of slow string concatenations into the list-append
plus string-join idiom.
 
P

Philippe C. Martin

Taking stuff for granted in unrelated to progress.

I agree that the "trade" of software engineering evolves and that, thanks to
hardware advances, we _usually_ can now "object orient" our software, add
billions of abstraction layers, and consume memory without a second
thought. But the trade evolves in the sense "sub"-trades are created, one
person becomes a database experts while another will html all of his/her
life (I personally find that sad). I'm being redundant here: The reason we
can use Python and take many issues for granted is because some very
skilled people handle the issues we find cumbersome.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,579
Members
45,053
Latest member
BrodieSola

Latest Threads

Top