Is python very slow compared to C

A

Alex Martelli

The "=" operator in Python

....doesn't exist, since '=' is not an operator in Python (just like it
isn't, say, in VB). But, OK, you mean "assignment".
is also quite different from many language
people had experience like C

Yes, but quite similar to assignment in Java, which is apparently the
most widely taught language nowadays.

Not really, because VB had TWO assignment verbs: LET (assignment as a
copy) and SET (assignment by reference, like Java and Python). The past
"had" is important, because VB today has (among many other changes,
which overall bring its semantics much closer to what Python all along)
sort of revolutionized the area of assignment... while hiding the big
differences under syntax that's much like that of previous VB versions,
an arrangement that's guaranteed to cause trouble (one of the underlying
reasons why so many people and firms are sticking with the old version,
VB6, and refusing to move to the new one, VB.NET aka VB7; of course,
Microsoft can force the issue, by not selling VB6 any more and
eventually removing all traces of support for it).
etc. And using the same argument,
everyone may still be programming in COBOL now.

If Cobol's key underlying concepts had proved satisfactory to the needs
of contemporary software development, as the "class" concepts appears so
far to have, then no doubt Cobol would enjoy yet more widespread
continuing acceptance (instead, while still widely used, it's also
generally seen as being on its long, slow way out).
And if we use market penetration as measure, Perl seems to be easier
for people ?

Perl historically did gain enormous traction by leveraging the
familiarity effect, "sucking in" its early adopters from the ranks of
sh/ksh, awk, and sed programmers. Of course, that effect was most
important in Perl's early years -- once solidly established, a language
creates its own "familiarity effect".

Javascript has leveraged its early advantage in the Netscape browser to
become the only "universally available" language for client-side "in the
browser" execution, and thus established a foothold in a strong and
growing niche for prototype based, rather than class based, object
models. However, there doesn't appear to be further spreading of such
object models; "big" new languages like C# (and indeed the whole
underlying CLR framework, which also forced the semantics of VB) are
strongly class-based.


Alex
 
C

Cameron Laird

.
.
.
Javascript has leveraged its early advantage in the Netscape browser to
become the only "universally available" language for client-side "in the
browser" execution, and thus established a foothold in a strong and
growing niche for prototype based, rather than class based, object
models. However, there doesn't appear to be further spreading of such
object models; "big" new languages like C# (and indeed the whole
underlying CLR framework, which also forced the semantics of VB) are
strongly class-based.


Alex

Were we to deepen this analysis, the next aspect to examine is that,
in my estimation, the JavaScript user community is unusual in the
extent to which its users don't aspire to comprehensive understanding.
Say to a Java or Eiffel or Lua programmer, "You didn't know about
{inner definitions|nested dictionaries|...}," and my bet is he'll say,
"Tell me more." Tell the majority of JS users that they don't seem to
be aware of inheritance constructs, and they respond, "So?"

That's a crude generalization, but one I think useful.

If true, it helps explain the comparative lack of penetration of
prototype-based object orientation.
 
R

rickman

Cameron said:
.
.
.
.
.
.
The programmers of, among other things, the FedEx bar-code reader,
the Sun boot loader, and parts of the Space Shuttle.

The original post seems to be missing, but my answer to the title
question is, No, Forth is not real.
 
I

Isaac Gouy

Steven said:
We do actually agree. You did explain why the speed of the language itself
is rarely the critical factor. My criticism is that whatever good your
post would have done was nullified by your opening comment stating that
Python is very slow -- a comment which I think is not only harmful, but
wrong, benchmarks like the one you linked to not withstanding.

I think it is wrong to call Python "very slow" just because it is slower
than some other language or languages, for the same reason it would be
wrong to describe the population of the UK as "very low" because 60
million people is a smaller number than China or India's one billion plus.
Doing so merely reinforces the premature optimizer's message that any
language that isn't C (and sometimes Lisp) is "not fast enough".

There was some context: Is python very slow compared to C?

With similar context your example becomes: Is the population of the UK
very low compared to the population of China or India? (Which seems to
be a reasonable question.)

We can make the missing context in the OP's question more obvious like
this: Is the population of the UK very /slow/ compared to the
population of China or India?

The benchmark you pointed to are of limited use for application
developers. (Their value to language designers is another story.)

Limited use for what purpose?
(Yes it really is difficult to make all our assumptions explicit in our
writing.)
Given
that Ocaml is (say) 200 times faster than Python on average, it tells the
application developer virtually nothing about the performance of his
application. And that's what user's care about -- they couldn't care less
about binary trees or partial sums, they care about how long it takes to
render a JPEG, open an email, save their files, query the database,
display a window, and so forth. Few application level tasks are limited by
the speed of the language, not these days.

As in
http://shootout.alioth.debian.org/miscfile.php?file=benchmarking&title=Flawed Benchmarks
You don't believe me? Consider the following:

When you drag your mouse over text in a graphical text editor, the text
highlights. How much money would you be prepared to pay to make that
highlighting occur 200 times faster? $100? $1? One cent? Chances are you
wouldn't pay a thing, because it is already fast enough, and making it
faster is of zero value to you -- even if highlighting ten gigabytes of
text might be uncomfortably slow.

What about opening an email? How much would you pay to reduce the time it
takes to open and display an email from a hundredth of a second to
virtually instantaneously? I suggest that most people would consider 0.01s
to already be be "virtually instantaneously".

The important question isn't "is Python fast or slow?", it is "is Python
fast enough?". That's a question that doesn't have a simple answer,
because it depends. Fast enough to do what?

But, in general, more often than not, Python is fast enough. The extra
value of using something like Lua or Ocaml or even C is just not enough to
make up for the disadvantages of using those languages.

Seems like you're having your cake and eating it too - if it's
meaningless for others to talk in generalities about fast or slow, and
it's just as meaningless to talk in generalities about 'more often than
not'.
 
M

Magnus Lycka

Isaac said:
There was some context: Is python very slow compared to C?

But that is a stupid context. It doesn't really tell us
anything. Slow at what?

No one writes a program for the purpose of doing loops,
comparing integers, setting up stack frames or jumping
between instructions.

The context for programming typically involves solving some
problem or executing a function that matters to the user.

It's just as database benchmarks. TPC C benchmarks tells us
how fast a certain database is at performing TPC C benchmarks.
Does that tell you anything about how fast it will be compared
to competing products for your real world problem? Probably
not. Which ever product you choose, you might end up with
situations where a particular function is too slow. You can
typically solve this either by changing the code or the
configuration of the server. Sometimes you have to rethink
your system and solve the problem in another way. Changing
from e.g. DB2 to Oracle is unlikely to have more impact than
these smaller changes. It's the same thing with most software
development.
Limited use for what purpose?

They are more or less useless for anyone who wants to decide
what programming language to use in a real world situation.
It's simply stupid to implement sorting algorithms in Python.
It's there already. Solving Ackermann's is also rather far
from what people typically want to achieve. If people actually
had the time and resources to create real software products,
(i.e. things that take man-months to create) from the same
spec, and developed e.g. ten programs each each ten different
programming languages in cleanroom conditions, we'd probably
learn something useful about the usefulness of a certain type
of programs with certain programming languages in certain
conditions, but only in these conditions. I'm rather certain
that aspects such as team size, development methodology etc
will influence the relative ratings of programs.

I'm not saying that the typical toy benchmarks are completely
useless. They might tell us things about particular language
features that should be improved, or give us clues that a
certain way of solving a particular problem etc, but they
don't really help Joe Newbie Programmer who wants to write
yet another web site toolkit.
 
I

igouy

Magnus said:
But that is a stupid context. It doesn't really tell us
anything. Slow at what?

As I said: We can make /the missing context/ in the OP's question more
obvious...

No one writes a program for the purpose of doing loops,
comparing integers, setting up stack frames or jumping
between instructions.

The context for programming typically involves solving some
problem or executing a function that matters to the user.

It's just as database benchmarks. TPC C benchmarks tells us
how fast a certain database is at performing TPC C benchmarks.
Does that tell you anything about how fast it will be compared
to competing products for your real world problem? Probably
not. Which ever product you choose, you might end up with
situations where a particular function is too slow. You can
typically solve this either by changing the code or the
configuration of the server. Sometimes you have to rethink
your system and solve the problem in another way. Changing
from e.g. DB2 to Oracle is unlikely to have more impact than
these smaller changes. It's the same thing with most software
development.


They are more or less useless for anyone who wants to decide
what programming language to use in a real world situation.

Good that's more explicit.

One of the program contributors told me they do something quite similar
to fasta, k-nucleotide and reverse-complement in their real world
situation.

Isn't it possible that someone would look through The Computer Language
Shootout programs and decide that language X was unusable convoluted
gobbledegook?

It's simply stupid to implement sorting algorithms in Python.
It's there already. Solving Ackermann's is also rather far
from what people typically want to achieve.

What "sorting algorithm" are you talking about?
There isn't one on http://shootout.alioth.debian.org/gp4/

Solving Ackermann's? Well if that was really the point then the
programs would be allowed to use memoization rather than simple
recursion.
 
A

astrobe

(e-mail address removed) a écrit :
Not for real, for Integer.

No, it's for me and you (well, perhaps more for you than for me).
But 4IM is forever mine :)

Amicalement,
Astrobe
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,772
Messages
2,569,593
Members
45,110
Latest member
OdetteGabb
Top