Will multithreading make python less popular?

S

Steven D'Aprano

Paul said:
How old is your computer, why did you buy it, and is it the first one
you ever owned?

For most of us, I suspect, it is not our first one, and we bought it
to get a processing speedup relative to the previous one.

My computer is about eight months old, and I bought it because the previous
one died.

If such
speedups were useless or unimportant, we would not have blown our hard
earned cash replacing perfectly good older hardware,

Oh the assumptions in that statement...

"Blowing hard-earned cash" assumes that people buy computers only when they
need to. That's certainly not true -- there's a lot of irrational purchases
involved. I have a friend who has just recently spent $2000 on storage so
he can store more games and videos, which he cheerfully admits he'll never
play or watch. He describes it as his "dragon's horde": it is for knowing
it's there, not for using.

Often hardware is upgraded because it's broken, or because you can't get
parts, or because the software you need will only run on newer machines. I
say *newer* rather than *faster*, because speed is only sometimes a factor
in why software won't run on old machines. My old Mac running a 68030 in
1990 ran Microsoft Word perfectly fast enough for even advanced word
processing needs, and nearly twenty years later, there's nothing I need
from a word processor that I couldn't do in 1990.

so we have to
accept the concept that speed matters and ignore those platitudes that
say otherwise.

The *perception* that speed matters, matters. The reality is that the
majority of computing tasks outside of certain specialist niches are I/O
bound, not CPU. Office server software is rarely CPU bound, and when it is,
in my experience there's one rogue process using all the CPU: the software
is broken, and a faster CPU would just let it be broken at a faster speed.

Gamers need better graphics cards and more memory, not faster CPUs. Internet
surfers need faster ethernet, more bandwidth and more memory, not faster
CPUs. Graphics designers need bigger hard drives and more memory, not
faster CPUs. (Hmm. There seems to be a pattern there...)

Of course, there are a few niches that do require faster CPUs: video
editing, some (but by no means all) Photoshop filters, number crunching,
etc. But even for them, you can often get more bang-for-your-buck
performance increase by adding more memory.

Speaking for myself, I'd happily take a 20% slower CPU for more reliable,
faster DVD/CD burning. What do I care if it takes my computer 120ms to open
a window instead of 100ms, but I care a lot if it takes me 10 minutes to
make a coaster instead of 7 minutes to make a good disc.
 
S

Steven D'Aprano

Steve said:
What Guido doesn't seem to have accepted yet is that slowing [C]Python
down by 50% on a single-processor CPU will actually be a worthwhile
tradeoff in ten years time, when nothing will have less than eight cores
and the big boys will be running at 64 kilo-cores.

Ten years?

There's no guarantee that Python will still even be around in ten years. It
probably will be, I see no reason why it won't, but who knows? Maybe we'll
have mandatory software warranties tailored to suit the Microsofts and
Apples, and Guido and the PSF will be forced to abandon the language.

I think a design mistake would be to hamstring Python now for a hypothetical
benefit in a decade. But, well, in five years time, or three? Don't know.
 
H

Hendrik van Rooyen

Steve Holden said:
<heresy>Perhaps it's time Python stopped being a dictatorship?</heresy>

This will need a wholesale switch to the worship of Freya - It is rumoured
that She is capable of herding cats.

- Hendrik
 
R

rushenaly

I would say, slow execution is a drawback that we put up with in order
to gain benefits of Python programming that are mostly unrelated to
the causes of the slowness.  The slowness itself can be addressed by
technical means, such as native-code compilation and eliminating the
GIL.  I believe (for example) that the PyPy project is doing both of
these.

Do you believe that there is an effort for removing gil with pypy.
As i know there is not an intend to remove gil with pypy. GIL will
be possibly used in PyPy. There is a mistake in your reply or mine.

Thank you
Rushen
 
S

sturlamolden

What am I actually seeing? If Python only uses one of the cores,
why do both light up?

Because of OS scheduling. You have more than one process running. The
Python process does not stay on one core. Try to put CPython into a
tight loop ("while 1: pass"). You will see ~50% use of both cores. If
you had 4 cores, you would see ~25% use.

Is everything much more complicated (due to
OS scheduling, etc.) than the simple explanations of GIL?

No. Your Python code cannot use more than one core simultaneously.
It's just that scheduling happens so fast and so often that you don't
notice it.
 
M

Mensanator

Because of OS scheduling. You have more than one process running. The
Python process does not stay on one core. Try to put CPython into a
tight loop ("while 1: pass"). You will see ~50% use of both cores. If
you had 4 cores, you would see ~25% use.

Saw that once when I had access to a four core machine.

Don't you mean "yes"?
Your Python code cannot use more than one core simultaneously.
It's just that scheduling happens so fast and so often that you don't
notice it.

Or that the Task Manager can't track the switches fast
enough to show the interleaving giving the illusion that
both cores are operating simultaneously.
 
R

rushenaly

I want to correct my last post where i said that there is not any
intend to remove GIL from python. There is an intend actually i wish
from a wizard :).
On the pypy blog there is an explanation about gil and pypy
"Note that multithreading in PyPy is based on a global interpreter
lock, as in CPython. I imagine that we will get rid of the global
interpreter lock at some point in the future -- I can certainly see
how this might be done in PyPy, unlike in CPython -- but it will be a
lot of work nevertheless. Given our current priorities, it will
probably not occur soon unless someone steps in."
Nothing new about GIL and Cpython and even PyPy

Thank you...
Rushen

Thank you...
 
P

Paul Rubin

Joshua Judson Rosen said:
What cost is that?

The cost of messing with the multiprocessing module instead of having
threads work properly, and the overhead of serializing Python data
structures to send to another process by IPC, instead of using the
same object in two threads. Also, one way I often use threading is by
sending function objects to another thread through a Queue, so the
other thread can evaluate the function. I don't think multiprocessing
gives a way to serialize functions, though maybe something like it
can be done at higher nuisance using classes.
 
H

Hendrik van Rooyen

Paul Rubin said:
The cost of messing with the multiprocessing module instead of having
threads work properly, and the overhead of serializing Python data
structures to send to another process by IPC, instead of using the
same object in two threads. Also, one way I often use threading is by
sending function objects to another thread through a Queue, so the
other thread can evaluate the function. I don't think multiprocessing
gives a way to serialize functions, though maybe something like it
can be done at higher nuisance using classes.

There are also Pyro and xmlrpc and shm. - all of them more apparent
hassle than threads, and all of them better able to exploit parallelism.

That said, this has made me think the following:

<conjecture>
It is an error to pass anything but plain data between processes,
as anything else does not scale easily.

Passing plain data between processes means either serialising
the data and using channels such as pipes or sockets, or
passing a pointer to a shared memory block through a similar
channel.

Following this leads to a clean design, while attempting to pass
higher order stuff quickly leads to convoluted code when
you try to make things operate in parallel.
<!conjecture>

The above can be crudely summed up as:

You can share and pass data, but you should not pass
or share code between processes.

Now the above is congruent with my own experience,
but I wonder if it is generally applicable.

- Hendrik
 
A

Aahz

If Aahz was trolling, then he got me. I know about William of Occam,
after whom the language was named, and his razor, but did not make the
association, and answered seriously.

Not trolling, but making a joke. Not always easy to tell the
difference, of course.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,766
Messages
2,569,569
Members
45,042
Latest member
icassiem

Latest Threads

Top