slowdown on looping

T

Trevor Perrin

After running a simple loop around 100-200 million times, there's a
speed drop of about a factor of 7. This happens within several
minutes (on a 1.7 Ghz machine), so it's not that hard to see. I'm
using Python 2.3.2 under Windows.

Is anyone familiar with this?

import time
startTime = time.time()
x = 0
while 1:
x += 1
if x % 1000000 == 0:
endTime = time.time()
print "time = ", endTime - startTime, x
startTime = endTime


Trevor
 
D

David C. Fox

Trevor said:
After running a simple loop around 100-200 million times, there's a
speed drop of about a factor of 7. This happens within several
minutes (on a 1.7 Ghz machine), so it's not that hard to see. I'm
using Python 2.3.2 under Windows.

Is anyone familiar with this?

import time
startTime = time.time()
x = 0
while 1:
x += 1
if x % 1000000 == 0:
endTime = time.time()
print "time = ", endTime - startTime, x
startTime = endTime


Trevor

I tried something similar to the loop above, but used a fixed loop size
(for i in range(100000)) and varied the starting value of x. I get a
slowdown of about a factor of 5 if I start it with x > sys.maxint.
Presumably that is because both += and % are now operating with long
integers rather than integers. Long integers have unlimited precision,
which is not usually supported by hardware arithmetic, so it is bound to
be slower.

Did you initially discover the problem with the loop above, or was that
just a simplified test to analyze the effect? If the latter, then
another loop might slow down for an entierely different cause (e.g.
memory use)

David
 
A

Andrew Dalke

Trevor Perrin:
After running a simple loop around 100-200 million times, there's a
speed drop of about a factor of 7. This happens within several
minutes (on a 1.7 Ghz machine), so it's not that hard to see.

Are computers really that fast these days? I'm startled.

Here's what I first thought.

Python's integers in your machine are 32 bit twos-complement
signed integers. The largest number that can represent is
2**31-1 == 2,147,483,648 . If you get beyond that, Python
switches to a 'long', which can store as many digits as you
can feed it memory. Because the math is no longer done as
low-level assembly instructions, the code is slower.

However, that limit is 10 times greater than 200 million, so that
isn't the problem. (Could you double check that you're not
off by an order of magnitude?)

I also wondered if your output was in an IDE which was slowing
down as the screen buffer got more and more flow, but you'll
only have 100 terms on the screen.

So I don't know. Next idea anyone?

Andrew
(e-mail address removed)
 
B

Bengt Richter

After running a simple loop around 100-200 million times, there's a
speed drop of about a factor of 7. This happens within several
minutes (on a 1.7 Ghz machine), so it's not that hard to see. I'm
using Python 2.3.2 under Windows.

Is anyone familiar with this?

import time
startTime = time.time()
x = 0
while 1:
x += 1
if x % 1000000 == 0:
endTime = time.time()
print "time = ", endTime - startTime, x
startTime = endTime
There will be a point where it switches gears from 32-bit int counting to unlimited
(except for memory) bitwidth long integer counting, i.e., when the count becomes 2**31.
But that is over 2,000 million, not 100-200 million, so I would wonder if you have
a hot-running laptop that decides to slow down when it gets to a certain temp. Or maybe
your fan/heatsink is clogged with dust, or the heatsink isn't making good thermal contact
with the CPU?
2147483647L

That says "L" for long, because it became long before the 1 was subtracted.
2147483647

One way to avoid that overflow, by doing in another order
2147483647

Simulating your code
2147483648L

Now you are in the long-integer representation mode, which is not as fast as int,
and as the size of the number grows, it will get gradually get slower, because it
will be manipulating larger and larger representations. A rough measure of how big
would be len('%x'%x). But as mentioned, I don't think this is the problem.
To see that kind of effect, calculate pi to thousands of decimals or monster
factorials or something like that ;-)

Regards,
Bengt Richter
 
T

Trevor Perrin

After running a simple loop around 100-200 million times, there's a
speed drop of about a factor of 7. This happens within several
minutes (on a 1.7 Ghz machine), so it's not that hard to see.

Never mind, Python's just too speedy for my laptop - the CPU's
overheating, as Bengt thought. I'll have to switch to VB or something
;-) Thanks, people..

Trevor
 
G

Greg Ewing (using news.cis.dfn.de)

Trevor said:
Never mind, Python's just too speedy for my laptop - the CPU's
overheating, as Bengt thought.

Oh, great. As if there weren't enough tricky things to take into
account when benchmarking on modern machines - we now have to take
care to pre-heat the CPU, too!
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,054
Latest member
TrimKetoBoost

Latest Threads

Top