List Performance

A

Ampedesign

If I happen to have a list that contains over 50,000 items, will the
size of the list severely impact the performance of appending to the
list?
 
P

Peter Otten

Ampedesign said:
If I happen to have a list that contains over 50,000 items, will the
size of the list severely impact the performance of appending to the
list?

No.

$ python -m timeit -n20000 -s"items = []" "items.append(42)"
20000 loops, best of 3: 0.554 usec per loop
$ python -m timeit -n20000 -s"items = [42]*10**6" "items.append(42)"
20000 loops, best of 3: 0.529 usec per loop

http://wiki.python.org/moin/TimeComplexity

Peter
 
M

Maric Michaud

Le Monday 30 June 2008 09:23:46 Peter Otten, vous avez écrit :
Ampedesign said:
If I happen to have a list that contains over 50,000 items, will the
size of the list severely impact the performance of appending to the
list?

No.

$ python -m timeit -n20000 -s"items = []" "items.append(42)"
20000 loops, best of 3: 0.554 usec per loop
$ python -m timeit -n20000 -s"items = [42]*10**6" "items.append(42)"
20000 loops, best of 3: 0.529 usec per loop

But it surely could, if the box happens to be out of memory and begin to swap,
while it's not, of course, an issue with python lists...
 
P

Peter Otten

Larry said:
Peter said:
Ampedesign said:
If I happen to have a list that contains over 50,000 items, will the
size of the list severely impact the performance of appending to the
list?

No.

$ python -m timeit -n20000 -s"items = []" "items.append(42)"
20000 loops, best of 3: 0.554 usec per loop
$ python -m timeit -n20000 -s"items = [42]*10**6" "items.append(42)"
20000 loops, best of 3: 0.529 usec per loop

http://wiki.python.org/moin/TimeComplexity

Peter

Peter,

So its actually faster to append to a long list than an empty one? That
certainly would not have been intuitively obvious now would it?

You shouldn't blindly trust the numbers.

Here's what happens if I repeat the measurements a few times:

$ python -m timeit -n20000 -s"items = []" "items.append(42)"
20000 loops, best of 3: 0.531 usec per loop
$ python -m timeit -n20000 -s"items = []" "items.append(42)"
20000 loops, best of 3: 0.511 usec per loop
$ python -m timeit -n20000 -s"items = []" "items.append(42)"
20000 loops, best of 3: 0.512 usec per loop
$ python -m timeit -n20000 -s"items = []" "items.append(42)"
20000 loops, best of 3: 0.51 usec per loop
$ python -m timeit -n20000 -s"items = []" "items.append(42)"
20000 loops, best of 3: 0.514 usec per loop
$ python -m timeit -n20000 -s"items = [42]*10**6" "items.append(42)"
20000 loops, best of 3: 0.506 usec per loop
$ python -m timeit -n20000 -s"items = [42]*10**6" "items.append(42)"
20000 loops, best of 3: 0.512 usec per loop
$ python -m timeit -n20000 -s"items = [42]*10**6" "items.append(42)"
20000 loops, best of 3: 0.543 usec per loop
$ python -m timeit -n20000 -s"items = [42]*10**6" "items.append(42)"
20000 loops, best of 3: 0.522 usec per loop
$ python -m timeit -n20000 -s"items = [42]*10**6" "items.append(42)"
20000 loops, best of 3: 0.51 usec per loop

The difference is within the error margin. All you can say is that both
operations take roughly the same time.

In general, if no error margin (e. g. 0.5+-0.1) is given that is always a
warning sign, be it opinion polls or timeit output.

Peter
 
G

Gerhard Häring

Larry said:
[...]
So its actually faster to append to a long list than an empty one? That
certainly would not have been intuitively obvious now would it?

Maybe not intuitively, but if you know how dynamically growing data
structures are implemented, it's plausible. They overallocate, and the
amount of overallocation is dependent on the current size. Relevant
source snippet from Python 2.6:

/* This over-allocates proportional to the list size, making room
* for additional growth. The over-allocation is mild, but is
* enough to give linear-time amortized behavior over a long
* sequence of appends() in the presence of a poorly-performing
* system realloc().
* The growth pattern is: 0, 4, 8, 16, 25, 35, 46, 58, 72, 88, ...
*/
new_allocated = (newsize >> 3) + (newsize < 9 ? 3 : 6);

If, on the other hand, we knew beforehand how big the list will get
approximately, we could avoid all these reallocations. No problem with
Python's C API:

PyAPI_FUNC(PyObject *) PyList_New(Py_ssize_t size);

But you can't do it directly from Python, unless you (ab)use ctypes.

-- Gerhard
 
C

Cédric Lucantis

Le Monday 30 June 2008 15:13:30 Larry Bates, vous avez écrit :
Peter said:
Ampedesign said:
If I happen to have a list that contains over 50,000 items, will the
size of the list severely impact the performance of appending to the
list?

No.

$ python -m timeit -n20000 -s"items = []" "items.append(42)"
20000 loops, best of 3: 0.554 usec per loop
$ python -m timeit -n20000 -s"items = [42]*10**6" "items.append(42)"
20000 loops, best of 3: 0.529 usec per loop

http://wiki.python.org/moin/TimeComplexity

Peter

Peter,

So its actually faster to append to a long list than an empty one? That
certainly would not have been intuitively obvious now would it?

That test only demonstrates that it's faster to append to a 1 million items
list than an empty one (and this on a particular platform with a particular
python version). Different sizes may give different result. I guess this is
because of some internal optimisations (items are probably allocated by
chunks, so sometimes append() involves a realloc, sometimes not).

So the only thing you should remember is that list.append() has a complexity
of O(1), and thus should be considered a constant time operation for any
length. Just be aware of the note:

[1] = These operations rely on the "Amortized" part of "Amortized Worst Case".
Individual actions may take surprisingly long, depending on the history of
the container.

Also note that 50000 items is a lot for a human being, not for a modern
computer.
 
M

Maric Michaud

Le Monday 30 June 2008 15:52:56 Gerhard Häring, vous avez écrit :
Larry said:
[...]
So its actually faster to append to a long list than an empty one? That
certainly would not have been intuitively obvious now would it?

Maybe not intuitively, but if you know how dynamically growing data
structures are implemented, it's plausible. They overallocate, and the
amount of overallocation is dependent on the current size. Relevant
source snippet from Python 2.6:

/* This over-allocates proportional to the list size, making room
* for additional growth. The over-allocation is mild, but is
* enough to give linear-time amortized behavior over a long
* sequence of appends() in the presence of a poorly-performing
* system realloc().
* The growth pattern is: 0, 4, 8, 16, 25, 35, 46, 58, 72, 88, ...
*/
new_allocated = (newsize >> 3) + (newsize < 9 ? 3 : 6);

If, on the other hand, we knew beforehand how big the list will get
approximately, we could avoid all these reallocations. No problem with
Python's C API:

PyAPI_FUNC(PyObject *) PyList_New(Py_ssize_t size);

But you can't do it directly from Python, unless you (ab)use ctypes.

-- Gerhard

Well, as I posted few days ago, one could envisage, as a pure python
optimization for dealing with long list, to replace an algorithm with a lot
of append by something like this :

mark = object()

datas = [ mark ] * expected_size

# working with the datas while maintaining the effective currrently used size

Of course one could even subclass list and redefine __len__, append, and some
other methods to deal with this "allocated by block" list.
 
T

Terry Reedy

Maric said:
Le Monday 30 June 2008 15:52:56 Gerhard Häring, vous avez écrit :
Larry Bates wrote:
If, on the other hand, we knew beforehand how big the list will get
approximately, we could avoid all these reallocations. No problem with
Python's C API:

PyAPI_FUNC(PyObject *) PyList_New(Py_ssize_t size);

But you can't do it directly from Python, unless you (ab)use ctypes.

-- Gerhard

Well, as I posted few days ago, one could envisage, as a pure python
optimization for dealing with long list, to replace an algorithm with a lot
of append by something like this :

mark = object()

datas = [ mark ] * expected_size

datas = [None] * expected_size
has been a standard idiom since before object() existed ;-)
and works fine *unless* one wants to add None explicitly
and have that be different from 'unused'.
# working with the datas while maintaining the effective currrently used size

Of course one could even subclass list and redefine __len__, append, and some
other methods to deal with this "allocated by block" list.

An interesting idea if one does this at least a few times and wants to
use .append and .extend instead of explicit indexing.

One could also make such a subclass a 'no-grow' list if appropriate
(when an attempt to grow it would indicate a bug).

tjr
 
M

Maric Michaud

Le Monday 30 June 2008 22:21:35 Terry Reedy, vous avez écrit :
Well, as I posted few days ago, one could envisage, as a pure python
optimization for dealing with long list, to replace an algorithm with a
lot of append by something like this :

mark = object()

datas = [ mark ] * expected_size

datas = [None] * expected_size
has been a standard idiom since before object() existed ;-)
and works fine *unless* one wants to add None explicitly
and have that be different from 'unused'.


Yes, in fact I used a marker because it I thought of it primarily as outbound
for the list (like \0 for strings in C), but it doesnt' matter what is the
object you put in the list, if you know at every moment its size.

A subclass of list will indeed have to override most of the methods of its
parent (not just some as I assumed before), using extend for reallocation
with some sort of iterator with size, as it work in my previous example with
xrange, something like that :
[31]: class iter_with_len(object) :
def __init__(self, size, obj=None) :
self.size = size
self.obj = obj
def __len__(self) : return self.size
def __iter__(self) : return itertools.repeat(self.obj, len(self))
....:
....:
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,579
Members
45,053
Latest member
BrodieSola

Latest Threads

Top