creating really big lists

D

Dr Mephesto

Hi!

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Does anyone know the most efficient way to do this? I have tried:

list = [[[],[],[],[],[]] for _ in xrange(3000000)]

but its not soooo fast. Is there a way to do this without looping?

David.
 
P

Paul Rudin

Dr Mephesto said:
Hi!

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Does anyone know the most efficient way to do this? I have tried:

list = [[[],[],[],[],[]] for _ in xrange(3000000)]

but its not soooo fast. Is there a way to do this without looping?

You can do:

[[[],[],[],[],[]]] * 3000000

although I don't know if it performs any better than what you already
have.
 
D

Diez B. Roggisch

Paul said:
Dr Mephesto said:
Hi!

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Does anyone know the most efficient way to do this? I have tried:

list = [[[],[],[],[],[]] for _ in xrange(3000000)]

but its not soooo fast. Is there a way to do this without looping?

You can do:

[[[],[],[],[],[]]] * 3000000

although I don't know if it performs any better than what you already
have.

You are aware that this is hugely different, because the nested lists are
references, not new instances? Thus the outcome is most probably (given the
gazillion of times people stumbled over this) not the desired one...

Diez
 
B

Bryan Olson

Paul said:
Dr said:
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Does anyone know the most efficient way to do this? I have tried:

list = [[[],[],[],[],[]] for _ in xrange(3000000)]

but its not soooo fast. Is there a way to do this without looping?

You can do:

[[[],[],[],[],[]]] * 3000000

although I don't know if it performs any better than what you already
have.

Actually, that produces list of 3000000 references to the same
5-element list. A reduced example:
>>> lst = [[[],[],[],[],[]]] * 3
>>> lst[1][1].append(42)
>>> print lst
[[[], [42], [], [], []], [[], [42], [], [], []], [[], [42], [], [], []]]
 
P

Paul Rudin

Diez B. Roggisch said:
Paul said:
Dr Mephesto said:
Hi!

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Does anyone know the most efficient way to do this? I have tried:

list = [[[],[],[],[],[]] for _ in xrange(3000000)]

but its not soooo fast. Is there a way to do this without looping?

You can do:

[[[],[],[],[],[]]] * 3000000

although I don't know if it performs any better than what you already
have.

You are aware that this is hugely different, because the nested lists are
references, not new instances? Thus the outcome is most probably (given the
gazillion of times people stumbled over this) not the desired one...

Err, yes sorry. I should try to avoid posting before having coffee in
the mornings.
 
D

Dr Mephesto

yep, thats why I'm asking :)

Paul said:
Hi!
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
Does anyone know the most efficient way to do this? I have tried:
list = [[[],[],[],[],[]] for _ in xrange(3000000)]
but its not soooo fast. Is there a way to do this without looping?
You can do:
[[[],[],[],[],[]]] * 3000000
although I don't know if it performs any better than what you already
have.

You are aware that this is hugely different, because the nested lists are
references, not new instances? Thus the outcome is most probably (given the
gazillion of times people stumbled over this) not the desired one...

Diez
 
H

Hrvoje Niksic

Dr Mephesto said:
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will
append data each of the 5 sublists, so they will be of varying
lengths (so no arrays!).

Does anyone know the most efficient way to do this? I have tried:

list = [[[],[],[],[],[]] for _ in xrange(3000000)]

You might want to use a tuple as the container for the lower-level
lists -- it's more compact and costs less allocation-wise.

But the real problem is not list allocation vs tuple allocation, nor
is it looping in Python; surprisingly, it's the GC. Notice this:

$ python
Python 2.5.1 (r251:54863, May 2 2007, 16:56:35)
[GCC 4.1.2 (Ubuntu 4.1.2-0ubuntu4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
import time
t0=time.time(); l=[([],[],[],[],[]) for _ in xrange(3000000)];
t1=time.time()
t1-t0
143.89971613883972

Now, with the GC disabled:
$ python
Python 2.5.1 (r251:54863, May 2 2007, 16:56:35)
[GCC 4.1.2 (Ubuntu 4.1.2-0ubuntu4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
import gc
gc.disable()
import time
t0=time.time(); l=[([],[],[],[],[]) for _ in xrange(3000000)];
t1=time.time()
t1-t0
2.9048631191253662

The speed difference is staggering, almost 50-fold. I suspect GC
degrades the (amortized) linear-time list building into quadratic
time. Since you allocate all the small lists, the GC gets invoked
every 700 or so allocations, and has to visit more and more objects in
each pass. I'm not sure if this can be fixed (shouldn't the
generational GC only have to visit the freshly created objects rather
than all of them?), but it has been noticed on this group before.

If you're building large data structures and don't need to reclaim
cyclical references, I suggest turning GC off, at least during
construction.
 
A

Aahz

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Why do you want to pre-create this? Why not just create the big list and
sublists as you append data to the sublists?
--
Aahz ([email protected]) <*> http://www.pythoncraft.com/

"Many customs in this life persist because they ease friction and promote
productivity as a result of universal agreement, and whether they are
precisely the optimal choices is much less important." --Henry Spencer
http://www.lysator.liu.se/c/ten-commandments.html
 
J

John Machin

Hi!

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Will each and every of the 3,000,000 slots be used? If not, you may be
much better off storagewise if you used a dictionary instead of a
list, at the cost of slower access.

Cheers,
John
 
D

Dr Mephesto

Hrvoje said:
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will
append data each of the 5 sublists, so they will be of varying
lengths (so no arrays!).
Does anyone know the most efficient way to do this? I have tried:
list = [[[],[],[],[],[]] for _ in xrange(3000000)]
If you're building large data structures and don't need to reclaim
cyclical references, I suggest turning GC off, at least during
construction.

This is good advice, but another question is whether you really want
such a list. You may well be better off with a database of some kind -
they're designed for manipulating large amounts of data.

Tim Delaney

I need some real speed! a database is waaay to slow for the algorithm
im using. and because the sublists are of varying size, i dont think I
can use an array...
 
P

Paul McGuire

I need some real speed! a database is waaay to slow for the algorithm
im using. and because the sublists are of varying size, i dont think I
can use an array...- Hide quoted text -

- Show quoted text -

How about a defaultdict approach?

from collections import defaultdict

dataArray = defaultdict(lambda : [[],[],[],[],[]])
dataArray[1001][3].append('x')
dataArray[42000][2].append('y')

for k in sorted(dataArray.keys()):
print "%6d : %s" % (k,dataArray[k])

prints:
1001 : [[], [], [], ['x'], []]
42000 : [[], [], ['y'], [], []]

-- Paul
 
D

Dr Mephesto

I need some real speed! a database is waaay to slow for the algorithm
im using. and because the sublists are of varying size, i dont think I
can use an array...- Hide quoted text -
- Show quoted text -

How about a defaultdict approach?

from collections import defaultdict

dataArray = defaultdict(lambda : [[],[],[],[],[]])
dataArray[1001][3].append('x')
dataArray[42000][2].append('y')

for k in sorted(dataArray.keys()):
print "%6d : %s" % (k,dataArray[k])

prints:
1001 : [[], [], [], ['x'], []]
42000 : [[], [], ['y'], [], []]

-- Paul

hey, that defaultdict thing looks pretty cool...

whats the overhead like for using a dictionary in python?

dave
 
G

Gabriel Genellina

hey, that defaultdict thing looks pretty cool...

whats the overhead like for using a dictionary in python?

Dictionaries are heavily optimized in Python. Access time is O(1),
adding/removing elements is amortized O(1) (that is, constant time unless
it has to grow/shrink some internal structures.)
 
D

Dr Mephesto

Dictionaries are heavily optimized in Python. Access time is O(1),
adding/removing elements is amortized O(1) (that is, constant time unless
it has to grow/shrink some internal structures.)

well, I want to (maybe) have a dictionary where the value is a list of
5 lists. And I want to add a LOT of data to these lists. 10´s of
millions of pieces of data. Will this be a big problem? I can just try
it out in practice on monday too :)

thanks
 
?

=?ISO-8859-1?Q?Ricardo_Ar=E1oz?=

Dr said:
well, I want to (maybe) have a dictionary where the value is a list of
5 lists. And I want to add a LOT of data to these lists. 10´s of
millions of pieces of data. Will this be a big problem? I can just try
it out in practice on monday too :)

thanks

targetList = myDict[someKey] # This takes normal dict access time
for j in xrange(5) :
for i in xrange(50000000) : # Add a LOT of data to targetList
targetList[j].append(i) # This takes normal list access time
 
P

Paul Rubin

Dr Mephesto said:
well, I want to (maybe) have a dictionary where the value is a list of
5 lists. And I want to add a LOT of data to these lists. 10´s of
millions of pieces of data. Will this be a big problem? I can just try
it out in practice on monday too :)

Yes, that may be a problem both because of the amount of memory
required, and because of how the GC works. You may want to turn off
the GC while building these list. Otherwise, think of some other
strategy, like files on disk.
 
B

Bruno Desthuilliers

Dr Mephesto a écrit :
Hi!

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).

Does anyone know the most efficient way to do this?

Hem... Did you consider the fact that RAM is not an unlimited resource?

Let's do some simple math (please someone correct me if I'm going off
the road): if a Python (empty) list object required 256 bits (if I refer
to some old post by GvR, it's probably more - 384 bytes at least. Some
Python guru around ?), you'd need (1 + (3000000 * 5)) * 256 bits just to
build this list of lists. Which would make something around 3 Gb. Not
counting all other needed memory...

FWIW, run the following code:

# eatallramthenswap.py
d = {}
for i in xrange(3000000):
d = ([], [], [], [], [])

And monitor what happens with top...
 
B

Bruno Desthuilliers

Dr Mephesto a écrit :
Dr Mephesto a écrit :

I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
Does anyone know the most efficient way to do this?

Hem... Did you consider the fact that RAM is not an unlimited resource?

Let's do some simple math (please someone correct me if I'm going off
the road): if a Python (empty) list object required 256 bits (if I refer
to some old post by GvR, it's probably more - 384 bytes at least. Some
Python guru around ?), you'd need (1 + (3000000 * 5)) * 256 bits just to
build this list of lists. Which would make something around 3 Gb. Not
counting all other needed memory...

FWIW, run the following code:

# eatallramthenswap.py
d = {}
for i in xrange(3000000):
d = ([], [], [], [], [])

And monitor what happens with top...



Unused ram is wasted ram :)


Indeed. But when your app eats all RAM and swap and brings the system
down, users are usually a bit unhappy !-)
I tried using MySQL, and it was to slow.
Possibly.

and I have 4gb anyway...

*You* have 4gb. Yes, fine. But:

1/ please take time to re-read my post - the 3 gb is based on a very
optimistic estimation (256 bits) of the size of an empty list. If you
choose the (probably much closer to reality) estimate of 384 bits, then
you need (1 + (3000000 * 5)) * 384, which makes =~ 5 gb. More than what
*you* have. BTW, please remember that your OS and the Python interpreter
are going to eat some of these 4 gb, and that you intend to actually
*store* something - objects references - in these lists. Even if you do
have a few shared references, this means that you'll have to have RAM
space for *at least* 3000000 * 5 *more* Python objects (which make *1*
object per list...). Which will *at minima* use about the same amount of
RAM as the list of lists itself. Which take us to something like 10
gb... for *1* object per list. I of course suppose you plan to store
much more than 1 object per list !-)

2/ now ask yourself how many users of your application will have enough
RAM to run it...

So IMVHO, the question is not how to build such a list in less than x
minutes, but how to *not* build such a list. IOW, do you *really* need
to store all that stuff in RAM ?
 
D

Dr Mephesto

Dr Mephesto a écrit :
I would like to create a pretty big list of lists; a list 3,000,000
long, each entry containing 5 empty lists. My application will append
data each of the 5 sublists, so they will be of varying lengths (so no
arrays!).
Does anyone know the most efficient way to do this?

Hem... Did you consider the fact that RAM is not an unlimited resource?

Let's do some simple math (please someone correct me if I'm going off
the road): if a Python (empty) list object required 256 bits (if I refer
to some old post by GvR, it's probably more - 384 bytes at least. Some
Python guru around ?), you'd need (1 + (3000000 * 5)) * 256 bits just to
build this list of lists. Which would make something around 3 Gb. Not
counting all other needed memory...

FWIW, run the following code:

# eatallramthenswap.py
d = {}
for i in xrange(3000000):
d = ([], [], [], [], [])

And monitor what happens with top...


Unused ram is wasted ram :)

I tried using MySQL, and it was to slow. and I have 4gb anyway...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,009
Latest member
GidgetGamb

Latest Threads

Top