creating really big lists

Discussion in 'Python' started by Dr Mephesto, Sep 5, 2007.

  1. Dr Mephesto

    Dr Mephesto Guest

    Hi!

    I would like to create a pretty big list of lists; a list 3,000,000
    long, each entry containing 5 empty lists. My application will append
    data each of the 5 sublists, so they will be of varying lengths (so no
    arrays!).

    Does anyone know the most efficient way to do this? I have tried:

    list = [[[],[],[],[],[]] for _ in xrange(3000000)]

    but its not soooo fast. Is there a way to do this without looping?

    David.
     
    Dr Mephesto, Sep 5, 2007
    #1
    1. Advertising

  2. Dr Mephesto

    Paul Rudin Guest

    Dr Mephesto <> writes:

    > Hi!
    >
    > I would like to create a pretty big list of lists; a list 3,000,000
    > long, each entry containing 5 empty lists. My application will append
    > data each of the 5 sublists, so they will be of varying lengths (so no
    > arrays!).
    >
    > Does anyone know the most efficient way to do this? I have tried:
    >
    > list = [[[],[],[],[],[]] for _ in xrange(3000000)]
    >
    > but its not soooo fast. Is there a way to do this without looping?


    You can do:

    [[[],[],[],[],[]]] * 3000000

    although I don't know if it performs any better than what you already
    have.
     
    Paul Rudin, Sep 5, 2007
    #2
    1. Advertising

  3. Paul Rudin wrote:

    > Dr Mephesto <> writes:
    >
    >> Hi!
    >>
    >> I would like to create a pretty big list of lists; a list 3,000,000
    >> long, each entry containing 5 empty lists. My application will append
    >> data each of the 5 sublists, so they will be of varying lengths (so no
    >> arrays!).
    >>
    >> Does anyone know the most efficient way to do this? I have tried:
    >>
    >> list = [[[],[],[],[],[]] for _ in xrange(3000000)]
    >>
    >> but its not soooo fast. Is there a way to do this without looping?

    >
    > You can do:
    >
    > [[[],[],[],[],[]]] * 3000000
    >
    > although I don't know if it performs any better than what you already
    > have.


    You are aware that this is hugely different, because the nested lists are
    references, not new instances? Thus the outcome is most probably (given the
    gazillion of times people stumbled over this) not the desired one...

    Diez
     
    Diez B. Roggisch, Sep 5, 2007
    #3
  4. Dr Mephesto

    Bryan Olson Guest

    Paul Rudin wrote:
    > Dr writes:
    >> I would like to create a pretty big list of lists; a list 3,000,000
    >> long, each entry containing 5 empty lists. My application will append
    >> data each of the 5 sublists, so they will be of varying lengths (so no
    >> arrays!).
    >>
    >> Does anyone know the most efficient way to do this? I have tried:
    >>
    >> list = [[[],[],[],[],[]] for _ in xrange(3000000)]
    >>
    >> but its not soooo fast. Is there a way to do this without looping?

    >
    > You can do:
    >
    > [[[],[],[],[],[]]] * 3000000
    >
    > although I don't know if it performs any better than what you already
    > have.


    Actually, that produces list of 3000000 references to the same
    5-element list. A reduced example:

    >>> lst = [[[],[],[],[],[]]] * 3
    >>> lst[1][1].append(42)
    >>> print lst

    [[[], [42], [], [], []], [[], [42], [], [], []], [[], [42], [], [], []]]


    --
    --Bryan
     
    Bryan Olson, Sep 5, 2007
    #4
  5. Dr Mephesto

    Paul Rudin Guest

    "Diez B. Roggisch" <> writes:

    > Paul Rudin wrote:
    >
    >> Dr Mephesto <> writes:
    >>
    >>> Hi!
    >>>
    >>> I would like to create a pretty big list of lists; a list 3,000,000
    >>> long, each entry containing 5 empty lists. My application will append
    >>> data each of the 5 sublists, so they will be of varying lengths (so no
    >>> arrays!).
    >>>
    >>> Does anyone know the most efficient way to do this? I have tried:
    >>>
    >>> list = [[[],[],[],[],[]] for _ in xrange(3000000)]
    >>>
    >>> but its not soooo fast. Is there a way to do this without looping?

    >>
    >> You can do:
    >>
    >> [[[],[],[],[],[]]] * 3000000
    >>
    >> although I don't know if it performs any better than what you already
    >> have.

    >
    > You are aware that this is hugely different, because the nested lists are
    > references, not new instances? Thus the outcome is most probably (given the
    > gazillion of times people stumbled over this) not the desired one...


    Err, yes sorry. I should try to avoid posting before having coffee in
    the mornings.
     
    Paul Rudin, Sep 5, 2007
    #5
  6. Dr Mephesto

    Dr Mephesto Guest

    yep, thats why I'm asking :)

    On Sep 5, 12:22 pm, "Diez B. Roggisch" <> wrote:
    > Paul Rudin wrote:
    > > Dr Mephesto <> writes:

    >
    > >> Hi!

    >
    > >> I would like to create a pretty big list of lists; a list 3,000,000
    > >> long, each entry containing 5 empty lists. My application will append
    > >> data each of the 5 sublists, so they will be of varying lengths (so no
    > >> arrays!).

    >
    > >> Does anyone know the most efficient way to do this? I have tried:

    >
    > >> list = [[[],[],[],[],[]] for _ in xrange(3000000)]

    >
    > >> but its not soooo fast. Is there a way to do this without looping?

    >
    > > You can do:

    >
    > > [[[],[],[],[],[]]] * 3000000

    >
    > > although I don't know if it performs any better than what you already
    > > have.

    >
    > You are aware that this is hugely different, because the nested lists are
    > references, not new instances? Thus the outcome is most probably (given the
    > gazillion of times people stumbled over this) not the desired one...
    >
    > Diez
     
    Dr Mephesto, Sep 5, 2007
    #6
  7. Dr Mephesto <> writes:

    > I would like to create a pretty big list of lists; a list 3,000,000
    > long, each entry containing 5 empty lists. My application will
    > append data each of the 5 sublists, so they will be of varying
    > lengths (so no arrays!).
    >
    > Does anyone know the most efficient way to do this? I have tried:
    >
    > list = [[[],[],[],[],[]] for _ in xrange(3000000)]


    You might want to use a tuple as the container for the lower-level
    lists -- it's more compact and costs less allocation-wise.

    But the real problem is not list allocation vs tuple allocation, nor
    is it looping in Python; surprisingly, it's the GC. Notice this:

    $ python
    Python 2.5.1 (r251:54863, May 2 2007, 16:56:35)
    [GCC 4.1.2 (Ubuntu 4.1.2-0ubuntu4)] on linux2
    Type "help", "copyright", "credits" or "license" for more information.
    >>> import time
    >>> t0=time.time(); l=[([],[],[],[],[]) for _ in xrange(3000000)];
    >>> t1=time.time()
    >>> t1-t0

    143.89971613883972

    Now, with the GC disabled:
    $ python
    Python 2.5.1 (r251:54863, May 2 2007, 16:56:35)
    [GCC 4.1.2 (Ubuntu 4.1.2-0ubuntu4)] on linux2
    Type "help", "copyright", "credits" or "license" for more information.
    >>> import gc
    >>> gc.disable()
    >>> import time
    >>> t0=time.time(); l=[([],[],[],[],[]) for _ in xrange(3000000)];
    >>> t1=time.time()
    >>> t1-t0

    2.9048631191253662

    The speed difference is staggering, almost 50-fold. I suspect GC
    degrades the (amortized) linear-time list building into quadratic
    time. Since you allocate all the small lists, the GC gets invoked
    every 700 or so allocations, and has to visit more and more objects in
    each pass. I'm not sure if this can be fixed (shouldn't the
    generational GC only have to visit the freshly created objects rather
    than all of them?), but it has been noticed on this group before.

    If you're building large data structures and don't need to reclaim
    cyclical references, I suggest turning GC off, at least during
    construction.
     
    Hrvoje Niksic, Sep 5, 2007
    #7
  8. Dr Mephesto

    Aahz Guest

    In article <>,
    Dr Mephesto <> wrote:
    >
    >I would like to create a pretty big list of lists; a list 3,000,000
    >long, each entry containing 5 empty lists. My application will append
    >data each of the 5 sublists, so they will be of varying lengths (so no
    >arrays!).


    Why do you want to pre-create this? Why not just create the big list and
    sublists as you append data to the sublists?
    --
    Aahz () <*> http://www.pythoncraft.com/

    "Many customs in this life persist because they ease friction and promote
    productivity as a result of universal agreement, and whether they are
    precisely the optimal choices is much less important." --Henry Spencer
    http://www.lysator.liu.se/c/ten-commandments.html
     
    Aahz, Sep 5, 2007
    #8
  9. Dr Mephesto

    John Machin Guest

    On Sep 5, 7:50 pm, Dr Mephesto <> wrote:
    > Hi!
    >
    > I would like to create a pretty big list of lists; a list 3,000,000
    > long, each entry containing 5 empty lists. My application will append
    > data each of the 5 sublists, so they will be of varying lengths (so no
    > arrays!).


    Will each and every of the 3,000,000 slots be used? If not, you may be
    much better off storagewise if you used a dictionary instead of a
    list, at the cost of slower access.

    Cheers,
    John
     
    John Machin, Sep 5, 2007
    #9
  10. Dr Mephesto

    Dr Mephesto Guest

    On 6 Sep., 01:34, "Delaney, Timothy (Tim)" <> wrote:
    > Hrvoje Niksic wrote:
    > > Dr Mephesto <> writes:

    >
    > >> I would like to create a pretty big list of lists; a list 3,000,000
    > >> long, each entry containing 5 empty lists. My application will
    > >> append data each of the 5 sublists, so they will be of varying
    > >> lengths (so no arrays!).

    >
    > >> Does anyone know the most efficient way to do this? I have tried:

    >
    > >> list = [[[],[],[],[],[]] for _ in xrange(3000000)]

    > > If you're building large data structures and don't need to reclaim
    > > cyclical references, I suggest turning GC off, at least during
    > > construction.

    >
    > This is good advice, but another question is whether you really want
    > such a list. You may well be better off with a database of some kind -
    > they're designed for manipulating large amounts of data.
    >
    > Tim Delaney


    I need some real speed! a database is waaay to slow for the algorithm
    im using. and because the sublists are of varying size, i dont think I
    can use an array...
     
    Dr Mephesto, Sep 6, 2007
    #10
  11. Dr Mephesto

    Paul McGuire Guest

    On Sep 6, 12:47 am, Dr Mephesto <> wrote:
    >
    > I need some real speed! a database is waaay to slow for the algorithm
    > im using. and because the sublists are of varying size, i dont think I
    > can use an array...- Hide quoted text -
    >
    > - Show quoted text -


    How about a defaultdict approach?

    from collections import defaultdict

    dataArray = defaultdict(lambda : [[],[],[],[],[]])
    dataArray[1001][3].append('x')
    dataArray[42000][2].append('y')

    for k in sorted(dataArray.keys()):
    print "%6d : %s" % (k,dataArray[k])

    prints:
    1001 : [[], [], [], ['x'], []]
    42000 : [[], [], ['y'], [], []]

    -- Paul
     
    Paul McGuire, Sep 6, 2007
    #11
  12. Dr Mephesto <> writes:

    > I need some real speed!


    Is the speed with the GC turned off sufficient for your usage?
     
    Hrvoje Niksic, Sep 6, 2007
    #12
  13. Dr Mephesto

    Dr Mephesto Guest

    On 6 Sep., 09:30, Paul McGuire <> wrote:
    > On Sep 6, 12:47 am, Dr Mephesto <> wrote:
    >
    >
    >
    > > I need some real speed! a database is waaay to slow for the algorithm
    > > im using. and because the sublists are of varying size, i dont think I
    > > can use an array...- Hide quoted text -

    >
    > > - Show quoted text -

    >
    > How about a defaultdict approach?
    >
    > from collections import defaultdict
    >
    > dataArray = defaultdict(lambda : [[],[],[],[],[]])
    > dataArray[1001][3].append('x')
    > dataArray[42000][2].append('y')
    >
    > for k in sorted(dataArray.keys()):
    > print "%6d : %s" % (k,dataArray[k])
    >
    > prints:
    > 1001 : [[], [], [], ['x'], []]
    > 42000 : [[], [], ['y'], [], []]
    >
    > -- Paul


    hey, that defaultdict thing looks pretty cool...

    whats the overhead like for using a dictionary in python?

    dave
     
    Dr Mephesto, Sep 7, 2007
    #13
  14. En Fri, 07 Sep 2007 16:16:46 -0300, Dr Mephesto <>
    escribi�:

    > hey, that defaultdict thing looks pretty cool...
    >
    > whats the overhead like for using a dictionary in python?


    Dictionaries are heavily optimized in Python. Access time is O(1),
    adding/removing elements is amortized O(1) (that is, constant time unless
    it has to grow/shrink some internal structures.)

    --
    Gabriel Genellina
     
    Gabriel Genellina, Sep 8, 2007
    #14
  15. Dr Mephesto

    Dr Mephesto Guest

    On Sep 8, 3:33 am, "Gabriel Genellina" <> wrote:
    > En Fri, 07 Sep 2007 16:16:46 -0300, Dr Mephesto <>
    > escribi?:
    >
    > > hey, that defaultdict thing looks pretty cool...

    >
    > > whats the overhead like for using a dictionary in python?

    >
    > Dictionaries are heavily optimized in Python. Access time is O(1),
    > adding/removing elements is amortized O(1) (that is, constant time unless
    > it has to grow/shrink some internal structures.)
    >
    > --
    > Gabriel Genellina


    well, I want to (maybe) have a dictionary where the value is a list of
    5 lists. And I want to add a LOT of data to these lists. 10´s of
    millions of pieces of data. Will this be a big problem? I can just try
    it out in practice on monday too :)

    thanks
     
    Dr Mephesto, Sep 8, 2007
    #15
  16. Dr Mephesto wrote:
    > On Sep 8, 3:33 am, "Gabriel Genellina" <> wrote:
    >> En Fri, 07 Sep 2007 16:16:46 -0300, Dr Mephesto <>
    >> escribi?:
    >>
    >>> hey, that defaultdict thing looks pretty cool...
    >>> whats the overhead like for using a dictionary in python?

    >> Dictionaries are heavily optimized in Python. Access time is O(1),
    >> adding/removing elements is amortized O(1) (that is, constant time unless
    >> it has to grow/shrink some internal structures.)
    >>
    >> --
    >> Gabriel Genellina

    >
    > well, I want to (maybe) have a dictionary where the value is a list of
    > 5 lists. And I want to add a LOT of data to these lists. 10´s of
    > millions of pieces of data. Will this be a big problem? I can just try
    > it out in practice on monday too :)
    >
    > thanks
    >
    >


    targetList = myDict[someKey] # This takes normal dict access time
    for j in xrange(5) :
    for i in xrange(50000000) : # Add a LOT of data to targetList
    targetList[j].append(i) # This takes normal list access time
     
    =?ISO-8859-1?Q?Ricardo_Ar=E1oz?=, Sep 8, 2007
    #16
  17. Dr Mephesto

    Paul Rubin Guest

    Dr Mephesto <> writes:
    > well, I want to (maybe) have a dictionary where the value is a list of
    > 5 lists. And I want to add a LOT of data to these lists. 10´s of
    > millions of pieces of data. Will this be a big problem? I can just try
    > it out in practice on monday too :)


    Yes, that may be a problem both because of the amount of memory
    required, and because of how the GC works. You may want to turn off
    the GC while building these list. Otherwise, think of some other
    strategy, like files on disk.
     
    Paul Rubin, Sep 8, 2007
    #17
  18. Dr Mephesto a écrit :
    > Hi!
    >
    > I would like to create a pretty big list of lists; a list 3,000,000
    > long, each entry containing 5 empty lists. My application will append
    > data each of the 5 sublists, so they will be of varying lengths (so no
    > arrays!).
    >
    > Does anyone know the most efficient way to do this?


    Hem... Did you consider the fact that RAM is not an unlimited resource?

    Let's do some simple math (please someone correct me if I'm going off
    the road): if a Python (empty) list object required 256 bits (if I refer
    to some old post by GvR, it's probably more - 384 bytes at least. Some
    Python guru around ?), you'd need (1 + (3000000 * 5)) * 256 bits just to
    build this list of lists. Which would make something around 3 Gb. Not
    counting all other needed memory...

    FWIW, run the following code:

    # eatallramthenswap.py
    d = {}
    for i in xrange(3000000):
    d = ([], [], [], [], [])

    And monitor what happens with top...
     
    Bruno Desthuilliers, Sep 8, 2007
    #18
  19. Dr Mephesto a écrit :
    > On Sep 8, 8:06 pm, Bruno Desthuilliers
    > <> wrote:
    >
    >>Dr Mephesto a écrit :
    >>
    >>
    >>>Hi!

    >>
    >>>I would like to create a pretty big list of lists; a list 3,000,000
    >>>long, each entry containing 5 empty lists. My application will append
    >>>data each of the 5 sublists, so they will be of varying lengths (so no
    >>>arrays!).

    >>
    >>>Does anyone know the most efficient way to do this?

    >>
    >>Hem... Did you consider the fact that RAM is not an unlimited resource?
    >>
    >>Let's do some simple math (please someone correct me if I'm going off
    >>the road): if a Python (empty) list object required 256 bits (if I refer
    >>to some old post by GvR, it's probably more - 384 bytes at least. Some
    >>Python guru around ?), you'd need (1 + (3000000 * 5)) * 256 bits just to
    >>build this list of lists. Which would make something around 3 Gb. Not
    >>counting all other needed memory...
    >>
    >>FWIW, run the following code:
    >>
    >># eatallramthenswap.py
    >>d = {}
    >>for i in xrange(3000000):
    >> d = ([], [], [], [], [])
    >>
    >>And monitor what happens with top...

    >
    >
    > Unused ram is wasted ram :)


    Indeed. But when your app eats all RAM and swap and brings the system
    down, users are usually a bit unhappy !-)

    > I tried using MySQL, and it was to slow.


    Possibly.

    > and I have 4gb anyway...


    *You* have 4gb. Yes, fine. But:

    1/ please take time to re-read my post - the 3 gb is based on a very
    optimistic estimation (256 bits) of the size of an empty list. If you
    choose the (probably much closer to reality) estimate of 384 bits, then
    you need (1 + (3000000 * 5)) * 384, which makes =~ 5 gb. More than what
    *you* have. BTW, please remember that your OS and the Python interpreter
    are going to eat some of these 4 gb, and that you intend to actually
    *store* something - objects references - in these lists. Even if you do
    have a few shared references, this means that you'll have to have RAM
    space for *at least* 3000000 * 5 *more* Python objects (which make *1*
    object per list...). Which will *at minima* use about the same amount of
    RAM as the list of lists itself. Which take us to something like 10
    gb... for *1* object per list. I of course suppose you plan to store
    much more than 1 object per list !-)

    2/ now ask yourself how many users of your application will have enough
    RAM to run it...

    So IMVHO, the question is not how to build such a list in less than x
    minutes, but how to *not* build such a list. IOW, do you *really* need
    to store all that stuff in RAM ?
     
    Bruno Desthuilliers, Sep 9, 2007
    #19
  20. Dr Mephesto

    Dr Mephesto Guest

    On Sep 8, 8:06 pm, Bruno Desthuilliers
    <> wrote:
    > Dr Mephesto a écrit :
    >
    > > Hi!

    >
    > > I would like to create a pretty big list of lists; a list 3,000,000
    > > long, each entry containing 5 empty lists. My application will append
    > > data each of the 5 sublists, so they will be of varying lengths (so no
    > > arrays!).

    >
    > > Does anyone know the most efficient way to do this?

    >
    > Hem... Did you consider the fact that RAM is not an unlimited resource?
    >
    > Let's do some simple math (please someone correct me if I'm going off
    > the road): if a Python (empty) list object required 256 bits (if I refer
    > to some old post by GvR, it's probably more - 384 bytes at least. Some
    > Python guru around ?), you'd need (1 + (3000000 * 5)) * 256 bits just to
    > build this list of lists. Which would make something around 3 Gb. Not
    > counting all other needed memory...
    >
    > FWIW, run the following code:
    >
    > # eatallramthenswap.py
    > d = {}
    > for i in xrange(3000000):
    > d = ([], [], [], [], [])
    >
    > And monitor what happens with top...


    Unused ram is wasted ram :)

    I tried using MySQL, and it was to slow. and I have 4gb anyway...
     
    Dr Mephesto, Sep 11, 2007
    #20
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. =?UTF-8?B?w4FuZ2VsIEd1dGnDqXJyZXogUm9kcsOtZ3Vleg==

    List of lists of lists of lists...

    =?UTF-8?B?w4FuZ2VsIEd1dGnDqXJyZXogUm9kcsOtZ3Vleg==, May 8, 2006, in forum: Python
    Replies:
    5
    Views:
    440
    =?UTF-8?B?w4FuZ2VsIEd1dGnDqXJyZXogUm9kcsOtZ3Vleg==
    May 15, 2006
  2. Shaguf
    Replies:
    0
    Views:
    560
    Shaguf
    Dec 24, 2008
  3. Shaguf
    Replies:
    0
    Views:
    506
    Shaguf
    Dec 26, 2008
  4. Shaguf
    Replies:
    0
    Views:
    277
    Shaguf
    Dec 26, 2008
  5. Shaguf
    Replies:
    0
    Views:
    257
    Shaguf
    Dec 24, 2008
Loading...

Share This Page