Re: what happens when the file begin read is too big for all lines toberead with "readlines()"

Discussion in 'Python' started by Fredrik Lundh, Nov 19, 2005.

  1. Ross Reyes wrote:

    > When I use readlines, what happens if the number of lines is huge? I have
    > a very big file (4GB) I want to read in, but I'm sure there must be some
    > limitation to readlines and I'd like to know how it is handled by python.


    readlines itself has no limitation, but it reads all the lines into memory, so
    you'll probably run out of memory before the call returns. Python raises
    a MemoryError exception when this happens.

    > I am using it like this:
    > slines = infile.readlines() # reads all lines into a list of strings called
    > "slines"


    as others have pointed out, an iterator is the best way to solve this.
    the iterator code will read blocks of data from the file, and return the
    lines one by one.

    if you need to support older versions of Python, xreadlines or repeated
    calls to readlines(N) can be useful (see the third example on this page
    for an example: http://effbot.org/zone/readline-performance.htm )

    </F>
    Fredrik Lundh, Nov 19, 2005
    #1
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Ross Reyes
    Replies:
    17
    Views:
    788
    Mike Meyer
    Nov 21, 2005
  2. NM
    Replies:
    6
    Views:
    465
    Default User
    Sep 20, 2006
  3. Shaguf
    Replies:
    0
    Views:
    349
    Shaguf
    Dec 24, 2008
  4. Shaguf
    Replies:
    0
    Views:
    447
    Shaguf
    Dec 26, 2008
  5. Shaguf
    Replies:
    0
    Views:
    235
    Shaguf
    Dec 26, 2008
Loading...

Share This Page