reading file objects in chunks

M

Martin Marcher

Hi,

I'm looking for something that will give me an iterator to a
file-(like)-object. I have large files with only a single line in it
that have fixed length fields like, record length is 26bytes, dataA is
10 bytes, dataB is 16 bytes.

Now when I made my parsing stuff but can't find anything that will let
me read those file efficiently (guess I'm just thinking too
complicated). I'd like to have something like:

f = file("datafile.dat", buffering=26)

for chunk in f.read_in_chunks():
compute_data(chunk)

f.iter() looked promising at first but somehow it doesn't do "the
right thing"(tm). also itertools doesn't quite seem to be what I want.
Maybe I just need coffee but right now I'm in the dark.

I'd really like something nicer than

chunksize = 26
f = file("datafile.dat", buffering=chunksize)

chunk = f.read(chunksize)
while len(chunk) == chunksize:
compute_data(chunk)
f.read(chunksize)

I just don't feel comfortable with it for some reason I can't explain...

thanks
martin
 
M

Marc 'BlackJack' Rintsch

I'd really like something nicer than

chunksize = 26
f = file("datafile.dat", buffering=chunksize)

chunk = f.read(chunksize)
while len(chunk) == chunksize:
compute_data(chunk)
f.read(chunksize)

I just don't feel comfortable with it for some reason I can't explain...

chunksize = 26
f = open('datafile.dat', 'rb')
for chunk in iter(lambda: f.read(chunksize), ''):
compute_data(chunk)
f.close()

Ciao,
Marc 'BlackJack' Rintsch
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,483
Members
44,901
Latest member
Noble71S45

Latest Threads

Top