Need advice on reading contents of a file into memory

V

vinjvinj

f = open(someFilePath, "rb")
content = []
for data in content.read()
content.append(data)
fullContent = "".join(content)

Is there a more efficient way of doing this? I'll be running this
operation on 10,000+ files where each file is an image file with size
50k-100k
 
F

Felipe Almeida Lessa

Em Qua, 2006-03-15 às 13:49 -0800, vinjvinj escreveu:
f = open(someFilePath, "rb")
content = []
for data in content.read()
content.append(data)
fullContent = "".join(content)

Is there a more efficient way of doing this? I'll be running this
operation on 10,000+ files where each file is an image file with size
50k-100k

If you really need everything in memory, why not just...

fullContent = open(someFilePath, "rb").read()

....?
 
F

Fredrik Lundh

vinjvinj said:
f = open(someFilePath, "rb")
content = []
for data in content.read()
content.append(data)
fullContent = "".join(content)

Is there a more efficient way of doing this?

read reads until end of file, so unless the source is something unusual,
a plain

fullContent = content.read()

should be good enough (not that it matters much; that join will be in
no-op, and the list/append overhead is marginal compared to the time
required to get the data from disk)

</F>
 
V

vinjvinj

Thanks. read() did not work when I opened the file with:

f = open(someFilePath)

But after changing to f = open(someFilePath, "rb") the read() works
fine.

VJ
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Staff online

Members online

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,012
Latest member
RoxanneDzm

Latest Threads

Top