open function fail after running a day

A

alexteo21

I have created a script using python that will batch process data
files every hour
The script is running on Solaris. Python version 2.3.3

t=open(filename,'rb')
data=t.read()
#processing data...
t.close()

The script is working fine on the day of execution.
It is able to process the data files very hour. However, the
processing fail one day later i.e. the date increment by 1.

Traceback (most recent call last):
File "./alexCopy.py", line 459, in processRequestModule
sanityTestSteps(reqId,model)
File "./alexCopy.py", line 699, in sanityTestSteps
t = open(filename, 'rb')
IOError: [Errno 24] Too many open files:

I have explicitly closed the file. Is there something else I need to
do?

Appreciate your comments
 
N

Nikita the Spider

alexteo21 said:
The script is working fine on the day of execution.
It is able to process the data files very hour. However, the
processing fail one day later i.e. the date increment by 1.

Traceback (most recent call last):
File "./alexCopy.py", line 459, in processRequestModule
sanityTestSteps(reqId,model)
File "./alexCopy.py", line 699, in sanityTestSteps
t = open(filename, 'rb')
IOError: [Errno 24] Too many open files:

I have explicitly closed the file. Is there something else I need to
do?

Sounds like the .close() isn't getting executed as you think. Try using
the logging module to log a line immediately before each open and close
so that you can ensure you're really closing all the files.
Alternatively, some other bit of code my be the guilty party. A utility
like fstat can show you who has files open.

Good luck
 
V

Vinay Sajip

I have created a script using python that will batch process data
files every hour
The script is running on Solaris. Python version 2.3.3

t=open(filename,'rb')
data=t.read()
#processing data...
t.close()

Try the following approach:

t=open(filename,'rb')
try:
data=t.read()
#processing data...
finally:
t.close()

and see if that improves matters. If you want to add logging for a
quick check, then...

import logging

t=open(filename,'rb')
try:
data=t.read()
#processing data...
except:
logging.exception("Failed to process file %r", filename)
finally:
t.close()

Regards,

Vinay Sajip
 
V

Vinay Sajip

Try the following:

import logging

t=open(filename,'rb')
try:
data=t.read()
#processing data...
except:
logging.exception("Failed to process %r", filename)
finally:
t.close()
 
V

Vinay Sajip

Try the following (Python 2.5.x):

import logging

t=open(filename,'rb')
try:
data=t.read()
#processing data...
except:
logging.exception("Failed to process %r", filename)
finally:
t.close()

For earlier versions of Python, you will need to nest the try blocks:

import logging

t=open(filename,'rb')
try:
try:
data=t.read()
#processing data...
except:
logging.exception("Failed to process %r", filename)
finally:
t.close()

Regards,


Vinay Sajip
 
V

Vinay Sajip

Sorry for the multiple posts. I kept getting network errors and it
looked like the posts weren't getting through.

Regards,

Vinay
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,770
Messages
2,569,583
Members
45,074
Latest member
StanleyFra

Latest Threads

Top