(Too many open files)

C

Collin VanDyck

Hi,

This problem just started coming up in our application. It is a publishing
application that writes pages and other items to disk, possibly managing
many different sites at the same time.

The user has the option of publishing all of the sites, which then invokes a
routine that writes all of the pages in all of the sites to disk before
FTP'ing them to the remote server.

The problem I'm seeing is that after a couple of minutes, I get a (Too many
open files) explanation as java.io.FileNotFoundExceptions start cropping up.

In my application, I am using java.io.File in the following manner:

protected java.io.File writeFile(params...) {
java.io.File outputFile = new java.io.File(fileName);
FileOutputStream out = new FileOutputStream(outputFile);
out.write(fileContent.toByteArray());
out.close();
return outputFile;
}

The outputFile that is returned is used later to locate the file by pathName
when sending over FTP, but I get this error without invoking the FTP step.
For some reason, the FileOutputStream.close() method does not seem to be
releasing the file back into the operating system. I verified this by
trying to delete the temporary file created by the above method, and WinXP
told me that it could not, as it was being used by another process.

Anyone have any ideas on why this is happening? I think that the close()
method should do what I want, but apparently it is not. Thanks!

Collin
 
V

VK

Your "outputFile" contains a reference to "out", actually it's created
from it.
Evidently as long as "outputFile" exists, Garbage Collector cannot
destroy "out".
Re-think your algorithm to make it suitable for Java.
 
C

Collin VanDyck

VK -- I had thought the same thing, but a couple of minutes ago, I found a
forgotten-about read routine that uses the
RandomAccessFile class to read the file's contents. I had not closed that
object , and doing so resolved the problem.

thanks,
 
S

Steve Horsley

Hi,

This problem just started coming up in our application. It is a
publishing application that writes pages and other items to disk,
possibly managing many different sites at the same time.

The user has the option of publishing all of the sites, which then
invokes a routine that writes all of the pages in all of the sites to
disk before FTP'ing them to the remote server.

The problem I'm seeing is that after a couple of minutes, I get a (Too
many open files) explanation as java.io.FileNotFoundExceptions start
cropping up.

In my application, I am using java.io.File in the following manner:

protected java.io.File writeFile(params...) {
java.io.File outputFile = new java.io.File(fileName);
FileOutputStream out = new FileOutputStream(outputFile);
out.write(fileContent.toByteArray());
out.close();
return outputFile;
}
}
The outputFile that is returned is used later to locate the file by
pathName when sending over FTP, but I get this error without invoking
the FTP step. For some reason, the FileOutputStream.close() method does
not seem to be releasing the file back into the operating system. I
verified this by trying to delete the temporary file created by the
above method, and WinXP told me that it could not, as it was being used
by another process.

Anyone have any ideas on why this is happening? I think that the
close() method should do what I want, but apparently it is not. Thanks!

Collin

Look at your exception handling - an exception that's caught and ignored
perhaps. See if it's possible that an exception thrown somewhere in
writing the file causes the out.close() to be skipped. This would then
accumulate an open FileOutputStream for every exception thrown.

Try always to use a try/finally pair when allocating external resources to
avoid unexpected disappointments:

protected java.io.File writeFile(params...) throws IOException {
java.io.File outputFile = new java.io.File(fileName);
FileOutputStream out = null;
try {
out = new FileOutputStream(outputFile);
out.write(fileContent.toByteArray());
}
finally {
if(out != null) {
out.close();
}
}
return outputFile;
}

With the above, whatever happens you will not be left with an open stream.
Understanding this pattern sooner would have saved me a few midnight
callouts - for a 24*7 app that started reporting "too many open files"
funnily enough. Always in the middle of the night.

Steve
 
M

Mike Schilling

Steve Horsley said:
Look at your exception handling - an exception that's caught and ignored
perhaps. See if it's possible that an exception thrown somewhere in
writing the file causes the out.close() to be skipped. This would then
accumulate an open FileOutputStream for every exception thrown.

Try always to use a try/finally pair when allocating external resources to
avoid unexpected disappointments:

protected java.io.File writeFile(params...) throws IOException {
java.io.File outputFile = new java.io.File(fileName);
FileOutputStream out = null;
try {
out = new FileOutputStream(outputFile);
out.write(fileContent.toByteArray());
}
finally {
if(out != null) {
out.close();
}
}
return outputFile;
}

Good advice, though I'd think

FileOutputStream out = new FileOutputStream(outputFile);
try {
out.write(fileContent.toByteArray());
}
finally {
out.close();
}

is sufficient, since if the FileOutputStream constructor throws an
exception, there's no cleanup that can be done (and in that case it's
FileOutputStream's responsibility to release any resources it's taken.)
 
C

Collin VanDyck

Wow, thanks guys for the responses. I will revisit that code to make sure
the cleanup happens in all circumstances.

:)

Collin
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,262
Messages
2,571,049
Members
48,769
Latest member
Clifft

Latest Threads

Top