too many open files? How to know?

D

Derrell Durrett

Howdy-

I have a situation in which a program that executes on solaris, a RedHat
flavor of Linux (32- and 64-bit), and Windows XP (32- and 64-bit) fails
on 32-bit XP w/the $! value equivalent to the string "Too many open files."

I am running an external program whose output to stderr I want to
capture, in case it's interesting. The algorithm is (as suggested in
recipe 7.20 in the Perl Cookbook):

dup STDERR (open using ">&" ) to new filehandle.
create new filehandle, to temporary file (using IO::File->new_tmpfile)
take file-descriptor of new filehandle (using fileno())
alias STDERR to new filehandle (open STDERR using ">&=$fileno" )

I then run my external program, and close the filehandles, undef the
temporary variable attached to the temporary file, and reopen stderr to
point back to the original.

In opening STDERR to alias it to the temporary file's descriptor, the
program fails.

I've done this in the debugger, and when I look at the symbol table for
either main, or the package in which the filehandles are being created,
and I don't see anything unusual (only STDOUT, STDIN, STDERR, and the
duplicate ). I did this using the 'x \%main::' and
'x\%<package_name>::' commands at the debugger command line.

Is there a better way to see what files are open? Is this likely a red
herring?

Thanks,

Derrell
 
D

Derrell Durrett

Derrell said:
I have a situation in which a program that executes on solaris, a
RedHat flavor of Linux (32- and 64-bit), and Windows XP (32- and
64-bit) fails on 32-bit XP w/the $! value equivalent to the string
"Too many open files."

I can duplicate the problem using the following code:

use strict;
use warnings;
use English;
use IO::File;

my $count = 0;
while (1) {

my @output;
$count++;
runCmd( 'ls', \@output );
}

sub runCmd {

my ( $cmd, $container ) = @ARG;

# The following code mimics recipe 7.20 from the Perl Cookbook and is
# necessary because the commands being run may output to STDERR and we
# want to capture that.
unless( open( ORIGINAL_STDERR, ">&STDERR" ) ) {

die( "Could not redirect STDERR" );
}

my $error_fh;
unless( $error_fh = IO::File->new_tmpfile ) {

die( "Could not open temporary file for STDERR: $OS_ERROR" );
}
my $error_fd = $error_fh->fileno();
unless( open( STDERR, ">&=$error_fd" ) ) {

die( "Iteration: $count\nCould not duplicate temporary filehandle
for ",
"STDERR: ",$OS_ERROR );
}
STDERR->autoflush( 1 );

@{ $container } = (`$cmd`);

# Close the temporary filehandle.
close $error_fh
or die( "Couldn't close temporary STDERR. ", $OS_ERROR );

# Close redirected handle
close STDERR
or die( "Could not close redirected STDERR" );

# Clean up after ourselves
undef $error_fh;

# Restore STDERR
open( STDERR, ">&ORIGINAL_STDERR" )
or die( "Could not restore STDERR" );

# Close copy to prevent leaks
close ORIGINAL_STDERR
or die( "Could not close copied STDERR" );
}

This gives varying numbers of iterations, depending on whether I'm
executing the program locally, or via rsh, but the error is the same:

"Could not duplicate temporary filehandle for STDERR: Too many open
files at test_opens.plx line 34"
I've done this in the debugger, and when I look at the symbol table
for either main, or the package in which the filehandles are being
created, and I don't see anything unusual (only STDOUT, STDIN, STDERR,
and the duplicate ). I did this using the 'x \%main::' and
'x\%<package_name>::' commands at the debugger command line.

Is there a better way to see what files are open? Is this likely a
red herring?

Since I can create it w/out the original program, it's clearly this bit
of code that matters. What am I missing? I've tried to be scrupulous
about closing all opened file handles, but seem to have a leak nevertheless.

Anything helps,

Derrell
 
D

Derrell Durrett

When I replace the previous code w/File::Temp, I see the same problem.
The XP OS complains after about 500 iterations that I've run out of file
descriptors. The error message isn't particularly informative: bldperl
writestderr exitted with 16777215 and core dumped from signal 127, where
writestderr is the following:

use strict;
use warnings;

print STDERR q[I'm freaking out! ];
die "Still!\n";

and bldperl is a simple wrapper around perl.

Has anyone else seen this problem? I can work around it using real
files to capture stdout and stderr, but since there are often cases
where I am trying to do this on multiple machines over a short amount of
time in a shared network directory. IO::File's tmpfile (if I understood
the docs correctly) was using memory to create these files, not real
files. If it was doing it w/real files, at least I didn't have to think
of algorithms that make the files just more likely to be unique. In any
case, I preferred that method, and for NT it worked. For SunOS 5.8 it
works. For RedHat 7.2 (or whatever the Enterprise equivalent is that
we're using now), it works.

I found (but lost and cannot find again) a mention that some file
related module was not available for XP, and it sounded potentially
related to this.

If anyone has had an experience w/XP similar to this, where the error
"Too many open files" has appeared even though you're sure you're
closing them (if because they go out of scope, if nothing else), I'd be
interested in comparing notes.

Thanks,

Derrell
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,764
Messages
2,569,566
Members
45,041
Latest member
RomeoFarnh

Latest Threads

Top