too many open files? How to know?

Discussion in 'Perl Misc' started by Derrell Durrett, May 23, 2005.

  1. Howdy-

    I have a situation in which a program that executes on solaris, a RedHat
    flavor of Linux (32- and 64-bit), and Windows XP (32- and 64-bit) fails
    on 32-bit XP w/the $! value equivalent to the string "Too many open files."

    I am running an external program whose output to stderr I want to
    capture, in case it's interesting. The algorithm is (as suggested in
    recipe 7.20 in the Perl Cookbook):

    dup STDERR (open using ">&" ) to new filehandle.
    create new filehandle, to temporary file (using IO::File->new_tmpfile)
    take file-descriptor of new filehandle (using fileno())
    alias STDERR to new filehandle (open STDERR using ">&=$fileno" )

    I then run my external program, and close the filehandles, undef the
    temporary variable attached to the temporary file, and reopen stderr to
    point back to the original.

    In opening STDERR to alias it to the temporary file's descriptor, the
    program fails.

    I've done this in the debugger, and when I look at the symbol table for
    either main, or the package in which the filehandles are being created,
    and I don't see anything unusual (only STDOUT, STDIN, STDERR, and the
    duplicate ). I did this using the 'x \%main::' and
    'x\%<package_name>::' commands at the debugger command line.

    Is there a better way to see what files are open? Is this likely a red
    herring?

    Thanks,

    Derrell

    --
    Derrell Durrett
    Xilinx, Inc. / Software Productivity Tools
    Longmont, Colorado / 720.652.3843
    ***remove bits about .processed meats and .death from e-mail to reply
    Derrell Durrett, May 23, 2005
    #1
    1. Advertising

  2. Derrell Durrett wrote:

    > I have a situation in which a program that executes on solaris, a
    > RedHat flavor of Linux (32- and 64-bit), and Windows XP (32- and
    > 64-bit) fails on 32-bit XP w/the $! value equivalent to the string
    > "Too many open files."


    I can duplicate the problem using the following code:

    use strict;
    use warnings;
    use English;
    use IO::File;

    my $count = 0;
    while (1) {

    my @output;
    $count++;
    runCmd( 'ls', \@output );
    }

    sub runCmd {

    my ( $cmd, $container ) = @ARG;

    # The following code mimics recipe 7.20 from the Perl Cookbook and is
    # necessary because the commands being run may output to STDERR and we
    # want to capture that.
    unless( open( ORIGINAL_STDERR, ">&STDERR" ) ) {

    die( "Could not redirect STDERR" );
    }

    my $error_fh;
    unless( $error_fh = IO::File->new_tmpfile ) {

    die( "Could not open temporary file for STDERR: $OS_ERROR" );
    }
    my $error_fd = $error_fh->fileno();
    unless( open( STDERR, ">&=$error_fd" ) ) {

    die( "Iteration: $count\nCould not duplicate temporary filehandle
    for ",
    "STDERR: ",$OS_ERROR );
    }
    STDERR->autoflush( 1 );

    @{ $container } = (`$cmd`);

    # Close the temporary filehandle.
    close $error_fh
    or die( "Couldn't close temporary STDERR. ", $OS_ERROR );

    # Close redirected handle
    close STDERR
    or die( "Could not close redirected STDERR" );

    # Clean up after ourselves
    undef $error_fh;

    # Restore STDERR
    open( STDERR, ">&ORIGINAL_STDERR" )
    or die( "Could not restore STDERR" );

    # Close copy to prevent leaks
    close ORIGINAL_STDERR
    or die( "Could not close copied STDERR" );
    }

    This gives varying numbers of iterations, depending on whether I'm
    executing the program locally, or via rsh, but the error is the same:

    "Could not duplicate temporary filehandle for STDERR: Too many open
    files at test_opens.plx line 34"

    >
    > I've done this in the debugger, and when I look at the symbol table
    > for either main, or the package in which the filehandles are being
    > created, and I don't see anything unusual (only STDOUT, STDIN, STDERR,
    > and the duplicate ). I did this using the 'x \%main::' and
    > 'x\%<package_name>::' commands at the debugger command line.
    >
    > Is there a better way to see what files are open? Is this likely a
    > red herring?


    Since I can create it w/out the original program, it's clearly this bit
    of code that matters. What am I missing? I've tried to be scrupulous
    about closing all opened file handles, but seem to have a leak nevertheless.

    Anything helps,

    Derrell

    --
    Derrell Durrett
    Xilinx, Inc. / Software Productivity Tools
    Longmont, Colorado / 720.652.3843
    ***remove bits about .processed meats and .death from e-mail to reply
    Derrell Durrett, May 23, 2005
    #2
    1. Advertising

  3. XP implementation bug? [was Re: too many open files? How to know?]

    Derrell Durrett wrote:

    > Derrell Durrett wrote:
    >
    >> I have a situation in which a program that executes on solaris, a
    >> RedHat flavor of Linux (32- and 64-bit), and Windows XP (32- and
    >> 64-bit) fails on 32-bit XP w/the $! value equivalent to the string
    >> "Too many open files."

    >

    When I replace the previous code w/File::Temp, I see the same problem.
    The XP OS complains after about 500 iterations that I've run out of file
    descriptors. The error message isn't particularly informative: bldperl
    writestderr exitted with 16777215 and core dumped from signal 127, where
    writestderr is the following:

    use strict;
    use warnings;

    print STDERR q[I'm freaking out! ];
    die "Still!\n";

    and bldperl is a simple wrapper around perl.

    Has anyone else seen this problem? I can work around it using real
    files to capture stdout and stderr, but since there are often cases
    where I am trying to do this on multiple machines over a short amount of
    time in a shared network directory. IO::File's tmpfile (if I understood
    the docs correctly) was using memory to create these files, not real
    files. If it was doing it w/real files, at least I didn't have to think
    of algorithms that make the files just more likely to be unique. In any
    case, I preferred that method, and for NT it worked. For SunOS 5.8 it
    works. For RedHat 7.2 (or whatever the Enterprise equivalent is that
    we're using now), it works.

    I found (but lost and cannot find again) a mention that some file
    related module was not available for XP, and it sounded potentially
    related to this.

    If anyone has had an experience w/XP similar to this, where the error
    "Too many open files" has appeared even though you're sure you're
    closing them (if because they go out of scope, if nothing else), I'd be
    interested in comparing notes.

    Thanks,

    Derrell

    --
    Derrell Durrett
    Xilinx, Inc. / Software Productivity Tools
    Longmont, Colorado / 720.652.3843
    ***remove bits about .processed meats and .death from e-mail to reply
    Derrell Durrett, May 27, 2005
    #3
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Collin VanDyck

    (Too many open files)

    Collin VanDyck, Oct 13, 2003, in forum: Java
    Replies:
    5
    Views:
    2,861
    Collin VanDyck
    Oct 14, 2003
  2. Cathy  Hui
    Replies:
    6
    Views:
    9,054
    Wiseguy
    Mar 4, 2005
  3. bond
    Replies:
    1
    Views:
    3,196
    Mark Jeffcoat
    Jun 28, 2007
  4. bond
    Replies:
    3
    Views:
    22,413
  5. Dag Sunde
    Replies:
    4
    Views:
    351
Loading...

Share This Page