Y
Yash
Hi,
I am facing a problem with a Perl 5.6.1 program on HP-Ux 11i.
My program reads lines from a set of hundreds of files and based on a
particular field in every line, redistributes the lines to a new set
of files.
A new file is created using:
$fh{$timePartition} = IO::File->new("> $ENDIR/$timePartition.new") or
print LOGFILE "$!\n";
While running the program, I get an error at the above line saying
"Too many open files"
This is because I close the file descriptors after I am done with all
the input files.
Is there a way I can get around the problem from within Perl, by
increasing the value of some special variable?
If that is not possible, what system parameter should I increase to
avoid the problem?
Thanks
I am facing a problem with a Perl 5.6.1 program on HP-Ux 11i.
My program reads lines from a set of hundreds of files and based on a
particular field in every line, redistributes the lines to a new set
of files.
A new file is created using:
$fh{$timePartition} = IO::File->new("> $ENDIR/$timePartition.new") or
print LOGFILE "$!\n";
While running the program, I get an error at the above line saying
"Too many open files"
This is because I close the file descriptors after I am done with all
the input files.
Is there a way I can get around the problem from within Perl, by
increasing the value of some special variable?
If that is not possible, what system parameter should I increase to
avoid the problem?
Thanks