S
Scott Stark
Hi, I'm running a script that reads through large numbers of html
files (1500-2000 or so) in each of about 20 directories, searching for
strings in the files.
For some reason the script quits midway through, and I get a
"Terminated" message. It quits while checking a batch of files at a
different point in the file system every time, so I know it's not a
code error. In fact if I limit the total number of files processed to
a couple of hundred, the script runs fine.
Is this some kind of memory problem or other resource problem? I've
tried breaking up each directory pass into separate subroutine calls,
and even broken up the individual directory lists so that they process
in smaller batches of 300 each, thinking that might free up resources.
Something like this:
foreach $d (@dirs){
my @files = glob("$basedir/$d/*.html $basedir/$d/*.htm");
if(scalar(@files) > 300){
... # make smaller lists called my(@shortList) of 300 each
search_files(@shortList);
}
}
sub search_files {
my @files = @_;
... # search through each file
}
I've tried running the script with perl -d and #! /usr/bin/perl -w
with no errors and get the same results, but at different points in
the file system.
Any thoughts? If it's a memory problem, is there some way to free up
memory?
thanks,
Scott
files (1500-2000 or so) in each of about 20 directories, searching for
strings in the files.
For some reason the script quits midway through, and I get a
"Terminated" message. It quits while checking a batch of files at a
different point in the file system every time, so I know it's not a
code error. In fact if I limit the total number of files processed to
a couple of hundred, the script runs fine.
Is this some kind of memory problem or other resource problem? I've
tried breaking up each directory pass into separate subroutine calls,
and even broken up the individual directory lists so that they process
in smaller batches of 300 each, thinking that might free up resources.
Something like this:
foreach $d (@dirs){
my @files = glob("$basedir/$d/*.html $basedir/$d/*.htm");
if(scalar(@files) > 300){
... # make smaller lists called my(@shortList) of 300 each
search_files(@shortList);
}
}
sub search_files {
my @files = @_;
... # search through each file
}
I've tried running the script with perl -d and #! /usr/bin/perl -w
with no errors and get the same results, but at different points in
the file system.
Any thoughts? If it's a memory problem, is there some way to free up
memory?
thanks,
Scott