P
pppe
Hi
I am posting this from my post to Perl Programming.
--
Is someone able to tell me if when using the open command to read data
from a file, if the file was large, say 80mb, would it load much of the
file into memory or would it only take a record at a time?
To expand further, I currently use a perl script to open a data file,
search each record for a match to a user input and then store the
matched data in an array. I stop the search if there are more than 200
results so the array never exceeds this, but as the data file is so
large I wonder if there is any load on server memory, especially if
numerous users were accessing it at once.
Also, should I expect the server load to be greater with such a large
file due to execution time? I already run this type of script on
smaller files (5mb+) but an 80mb file concerns me on a shared server.
Thanks
pppe
I am posting this from my post to Perl Programming.
--
Is someone able to tell me if when using the open command to read data
from a file, if the file was large, say 80mb, would it load much of the
file into memory or would it only take a record at a time?
To expand further, I currently use a perl script to open a data file,
search each record for a match to a user input and then store the
matched data in an array. I stop the search if there are more than 200
results so the array never exceeds this, but as the data file is so
large I wonder if there is any load on server memory, especially if
numerous users were accessing it at once.
Also, should I expect the server load to be greater with such a large
file due to execution time? I already run this type of script on
smaller files (5mb+) but an 80mb file concerns me on a shared server.
Thanks
pppe