W
Witold Rugowski
Hi!
I have following problem. My perl script is parsing large log files (to be exact many medium sized) up to 3 or 4 GB of data. But all what is done, it is extracting some data (IP address and volume of traffic).
Data is read line by line, from file by file, with something like (not real code):
while (@files) {
open FILE, files[0] or die;
shift;
while (<FILE>) { if ( $_ =~ /(MY_PATTERN)/ ) call_sum_function($1); }
}
print_results();
All hashes, storing data are very compact (after processing 90% of 1.6 GB data all my defined hashes have less than 30 entries, every having two integers).
And after those 90% shit happens:
Out of memory during request for 156 bytes, total sbrk() is 536711168 bytes!
And I don't have idea why?? Of course I can read, that perl allocated 512MB and that is enough for him, but I have no idea:
a) what consumed this memory
b) how to avoid this error
Any help?
I have following problem. My perl script is parsing large log files (to be exact many medium sized) up to 3 or 4 GB of data. But all what is done, it is extracting some data (IP address and volume of traffic).
Data is read line by line, from file by file, with something like (not real code):
while (@files) {
open FILE, files[0] or die;
shift;
while (<FILE>) { if ( $_ =~ /(MY_PATTERN)/ ) call_sum_function($1); }
}
print_results();
All hashes, storing data are very compact (after processing 90% of 1.6 GB data all my defined hashes have less than 30 entries, every having two integers).
And after those 90% shit happens:
Out of memory during request for 156 bytes, total sbrk() is 536711168 bytes!
And I don't have idea why?? Of course I can read, that perl allocated 512MB and that is enough for him, but I have no idea:
a) what consumed this memory
b) how to avoid this error
Any help?