Out of memory! When running perl script on windows

F

felad

Hi

I have Perl program on windows that suppose to run on about 10,000
items in a loop ( should take 2-3 days for it to run )
The problem is that after several hours I get this error:
Out of memory!
Callback called exit.

I have found that every single loop the commit memory grow by 6000 K so
I guess this is why the program crash
But I really can't find that cause of it, most of my variable are local
and I can't find any infinite loop

Any ideas ?
 
M

Matt Garrish

felad said:
I have Perl program on windows that suppose to run on about 10,000
items in a loop ( should take 2-3 days for it to run )
The problem is that after several hours I get this error:
Out of memory!
Callback called exit.

I have found that every single loop the commit memory grow by 6000 K so
I guess this is why the program crash
But I really can't find that cause of it, most of my variable are local
and I can't find any infinite loop

Any ideas ?

Why does it take 2-3 days to run 10,000 items? What is it that you are
doing? Are you calling some other program that has a memory leak? Are
you appending data to a variable that doesn't get reset with each
iteration of the loop?

It's nearly impossible to give you any meaningful advice if you don't
give specifics and show some code.

Matt
 
P

Peter J. Holzer

I have found that every single loop the commit memory grow by 6000 K so
I guess this is why the program crash
But I really can't find that cause of it, most of my variable are local
and I can't find any infinite loop

Any ideas ?

First, check that you aren't creating cyclic data structures: The perl
garbage collector cannot free cyclic data structures. If you really need
them, use weak references or destroy them explicitely after you are done
with them.

Second, lexical variables aren't freed immediately after they go out of
scope but only after the function returns. So if you create large
temporary variables, use them inside functions, not in the main program.

Third, modules which may be helpful: Devel::Cycle, Devel::Leak.

Fourth, perl has quite a bit overhead for each object: If you must keep
lots of data in memory, try to keep few large objects instead of many
little ones.

hp
 
A

alpha_beta_release

Hi
i used to do scripting for Bioinformatics and still doing it. Most of
the task is to swallow a huge chunk of text, parse it, then tranfer or
convert into another form (SQL database, other text format etc...). I
had met this kind of problem several times.

Normally what i do is :
1- check for infinite loop, which caused by
+ unchanged variable values (missing $i++ etc.)
+ never fulfilled condition (-1 > 0 etc.)
2- undef not-used variable in the middle of script (sometimes these
variables hold huge data from file, and used only before parsing). So
undef after parsing, and before doing other task, to save memory.
3- optimize (reading file chunk-by-chunk can save memory than slurping
a whole file etc.)

etc.
 
K

Klaus

felad said:
I have Perl program on windows that suppose to run on about 10,000
items in a loop ( should take 2-3 days for it to run )
The problem is that after several hours I get this error:
Out of memory!
Callback called exit.

I have found that every single loop the commit memory grow by 6000 K so
I guess this is why the program crash
But I really can't find that cause of it, most of my variable are local
and I can't find any infinite loop

Can you post some code ?

(that might not be easy: preferrably a short but complete version of
the program, that still causes the Out of memory error)
 
F

felad

Hi

The script is very long so I don't think it will help
Still couldn't find the problem but now I was thinking about printing
how much memory the program \ process takes in several functions to try
to find the problem
Which perl module I can use to print the memory the script takes ?

Thanks
 
D

Dr.Ruud

alpha_beta_release schreef:
2- undef not-used variable in the middle of script (sometimes these
variables hold huge data from file, and used only before parsing).
So undef after parsing, and before doing other task, to save memory.

The variable needs to get out of scope AND not being referenced anymore,
for the memory to be returned.

perldoc -q memory
perldoc -q shrink
 
X

xhoster

felad said:
Hi

I have Perl program on windows that suppose to run on about 10,000
items in a loop ( should take 2-3 days for it to run )

Wow, that is a long time for 10,000 items. What are you doing to them?
The problem is that after several hours I get this error:
Out of memory!
Callback called exit.

I have found that every single loop the commit memory grow by 6000 K so
I guess this is why the program crash
But I really can't find that cause of it, most of my variable are local
and I can't find any infinite loop

Any ideas ?

It is hard to say without knowing more. If each item is processed
seperately (i.e. results of current item don't depend on results of past
items) then it should be easy to add a "next" or "return" part way through
the processing code, so that each item is only partially processed. Move
the next or return statement around in a binary search-like way to figure
out where the leak is triggered. Montior the memory useage closely so that
you can figure out if it is leaking without having to wait for 3 days
and/or for a OOM error.

Xho
 
A

alpha_beta_release

Ah yes, that for automatic reallocation (using closure {}). To tell
Perl explicitly ('we are done with this thing here') we use undef. BTW
i'm also suggesting to try multi threading.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,009
Latest member
GidgetGamb

Latest Threads

Top