M
Martin Trautmann
Hi all,
I got a perl script here which takes more memory than expected: It
starts at about 10 MB, which is ok. After half of the job (3000 URL
checks) it got > 500 MB, further increasing up to its termination. I've
seen several segmentation fault errors as soon as the script would end,
but don't know why. It's just a while loop, which is processed again and
again without problems, while there's no more command after the end of
this loop. Maybe the increased memory size is related to this error.
I use strict and warnings, I verified that there was nothing wrong about
the variables and open/close files.
I take extensive use of WWW::Mechanize, without any knowledge what going
on inside.
I used Devel::Leaks in order to get the info that there's much going on -
but I do not know yet what to do about this kind of information.
What's your recommendation for a perl novice how to find out about these
leaks?
Thanks,
Martin
I got a perl script here which takes more memory than expected: It
starts at about 10 MB, which is ok. After half of the job (3000 URL
checks) it got > 500 MB, further increasing up to its termination. I've
seen several segmentation fault errors as soon as the script would end,
but don't know why. It's just a while loop, which is processed again and
again without problems, while there's no more command after the end of
this loop. Maybe the increased memory size is related to this error.
I use strict and warnings, I verified that there was nothing wrong about
the variables and open/close files.
I take extensive use of WWW::Mechanize, without any knowledge what going
on inside.
I used Devel::Leaks in order to get the info that there's much going on -
but I do not know yet what to do about this kind of information.
What's your recommendation for a perl novice how to find out about these
leaks?
Thanks,
Martin