Small memory problem

D

David Morel

Hi all,

I am having a small problem with memory management.
Here is my code:

my $num = 500000;
my @paths;
# memory usage is now low
sleep 5;

for (my $i = 0; $i <= $num; $i++) {
my $path = [$i, $i];
push @paths, $path;
}
# memory usage is now high
sleep 5;

undef @paths;
# now I want to return to low memory usage,
# but the above undef statement doesn't get the job done!
sleep 5;

Notice that in the second block of code I create an array with 500000
pointers. Then, in the third block of code, I delete all of the
pointers with undef (I think). Then why doesn't the garbage collector
kick in? I mean the total memory usage after the second block is about
75MB, and then it only drops to about 73MB... (as opposed to about
1MB) ?

Thanks!
 
N

Nicholas Dronen

DM> Hi all,

DM> I am having a small problem with memory management.
DM> Here is my code:

DM> my $num = 500000;
DM> my @paths;
DM> # memory usage is now low
DM> sleep 5;

DM> for (my $i = 0; $i <= $num; $i++) {
DM> my $path = [$i, $i];
DM> push @paths, $path;
DM> }
DM> # memory usage is now high
DM> sleep 5;

DM> undef @paths;
DM> # now I want to return to low memory usage,
DM> # but the above undef statement doesn't get the job done!
DM> sleep 5;

DM> Notice that in the second block of code I create an array with 500000
DM> pointers. Then, in the third block of code, I delete all of the
DM> pointers with undef (I think). Then why doesn't the garbage collector
DM> kick in? I mean the total memory usage after the second block is about
DM> 75MB, and then it only drops to about 73MB... (as opposed to about
DM> 1MB) ?

Did you check 'perldoc -q free'?

Found in /usr/share/perl/5.8.0/pod/perlfaq3.pod
How can I free an array or hash so my program shrinks?

You usually can't. On most operating systems, memory allocated to a
program can never be returned to the system. That's why long-running
programs sometimes re-exec themselves. Some operating systems (notably,
systems that use mmap(2) for allocating large chunks of memory) can
reclaim memory that is no longer used, but on such systems, perl must
be configured and compiled to use the OS's malloc, not perl's.

. . . .

The pages of memory perl allocated *are* freed within the context
of the process. What you're seeing, however, is that the memory
allocater hasn't shrunk the heap -- this is normal behavior. Pages
allocated from the heap are "returned" to the operating system's
free pool of memory when the process exits.

Regards,

Nicholas
 
B

Benjamin Goldberg

Tore said:
# now I want to return to low memory usage,
# but the above undef statement doesn't get the job done!

That's right, and that's the way it is; When a program exits, it doesn't
necessarily free up its memory and return it to the OS. Check the perldoc
for information about this.

A comment on your code, however. Seems a bit C'ish to me, as you could
have written the same thing like this;

my @paths = ();
sleep 5;
for ( 1 .. 500_000 ) {
push( @paths, [$_, $_] );
}
sleep 5;

Surely you mean:

my @paths;
sleep 5;
@paths = map [$_, $_], 1 .. 500_000;
sleep 5;

:)
 
I

Ilya Zakharevich

[A complimentary Cc of this posting was sent to
Benjamin Goldberg
my @paths = ();
sleep 5;
for ( 1 .. 500_000 ) {
push( @paths, [$_, $_] );
}
sleep 5;

Surely you mean:

my @paths;
sleep 5;
@paths = map [$_, $_], 1 .. 500_000;
sleep 5;

Are you sure that 1 .. 500_000 is lazy in this context too?

Thanks,
Ilya
 
A

Anno Siegel

Ilya Zakharevich said:
[A complimentary Cc of this posting was sent to
Benjamin Goldberg
my @paths = ();
sleep 5;
for ( 1 .. 500_000 ) {
push( @paths, [$_, $_] );
}
sleep 5;

Surely you mean:

my @paths;
sleep 5;
@paths = map [$_, $_], 1 .. 500_000;
sleep 5;

Are you sure that 1 .. 500_000 is lazy in this context too?

Does it matter? I'd expect the list of arrayrefs to occupy about
ten times of what the list of integers takes, and rough measurements
bear that out. That's not too much of an impact.

Anno
 
A

Anno Siegel

Ilya Zakharevich said:
[A complimentary Cc of this posting was sent to
Anno Siegel
@paths = map [$_, $_], 1 .. 500_000;
Are you sure that 1 .. 500_000 is lazy in this context too?
Does it matter? I'd expect the list of arrayrefs to occupy about
ten times of what the list of integers takes, and rough measurements
bear that out. That's not too much of an impact.

I think it matters. 1..500K is a *constant* array. It is created at
*compile* time, and is never free()ed. @paths may be bigger, but it
has a chance to come and go (e.g., when the subroutine exits).

I still wouldn't worry about it much if the overall memory consumption
is much larger.
However, if you use this construct in 50 subroutines, the *minimal*
footprint of your program is 50 large arrays - even before the program
starts.

This would indeed be an unfortunate state of affairs -- one I didn't
think of.

use constant A_MILLION_TIMES => 0 .. 999_999;

map { ... } A_MILLION_TIMES;


....would actually save space in this case. But that's no longer serious.

Anno
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,013
Latest member
KatriceSwa

Latest Threads

Top