freeing memory

H

Harald Kirsch

Eric Sossman wrote (in an article Google is too stupid to retrieve,
grrr.)
So: The overall picture is that of a program that runs
in "phases," where the early phases require a lot of memory
and the later phases (which run for a long time) require very
little. One can imagine such programs, but they're fairly
unusual -- and some of the counter-examples might profitably
be broken into two or more programs anyhow.

A typical setup I run into is a programm which compiles
things into a data structure. As it turns out, during
compilation, a huge amount of intermediate memory is needed,
but the resulting structure needs only roughly
20% of the peak usage (peak at 1GB).
This takes time, so the program
is set up as a server in order to have faster
response times. In addition I need many of those beasts.
Without giving back the memory, I would fit much less
beasts into one machine.

Other solutions could be:
1) Let the OS take care of swapping the unused stuff out.
It will just lie there and will never be swapped in again.
Well, works not as nice as really giving back the memory.
I tried it.

2) Write the resulting data structure to file after
compilation. Tried this with serialization, but due to the
high connectivity of the data structure this was even
slower than compiling and also seemed to need a large
amount of memory to set up the data structure in memory.

The 2nd approach could propably be improved,
but with -XXMaxHeapFree and calling the GC a few times
to convince it that there is memory to give back,
I got exactly what I want and it works find.

Harald.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,020
Latest member
GenesisGai

Latest Threads

Top