Eric Sossman wrote (in an article Google is too stupid to retrieve,
> So: The overall picture is that of a program that runs
> in "phases," where the early phases require a lot of memory
> and the later phases (which run for a long time) require very
> little. One can imagine such programs, but they're fairly
> unusual -- and some of the counter-examples might profitably
> be broken into two or more programs anyhow.
A typical setup I run into is a programm which compiles
things into a data structure. As it turns out, during
compilation, a huge amount of intermediate memory is needed,
but the resulting structure needs only roughly
20% of the peak usage (peak at 1GB).
This takes time, so the program
is set up as a server in order to have faster
response times. In addition I need many of those beasts.
Without giving back the memory, I would fit much less
beasts into one machine.
Other solutions could be:
1) Let the OS take care of swapping the unused stuff out.
It will just lie there and will never be swapped in again.
Well, works not as nice as really giving back the memory.
I tried it.
2) Write the resulting data structure to file after
compilation. Tried this with serialization, but due to the
high connectivity of the data structure this was even
slower than compiling and also seemed to need a large
amount of memory to set up the data structure in memory.
The 2nd approach could propably be improved,
but with -XXMaxHeapFree and calling the GC a few times
to convince it that there is memory to give back,
I got exactly what I want and it works find.
|All times are GMT. The time now is 09:43 PM.|
Powered by vBulletin®. Copyright ©2000 - 2014, vBulletin Solutions, Inc.
SEO by vBSEO ©2010, Crawlability, Inc.