C
Chris Ott
Hi,
We're working on a Java application at my company and we're having some
memory issues. Let me point out first: I am not a Java programmer, myself,
but I've programmed in lots of other languages, so I understand things like
memory allocation, heaps, stacks, etc.
Unlike most Java applications (with which I'm familiar), ours doesn't have a
GUI. It just sits in the background, listening on a TCP socket. Other
programs connect on that socket and give the application a chunk of data,
which it processes and adds to our database. The application allocates the
memory it needs to process the dataset, then (supposedly) releases the
memory when it's done.
Normally, the datasets are in the 1-5 meg range but, every so often,
there'll be a huge dataset that causes an enormous increase in Java's
memory usage. We've seen it get up over 1/2 a gig. This isn't a problem,
short-term, but Java never releases that memory back to the OS. This is
true for all VMs we've tried: Tru64, AIX, Linux, and Windows.
The Java programmers claim they're releasing their objects properly and I'd
be inclined to agree because, as long as the application doesn't receive
any more large datasets, Java won't allocate any more memory. They tell me
the only way to free the memory is to restart the VM, but we'd rather not
do that, since this is a server-side application that's supposed to be
running 24/7.
I find it hard to believe that memory management for a language as popular
as Java could be - well - that broken, so I'm feeling the need for some
verification. Anyone know what's going on here? At this point, I'm
beginning to wonder if it was even appropriate to use Java for a
server-side application.
Thanks,
Chris Ott
<first initial, last name at acclamation dot com>
We're working on a Java application at my company and we're having some
memory issues. Let me point out first: I am not a Java programmer, myself,
but I've programmed in lots of other languages, so I understand things like
memory allocation, heaps, stacks, etc.
Unlike most Java applications (with which I'm familiar), ours doesn't have a
GUI. It just sits in the background, listening on a TCP socket. Other
programs connect on that socket and give the application a chunk of data,
which it processes and adds to our database. The application allocates the
memory it needs to process the dataset, then (supposedly) releases the
memory when it's done.
Normally, the datasets are in the 1-5 meg range but, every so often,
there'll be a huge dataset that causes an enormous increase in Java's
memory usage. We've seen it get up over 1/2 a gig. This isn't a problem,
short-term, but Java never releases that memory back to the OS. This is
true for all VMs we've tried: Tru64, AIX, Linux, and Windows.
The Java programmers claim they're releasing their objects properly and I'd
be inclined to agree because, as long as the application doesn't receive
any more large datasets, Java won't allocate any more memory. They tell me
the only way to free the memory is to restart the VM, but we'd rather not
do that, since this is a server-side application that's supposed to be
running 24/7.
I find it hard to believe that memory management for a language as popular
as Java could be - well - that broken, so I'm feeling the need for some
verification. Anyone know what's going on here? At this point, I'm
beginning to wonder if it was even appropriate to use Java for a
server-side application.
Thanks,
Chris Ott
<first initial, last name at acclamation dot com>