Memory utilization

G

ghostwhoowalks

I was trying to profile the following piece of code using NetBeans:

public static void main(String[] args)
{
Map<String, String> map = new HashMap<String, String>();
for( int i = 0; i < Integer.MAX_VALUE; ++i )
{
map.put(Integer.valueOf(i).toString(),
Integer.valueOf(i).toString());
}
}


I set my max and min heap size to be 1GB. I find that the program dies
after having created 4M entries (at which point the heap used is at
1GB) although the memory used by the objects as indicated by the
Netbeans memory profiler is the following:

String -- 48MB
char[] -- 48 MB
HeapCharBuffer -- 45MB
Integer -- 19MB
HashMap$Entry -- 19MB


This totals to less than 200 MB and the rest of the objects are at 0%
of the total used memory at most another 200MB. How come the heap
usage is at 1GB at this point of time? I am extremely confused. Could
someone please explain what might be going on?

Thanks
A
 
T

tzvika.barenholz

I was trying to profile the following piece of code using NetBeans:

public static void main(String[] args)
{
Map<String, String> map = new HashMap<String, String>();
for( int i = 0; i < Integer.MAX_VALUE; ++i )
{
map.put(Integer.valueOf(i).toString(),
Integer.valueOf(i).toString());
}

}

I set my max and min heap size to be 1GB. I find that the program dies
after having created 4M entries (at which point the heap used is at
1GB) although the memory used by the objects as indicated by the
Netbeans memory profiler is the following:

String -- 48MB
char[] -- 48 MB
HeapCharBuffer -- 45MB
Integer -- 19MB
HashMap$Entry -- 19MB

This totals to less than 200 MB and the rest of the objects are at 0%
of the total used memory at most another 200MB. How come the heap
usage is at 1GB at this point of time? I am extremely confused. Could
someone please explain what might be going on?

Thanks
A

The profiler itself takes up quite a bit, but doesn't show it to you.
Also, space is needed for classes and other items of overhead.

T
 
G

GArlington

I was trying to profile the following piece of code using NetBeans:

public static void main(String[] args)
{
Map<String, String> map = new HashMap<String, String>();
for( int i = 0; i < Integer.MAX_VALUE; ++i )
{
map.put(Integer.valueOf(i).toString(),
Integer.valueOf(i).toString());
}

}

I set my max and min heap size to be 1GB. I find that the program dies
after having created 4M entries (at which point the heap used is at
1GB) although the memory used by the objects as indicated by the
Netbeans memory profiler is the following:

String -- 48MB
char[] -- 48 MB
HeapCharBuffer -- 45MB
Integer -- 19MB
HashMap$Entry -- 19MB

This totals to less than 200 MB and the rest of the objects are at 0%
of the total used memory at most another 200MB. How come the heap
usage is at 1GB at this point of time? I am extremely confused. Could
someone please explain what might be going on?

Thanks
A

Something is VERY wrong here:
"4M entries" in Map<String, String> will mean 8M String(s) - each
String has an OVERHEAD of 40bytes (AFAIK) - this alone should give you
320MB heap usage even when ALL String(s) are empty.
 
L

Lew

GArlington said:
each
String has an OVERHEAD of 40bytes (AFAIK)

I'm not sure that's correct. Are you sure it isn't 4 bytes per String? 40
seems ridiculously high. Where'd you get that number?
 
G

gwoodhouse

I'm not sure that's correct. Are you sure it isn't 4 bytes per String? 40
seems ridiculously high. Where'd you get that number?

On the machine you are on is it possible to increase your max heap
space higher? If so problem solved!

Graeme
 
P

Patricia Shanahan

On the machine you are on is it possible to increase your max heap
space higher? If so problem solved!

That depends on what the problem really is. The program looks more like
a test case than an actual application. I believe the problem is
explaining the relationship between the NetBeans memory profile, the
program, and the out-of-memory condition. If so, increasing the heap
space could not possibly solve it, and might make it harder to solve.

Patricia
 
E

Eric Sosman

Might be high, but doesn't seem ridiculous. A String
instance has its object-ness plus three ints and a char[]
reference. The char[] will have its own object-ness plus
a length. If object-ness takes eight bytes and a reference
takes four (32-bit JVM), we're up to at least 36 bytes. If
there's an eight-byte alignment requirement, 40 bytes is
right on the money.
On the machine you are on is it possible to increase your max heap
space higher? If so problem solved!

He's trying to create 2G pairs of String objects (72-80
bytes of overhead per pair) plus 2G HashMap.Entry objects
(~24 bytes each, under the same set of assumptions), hence
about 192-208GB -- and that's not counting the "payload" of
the char values, which will add another 80GB or thereabouts.
272GB is well beyond what a 32-bit JVM can manage, so he'll
need to go to a 64-bit version -- and this will expand the
String overhead to at least 40 bytes, and the HashMap.Entry
size to at least 36-40, raising the total to about 312-320GB.
And that's just for this one data structure; the rest of the
Java environment needs some heap, too. I'd guess that a
machine with half a terabyte of RAM could just about manage
it, with luck and a following wind.

... but in truth he's not really trying to populate this
enormous and pointlessly wasteful data structure; he's trying
to understand how the NetBeans profiler reports the memory
used by the program, and I'm afraid I can't help much with
that. After 4M pairs he sees
String -- 48MB
char[] -- 48 MB
HeapCharBuffer -- 45MB
Integer -- 19MB
HashMap$Entry -- 19MB

.... which works out to 6 bytes per string, 6 bytes per char[],
and <5 bytes per HashMap.Entry -- clearly, the profiler reports
some other notion of "memory used" than we've been talking
about in the last few messages. All I can suggest is RTFM.
 
G

ghostwhoowalks

Might be high, but doesn't seem ridiculous. A String
instance has its object-ness plus three ints and a char[]
reference. The char[] will have its own object-ness plus
a length. If object-ness takes eight bytes and a reference
takes four (32-bit JVM), we're up to at least 36 bytes. If
there's an eight-byte alignment requirement, 40 bytes is
right on the money.
On the machine you are on is it possible to increase your max heap
space higher? If so problem solved!

He's trying to create 2G pairs of String objects (72-80
bytes of overhead per pair) plus 2G HashMap.Entry objects
(~24 bytes each, under the same set of assumptions), hence
about 192-208GB -- and that's not counting the "payload" of
the char values, which will add another 80GB or thereabouts.
272GB is well beyond what a 32-bit JVM can manage, so he'll
need to go to a 64-bit version -- and this will expand the
String overhead to at least 40 bytes, and the HashMap.Entry
size to at least 36-40, raising the total to about 312-320GB.
And that's just for this one data structure; the rest of the
Java environment needs some heap, too. I'd guess that a
machine with half a terabyte of RAM could just about manage
it, with luck and a following wind.

... but in truth he's not really trying to populate this
enormous and pointlessly wasteful data structure; he's trying
to understand how the NetBeans profiler reports thememory
used by the program, and I'm afraid I can't help much with
that. After 4M pairs he sees
String -- 48MB
char[] -- 48 MB
HeapCharBuffer -- 45MB
Integer -- 19MB
HashMap$Entry -- 19MB

... which works out to 6 bytes per string, 6 bytes per char[],
and <5 bytes per HashMap.Entry -- clearly, the profiler reports
some other notion of "memoryused" than we've been talking
about in the last few messages. All I can suggest is RTFM.

Obviously my intention is not to fill 2G of pairs. If you thought so I
think you are dumber than I am. The point is to understand when the
JVM craps out how much of memory is taken up by the VM apart from the
objects used by the application. The profiler reports that the memory
used by the objects in the heap is atmost 200M. But the heap memory
used is at 800M. So why is there a 4x increase? Even if you throw in
200M used by the profiler there is an additional 400M that is
unaccounted for. I even tried MinHeapRatio and MaxHeapRatio options
but to no avail. I agree that there will be some overhead with Java
but 3x or 4x is seriously flawed. I am wondering if someone can point
me to what might be going wrong.

A
 
L

Lew

The point is to understand when the
JVM craps out how much of memory is taken up by the VM apart from the
objects used by the application. The profiler reports that the memory
used by the objects in the heap is atmost 200M. But the heap memory
used is at 800M. So why is there a 4x increase?

How are you determining that the heap consumption is 800MB ?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,767
Messages
2,569,570
Members
45,045
Latest member
DRCM

Latest Threads

Top