<class>[] leaks memory?

E

Eric Lo

Hello all,

I seldom post such long message, but I really tired on keep fighting
with this thing and I thus ask your help here.

My program loads a pretty big data structure from disk.

On the disk, the file is "843,142bytes 10000c_UNI.rcs"

Since I get stuck into this out-of-memory error problem for a long
time, I have tried to use many profiling solutions:

1) Added runtime.memory statements:

public static void main(String args[]) {

tuning("b4 loading the conflict set:");

String fileName = "10000c_UNI";
ConflictTree rcs= (ConflictTree)
readObjectFromFile(fileName+".rcs");
System.out.println("Real Conflict Set loaded from
"+fileName+".rcs");

tuning("after loading the conflict set:");
}

public static void tuning(String msg){

System.err.println(msg);
System.err.println("Memory: curr total for JVM=" + r.totalMemory());
System.err.println("Memory: free=" + r.freeMemory());
System.err.println("Memory: in used=" + (r.totalMemory() -
r.freeMemory()));
System.err.println("Memory: JVM max heap=" + r.maxMemory());

}

===output===
b4 loading the conflict set:
Memory: curr total for JVM=2031616
Memory: free=1265136
Memory: in used=766824
Memory: JVM max heap=66650112
Real Conflict Set loaded from 10000c_UNI.rcs

after loading the conflict set:
Memory: curr total for JVM=6451200
Memory: free=7296
Memory: in used=6444248
Memory: JVM max heap=66650112
JProfiler> Keeping VM alive until frontend disconnects.

============

By looking at the verbose output, I first guess the data structure
ConflictTree has some problems: it used up 6444248-766824= 5M of
memory. However, it takes only on 834K on disk (I understand that the
serialized object usually is bigger than the object in memory).

However, I found no obvious memory leak problem on the
ConflictTree.java.

Therefore, I try to use JProfiler3:

2) Use Jprofiler 3 (evaluation version) to track where is the memory
gone:

Interestingly, from JProfiler (the first number is the number of
instance, the second number is the number of memory used in bytes),
the conflictTree and its component (TreeEdge) used up ~240K only, and
exactly 10001 objects (same as my input):

---------
<class>[ ] 47,768 2,921,384
char[ ] 20,497 681,616
java.util.ArrayList 22,673 544,152
java.lang.Integer 20,000 320,000
java.lang.String 13,219 317,256
java.util.HashMap$Entry 12,816 307,584
regtest.conflictManagement.TreeEdge 10,001 240,024
byte[ ] 229 165,472
java.lang.StringBuffer 7,253 116,048
short[ ] 338 21,080
java.lang.Class 232 20,416
.....
regtest.conflictManagement.ConflictTree 1 32
---------

I found there is a strange <class>[] used up 2M memory by 47768
instances.

Up to this point, I get lost:

(1) Either the <class>[] is referred to an array of a class member of
TreeEdge; The signature for TreeEdge is:

private Object trace;
private ArrayList data = new ArrayList();
private HashMap children;
private TreeEdge parent;

or
(2) There is something else called <class>[]?


*********************

In fact, this is just a testing program that I extracted from a very
large program. That original program always throw out-of-memory even
though I set the heap size to near 2G (where I run on a 4G 32 bit
server) when it continues to run and add more TreeEdges to the
ConflictTree.

Thanks a lot of reading this message!

Cheer and thanks!

Eric
 
J

Joe Smith

char[ ] 20,497 681,616
java.util.ArrayList 22,673 544,152
java.lang.Integer 20,000 320,000
java.lang.String 13,219 317,256
java.util.HashMap$Entry 12,816 307,584
regtest.conflictManagement.TreeEdge 10,001 240,024 [...]
(1) Either the <class>[] is referred to an array of a class member of
TreeEdge; The signature for TreeEdge is:

private Object trace;
private ArrayList data = new ArrayList();
private HashMap children;
private TreeEdge parent;

or
(2) There is something else called <class>[]?

Well, not really sure, but I think it's the space taken by the "definition"
of the classes. When you first start your program, the classloader hasn't
read all the classes yet. When they are needed they will be read in memory,
so when you create ConflictTree, a lot of new classes may be instantiated.
You could easily verify this by creating a second instance of your
ConflictTree, and watching the profile again. I'm guessing the number of
classes will be the same before and after this second creation.

Hope this gives you more hints.
 
J

John C. Bollinger

Eric said:
Hello all,

I seldom post such long message, but I really tired on keep fighting
with this thing and I thus ask your help here.

My program loads a pretty big data structure from disk.

On the disk, the file is "843,142bytes 10000c_UNI.rcs"

Since I get stuck into this out-of-memory error problem for a long
time, I have tried to use many profiling solutions:

1) Added runtime.memory statements:

public static void main(String args[]) {

tuning("b4 loading the conflict set:");

String fileName = "10000c_UNI";
ConflictTree rcs= (ConflictTree)
readObjectFromFile(fileName+".rcs");
System.out.println("Real Conflict Set loaded from
"+fileName+".rcs");

tuning("after loading the conflict set:");
}

public static void tuning(String msg){

Although it is usually a poor idea, for this particular case it would
behoove you to run a System.gc() at this point. That will give you the
best chance of profiling only reachable objects. You don't (oughtn't
to) care about unreachable ones that just haven't been collected yet.
System.err.println(msg);
System.err.println("Memory: curr total for JVM=" + r.totalMemory());
System.err.println("Memory: free=" + r.freeMemory());
System.err.println("Memory: in used=" + (r.totalMemory() -
r.freeMemory()));
System.err.println("Memory: JVM max heap=" + r.maxMemory());

}

===output===
b4 loading the conflict set:
Memory: curr total for JVM=2031616
Memory: free=1265136
Memory: in used=766824
Memory: JVM max heap=66650112
Real Conflict Set loaded from 10000c_UNI.rcs

after loading the conflict set:
Memory: curr total for JVM=6451200
Memory: free=7296
Memory: in used=6444248
Memory: JVM max heap=66650112
JProfiler> Keeping VM alive until frontend disconnects.

============

By looking at the verbose output, I first guess the data structure
ConflictTree has some problems: it used up 6444248-766824= 5M of
memory. However, it takes only on 834K on disk (I understand that the
serialized object usually is bigger than the object in memory).

If you are referring to Java's built-in serialization mechanism then you
are correct, but does the file you are trying to read actually contain a
serialized object in that sense? It is not obvious that you are reading
it as if it did. Depending on the form of a particular collection of
data and of the Java classes that you choose to represent it, the object
representation can be *far* larger.
However, I found no obvious memory leak problem on the
ConflictTree.java.

Therefore, I try to use JProfiler3:

2) Use Jprofiler 3 (evaluation version) to track where is the memory
gone:

Interestingly, from JProfiler (the first number is the number of
instance, the second number is the number of memory used in bytes),
the conflictTree and its component (TreeEdge) used up ~240K only, and
exactly 10001 objects (same as my input):

And TreeEdge has no members? A reference to another object (an array,
for instance) is small, but the object referred to may consume, directly
or indirectly, a large amount of memory.
---------
<class>[ ] 47,768 2,921,384
char[ ] 20,497 681,616
java.util.ArrayList 22,673 544,152
java.lang.Integer 20,000 320,000
java.lang.String 13,219 317,256
java.util.HashMap$Entry 12,816 307,584
regtest.conflictManagement.TreeEdge 10,001 240,024
byte[ ] 229 165,472
java.lang.StringBuffer 7,253 116,048
short[ ] 338 21,080
java.lang.Class 232 20,416
....
regtest.conflictManagement.ConflictTree 1 32
---------

I found there is a strange <class>[] used up 2M memory by 47768
instances.

Up to this point, I get lost:

(1) Either the <class>[] is referred to an array of a class member of
TreeEdge; The signature for TreeEdge is:

private Object trace;
private ArrayList data = new ArrayList();
private HashMap children;
private TreeEdge parent;

This looks a bit heavyweight for a tree node (and it definitely looks
like a node rather than an edge). Not so heavyweight, however, that I
would expect the structure, in itself, to be the cause of your problem.
or
(2) There is something else called <class>[]?

I'm sure that <class>[] is a collective name representing all arrays of
reference types. (i.e. Object[], String[]). Every ArrayList contains
one of these, and every HashMap contains several.
*********************

In fact, this is just a testing program that I extracted from a very
large program. That original program always throw out-of-memory even
though I set the heap size to near 2G (where I run on a 4G 32 bit
server) when it continues to run and add more TreeEdges to the
ConflictTree.

Thanks a lot of reading this message!

If the original data consume less than 1MB, then it reflects a severe
program flaw that the program cannot represent the same data in 2GB.
The main possibilities are (1) that you are making many unnecessary
copies of the data, and (2) that you are creating a great deal of
metadata / structure on top of the data. Possibly both.

I really can't analyze the memory profile because it may contain a
significant number of unreachable objects. These may slow your program,
but they will not cause an out of memory condition (the VM will GC as
many objects as possible to try to free memory before it throws an
OutOfMemoryException). It would also help to have profiles for
different amounts of the input data, and any kind of analysis based on
partial data assumes that the input is roughly homogeneous.
 
R

Richard Wheeldon

Eric said:
In fact, this is just a testing program that I extracted from a very
large program. That original program always throw out-of-memory even
though I set the heap size to near 2G (where I run on a 4G 32 bit
server) when it continues to run and add more TreeEdges to the
ConflictTree.

I suggest you post the code for the readObjectFromFile() method. It's
the most likely cause of problems. If you're running out of memory on
a 2G setup it's probably an infinite loop issue of some kind,

Richard
 
H

HArolD

Eric,

try to start java with -Xloggc:loggc.out parameter. It will write to
file all GC calls, with some additional Memory data. You can analyze
this file using HPJTune
http://www.hp.com/products1/unix/java/java2/hpjtune/downloads/index.html

If you can't find anything, feel free sending me GC output file. I will
look on it.

Regards,
HArolD

Eric said:
Hello all,

I seldom post such long message, but I really tired on keep fighting
with this thing and I thus ask your help here.

My program loads a pretty big data structure from disk.

On the disk, the file is "843,142bytes 10000c_UNI.rcs"

Since I get stuck into this out-of-memory error problem for a long
time, I have tried to use many profiling solutions:

1) Added runtime.memory statements:

public static void main(String args[]) {

tuning("b4 loading the conflict set:");

String fileName = "10000c_UNI";
ConflictTree rcs= (ConflictTree)
readObjectFromFile(fileName+".rcs");
System.out.println("Real Conflict Set loaded from
"+fileName+".rcs");

tuning("after loading the conflict set:");
}

public static void tuning(String msg){

System.err.println(msg);
System.err.println("Memory: curr total for JVM=" + r.totalMemory());
System.err.println("Memory: free=" + r.freeMemory());
System.err.println("Memory: in used=" + (r.totalMemory() -
r.freeMemory()));
System.err.println("Memory: JVM max heap=" + r.maxMemory());

}

===output===
b4 loading the conflict set:
Memory: curr total for JVM=2031616
Memory: free=1265136
Memory: in used=766824
Memory: JVM max heap=66650112
Real Conflict Set loaded from 10000c_UNI.rcs

after loading the conflict set:
Memory: curr total for JVM=6451200
Memory: free=7296
Memory: in used=6444248
Memory: JVM max heap=66650112
JProfiler> Keeping VM alive until frontend disconnects.

============

By looking at the verbose output, I first guess the data structure
ConflictTree has some problems: it used up 6444248-766824= 5M of
memory. However, it takes only on 834K on disk (I understand that the
serialized object usually is bigger than the object in memory).

However, I found no obvious memory leak problem on the
ConflictTree.java.

Therefore, I try to use JProfiler3:

2) Use Jprofiler 3 (evaluation version) to track where is the memory
gone:

Interestingly, from JProfiler (the first number is the number of
instance, the second number is the number of memory used in bytes),
the conflictTree and its component (TreeEdge) used up ~240K only, and
exactly 10001 objects (same as my input):

---------
<class>[ ] 47,768 2,921,384
char[ ] 20,497 681,616
java.util.ArrayList 22,673 544,152
java.lang.Integer 20,000 320,000
java.lang.String 13,219 317,256
java.util.HashMap$Entry 12,816 307,584
regtest.conflictManagement.TreeEdge 10,001 240,024
byte[ ] 229 165,472
java.lang.StringBuffer 7,253 116,048
short[ ] 338 21,080
java.lang.Class 232 20,416
.....
regtest.conflictManagement.ConflictTree 1 32
---------

I found there is a strange <class>[] used up 2M memory by 47768
instances.

Up to this point, I get lost:

(1) Either the <class>[] is referred to an array of a class member of
TreeEdge; The signature for TreeEdge is:

private Object trace;
private ArrayList data = new ArrayList();
private HashMap children;
private TreeEdge parent;

or
(2) There is something else called <class>[]?


*********************

In fact, this is just a testing program that I extracted from a very
large program. That original program always throw out-of-memory even
though I set the heap size to near 2G (where I run on a 4G 32 bit
server) when it continues to run and add more TreeEdges to the
ConflictTree.

Thanks a lot of reading this message!

Cheer and thanks!

Eric
 
S

steepyirl

Does building your ConflictTree (by means of readObjectFromFile)
involve lots of recursive method calls? I've found that
OutOfMemoryErrors are often caused by stack overflows in recursive
methods. This could happen when a flaw in the code causes infinite (or
almost infinite) recursion.
 
?

.

Hello all,

I seldom post such long message, but I really tired on keep fighting
with this thing and I thus ask your help here.

My program loads a pretty big data structure from disk.

On the disk, the file is "843,142bytes 10000c_UNI.rcs"

Since I get stuck into this out-of-memory error problem for a long
time, I have tried to use many profiling solutions:

1) Added runtime.memory statements:

public static void main(String args[]) {

tuning("b4 loading the conflict set:");

String fileName = "10000c_UNI";
ConflictTree rcs= (ConflictTree)
readObjectFromFile(fileName+".rcs");
System.out.println("Real Conflict Set loaded from
"+fileName+".rcs");

tuning("after loading the conflict set:");
}

public static void tuning(String msg){

System.err.println(msg);
System.err.println("Memory: curr total for JVM=" + r.totalMemory());
System.err.println("Memory: free=" + r.freeMemory());
System.err.println("Memory: in used=" + (r.totalMemory() -
r.freeMemory()));
System.err.println("Memory: JVM max heap=" + r.maxMemory());

}

Why not wrap the code in a try/catch then print the stack out when the
program throws the OutOfMemory exception. That should help you to narrow
down where in the program you are running out of memory.

Everything below just seems to be guessing. Let the language tell you
exactly where it runs out of memory.
===output===
b4 loading the conflict set:
Memory: curr total for JVM=2031616
Memory: free=1265136
Memory: in used=766824
Memory: JVM max heap=66650112
Real Conflict Set loaded from 10000c_UNI.rcs

after loading the conflict set:
Memory: curr total for JVM=6451200
Memory: free=7296
Memory: in used=6444248
Memory: JVM max heap=66650112
JProfiler> Keeping VM alive until frontend disconnects.

============

By looking at the verbose output, I first guess the data structure
ConflictTree has some problems: it used up 6444248-766824= 5M of
memory. However, it takes only on 834K on disk (I understand that the
serialized object usually is bigger than the object in memory).

However, I found no obvious memory leak problem on the
ConflictTree.java.

Therefore, I try to use JProfiler3:

2) Use Jprofiler 3 (evaluation version) to track where is the memory
gone:

Interestingly, from JProfiler (the first number is the number of
instance, the second number is the number of memory used in bytes),
the conflictTree and its component (TreeEdge) used up ~240K only, and
exactly 10001 objects (same as my input):

---------
<class>[ ] 47,768 2,921,384
char[ ] 20,497 681,616
java.util.ArrayList 22,673 544,152
java.lang.Integer 20,000 320,000
java.lang.String 13,219 317,256
java.util.HashMap$Entry 12,816 307,584
regtest.conflictManagement.TreeEdge 10,001 240,024
byte[ ] 229 165,472
java.lang.StringBuffer 7,253 116,048
short[ ] 338 21,080
java.lang.Class 232 20,416
....
regtest.conflictManagement.ConflictTree 1 32
---------

I found there is a strange <class>[] used up 2M memory by 47768
instances.

Up to this point, I get lost:

(1) Either the <class>[] is referred to an array of a class member of
TreeEdge; The signature for TreeEdge is:

private Object trace;
private ArrayList data = new ArrayList();
private HashMap children;
private TreeEdge parent;

or
(2) There is something else called <class>[]?


*********************

In fact, this is just a testing program that I extracted from a very
large program. That original program always throw out-of-memory even
though I set the heap size to near 2G (where I run on a 4G 32 bit
server) when it continues to run and add more TreeEdges to the
ConflictTree.

Thanks a lot of reading this message!

Cheer and thanks!

Eric
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
474,262
Messages
2,571,048
Members
48,769
Latest member
Clifft

Latest Threads

Top