Confused: Java6 32-bit vs. 64-bit

H

Hendrik Maryns

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi all.

I have this program that does some very memory intense computations
converting logical formulae into tree automata. The automata get that
big that after a while I get OutOfMemoryError. I have this particular
formula that gave me problems the first time I started testing the
program. I then rewrote part of it to make it consume less memory, and
after 6 months of work, that formula still gives problems. However,
just monday, suddenly it managed in computing the formula, with ‘only’
1.2G of RAM.

However, this only works if I tell Eclipse to use the 32-bit java6 jre,
which I installed additionally to the 64-bit jdk that I use for development.

More strangely, I cannot reproduce this behavior consistently: the first
few times I tried today, it didn’t manage the formula, with neither
32-bit not 64-bit versions. Now, after some switching around, I get the
same result: the 32-bit jre manages the formula, the 64-bit one doesn’t.

The result is reproducible on the command line.

I also have a 1.5 version 64-bit, which seems to fail as well. And now
that I am in a testing phase, I quickly installed a 32-bit 1.5, which
didn’t manage either.

In short, my question: does anyone have an explanation for this? Are
there some things I can learn from this? Which part of my program
should be tuned, such that maybe it becomes usable with less memory? Etc.

Many thanks, H.
- --
Hendrik Maryns
http://tcl.sfs.uni-tuebingen.de/~hendrik/
==================
http://aouw.org
Ask smart questions, get good answers:
http://www.catb.org/~esr/faqs/smart-questions.html
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.5 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFGOI50e+7xMGD3itQRAh9MAJ0XgLkiSvtkhJbjaFoz387ls1T4CACfRZDc
xCsHREW4+v6FPtZMLFDbB3k=
=UuJa
-----END PGP SIGNATURE-----
 
P

Patricia Shanahan

Hendrik said:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi all.

I have this program that does some very memory intense computations
converting logical formulae into tree automata. The automata get that
big that after a while I get OutOfMemoryError. I have this particular
formula that gave me problems the first time I started testing the
program. I then rewrote part of it to make it consume less memory, and
after 6 months of work, that formula still gives problems. However,
just monday, suddenly it managed in computing the formula, with ‘only’
1.2G of RAM.

However, this only works if I tell Eclipse to use the 32-bit java6 jre,
which I installed additionally to the 64-bit jdk that I use for development.

More strangely, I cannot reproduce this behavior consistently: the first
few times I tried today, it didn’t manage the formula, with neither
32-bit not 64-bit versions. Now, after some switching around, I get the
same result: the 32-bit jre manages the formula, the 64-bit one doesn’t.

The result is reproducible on the command line.

I also have a 1.5 version 64-bit, which seems to fail as well. And now
that I am in a testing phase, I quickly installed a 32-bit 1.5, which
didn’t manage either.

In short, my question: does anyone have an explanation for this? Are
there some things I can learn from this? Which part of my program
should be tuned, such that maybe it becomes usable with less memory? Etc.

Well, one obvious lesson is that bigger pointers take more space. Why
use a 64 bit JDK if your intended memory allocation is small enough for
a 32 bit one?

Does anyone know the heap management algorithms for the JRE? Some memory
allocation systems can lose space to fragmentation, so that the exact
allocation order can affect the amount of memory used.

Also, in a multi-threaded program the sets of objects that exist
simultaneously can vary from run to run.

The big picture is that your program, as currently constructed, is right
on the edge, and will be very fragile unless you do something drastic to
either increase the available memory or reduce the memory it uses.

Patricia
 
R

Robert Klemme

Well, one obvious lesson is that bigger pointers take more space. Why
use a 64 bit JDK if your intended memory allocation is small enough for
a 32 bit one?

Does anyone know the heap management algorithms for the JRE? Some memory
allocation systems can lose space to fragmentation, so that the exact
allocation order can affect the amount of memory used.

I suggest to look into GC related JVM settings, e.g. -XX:NewRatio might
help.

http://java.sun.com/javase/technologies/hotspot/vmoptions.jsp

Some more links
http://www-128.ibm.com/developerworks/ibm/library/i-gctroub/
http://java.sun.com/docs/hotspot/gc5.0/gc_tuning_5.html
Also, in a multi-threaded program the sets of objects that exist
simultaneously can vary from run to run.

The big picture is that your program, as currently constructed, is right
on the edge, and will be very fragile unless you do something drastic to
either increase the available memory or reduce the memory it uses.

Definitively.

robert
 
T

Tom Hawtin

Hendrik said:
I have this program that does some very memory intense computations
converting logical formulae into tree automata. The automata get that
big that after a while I get OutOfMemoryError. I have this particular
More strangely, I cannot reproduce this behavior consistently: the first
few times I tried today, it didn’t manage the formula, with neither
32-bit not 64-bit versions. Now, after some switching around, I get the
same result: the 32-bit jre manages the formula, the 64-bit one doesn’t.

As Patricia pointed out, 64-bit pointers will use more memory the 32-bit
pointers (4 bytes each).

The OOMEs are inconsistent because it is thrown when there is still free
memory available. This may seem nuts, but it isn't. If OOME really only
did get thrown when there was no memory left, as memory gets short the
garbage collector would have to run more often. The process would just
max out CPU usage and get nowhere. Better to throw OOME when memory gets
reasonably short, and allow backing out of an operation and be able to
continue other operations and save data.
there some things I can learn from this? Which part of my program
should be tuned, such that maybe it becomes usable with less memory? Etc.

Use some kind of memory profiling to work out where you memory is going.
jmap and jhat are included in the JDK. Even back of an envelope
calculations may well help.

The problem is probably in very few classes. Don't waste time optimising
classes that aren't actually taking up much memory. Sound obvious, but
it's a very easy and common trap to fall into.

Generally you will want fewer larger objects (or arrays) rather than
lots of small ones.

Tom Hawtin
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,007
Latest member
obedient dusk

Latest Threads

Top