M
Momo
Hi,
Something was puzzling me about when I will run out of memory. I have
a simple program, that does nothing but take size as the input, and
then allocates a boolean array with x=new boolean[size]. I throw an
OutOfMemoryError when size is greater than about 6*10^7.
If I then change the program to allocate a boolean array with y=new
boolean[size][4], I get the Error when size is greater than about
3*10^6. I would have thought that double-scripting the array to be
length 4, I would make the maximum value of size roughly 1/4th as
large. But instead it changes by a factor of 20. Why such a big
change?
This is not an actual problem for me, since I just increased the size
of the java heap space. But I'd like to understand where the factor of
20 comes from.
Momo
Something was puzzling me about when I will run out of memory. I have
a simple program, that does nothing but take size as the input, and
then allocates a boolean array with x=new boolean[size]. I throw an
OutOfMemoryError when size is greater than about 6*10^7.
If I then change the program to allocate a boolean array with y=new
boolean[size][4], I get the Error when size is greater than about
3*10^6. I would have thought that double-scripting the array to be
length 4, I would make the maximum value of size roughly 1/4th as
large. But instead it changes by a factor of 20. Why such a big
change?
This is not an actual problem for me, since I just increased the size
of the java heap space. But I'd like to understand where the factor of
20 comes from.
Momo