W
Will
This is partially a question and partially a rant. I am currently
running Sun's SDK/JRE 1.3.1 on a Windows XP machine with 512MB of
toatl RAM. I have a program which requires a lot of recursion since
the linked lists that it operates on are very long (8192 elements
apiece). These linked lists are generated from text files that are
all the same size (same number of columns and rows, all containing
either zeros or ones. I'm implementing a Quine-McClusky algorithm for
reducing boolean functions from exhaustive input/output lists.) My
problem is a few of these files seem to consistently cause a
StackOverFlowError while some do not. Since they are the same format
and the same number of characters, why would one be different than
another. Out of eight generated files (8bit x 8bit inputs -> 8
subfiles of 8192 lines each) six work fine, and two cause the error.
I found a work-around by increasing the stack size using java.exe
-Xss65536K but why is this only needed on two of the eight files? To
note, this exception occurs after I have read in the 8192 lines,
tagged each with a line number, and then perform a count on the number
of lines. I print this out so that I have a running tally of how many
lines I have left relative to the start of the program. My counting
method looks like:
1) public int countNodes(){
2) if(next == null)
3) return 1;
4) else
5) return next.countNodes()+1;
6) }
Predictably, it always fails on line 5.
Befuddled....
-Will
running Sun's SDK/JRE 1.3.1 on a Windows XP machine with 512MB of
toatl RAM. I have a program which requires a lot of recursion since
the linked lists that it operates on are very long (8192 elements
apiece). These linked lists are generated from text files that are
all the same size (same number of columns and rows, all containing
either zeros or ones. I'm implementing a Quine-McClusky algorithm for
reducing boolean functions from exhaustive input/output lists.) My
problem is a few of these files seem to consistently cause a
StackOverFlowError while some do not. Since they are the same format
and the same number of characters, why would one be different than
another. Out of eight generated files (8bit x 8bit inputs -> 8
subfiles of 8192 lines each) six work fine, and two cause the error.
I found a work-around by increasing the stack size using java.exe
-Xss65536K but why is this only needed on two of the eight files? To
note, this exception occurs after I have read in the 8192 lines,
tagged each with a line number, and then perform a count on the number
of lines. I print this out so that I have a running tally of how many
lines I have left relative to the start of the program. My counting
method looks like:
1) public int countNodes(){
2) if(next == null)
3) return 1;
4) else
5) return next.countNodes()+1;
6) }
Predictably, it always fails on line 5.
Befuddled....
-Will