A
Alessandro
I'm running a java application which processes some million record
files (average size per file about 200 MB).
JVM is started declaring -Xms32m -Xmx1024m on a server well provided
of memory and there isn't any problem with only one file.
Actually I load record by record to memory using an array list object.
I have problems with 2 files ... first is well loaded, the second one
gives me the classic error OutOfMemory: heap size
Is it sufficient to call a .clear() on the array list before loading
into memory ?
The strange issue is that the array list isn't a global variable,
loads only all records but for a single file ... and the new()
operator should create a new object (clean) with the same name,
garbage collector should erase the old one
//START
private void adjust(String fileName, String record)
{
ArrayList<String[]> recordsToAdjust=new ArrayList<String[]>();
String[] values;
//load file into memory....
}
//END
Anything worng ??
Thanks for suggestions/explanation
Best rgds,
Ale
files (average size per file about 200 MB).
JVM is started declaring -Xms32m -Xmx1024m on a server well provided
of memory and there isn't any problem with only one file.
Actually I load record by record to memory using an array list object.
I have problems with 2 files ... first is well loaded, the second one
gives me the classic error OutOfMemory: heap size
Is it sufficient to call a .clear() on the array list before loading
into memory ?
The strange issue is that the array list isn't a global variable,
loads only all records but for a single file ... and the new()
operator should create a new object (clean) with the same name,
garbage collector should erase the old one
//START
private void adjust(String fileName, String record)
{
ArrayList<String[]> recordsToAdjust=new ArrayList<String[]>();
String[] values;
//load file into memory....
}
//END
Anything worng ??
Thanks for suggestions/explanation
Best rgds,
Ale