A
antoine
** problem with news sender, apologies if message appears twice **
Hello,
I have developped an application that is doing some "backtesting" on
market data.
I have a serie of text files that contain data, one meaningful data per
line.
my app opens the file, reads a line, process it, reads the next one,
process it, and so on.
each file is around 50,000 lines long, and I have more than 200 files
(and growing).
the whole processing is taking quite some time, and I'm trying to find
ways to make it faster.
I chose the "file" way a while ago when I was lazy and the "speed"
problem had not appeared yet. however, I know I/O is one important
thing to look at if I want to improve performances.
especially, I'm thinking wouldn't that be faster to go through a
Database ? enter all my data in a DB once, and access the DB with JDBC.
has any of you any comment to make on ways to make things faster, and
on the Database performance especially ? or is my file solution good
enough ?
thanks for your insight...
-Antoine
Hello,
I have developped an application that is doing some "backtesting" on
market data.
I have a serie of text files that contain data, one meaningful data per
line.
my app opens the file, reads a line, process it, reads the next one,
process it, and so on.
each file is around 50,000 lines long, and I have more than 200 files
(and growing).
the whole processing is taking quite some time, and I'm trying to find
ways to make it faster.
I chose the "file" way a while ago when I was lazy and the "speed"
problem had not appeared yet. however, I know I/O is one important
thing to look at if I want to improve performances.
especially, I'm thinking wouldn't that be faster to go through a
Database ? enter all my data in a DB once, and access the DB with JDBC.
has any of you any comment to make on ways to make things faster, and
on the Database performance especially ? or is my file solution good
enough ?
thanks for your insight...
-Antoine