E
eunever32
Hi
We have a requirement to query across two disparate systems. Both
systems are read-only so no need for updates and once loaded and no
need to check for updates. I would plan to reload the data afresh each
day. Records on both systems map one-one and each has 7million
records.
The first system is legacy and I am reluctant to redevelop (C code).
The second is standard Java/tomcat/SQL
The non-relational query can return up to 1000 records.
This could therefore result in 1000 queries to the relational system
(just one table) before returning to the user.
To avoid 1000 relational queries I was planning to "cache" the entire
relational table in memory. I was planning to have a web service which
would load the entire relational table into memory. The web service,
running in a separate tomcat could then be queried 1000 times or maybe
get a single request with 1000 values and return all results in one
go. Having a separate tomcat process would help to isolate any memory
issues eg JVM heap size.
Can people recommend an approach?
Because the entire set of records would always be in memory does that
make using something like ehcache pointless?
Issues I would anticipate:
time to load 7m records each morning
memory issues
best Java collection to hold the map (HashMap?) The map would be
(int, int) -> Object
Any suggestions regarding specialized cache utility eg EhCache
Thanks in advance.
We have a requirement to query across two disparate systems. Both
systems are read-only so no need for updates and once loaded and no
need to check for updates. I would plan to reload the data afresh each
day. Records on both systems map one-one and each has 7million
records.
The first system is legacy and I am reluctant to redevelop (C code).
The second is standard Java/tomcat/SQL
The non-relational query can return up to 1000 records.
This could therefore result in 1000 queries to the relational system
(just one table) before returning to the user.
To avoid 1000 relational queries I was planning to "cache" the entire
relational table in memory. I was planning to have a web service which
would load the entire relational table into memory. The web service,
running in a separate tomcat could then be queried 1000 times or maybe
get a single request with 1000 values and return all results in one
go. Having a separate tomcat process would help to isolate any memory
issues eg JVM heap size.
Can people recommend an approach?
Because the entire set of records would always be in memory does that
make using something like ehcache pointless?
Issues I would anticipate:
time to load 7m records each morning
memory issues
best Java collection to hold the map (HashMap?) The map would be
(int, int) -> Object
Any suggestions regarding specialized cache utility eg EhCache
Thanks in advance.