Roedy Green sez:
My organic chem lab prof chastising us for wasting expensive solvents
for test tube cleaning would tell us that when we became researchers,
THEN we would have so much money for research grants we could wash our
feet in acetone.
In a similar way, when you have more than 4 gigs of RAM you can afford
to wash your feet in it, and use it in ways we mortals would not
imagine. It becomes more like a fast hard disk to store a complete
database. Much of it may not be accessed for weeks at a time other
than for backup.
But that's the point: it'll *take* weeks to find the objects you
want in > 4GB heap of 'em.
Some apps may need to store > 4GB of, say, ints and process them
one at a time. Then you (impersonal "you", of course) could do a
binary search on sorted list and that will be fast -- for values
of "fast" in O(log(1073741827)).
In general case you'll be storing compound objects with multiple
attributes and looking for a subset of > 4GB based on some
combination of some of the attributes. When you think of
implementation of searches, sure, you can do it to satisfy your
scientific curiosity -- if you have enough money in research
grants, that is. IRL you'll just dump the data into a database
and use (Structured) Query Language to get to it.
Plus, if you have that much data and your app runs for weeks,
you'll probably hate losing it all when the janitor unplugs
your server to plug in vacuum cleaner. So you'll want
persistence and probably some form of logging/transactions.
You'll have to implement that, too -- or let SQl engine worry
about it.
The end result is that by the time you finished debugging all
that code your potential customers have been using an SQL-based
solution from competing vendor for years. And they won't switch
to your app even though it may be faser. Beacuse SQL-based
solution is smaller and easier to maintain, meaning: cheaper,
more robust, easily extensible. (And it probably scales better,
too, how about multi-terababyte databases?)
Dima