U
user923005
user923005 said:I work for a database company, and most of our customers are large
customers. (E.g. huge US company, large university, government of
country x, etc.)It is not at all unusual for a single file to be 20-100 GB. Needless
to say, you would not want to put this file into memory even if you
could do it.
Sometimes you do. For some time, I worked for a company that had
a gigantic Perforce repository[*]. Every developer made heavy
use of this repository. Unfortunately, Perforce doesn't scale to
anything that big or that busy. The solution turned out to be to
put 128 GB of RAM in the Perforce server. Then the whole
database was cached. Performance was then tolerable, if still
not all that great.
[*] Perforce is a version control system.
Tell me, was the file read into a fixed memory buffer or memory
mapped?
Actually, since Perforce uses a database, the answer is obvious.