S
shailesh kumar
Hi,
I need to design data interfaces for accessing files of very large
sizes efficiently. The data will be accessed in chunks of fixed size
[may be a block of 16 KB]... My data interface should be able to do a
random seek in the file, as well as sequential access block by
block....
One aspect of the usage of this interface is that there is quite good
chance of accessing same blocks again and again by the application..
Hence, some caching might be needed for efficient implementation..
I was wondering how should such a data interface be implemented. I
could not find much literature on issues in handling very large files
of GB size.. I am wondering Whether C++ fstream classes are suitable
for the above problem or not?
Can somebody help me with some information about how to tackle this
problem? Or some pointers to where relavant information can be found?
Thanx and regards
Shailesh Kumar
I need to design data interfaces for accessing files of very large
sizes efficiently. The data will be accessed in chunks of fixed size
[may be a block of 16 KB]... My data interface should be able to do a
random seek in the file, as well as sequential access block by
block....
One aspect of the usage of this interface is that there is quite good
chance of accessing same blocks again and again by the application..
Hence, some caching might be needed for efficient implementation..
I was wondering how should such a data interface be implemented. I
could not find much literature on issues in handling very large files
of GB size.. I am wondering Whether C++ fstream classes are suitable
for the above problem or not?
Can somebody help me with some information about how to tackle this
problem? Or some pointers to where relavant information can be found?
Thanx and regards
Shailesh Kumar