F
filia&sofia
Let me rephrase my question that hasn't been answered properly. I want
to read a file into n-bit chunks of data _extremely fast_, process the
chunks and finally write them back into a different file. The problem
is that usually people suggest algorithms that are slow.
I'm thinking that the possible algorithm should first read the file
into a buffer (in computer memory), which is large w.r.t. length of
the individiual chunk of bits. Secondly, one should try to avoid slow
operations such as "for". Maybe operations like memmove or memcopy
would do the trick, but how?
So, basically, one could think that the algorithm is like a window
that goes through a file sequentially showing at a time only the n-bit
chunk. There isn't a trivial solution, because computers process bytes
instead of bits and the algorithm has to be state-of-art.
Any suggestions? Thank you.
to read a file into n-bit chunks of data _extremely fast_, process the
chunks and finally write them back into a different file. The problem
is that usually people suggest algorithms that are slow.
I'm thinking that the possible algorithm should first read the file
into a buffer (in computer memory), which is large w.r.t. length of
the individiual chunk of bits. Secondly, one should try to avoid slow
operations such as "for". Maybe operations like memmove or memcopy
would do the trick, but how?
So, basically, one could think that the algorithm is like a window
that goes through a file sequentially showing at a time only the n-bit
chunk. There isn't a trivial solution, because computers process bytes
instead of bits and the algorithm has to be state-of-art.
Any suggestions? Thank you.