Processing file input for large files[100+ MB] - Performance suggestions?

M

Maxim

I am wondering if anyone could suggest come performance improvements for
processing a very large file[100+MB]. THe processing taking place here is
on 30-50MB chunks of the file.

Performance is extremely important here.

+ I initialise a StringBuilder object with the result of the function
System.Text.Encoding.GetString to convert the byte[] input to string.
[potentially gobbling up an extraordinary amount of RAM]
myEncoding.getString(fileInput);


+ The StringBuilder is converted to a string, capitalisation taken
out[ToLower], and split[String.Split()] on the new line character '\n' which
is placed in a string array.
fileLines = SB.ToString.ToLower().Split(newline);


At the moment this is taking up to 40-50% of the programs running time to
perform, I'd really like to get this down as low as possible.

Im guessing not converting to a Byte array would save quite a bit of time,
as well as finding a better way to split the string into lines. But I've had
no luck at all finding anything.


Thankyou.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,011
Latest member
AjaUqq1950

Latest Threads

Top