Processing file input for large files[100+ MB] - Performance suggestions?

Discussion in 'ASP .Net' started by Maxim, Jul 7, 2003.

  1. Maxim

    Maxim Guest

    I am wondering if anyone could suggest come performance improvements for
    processing a very large file[100+MB]. THe processing taking place here is
    on 30-50MB chunks of the file.

    Performance is extremely important here.

    + I initialise a StringBuilder object with the result of the function
    System.Text.Encoding.GetString to convert the byte[] input to string.
    [potentially gobbling up an extraordinary amount of RAM]

    > myEncoding.getString(fileInput);



    + The StringBuilder is converted to a string, capitalisation taken
    out[ToLower], and split[String.Split()] on the new line character '\n' which
    is placed in a string array.

    > fileLines = SB.ToString.ToLower().Split(newline);



    At the moment this is taking up to 40-50% of the programs running time to
    perform, I'd really like to get this down as low as possible.

    Im guessing not converting to a Byte array would save quite a bit of time,
    as well as finding a better way to split the string into lines. But I've had
    no luck at all finding anything.


    Thankyou.
     
    Maxim, Jul 7, 2003
    #1
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Andreas Klemt
    Replies:
    2
    Views:
    436
    Marina
    Jul 28, 2003
  2. Not4u
    Replies:
    9
    Views:
    1,062
    Not4u
    Feb 27, 2004
  3. Replies:
    8
    Views:
    6,740
    Neredbojias
    Dec 9, 2005
  4. fred
    Replies:
    3
    Views:
    306
    Zifud
    Mar 17, 2005
  5. Replies:
    5
    Views:
    921
Loading...

Share This Page