Out Of Memory! - error

Discussion in 'Perl Misc' started by Zaphod Beeblebrox, Aug 18, 2006.

  1. Hello All;

    Please forgive me if this is somewhere in the group but I am a little
    clueless about this:

    When running a Perl script:
    _____________________________________
    Characteristics of this binary (from libperl):
    Compile-time options: PERL_MALLOC_WRAP USE_LARGE_FILES USE_PERLIO
    Built under linux
    Compiled at Aug 17 2006 09:52:18
    @INC:
    /usr/local/lib/perl5/5.8.8/i686-linux
    /usr/local/lib/perl5/5.8.8
    /usr/local/lib/perl5/site_perl/5.8.8/i686-linux
    /usr/local/lib/perl5/site_perl/5.8.8
    /usr/local/lib/perl5/site_perl
    ___________________________

    I get an Out Of Memory! error.

    Now, I am attempting to pull all the lines out of a really large file :

    2162788 -rwx------ 1 xxxxxxx xxxxxxx 2212526628 2006-08-16 15:08
    Data_20060816.log

    Sample of data:

    20060816093111,@@a,B,NT,1.98,1.97,853,,1.99,948,,1.98,1.97,1.97,U@,Z,Z,5000,79400,,,,,,131033,,|,1.97,1155735051,2B
    20060816093118,@@a,Q00,NT,1.97,853,,,1.99,897,,,Z,0,0,R,0,378651,1155735060
    20060816093128,@@a,Q00,NT,1.97,853,,,1.99,948,,,Z,0,0,R,0,395594,1155735070
    20060816093133,@@a,Q00,NT,1.97,853,,,1.99,902,,,Z,0,0,R,0,403342,1155735075
    20060816093133,@@a,Q00,NT,1.97,1393,,,1.99,902,,,Z,0,0,R,0,404373,1155735075
    20060816093134,@@a,B,NT,1.99,1.97,1393,,1.99,902,,1.99,1.97,1.97,U@,Z,Z,4600,84000,,,,,,143314,,|,1.97,1155735075,3B

    I want to strip out all of the lines that have the 8th field blank (
    actually , NULL not blank).

    Using the perl debugger (perl -d <script> <file> ) it fails on putting
    the file into an array:
    Loading DB routines from perl5db.pl version 1.28
    Editor support available.

    Enter h or `h h' for help, or `man perldebug' for more help.

    main::(TradeQuoteCanBlankExch.pl:10):
    10: $now_string = strftime "%a%b%e%H:%M:%S%Y", localtime;
    DB<1> n
    main::(TradeQuoteCanBlankExch.pl:11):
    11: my $symbol = $ARGV[0];#gets the file
    DB<1> n
    main::(TradeQuoteCanBlankExch.pl:12):
    12: chomp($symbol);#doesn't do shit for some reason
    DB<1> n
    main::(TradeQuoteCanBlankExch.pl:13):
    13: open(MYFILE,$symbol) || die "opening testfile: $!";
    DB<1> n
    main::(TradeQuoteCanBlankExch.pl:14):
    14: @stuff=<MYFILE>;
    DB<1> n

    Sits here forever and then

    Out of Memory!

    This is the root of the problem. I am wondering if there is a more
    efficient way of reading a really large file into an array.

    Any words of wisdom?

    Thanks in advance.

    Z
     
    Zaphod Beeblebrox, Aug 18, 2006
    #1
    1. Advertising

  2. Zaphod Beeblebrox

    Guest

    Zaphod Beeblebrox wrote:

    > I am wondering if there is a more efficient way of reading
    > a really large file into an array.


    The best way to do that is to not do that. It's rarely necessary or
    advisable to read a file into an array (especially a large file).

    Instead of doing something like this:

    my @stuff=<MYFILE>;
    foreach my $line_of_stuff( @stuff ) {
    # do stuff with stuff
    }

    do something like this:

    while ( my $line_of_stuff = <MYFILE> ) {
    # do stuff with stuff
    }

    Except use lexical filehandles (with better names) and the
    three-argument form of file open... but that's all another rant (read
    more in 'Perl Best Practices' by Damian Conway)

    --
    David Filmer (http://DavidFilmer.com)
     
    , Aug 18, 2006
    #2
    1. Advertising

  3. wrote:
    > Zaphod Beeblebrox wrote:
    >
    > > I am wondering if there is a more efficient way of reading
    > > a really large file into an array.

    >
    > The best way to do that is to not do that. It's rarely necessary or
    > advisable to read a file into an array (especially a large file).
    >
    > Instead of doing something like this:
    >
    > my @stuff=<MYFILE>;
    > foreach my $line_of_stuff( @stuff ) {
    > # do stuff with stuff
    > }
    >
    > do something like this:
    >
    > while ( my $line_of_stuff = <MYFILE> ) {
    > # do stuff with stuff
    > }
    >
    > Except use lexical filehandles (with better names) and the
    > three-argument form of file open... but that's all another rant (read
    > more in 'Perl Best Practices' by Damian Conway)
    >
    > --
    > David Filmer (http://DavidFilmer.com)



    HEY!! Thanks so much!

    I think you saved me a ton on money on asprin!!

    That did the trick! Like you stated, why do I/O when you don't have to!

    Cheers..

    G
     
    Zaphod Beeblebrox, Aug 22, 2006
    #3
  4. Zaphod Beeblebrox

    Guest

    Zaphod Beeblebrox wrote:

    > I think you saved me a ton on money on asprin!!


    I was serious in recommending the book, Perl Best Practices - it should
    be the second Perl book that anyone ever buys (I REALLY wish it had
    been in print years ago when I was new to Perl). Use your unspent
    asprin budget to grab a copy - you won't regret it.

    > That did the trick! Like you stated, why do I/O when you don't have to!


    You're still doing I/O (well, I at least), but you're only groking one
    line at a time. Slurping an entire file into an array is a common
    mistake, especially among newer programmers. There are times when you
    need to do that, but usually it's not a good idea because you're
    usually processing a file one line at a time.

    --
    David Filmer (http://DavidFilmer.com)
     
    , Aug 22, 2006
    #4
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.

Share This Page