DBI (with Oracle) 'out of memory' error

Discussion in 'Perl Misc' started by dn.perl@gmail.com, Aug 27, 2009.

  1. Guest

    I am using perl 5.6 on FreeBSD. (This is perl, v5.6.1 built for i386-
    freebsd).
    DBI version 1.48 .

    I am running a select command which returns 355,000+ rows. While
    reading it into a hash, I should have 280,000+ hash-keys when I am
    done constructing the hash.

    But the script is aborting with message: Out of memory!

    What might be wrong? If I can keep the hash-size down (deal with a
    key, complete all calculations assoiated with that key, then undef
    it), will it help?

    $$rh_result{field01}{field02} = 2 ;
    do calculations for this hash-key
    undef $$rh_result{field01} ; # I do not know the syntax to undef,
    but will find it out.

    Please advise.
    , Aug 27, 2009
    #1
    1. Advertising

  2. J. Gleixner Guest

    wrote:
    > I am using perl 5.6 on FreeBSD. (This is perl, v5.6.1 built for i386-
    > freebsd).
    > DBI version 1.48 .
    >
    > I am running a select command which returns 355,000+ rows. While
    > reading it into a hash, I should have 280,000+ hash-keys when I am
    > done constructing the hash.
    >
    > But the script is aborting with message: Out of memory!
    >
    > What might be wrong? If I can keep the hash-size down (deal with a
    > key, complete all calculations assoiated with that key, then undef
    > it), will it help?
    >
    > $$rh_result{field01}{field02} = 2 ;
    > do calculations for this hash-key
    > undef $$rh_result{field01} ; # I do not know the syntax to undef,


    $rh_result->{ 'field01' } is more readable.

    > but will find it out.


    perldoc -f undef

    >
    > Please advise.
    >


    Please show us the relevant code.

    In short, you should probably be doing:
    Prepare SQL
    Execute
    Bind columns
    Iterate over each row using while and fetch.
    Using the binded variables, do calculations, store results
    Finish
    Disconnect

    If you're doing that and it runs out of memory, then we really
    really need to see your code.
    J. Gleixner, Aug 27, 2009
    #2
    1. Advertising

  3. On 2009-08-27 09:12, <> wrote:
    > I am using perl 5.6 on FreeBSD. (This is perl, v5.6.1 built for i386-
    > freebsd).


    Perl 5.6 is really old. 5.10 is the current release.

    > DBI version 1.48 .
    >
    > I am running a select command which returns 355,000+ rows. While
    > reading it into a hash, I should have 280,000+ hash-keys when I am
    > done constructing the hash.
    >
    > But the script is aborting with message: Out of memory!


    Since your perl is very old, I assume your computer is rather old, too.

    Hashes need a lot of memory. With 280k hash keys, I estimate that your
    hash needs at least 60 MB. If your hash has multiple levels, it can be a
    lot more.

    Also you don't say how you run the select command. If you use one of
    the selectall_* methods you'll get a nested data structure with 355,000+
    times the number of colums elements - probably a few million elements in
    total, which would use another few hundred MB ...

    How much memory does your script use just before it runs out of memory?
    Is it near the available virtual memory? Do you have any resource limits
    set?


    > What might be wrong? If I can keep the hash-size down (deal with a
    > key, complete all calculations assoiated with that key, then undef
    > it), will it help?


    Yes, that will help. But if you can process your data one key at a time,
    why store them in a hash in the first place?

    hp
    Peter J. Holzer, Aug 27, 2009
    #3
  4. Guest

    On Aug 27, 10:21 am, "Peter J. Holzer" <> wrote:
    > On 2009-08-27 09:12, <> wrote:
    >
    > Hashes need a lot of memory. With 280k hash keys, I estimate that your
    > hash needs at least 60 MB. If your hash has multiple levels, it can be a
    > lot more.
    >
    > How much memory does your script use just before it runs out of memory?
    > Is it near the available virtual memory? Do you have any resource limits
    > set?
    >


    I do not know how much memory my script was using before crashing, and
    do not even know how to find it out. But your answer helped because I
    was using a hash with 300,000 keys and each key had sub-levels.
    $$rh_result{from_queue001}{to_queue}{admin_name}{count} = 4
    ... to ...
    $$rh_result{from_queue300_001}{to_queue}{admin_name}{count} = 10.
    Now I am deleting a key after its relevance is over.
    And the script is not crashing.



    > > What might be wrong? If I can keep the hash-size down (deal with a
    > > key, complete all calculations assoiated with that key, then undef
    > > it), will it help?

    >
    > Yes, that will help. But if you can process your data one key at a time,
    > why store them in a hash in the first place?
    >


    Because each from_queue may have more than one entries assigned to it
    and I can do calculations about that queue only after all the entries
    are read. And I just prefer to store the data in a hash rather than
    array.
    my last_queue = "" ;
    while(my @row = fetchrow_array) {
    my $current_queue = $row[0] ;
    populate $$rh_result($current_queue) ;
    if( $current_queue ne $last_queue and $last_queue ne "" ) {
    do-stuff about $$rh_result($last_queue) ;
    NOW added: delete $$rh_result($current_queue) ;
    }
    $last_queue = $current_queue ;
    }
    do-stuff about $last_queue if $last_queue ne "" ;

    The script was working without a hitch for 2-3 months before the
    recent crash. But such problems help in understanding better how the
    whole thing works.

    Thanks for the responses.
    , Aug 28, 2009
    #4
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. ulloa
    Replies:
    1
    Views:
    515
    Juha Laiho
    Jul 22, 2004
  2. Jerome Hauss
    Replies:
    0
    Views:
    161
    Jerome Hauss
    Oct 13, 2004
  3. Graham

    DBI Oracle error

    Graham, Jun 7, 2005, in forum: Ruby
    Replies:
    0
    Views:
    104
    Graham
    Jun 7, 2005
  4. Vincent Le-Texier
    Replies:
    1
    Views:
    142
    Paul Lalli
    Dec 3, 2004
  5. Feyruz
    Replies:
    4
    Views:
    2,128
    Sherm Pendley
    Oct 14, 2005
Loading...

Share This Page