DBI (with Oracle) 'out of memory' error

D

dn.perl

I am using perl 5.6 on FreeBSD. (This is perl, v5.6.1 built for i386-
freebsd).
DBI version 1.48 .

I am running a select command which returns 355,000+ rows. While
reading it into a hash, I should have 280,000+ hash-keys when I am
done constructing the hash.

But the script is aborting with message: Out of memory!

What might be wrong? If I can keep the hash-size down (deal with a
key, complete all calculations assoiated with that key, then undef
it), will it help?

$$rh_result{field01}{field02} = 2 ;
do calculations for this hash-key
undef $$rh_result{field01} ; # I do not know the syntax to undef,
but will find it out.

Please advise.
 
J

J. Gleixner

I am using perl 5.6 on FreeBSD. (This is perl, v5.6.1 built for i386-
freebsd).
DBI version 1.48 .

I am running a select command which returns 355,000+ rows. While
reading it into a hash, I should have 280,000+ hash-keys when I am
done constructing the hash.

But the script is aborting with message: Out of memory!

What might be wrong? If I can keep the hash-size down (deal with a
key, complete all calculations assoiated with that key, then undef
it), will it help?

$$rh_result{field01}{field02} = 2 ;
do calculations for this hash-key
undef $$rh_result{field01} ; # I do not know the syntax to undef,

$rh_result->{ 'field01' } is more readable.
but will find it out.

perldoc -f undef
Please advise.

Please show us the relevant code.

In short, you should probably be doing:
Prepare SQL
Execute
Bind columns
Iterate over each row using while and fetch.
Using the binded variables, do calculations, store results
Finish
Disconnect

If you're doing that and it runs out of memory, then we really
really need to see your code.
 
P

Peter J. Holzer

I am using perl 5.6 on FreeBSD. (This is perl, v5.6.1 built for i386-
freebsd).

Perl 5.6 is really old. 5.10 is the current release.
DBI version 1.48 .

I am running a select command which returns 355,000+ rows. While
reading it into a hash, I should have 280,000+ hash-keys when I am
done constructing the hash.

But the script is aborting with message: Out of memory!

Since your perl is very old, I assume your computer is rather old, too.

Hashes need a lot of memory. With 280k hash keys, I estimate that your
hash needs at least 60 MB. If your hash has multiple levels, it can be a
lot more.

Also you don't say how you run the select command. If you use one of
the selectall_* methods you'll get a nested data structure with 355,000+
times the number of colums elements - probably a few million elements in
total, which would use another few hundred MB ...

How much memory does your script use just before it runs out of memory?
Is it near the available virtual memory? Do you have any resource limits
set?

What might be wrong? If I can keep the hash-size down (deal with a
key, complete all calculations assoiated with that key, then undef
it), will it help?

Yes, that will help. But if you can process your data one key at a time,
why store them in a hash in the first place?

hp
 
D

dn.perl

Hashes need a lot of memory. With 280k hash keys, I estimate that your
hash needs at least 60 MB. If your hash has multiple levels, it can be a
lot more.

How much memory does your script use just before it runs out of memory?
Is it near the available virtual memory? Do you have any resource limits
set?

I do not know how much memory my script was using before crashing, and
do not even know how to find it out. But your answer helped because I
was using a hash with 300,000 keys and each key had sub-levels.
$$rh_result{from_queue001}{to_queue}{admin_name}{count} = 4
... to ...
$$rh_result{from_queue300_001}{to_queue}{admin_name}{count} = 10.
Now I am deleting a key after its relevance is over.
And the script is not crashing.


Yes, that will help. But if you can process your data one key at a time,
why store them in a hash in the first place?

Because each from_queue may have more than one entries assigned to it
and I can do calculations about that queue only after all the entries
are read. And I just prefer to store the data in a hash rather than
array.
my last_queue = "" ;
while(my @row = fetchrow_array) {
my $current_queue = $row[0] ;
populate $$rh_result($current_queue) ;
if( $current_queue ne $last_queue and $last_queue ne "" ) {
do-stuff about $$rh_result($last_queue) ;
NOW added: delete $$rh_result($current_queue) ;
}
$last_queue = $current_queue ;
}
do-stuff about $last_queue if $last_queue ne "" ;

The script was working without a hitch for 2-3 months before the
recent crash. But such problems help in understanding better how the
whole thing works.

Thanks for the responses.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,579
Members
45,053
Latest member
BrodieSola

Latest Threads

Top