Out of memory problem

S

Shikhar

Perl Gurus,

I have a Perl program that reads certain file(s) and stores contents
in hash maps. The files are not huge per se,. One of the file is 100
bytes long * 100,000 records (80 megs).


I have always thought/read that Perl can read whole file and work
efficiently. I had this program running nicely with a 40 meg file.
Now I am getting "Out of Memory!" problem with 80 meg file.


In Java I can use heap size parameters to set initial and maximum heap
sizes and they work nicely.

How can I deal with situation here. I checked in Top and memory
utilization for the process was around 56,264 K which does not seem a
lot.


System : HP-UX 11.0
This is perl, v5.6.1 built for PA-RISC1.1-thread-multi
(with 1 registered patch, see perl -V for more detail)

Copyright 1987-2001, Larry Wall

Binary build 627 provided by ActiveState Tool Corp.
http://www.ActiveState.com
Built 21:42:53 Jun 20 2001

Memory Info

4 gb physical

From glance : Total VM : 279.5mb Sys Mem : 270.6mb User Mem:
1.50gb Phys Mem: 4.00gb
 
B

Ben Morrow

Quoth (e-mail address removed) (Shikhar):
Perl Gurus,

I have a Perl program that reads certain file(s) and stores contents
in hash maps. The files are not huge per se,. One of the file is 100
bytes long * 100,000 records (80 megs).

I have always thought/read that Perl can read whole file and work
efficiently. I had this program running nicely with a 40 meg file.
Now I am getting "Out of Memory!" problem with 80 meg file.

In Java I can use heap size parameters to set initial and maximum heap
sizes and they work nicely.

Do you have a soft limit on memory usage set? Check ulimit.

Otherwise, do you really need to process the whole file in one go? You
may be able to rewrite your program so it processes the file
sequentially.
 
M

Martien Verbruggen

Perl Gurus,

I have a Perl program that reads certain file(s) and stores contents
in hash maps. The files are not huge per se,. One of the file is 100
bytes long * 100,000 records (80 megs).

Perl will probably need a bit more than that 80 MB, due to some
overhead, but not hugely more.
I have always thought/read that Perl can read whole file and work
efficiently. I had this program running nicely with a 40 meg file.
Now I am getting "Out of Memory!" problem with 80 meg file.


In Java I can use heap size parameters to set initial and maximum heap
sizes and they work nicely.

How can I deal with situation here. I checked in Top and memory
utilization for the process was around 56,264 K which does not seem a
lot.

That is not a lot at all. especially given:
Memory Info

4 gb physical

From glance : Total VM : 279.5mb Sys Mem : 270.6mb User Mem:
1.50gb Phys Mem: 4.00gb

With this to back it up, there should be no problem at all for Perl to
have 100,000 key-value pairs in a hash, with each key-value pair
representing 100 bytes of data.


I'd say your system probably has some user limitations configured by
default. Type in something like:

$ ulimit -a

I suspect that AIX has one or more entries in there for memory size.
If they're not set to unlimited, you're probably hitting those. talk
to your system admin about those limits if they're there.

Martien
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,777
Messages
2,569,604
Members
45,234
Latest member
SkyeWeems

Latest Threads

Top