Massive Memory Structures

R

ruu

OK - I was under the impression that perl running on a 64 bit OS,
compiled for said 64 bit OS would be able to handle data structures in
memory greater than 4G in size. I've tried this with a couple of OS's
(Solaris on sparc, Solaris on x86_64) with both packaged and compiled
versions of perl with identical results - as soon as I hit 4G with my
script, I get an out of memory error and the script dies.

I'm pretty sure I don't have a ulimit issue, since more or less
everything is unlimited, and other programs run by the same user can
access over 4G of ram quite happily.

Two questions:

1) Is it possible to access a structure over 4G with perl running on
Solaris?
2) If so, what options do I need to compile in to make this happen.

Dave
 
R

ruu

No-one knows? Not even a "if you have to use more than 4G of memory,
you don't know what you are doing" reply?

Dave
 
J

jgraber

No-one knows? Not even a "if you have to use more than 4G of memory,
you don't know what you are doing" reply?
Dave

Possibly if you follow the steps in the posting guidelines,
such as posting a runnable short program that demonstrates the problem,
your post will both overcome the work-threashold-limit
and also pass the "sounds interesting" threshold
so that one of the few people here who have not only
the knowledge, ability, experience, but also the resource (4G solaris machine)
to check it out on their system, and if it works, to post their configuration.

Our workgroup has been moving away from Solaris boxes
and toward Linux running on AMD Opterons, for 4G/8G/16G memory setups,
so even if you posted an example, I wouldn't have the ability to try it on
a suitable Solaris machine.
 
Q

QoS

(e-mail address removed) wrote in message-id: <[email protected]>

[snip]
[paste] (please dont top-post)
No-one knows? Not even a "if you have to use more than 4G of memory,
you don't know what you are doing" reply?

Dave

Sounds like a Sun issue, have you tried accessing their knowledge base?
Unfortuantly i havent got 4 GB of memory to test this on my OS.
Perhaps this is not a Perl question at all;
did you try this using another programming language?
 
X

xhoster

I thought there was a discussion here several months ago, but I can't find
it now. It seems like the conclusion was that with Solaris you have
to start an executable in a certain way in order to get it to work with
more than 4G of memory, and apparently Perl isn't by default started in
that way. It seems like there was some kind of "extended attribute",
like some super chmod command, you could run on the perl binary file to
tell it to start up in that special way in the future. I didn't pay much
attention, because I don't use Solaris much, so this is all kind of fuzzy.
Maybe a Solaris specific group would know more.


Xho
 
R

ruu

(e-mail address removed) wrote in message-id: <[email protected]>

[snip]





[paste] (please dont top-post)


No-one knows? Not even a "if you have to use more than 4G of memory,
you don't know what you are doing" reply?

Sounds like a Sun issue, have you tried accessing their knowledge base?
Unfortuantly i havent got 4 GB of memory to test this on my OS.
Perhaps this is not a Perl question at all;
did you try this using another programming language?

I got a perl error (which I will concede may well have originated as
an OS error), and I have run other applications (written in C++
specifically) right up to 20+G of memory with no issues (I'm sorry - I
have a number of servers with 64G of RAM - its not my fault). I was
really hoping for someone to say "This definitely isn't a general perl
thing, because I have done this on my Linux box". I can take this up
with Sun directly, but chances are they are going to blame Perl, so I
was hoping to have some kind of answer as to if it was even possible.

Dave
 
R

ruu

Possibly if you follow the steps in the posting guidelines,
such as posting a runnable short program that demonstrates the problem,
your post will both overcome the work-threashold-limit
and also pass the "sounds interesting" threshold
so that one of the few people here who have not only
the knowledge, ability, experience, but also the resource (4G solaris machine)
to check it out on their system, and if it works, to post their configuration.

Our workgroup has been moving away from Solaris boxes
and toward Linux running on AMD Opterons, for 4G/8G/16G memory setups,
so even if you posted an example, I wouldn't have the ability to try it on
a suitable Solaris machine.


OK. Sounds fair. If you feel like running this on a Linux system, I
would be interested to know if it works or not, even if it isnt
Solaris.

Below is a short script that works under Solaris (and probably
anything that has mkfile). It will create a 5g test file, and then
attempt to pull the whole thing into $bigvar. You will need to have
enough space somewhere to create the 5G file, and at least 8G of RAM
to attempt. Please, anyone reading this, DO NOT RUN THIS SCRIPT IF
YOU ARENT SURE WHAT IT WILL DO, OR ON A PRODUCTION SYSTEM - there is a
reasonable chance that your OS may fail in exciting ways if it uses up
all of the memory. Further rules:

1) Please dont run the script if you manage the safety systems of a
nuclear power station.
2) Please dont run the script on anything labelled "life support"
3) If you work in a lab, have played "Far Cry", and thought "This
looks familiar" at any point during the game, please dont run the
script.

#!/usr/bin/perl

$filename = shift || die("Need to be passed a test file.\n");
`/usr/sbin/mkfile 5g $filename`;
unless (-s "$filename") {
die("Failed to create testfile $filename.\n");
}

open(FILE,"$filename");
$bigvar = <FILE>;
close FILE;

print "I successfully read in the test file.\n";
unlink($filename) || die("Failed to remove test file $filename.\n");


Dave
 
D

Dr.Ruud

(e-mail address removed) schreef:
open(FILE,"$filename");
$bigvar = <FILE>;
close FILE;

Why not create the $bigvar directly?

#!/usr/bin/perl
use strict;
use warnings;

my $b = '#' x 128;
my $n = 0;
while ($n < 3_000_000_000) {
$b .= $b;
print $n = length $b, "\n";
}
__END__

See also `perl -V:* |grep 32`, and try 64 too.
 
X

xhoster

I got a perl error (which I will concede may well have originated as
an OS error), and I have run other applications (written in C++
specifically) right up to 20+G of memory with no issues (I'm sorry - I
have a number of servers with 64G of RAM - its not my fault). I was
really hoping for someone to say "This definitely isn't a general perl
thing, because I have done this on my Linux box".

This definitely isn't a general Perl thing, because I routinely have used
10G or more on my x86_64 Linux box.

Xho
 
R

ruu

Can you find the (mystery) error message in perldiag.pod?

It isn't a mystery - the error is "Out of Memory!" - I'm sorry if I
wasn't clear enough. The pod only suggests ulimit tweaks, and all the
relevant settings are already unlimited as far as I can tell
(mentioned in my first post).

I wanted to limit the amount of memory the script used to a known
amount (5G in the example) and do it as quickly as possible. It was
the first thing that popped into my head to be honest, and it seemed
to work relatively quickly.

Dave
 
I

Ilya Zakharevich

[A complimentary Cc of this posting was sent to

open(FILE,"$filename");
$bigvar = <FILE>;
close FILE;

Typical users underestimate the amount of memory used by "a
statement". E.g., if no optimizations were present, the statement
above could use about 50GB+. Look at it:

a) you read data of unknown size into a temporary variable; the buffer
used by this variable is realloc()ed about 150 times. Assume the
"trail" of old buffers takes about 35GB; then the total size used is
40G + memory overhead on a 5GB allocation.

b) The temporary variable is copied inot $bigvar; (another 5GB + memory
overhead);

c) The copied value is also the value of the statement; another 5G +
memory overhead is sitting on Perl stack.

AFAIR, "b" and "c" are currently optimized away. However, "a" is
fully dependent on the malloc() implementation, and (unless you use
Perl's malloc()) are out of Perl control. (Perl's malloc() would use
about 8GB.)

My advice is to redo the test with 3GB allocation, and check the
actual memory usage.

Hope this helps,
Ilya
 
R

ruu

[A complimentary Cc of this posting was sent to

open(FILE,"$filename");
$bigvar = <FILE>;
close FILE;

Typical users underestimate the amount of memory used by "a
statement". E.g., if no optimizations were present, the statement
above could use about 50GB+. Look at it:

a) you read data of unknown size into a temporary variable; the buffer
used by this variable is realloc()ed about 150 times. Assume the
"trail" of old buffers takes about 35GB; then the total size used is
40G + memory overhead on a 5GB allocation.

b) The temporary variable is copied inot $bigvar; (another 5GB + memory
overhead);

c) The copied value is also the value of the statement; another 5G +
memory overhead is sitting on Perl stack.

AFAIR, "b" and "c" are currently optimized away. However, "a" is
fully dependent on the malloc() implementation, and (unless you use
Perl's malloc()) are out of Perl control. (Perl's malloc() would use
about 8GB.)

My advice is to redo the test with 3GB allocation, and check the
actual memory usage.

Hope this helps,
Ilya


The first thing I did with this script was use a 3G memory size, which
used a tiny bit more than 3G of memory and worked fine (have you
tested you logic on your architecture - what you describe just doesn't
seem to happen on my Solaris boxes). Then I moved up to over 4G, and
it failed with "Out of Memory!", which I expected. This really isn't
the point anyway - the file example was just to try any memory
structure greater than 4G to see what would happen, and if the script
had sat there and used memory all the way up to 50G+ (which honestly
wouldn't have failed on the servers I am using anyway), then I would
probably have noticed on my 10th or 11th attempt at doing this while
watching top/prstat. What I actually needed to work was a sizeable
hash mapping customer ID's to customer details, but since it is a lot
more difficult writing a test script that keeps memory under control,
I went with the much simpler "read a file" test script instead.

My point is fairly simple - it doesn't matter what data structure I am
using, be it reading a huge file into a single variable, or
maintaining a hash of 50+million entries, as soon as the script uses
4G of memory, it dies. I'm about as sure as I can be that the problem
isn't in the code, but either in the way that perl was built, or
something that the OS was doing. Since this isn't a Solaris group,
mostly I was hoping someone with a lot of experience building perl
might be able to make some suggestions on optimizing the build.

Dave
 
I

Ilya Zakharevich

[A complimentary Cc of this posting was sent to

The first thing I did with this script was use a 3G memory size, which
used a tiny bit more than 3G of memory and worked fine (have you
tested you logic on your architecture - what you describe just doesn't
seem to happen on my Solaris boxes).

Then Solaris' malloc() got better than it used to be...
Then I moved up to over 4G, and it failed with "Out of Memory!"

From watching comp.sys.sun.hardware, I also remember some mentions of
"secret handshakes" :-(. Look for postings by Casper.

But the most probable cause is wrong compiler flags. You did not post
your -V (not that I know correct flags for full 64bit on Solaris)...

Hope this helps,
Ilya
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,768
Messages
2,569,574
Members
45,050
Latest member
AngelS122

Latest Threads

Top