Massive Memory Structures

Discussion in 'Perl Misc' started by ruu@cwcom.net, Jun 13, 2007.

  1. Guest

    OK - I was under the impression that perl running on a 64 bit OS,
    compiled for said 64 bit OS would be able to handle data structures in
    memory greater than 4G in size. I've tried this with a couple of OS's
    (Solaris on sparc, Solaris on x86_64) with both packaged and compiled
    versions of perl with identical results - as soon as I hit 4G with my
    script, I get an out of memory error and the script dies.

    I'm pretty sure I don't have a ulimit issue, since more or less
    everything is unlimited, and other programs run by the same user can
    access over 4G of ram quite happily.

    Two questions:

    1) Is it possible to access a structure over 4G with perl running on
    Solaris?
    2) If so, what options do I need to compile in to make this happen.

    Dave
    , Jun 13, 2007
    #1
    1. Advertising

  2. Guest

    No-one knows? Not even a "if you have to use more than 4G of memory,
    you don't know what you are doing" reply?

    Dave



    On Jun 13, 4:44 pm, wrote:
    > OK - I was under the impression that perl running on a 64 bit OS,
    > compiled for said 64 bit OS would be able to handle data structures in
    > memory greater than 4G in size. I've tried this with a couple of OS's
    > (Solaris on sparc, Solaris on x86_64) with both packaged and compiled
    > versions of perl with identical results - as soon as I hit 4G with my
    > script, I get an out of memory error and the script dies.
    >
    > I'm pretty sure I don't have a ulimit issue, since more or less
    > everything is unlimited, and other programs run by the same user can
    > access over 4G of ram quite happily.
    >
    > Two questions:
    >
    > 1) Is it possible to access a structure over 4G with perl running on
    > Solaris?
    > 2) If so, what options do I need to compile in to make this happen.
    >
    > Dave
    , Jun 15, 2007
    #2
    1. Advertising

  3. Guest

    writes:
    > No-one knows? Not even a "if you have to use more than 4G of memory,
    > you don't know what you are doing" reply?
    > Dave
    >
    > On Jun 13, 4:44 pm, wrote:
    > > OK - I was under the impression that perl running on a 64 bit OS,
    > > compiled for said 64 bit OS would be able to handle data structures in
    > > memory greater than 4G in size. I've tried this with a couple of OS's
    > > (Solaris on sparc, Solaris on x86_64) with both packaged and compiled
    > > versions of perl with identical results - as soon as I hit 4G with my
    > > script, I get an out of memory error and the script dies.
    > >
    > > I'm pretty sure I don't have a ulimit issue, since more or less
    > > everything is unlimited, and other programs run by the same user can
    > > access over 4G of ram quite happily.
    > >
    > > Two questions:
    > >
    > > 1) Is it possible to access a structure over 4G with perl running on
    > > Solaris?
    > > 2) If so, what options do I need to compile in to make this happen.
    > >
    > > Dave


    Possibly if you follow the steps in the posting guidelines,
    such as posting a runnable short program that demonstrates the problem,
    your post will both overcome the work-threashold-limit
    and also pass the "sounds interesting" threshold
    so that one of the few people here who have not only
    the knowledge, ability, experience, but also the resource (4G solaris machine)
    to check it out on their system, and if it works, to post their configuration.

    Our workgroup has been moving away from Solaris boxes
    and toward Linux running on AMD Opterons, for 4G/8G/16G memory setups,
    so even if you posted an example, I wouldn't have the ability to try it on
    a suitable Solaris machine.

    --
    Joel
    , Jun 15, 2007
    #3
  4. Guest

    wrote in message-id: <>

    [snip]
    >
    >On Jun 13, 4:44 pm, wrote:
    >> OK - I was under the impression that perl running on a 64 bit OS,
    >> compiled for said 64 bit OS would be able to handle data structures in
    >> memory greater than 4G in size. I've tried this with a couple of OS's
    >> (Solaris on sparc, Solaris on x86_64) with both packaged and compiled
    >> versions of perl with identical results - as soon as I hit 4G with my
    >> script, I get an out of memory error and the script dies.
    >>
    >> I'm pretty sure I don't have a ulimit issue, since more or less
    >> everything is unlimited, and other programs run by the same user can
    >> access over 4G of ram quite happily.
    >>
    >> Two questions:
    >>
    >> 1) Is it possible to access a structure over 4G with perl running on
    >> Solaris?
    >> 2) If so, what options do I need to compile in to make this happen.
    >>
    >> Dave


    [paste] (please dont top-post)
    >
    >No-one knows? Not even a "if you have to use more than 4G of memory,
    >you don't know what you are doing" reply?
    >
    >Dave
    >


    Sounds like a Sun issue, have you tried accessing their knowledge base?
    Unfortuantly i havent got 4 GB of memory to test this on my OS.
    Perhaps this is not a Perl question at all;
    did you try this using another programming language?
    , Jun 15, 2007
    #4
  5. Guest

    wrote:
    > On Jun 13, 4:44 pm, wrote:
    > > OK - I was under the impression that perl running on a 64 bit OS,
    > > compiled for said 64 bit OS would be able to handle data structures in
    > > memory greater than 4G in size. I've tried this with a couple of OS's
    > > (Solaris on sparc, Solaris on x86_64) with both packaged and compiled
    > > versions of perl with identical results - as soon as I hit 4G with my
    > > script, I get an out of memory error and the script dies.
    > >
    > > I'm pretty sure I don't have a ulimit issue, since more or less
    > > everything is unlimited, and other programs run by the same user can
    > > access over 4G of ram quite happily.


    I thought there was a discussion here several months ago, but I can't find
    it now. It seems like the conclusion was that with Solaris you have
    to start an executable in a certain way in order to get it to work with
    more than 4G of memory, and apparently Perl isn't by default started in
    that way. It seems like there was some kind of "extended attribute",
    like some super chmod command, you could run on the perl binary file to
    tell it to start up in that special way in the future. I didn't pay much
    attention, because I don't use Solaris much, so this is all kind of fuzzy.
    Maybe a Solaris specific group would know more.


    Xho

    --
    -------------------- http://NewsReader.Com/ --------------------
    Usenet Newsgroup Service $9.95/Month 30GB
    , Jun 15, 2007
    #5
  6. Guest

    On Jun 15, 2:55 pm, wrote:
    > wrote in message-id: <>
    >
    > [snip]
    >
    >
    >
    >
    >
    > >On Jun 13, 4:44 pm, wrote:
    > >> OK - I was under the impression that perl running on a 64 bit OS,
    > >> compiled for said 64 bit OS would be able to handle data structures in
    > >> memory greater than 4G in size. I've tried this with a couple of OS's
    > >> (Solaris on sparc, Solaris on x86_64) with both packaged and compiled
    > >> versions of perl with identical results - as soon as I hit 4G with my
    > >> script, I get an out of memory error and the script dies.

    >
    > >> I'm pretty sure I don't have a ulimit issue, since more or less
    > >> everything is unlimited, and other programs run by the same user can
    > >> access over 4G of ram quite happily.

    >
    > >> Two questions:

    >
    > >> 1) Is it possible to access a structure over 4G with perl running on
    > >> Solaris?
    > >> 2) If so, what options do I need to compile in to make this happen.

    >
    > >> Dave

    >
    > [paste] (please dont top-post)
    >
    >
    >
    > >No-one knows? Not even a "if you have to use more than 4G of memory,
    > >you don't know what you are doing" reply?

    >
    > >Dave

    >
    > Sounds like a Sun issue, have you tried accessing their knowledge base?
    > Unfortuantly i havent got 4 GB of memory to test this on my OS.
    > Perhaps this is not a Perl question at all;
    > did you try this using another programming language?


    I got a perl error (which I will concede may well have originated as
    an OS error), and I have run other applications (written in C++
    specifically) right up to 20+G of memory with no issues (I'm sorry - I
    have a number of servers with 64G of RAM - its not my fault). I was
    really hoping for someone to say "This definitely isn't a general perl
    thing, because I have done this on my Linux box". I can take this up
    with Sun directly, but chances are they are going to blame Perl, so I
    was hoping to have some kind of answer as to if it was even possible.

    Dave
    , Jun 15, 2007
    #6
  7. Guest

    On Jun 15, 2:28 pm, wrote:
    > writes:
    > > No-one knows? Not even a "if you have to use more than 4G of memory,
    > > you don't know what you are doing" reply?
    > > Dave

    >
    > > On Jun 13, 4:44 pm, wrote:
    > > > OK - I was under the impression that perl running on a 64 bit OS,
    > > > compiled for said 64 bit OS would be able to handle data structures in
    > > > memory greater than 4G in size. I've tried this with a couple of OS's
    > > > (Solaris on sparc, Solaris on x86_64) with both packaged and compiled
    > > > versions of perl with identical results - as soon as I hit 4G with my
    > > > script, I get an out of memory error and the script dies.

    >
    > > > I'm pretty sure I don't have a ulimit issue, since more or less
    > > > everything is unlimited, and other programs run by the same user can
    > > > access over 4G of ram quite happily.

    >
    > > > Two questions:

    >
    > > > 1) Is it possible to access a structure over 4G with perl running on
    > > > Solaris?
    > > > 2) If so, what options do I need to compile in to make this happen.

    >
    > > > Dave

    >
    > Possibly if you follow the steps in the posting guidelines,
    > such as posting a runnable short program that demonstrates the problem,
    > your post will both overcome the work-threashold-limit
    > and also pass the "sounds interesting" threshold
    > so that one of the few people here who have not only
    > the knowledge, ability, experience, but also the resource (4G solaris machine)
    > to check it out on their system, and if it works, to post their configuration.
    >
    > Our workgroup has been moving away from Solaris boxes
    > and toward Linux running on AMD Opterons, for 4G/8G/16G memory setups,
    > so even if you posted an example, I wouldn't have the ability to try it on
    > a suitable Solaris machine.
    >
    > --
    > Joel



    OK. Sounds fair. If you feel like running this on a Linux system, I
    would be interested to know if it works or not, even if it isnt
    Solaris.

    Below is a short script that works under Solaris (and probably
    anything that has mkfile). It will create a 5g test file, and then
    attempt to pull the whole thing into $bigvar. You will need to have
    enough space somewhere to create the 5G file, and at least 8G of RAM
    to attempt. Please, anyone reading this, DO NOT RUN THIS SCRIPT IF
    YOU ARENT SURE WHAT IT WILL DO, OR ON A PRODUCTION SYSTEM - there is a
    reasonable chance that your OS may fail in exciting ways if it uses up
    all of the memory. Further rules:

    1) Please dont run the script if you manage the safety systems of a
    nuclear power station.
    2) Please dont run the script on anything labelled "life support"
    3) If you work in a lab, have played "Far Cry", and thought "This
    looks familiar" at any point during the game, please dont run the
    script.

    #!/usr/bin/perl

    $filename = shift || die("Need to be passed a test file.\n");
    `/usr/sbin/mkfile 5g $filename`;
    unless (-s "$filename") {
    die("Failed to create testfile $filename.\n");
    }

    open(FILE,"$filename");
    $bigvar = <FILE>;
    close FILE;

    print "I successfully read in the test file.\n";
    unlink($filename) || die("Failed to remove test file $filename.\n");


    Dave
    , Jun 15, 2007
    #7
  8. <> wrote:

    > I got a perl error



    Can you find the (mystery) error message in perldiag.pod?


    --
    Tad McClellan
    email: perl -le "print scalar reverse qq/moc.noitatibaher\100cmdat/"
    Tad McClellan, Jun 16, 2007
    #8
  9. Dr.Ruud Guest

    schreef:

    > open(FILE,"$filename");
    > $bigvar = <FILE>;
    > close FILE;


    Why not create the $bigvar directly?

    #!/usr/bin/perl
    use strict;
    use warnings;

    my $b = '#' x 128;
    my $n = 0;
    while ($n < 3_000_000_000) {
    $b .= $b;
    print $n = length $b, "\n";
    }
    __END__

    See also `perl -V:* |grep 32`, and try 64 too.

    --
    Affijn, Ruud

    "Gewoon is een tijger."
    Dr.Ruud, Jun 16, 2007
    #9
  10. Guest

    wrote:
    >
    > I got a perl error (which I will concede may well have originated as
    > an OS error), and I have run other applications (written in C++
    > specifically) right up to 20+G of memory with no issues (I'm sorry - I
    > have a number of servers with 64G of RAM - its not my fault). I was
    > really hoping for someone to say "This definitely isn't a general perl
    > thing, because I have done this on my Linux box".


    This definitely isn't a general Perl thing, because I routinely have used
    10G or more on my x86_64 Linux box.

    Xho

    --
    -------------------- http://NewsReader.Com/ --------------------
    Usenet Newsgroup Service $9.95/Month 30GB
    , Jun 16, 2007
    #10
  11. Guest

    >> Can you find the (mystery) error message in perldiag.pod?

    It isn't a mystery - the error is "Out of Memory!" - I'm sorry if I
    wasn't clear enough. The pod only suggests ulimit tweaks, and all the
    relevant settings are already unlimited as far as I can tell
    (mentioned in my first post).

    >> Why not create the $bigvar directly?


    I wanted to limit the amount of memory the script used to a known
    amount (5G in the example) and do it as quickly as possible. It was
    the first thing that popped into my head to be honest, and it seemed
    to work relatively quickly.

    Dave
    , Jun 18, 2007
    #11
  12. [A complimentary Cc of this posting was sent to

    <>], who wrote in article <>:
    > open(FILE,"$filename");
    > $bigvar = <FILE>;
    > close FILE;


    Typical users underestimate the amount of memory used by "a
    statement". E.g., if no optimizations were present, the statement
    above could use about 50GB+. Look at it:

    a) you read data of unknown size into a temporary variable; the buffer
    used by this variable is realloc()ed about 150 times. Assume the
    "trail" of old buffers takes about 35GB; then the total size used is
    40G + memory overhead on a 5GB allocation.

    b) The temporary variable is copied inot $bigvar; (another 5GB + memory
    overhead);

    c) The copied value is also the value of the statement; another 5G +
    memory overhead is sitting on Perl stack.

    AFAIR, "b" and "c" are currently optimized away. However, "a" is
    fully dependent on the malloc() implementation, and (unless you use
    Perl's malloc()) are out of Perl control. (Perl's malloc() would use
    about 8GB.)

    My advice is to redo the test with 3GB allocation, and check the
    actual memory usage.

    Hope this helps,
    Ilya
    Ilya Zakharevich, Jun 18, 2007
    #12
  13. Guest

    On Jun 18, 3:29 am, Ilya Zakharevich <> wrote:
    > [A complimentary Cc of this posting was sent to
    >
    > <>], who wrote in article <>:
    >
    > > open(FILE,"$filename");
    > > $bigvar = <FILE>;
    > > close FILE;

    >
    > Typical users underestimate the amount of memory used by "a
    > statement". E.g., if no optimizations were present, the statement
    > above could use about 50GB+. Look at it:
    >
    > a) you read data of unknown size into a temporary variable; the buffer
    > used by this variable is realloc()ed about 150 times. Assume the
    > "trail" of old buffers takes about 35GB; then the total size used is
    > 40G + memory overhead on a 5GB allocation.
    >
    > b) The temporary variable is copied inot $bigvar; (another 5GB + memory
    > overhead);
    >
    > c) The copied value is also the value of the statement; another 5G +
    > memory overhead is sitting on Perl stack.
    >
    > AFAIR, "b" and "c" are currently optimized away. However, "a" is
    > fully dependent on the malloc() implementation, and (unless you use
    > Perl's malloc()) are out of Perl control. (Perl's malloc() would use
    > about 8GB.)
    >
    > My advice is to redo the test with 3GB allocation, and check the
    > actual memory usage.
    >
    > Hope this helps,
    > Ilya



    The first thing I did with this script was use a 3G memory size, which
    used a tiny bit more than 3G of memory and worked fine (have you
    tested you logic on your architecture - what you describe just doesn't
    seem to happen on my Solaris boxes). Then I moved up to over 4G, and
    it failed with "Out of Memory!", which I expected. This really isn't
    the point anyway - the file example was just to try any memory
    structure greater than 4G to see what would happen, and if the script
    had sat there and used memory all the way up to 50G+ (which honestly
    wouldn't have failed on the servers I am using anyway), then I would
    probably have noticed on my 10th or 11th attempt at doing this while
    watching top/prstat. What I actually needed to work was a sizeable
    hash mapping customer ID's to customer details, but since it is a lot
    more difficult writing a test script that keeps memory under control,
    I went with the much simpler "read a file" test script instead.

    My point is fairly simple - it doesn't matter what data structure I am
    using, be it reading a huge file into a single variable, or
    maintaining a hash of 50+million entries, as soon as the script uses
    4G of memory, it dies. I'm about as sure as I can be that the problem
    isn't in the code, but either in the way that perl was built, or
    something that the OS was doing. Since this isn't a Solaris group,
    mostly I was hoping someone with a lot of experience building perl
    might be able to make some suggestions on optimizing the build.

    Dave
    , Jun 18, 2007
    #13
  14. [A complimentary Cc of this posting was sent to

    <>], who wrote in article <>:
    > The first thing I did with this script was use a 3G memory size, which
    > used a tiny bit more than 3G of memory and worked fine (have you
    > tested you logic on your architecture - what you describe just doesn't
    > seem to happen on my Solaris boxes).


    Then Solaris' malloc() got better than it used to be...

    > Then I moved up to over 4G, and it failed with "Out of Memory!"


    From watching comp.sys.sun.hardware, I also remember some mentions of
    "secret handshakes" :-(. Look for postings by Casper.

    But the most probable cause is wrong compiler flags. You did not post
    your -V (not that I know correct flags for full 64bit on Solaris)...

    Hope this helps,
    Ilya
    Ilya Zakharevich, Jun 18, 2007
    #14
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. wish

    JMF massive memory leak

    wish, Apr 30, 2005, in forum: Java
    Replies:
    0
    Views:
    578
  2. Hallvard B Furuseth

    slowdown with massive memory usage

    Hallvard B Furuseth, Jul 30, 2004, in forum: Python
    Replies:
    5
    Views:
    314
    Bengt Richter
    Aug 1, 2004
  3. Alfonso Morra
    Replies:
    11
    Views:
    713
    Emmanuel Delahaye
    Sep 24, 2005
  4. Cyhawk

    Question about massive arrays and memory

    Cyhawk, Nov 1, 2005, in forum: C Programming
    Replies:
    2
    Views:
    321
    Malcolm
    Nov 1, 2005
  5. Nick Craig-Wood
    Replies:
    1
    Views:
    277
Loading...

Share This Page