Memory Limitations for Perl Programs?

Discussion in 'Perl Misc' started by Chris Hamel, Sep 5, 2006.

  1. Chris Hamel

    Chris Hamel Guest

    This may not be a Perl issue per se, but our Unix support people don't
    have any insight and I was hoping to get some direction.

    We have Perl (5.8.0) installed on an AIX server. The server itself is
    capable of handling 32-bit processes as large as about 2 GB, and we've
    never found a ceiling for the 64-bit processes (only 32 GB on the
    server, and we've run programs as large as 22 GB). When I run programs
    in Perl, however, they core dump once the process hits about 250 MB.

    In many cases, we can get around this by using the BerkeleyDB module,
    but this results in a significant performance hit and requires some
    data structure reengineering.

    Unfortunately, I am fairly Unix ignorant. I did not install Perl on
    Unix, nor would I know how to. I just use what the Unix admins
    installed for us. What I'm trying to find out is if there is a runtime
    option or an option on installation (compilation?) that enables Perl to
    have a higher threshhold than 250 MB.... or is this a limitation built
    into Perl?

    Any information or feedback I can pass on to our Unix admins would be
    most appreciated.

    Also, for what it's worth, our programs really are that large. We've
    done a number of things to try to reduce the footprint of the programs
    (other than what's in perldoc -q memory). One example I learned is
    that that doing this:

    $part_info{$part} = [ $nomenclature, $cost, $min_qty, $max_qty ];

    takes up more memory than this:

    $part_info{$part} = join '|', $nomenclature, $cost, $min_qty,
    $max_qty;

    (not to mention not working with Berkeley). But the bottom line is the
    data we bring together is huge.

    Thanks in advance for any insight or direction, and I apologize in
    advance if this has more to do with the OS than with Perl...

    Chris H.
    Chris Hamel, Sep 5, 2006
    #1
    1. Advertising

  2. Chris Hamel

    tuser Guest

    Chris Hamel wrote:
    > This may not be a Perl issue per se, but our Unix support people don't
    > have any insight and I was hoping to get some direction.
    >
    > We have Perl (5.8.0) installed on an AIX server. The server itself is
    > capable of handling 32-bit processes as large as about 2 GB, and we've
    > never found a ceiling for the 64-bit processes (only 32 GB on the
    > server, and we've run programs as large as 22 GB). When I run programs
    > in Perl, however, they core dump once the process hits about 250 MB.


    I don't claim to know the answer, but I think it might be helpful if
    you could show us what is displayed after you issue the command "perl
    -V" (that's perl in lowercase and with a capital "V") on your console.
    tuser, Sep 5, 2006
    #2
    1. Advertising

  3. Chris Hamel <> wrote:

    > or is this a limitation built
    > into Perl?



    No.

    perl will happily use all the available memory.


    --
    Tad McClellan SGML consulting
    Perl programming
    Fort Worth, Texas
    Tad McClellan, Sep 5, 2006
    #3
  4. In article <>, says...
    >
    >
    >This may not be a Perl issue per se, but our Unix support people don't
    >have any insight and I was hoping to get some direction.
    >
    >We have Perl (5.8.0) installed on an AIX server. The server itself is
    >capable of handling 32-bit processes as large as about 2 GB, and we've
    >never found a ceiling for the 64-bit processes (only 32 GB on the
    >server, and we've run programs as large as 22 GB). When I run programs
    >in Perl, however, they core dump once the process hits about 250 MB.


    If your perl is 32 bit (as /usr/opt/perl5/bin/perl that comes with AIX),
    the following may help:

    http://publib16.boulder.ibm.com/doc...prggd/genprogc/lrg_prg_support.htm#a179c11c5d

    cheers

    Heinrich

    --
    Heinrich Mislik
    Zentraler Informatikdienst der Universitaet Wien
    A-1010 Wien, Universitaetsstrasse 7
    Tel.: (+43 1) 4277-14056, Fax: (+43 1) 4277-9140
    Heinrich Mislik, Sep 5, 2006
    #4
  5. Chris Hamel

    Guest

    "Chris Hamel" <> wrote:
    > This may not be a Perl issue per se, but our Unix support people don't
    > have any insight and I was hoping to get some direction.
    >
    > We have Perl (5.8.0) installed on an AIX server.


    You should upgrade that. 5.8.0 is fairly buggy (but I have no particular
    reason to think that that is the source of this particular problem.)

    > The server itself is
    > capable of handling 32-bit processes as large as about 2 GB, and we've
    > never found a ceiling for the 64-bit processes (only 32 GB on the
    > server, and we've run programs as large as 22 GB). When I run programs
    > in Perl, however, they core dump once the process hits about 250 MB.


    Is this with a wide variety of perl programs, or have you only tried
    one program? How about something like:

    perl -le 'my @x=1..1e7; system "ps -p $$ -o rss"; sleep'

    Can you run large non-Perl processes from the very same account
    in which you are having problems with perl (it could be a shell limit/
    ulimit problem).


    >
    > In many cases, we can get around this by using the BerkeleyDB module,
    > but this results in a significant performance hit and requires some
    > data structure reengineering.


    Have you considered using something like Mysql or PostgreSQL?

    > Unfortunately, I am fairly Unix ignorant. I did not install Perl on
    > Unix, nor would I know how to. I just use what the Unix admins
    > installed for us. What I'm trying to find out is if there is a runtime
    > option or an option on installation (compilation?) that enables Perl to
    > have a higher threshhold than 250 MB.... or is this a limitation built
    > into Perl?
    >
    > Any information or feedback I can pass on to our Unix admins would be
    > most appreciated.


    There is no such limitation intentionally built into perl. It is probably
    either something with your system itself, or some buggy interaction between
    your perl build and your system.

    > Also, for what it's worth, our programs really are that large. We've
    > done a number of things to try to reduce the footprint of the programs
    > (other than what's in perldoc -q memory). One example I learned is
    > that that doing this:
    >
    > $part_info{$part} = [ $nomenclature, $cost, $min_qty, $max_qty ];
    >
    > takes up more memory than this:
    >
    > $part_info{$part} = join '|', $nomenclature, $cost, $min_qty,
    > $max_qty;
    >
    > (not to mention not working with Berkeley). But the bottom line is the
    > data we bring together is huge.


    Are you sure you can't bring it together outside of Perl, then? What you
    have shown seems like the raison d’être for SQL databases.

    Xho

    --
    -------------------- http://NewsReader.Com/ --------------------
    Usenet Newsgroup Service $9.95/Month 30GB
    , Sep 5, 2006
    #5
  6. Chris Hamel

    Chris Hamel Guest

    In Reply to various posts...

    ONE.

    > it might be helpful if you could show us what is displayed after you
    > issue the command "perl -V"


    When I perl -V, I get the following:

    Summary of my perl5 (revision 5.0 version 8 subversion 0)
    configuration:
    Platform:
    osname=aix, osvers=4.3.3.0, archname=aix
    uname='aix prod21 3 4 000a932f4c00 '
    config_args='-des'
    hint=recommended, useposix=true, d_sigaction=define
    usethreads=undef use5005threads=undef useithreads=undef
    usemultiplicity=undef
    useperlio=define d_sfio=undef uselargefiles=define usesocks=undef
    use64bitint=undef use64bitall=undef uselongdouble=undef
    usemymalloc=n, bincompat5005=undef
    Compiler:
    cc='cc', ccflags ='-D_ALL_SOURCE -D_ANSI_C_SOURCE -D_POSIX_SOURCE
    -qmaxmem=16384 -qnoansialias -DUSE_NATIVE_DLOPEN -q32 -D_LARGE_FILES
    -qlonglong',
    optimize='-O',
    cppflags='-D_ALL_SOURCE -D_ANSI_C_SOURCE -D_POSIX_SOURCE
    -qmaxmem=16384 -qnoansialias -DUSE_NATIVE_DLOPEN'
    ccversion='3.6.4.0', gccversion='', gccosandvers=''
    intsize=4, longsize=4, ptrsize=4, doublesize=8, byteorder=4321
    d_longlong=define, longlongsize=8, d_longdbl=define, longdblsize=8
    ivtype='long', ivsize=4, nvtype='double', nvsize=8, Off_t='off_t',
    lseeksize=8
    alignbytes=8, prototype=define
    Linker and Libraries:
    ld='ld', ldflags =' -brtl -L/usr/local/lib -b32'
    libpth=/usr/local/lib /lib /usr/lib /usr/ccs/lib
    libs=-lbind -lnsl -ldbm -ldl -lld -lm -lc -lcrypt -lbsd -lPW
    perllibs=-lbind -lnsl -ldl -lld -lm -lc -lcrypt -lbsd -lPW
    libc=/lib/libc.a, so=a, useshrplib=false, libperl=libperl.a
    gnulibc_version=''
    Dynamic Linking:
    dlsrc=dl_aix.xs, dlext=so, d_dlsymun=undef, ccdlflags='
    -bE:/usr/local/lib/perl5/5.8.0/aix/CORE/perl.exp'
    cccdlflags=' ', lddlflags=' -bhalt:4 -bM:SRE
    -bI:$(PERL_INC)/perl.exp -bE:$(BASEEXT).exp -bnoentry -lc
    -L/usr/local/lib'


    Characteristics of this binary (from libperl):
    Compile-time options: USE_LARGE_FILES
    Built under aix
    Compiled at Jul 17 2003 17:16:18
    @INC:
    /usr/local/lib/perl5/5.8.0/aix
    /usr/local/lib/perl5/5.8.0
    /usr/local/lib/perl5/site_perl/5.8.0/aix
    /usr/local/lib/perl5/site_perl/5.8.0
    /usr/local/lib/perl5/site_perl/5.6.1
    /usr/local/lib/perl5/site_perl
    Chris Hamel, Sep 5, 2006
    #6
  7. Christian Winter wrote:
    > Chris Hamel wrote:


    >> We have Perl (5.8.0) installed on an AIX server. The server itself is
    >> capable of handling 32-bit processes as large as about 2 GB, and we've
    >> never found a ceiling for the 64-bit processes (only 32 GB on the
    >> server, and we've run programs as large as 22 GB). When I run programs
    >> in Perl, however, they core dump once the process hits about 250 MB.


    > I've hardly had any serious encounters with AIX, so I'm just pulling
    > together the few crumbs that crossed my way: 256MB is, if I remember
    > correctly, a standard page size and the usual limit for allocating
    > memory in one take. Whether more can be used depends on the exact OS
    > version, compile time settings and/or environment settings. AFAIR
    > there is an explicit Large-Memory readme on the IBM site where this
    > is all explained in detail.
    > The Keywords for a search should be "-bmaxdata", "LARGE_PAGE_DATA" and
    > "LDR_CNTRL".


    This is absolutely what the original poster is running into. I spend
    most of my time in AIX and have long ago added the -bmaxdata flag to my
    Perl builds. If you are experienced with a hex/binary editor, it's
    quite possible to update an existing perl binary manually.

    The AIX large-memory docs (as Christian mentions) even provide a recipe
    for doing this from the command line. And, while it may not be an issue
    in your case, keep in mind that increasing maxdata to 2GB reduces the
    maximum size for memory-mapped files to 512MB.

    Finally, a word of warning: This trick is absolutely _not_ required for
    64-bit binaries and will cause problems if you try it!

    Steve
    Steven N. Hirsch, Sep 8, 2006
    #7
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Replies:
    12
    Views:
    1,638
    Dave Thompson
    Jan 10, 2005
  2. Coca
    Replies:
    7
    Views:
    736
    Aidan Grey
    Aug 24, 2004
  3. Bart Van der Donck

    Memory: measuring 5 limitations

    Bart Van der Donck, Dec 10, 2003, in forum: Perl Misc
    Replies:
    6
    Views:
    208
    Bart Van der Donck
    Dec 12, 2003
  4. Casey Hawthorne
    Replies:
    14
    Views:
    446
  5. MC

    Javascript Memory Limitations

    MC, Aug 14, 2010, in forum: Javascript
    Replies:
    16
    Views:
    1,246
    Evertjan.
    Aug 21, 2010
Loading...

Share This Page