perl maximum memory

D

david

Hi all,

Is there a maximum memory limit for a perl process ?
Is there a difference between 32 and 64 bit perl ?

Thanks,
David
 
P

Peter J. Holzer

Is there a maximum memory limit for a perl process ?

The same as for any other process.
Is there a difference between 32 and 64 bit perl ?

Yes.

32-bit processes are limited to 4 GB (theoretically, in
practice the limit is more likely to be 2 GB (Linux/i386 is rather weird
with a 3 GB limit)).

For 64-bit processes the limit is theoretically 16 Exabytes, but that's
well beyond the capabilities of current hardware.

hp
 
I

Ilya Zakharevich

[A complimentary Cc of this posting was NOT [per weedlist] sent to
Peter J. Holzer
For 64-bit processes the limit is theoretically 16 Exabytes, but that's
well beyond the capabilities of current hardware.

??? Well *within* capabilities of current hardware. One ethernet
card, and you have *a possibility* of unlimited (read: limited only by
software) expansion of available disk space; which may be
memory-mapped.

So it is a question of money only. Last time I checked, 2^32 bytes of
read/write storage costed about 40cents; the overhead to network it is
not large.... So it is less than $2B to get 2^64 bytes...

It may be much more cost-effective to have robotic CD/changer; do not
know price//relibility, though...

Yours,
Ilya
 
P

Peter J. Holzer

??? Well *within* capabilities of current hardware. One ethernet
card, and you have *a possibility* of unlimited (read: limited only by
software) expansion of available disk space; which may be
memory-mapped.

Last time I looked there was no processor which would actually use all
64 bits in the MMU. The usable number of bits is typically somewhere
between 36 and 48, which limits the usable virtual memory (including
memory-mapped files, etc.) to 2^36 to 2^48 bytes.
So it is a question of money only.

If you have enough money to develop a new MMU for your CPU, you are
right ;-).

hp
 
I

Ilya Zakharevich

[A complimentary Cc of this posting was NOT [per weedlist] sent to
Peter J. Holzer
Last time I looked there was no processor which would actually use all
64 bits in the MMU. The usable number of bits is typically somewhere
between 36 and 48, which limits the usable virtual memory (including
memory-mapped files, etc.) to 2^36 to 2^48 bytes.

So, IIUC, I misinterpreted your remark. I thought that you say that
currently, one can't get enough MEMORY to overflow 64bit. And now you
say that one can't get enough MEMORY ADDRESS SPACE to overflow 64bit.
If you have enough money to develop a new MMU for your CPU, you are
right ;-).

About 10 years ago I looked through notes for a hardware design 101
class, and one of the first homeworks was to design a MMU, simple, but
good enough to bootstrap a processor via (a hard disk/whatever)
sitting on a bus. They needed to catch memory accesses to a segment
in memory, and translate them to bus access commands; and I think the
requirement was to design this in terms of discrete components
(transistors). So IIRC, I think even I have enough money for such a
design. ;-)

Yours,
Ilya

P.S. Thinking about it more: the price estimate I gave is in ballpark
of a price of a particle physics detector (LHC, Tevatron).
Given that current design is to through away 99.99999% (or
whatever) of information as early as possible, any money spent
on larger storage and memory throughput has a probability to
improve a chance the data from experiments may be (later) used
for unrelated purposes...

P.P.S. I tried to imagine other scenarios which may quickly produce
much more than 2^64 bytes of info. First I thought of LLST
(https://www.llnl.gov/str/November05/Brase.html), but it is
only 2^55 B/year. The only other "realistic" scenario I found
is a very anxious bigbrother: a "good" video camera (I'm
thinking about IMAX-like quality, 4K x 3K x 3 x 50p; maybe not
available this year, but RSN) can easily saturate 10Gb-BASE
connection (in RAW stream with minimal compression).

So if London authorities decide to replace their spycams by
such beasts, AND would like to preserve RAW streams, they
would generate 10TB/sec. This is 25e18 B/month, which is
 
S

sln

[A complimentary Cc of this posting was NOT [per weedlist] sent to
Peter J. Holzer
Last time I looked there was no processor which would actually use all
64 bits in the MMU. The usable number of bits is typically somewhere
between 36 and 48, which limits the usable virtual memory (including
memory-mapped files, etc.) to 2^36 to 2^48 bytes.

So, IIUC, I misinterpreted your remark. I thought that you say that
currently, one can't get enough MEMORY to overflow 64bit. And now you
say that one can't get enough MEMORY ADDRESS SPACE to overflow 64bit.
If you have enough money to develop a new MMU for your CPU, you are
right ;-).

About 10 years ago I looked through notes for a hardware design 101
class, and one of the first homeworks was to design a MMU, simple, but
good enough to bootstrap a processor via (a hard disk/whatever)
sitting on a bus. They needed to catch memory accesses to a segment
in memory, and translate them to bus access commands; and I think the
requirement was to design this in terms of discrete components
(transistors). So IIRC, I think even I have enough money for such a
design. ;-)

Yours,
Ilya

P.S. Thinking about it more: the price estimate I gave is in ballpark
of a price of a particle physics detector (LHC, Tevatron).
Given that current design is to through away 99.99999% (or
whatever) of information as early as possible, any money spent
on larger storage and memory throughput has a probability to
improve a chance the data from experiments may be (later) used
for unrelated purposes...

P.P.S. I tried to imagine other scenarios which may quickly produce
much more than 2^64 bytes of info. First I thought of LLST
(https://www.llnl.gov/str/November05/Brase.html), but it is
only 2^55 B/year. The only other "realistic" scenario I found
is a very anxious bigbrother: a "good" video camera (I'm
thinking about IMAX-like quality, 4K x 3K x 3 x 50p; maybe not
available this year, but RSN) can easily saturate 10Gb-BASE
connection (in RAW stream with minimal compression).

So if London authorities decide to replace their spycams by
such beasts, AND would like to preserve RAW streams, they
would generate 10TB/sec. This is 25e18 B/month, which is
>2^64 B/month. Viva the bigbrother!

I think this falls under the category of navel research

sln
 
I

Ilya Zakharevich

[A complimentary Cc of this posting was sent to
I think this falls under the category of navel research

Hmm, all navel researches I saw were done in (super?) 35mm; never in
(15-sprocket) IMAX format. Do you have some particular in mind?

Yours,
Ilya
 
J

John W. Krahn

Ilya said:
[A complimentary Cc of this posting was sent to
I think this falls under the category of navel research

Hmm, all navel researches I saw were done in (super?) 35mm;

Maybe you are thinking of Super 8 (8mm)?
never in (15-sprocket) IMAX format.

AFAIK IMAX is in 70mm, but I don't know how many sprockets the projector
has, or if that matters. (Although a video camera wouldn't have
sprockets [unless you're thinking of Sprockets on SNL but that only
appeared 14 times.])


John
 
I

Ilya Zakharevich

[A complimentary Cc of this posting was sent to
John W. Krahn
Maybe you are thinking of Super 8 (8mm)?

Nope, in the years I was using Super 8, navels were not the tops on my
priority lists.

For 35mm, see, e.g., http://www.imdb.com/title/tt0114134/technical
AFAIK IMAX is in 70mm, but I don't know how many sprockets the projector
has, or if that matters.

With 70mm, the stuff is tricky. First, it is 65mm when shot, 70mm
when projected. Then the usual mess strikes: how you put the actual
frames on a strip of film [*].

In fact, the mess is much nicer than with 35mm film. IIRC, during
last 40 or so years, mostly two formats were used: 65/15sprockets
(=IMAX), and 65/5sprockets. (One is 3times larger than the other! [**])
This is the same difference as landscape/portrait printing: on one
film travels vertically, so you need to fit width of frame into 65mm -
performation, and on another film travels horizontally, so you fit the
height of the frame.

Yours,
Ilya

[*] For 35mm, yesterday I found these gems:

http://www.cinematography.net/edited-pages/Shooting35mm_Super35mmFor_11.85.htm
http://www.arri.de/infodown/cam/ti/format_guide.pdf

[**] I expect that for most applications, current digicams (e.g.,
Red One) exceed capacities of 65/5 [***]. However, 65/15 at
60fps should be significantly better than what Red One gives.
I think one must have something like 4K x 3K x 3ccd x 48fps to
get a comparable experience. (Which led to bandwidth I
mentioned before. ;-)

[***] On the other hand, this calculation is based on interpolation
from digital photos. With photos, the very high extinction
resolution of the film does not enter the equation (since eye
does not care about *extinction* resolution, only about
noise-limited resolution - one where S/N ratio drops below
about 3 - and digital has an order of magnitude better noise).

But with movies, eye averages out the noise very effectively;
so the higher extinction resolution of film may have some
relevance... Do not think I saw it investigated anywhere; I
may want to create some digital simulations...

On the gripping hand, some people in the film industry believe
that HD video (2K x 1K) is comparable to (digitized) Super 35,
and this ratio is quite similar to what one gets from photos.
With photos, the best-processed color 36mm x 24mm is
approximately equivalent in resolution to a digital shot
rescaled down to 4 or 5MPix (only that the digital shot would
have practically no noise).
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,582
Members
45,057
Latest member
KetoBeezACVGummies

Latest Threads

Top