Swap memory in Python ? - three questions

R

Robert LaMarca

Hi,

I am using numpy and wish to create very large arrays. My system is AMD 64 x 2 Ubuntu 8.04. Ubuntu should be 64 bit. I have 3gb RAM and a 15 GB swap drive.

The command I have been trying to use is;
g=numpy.ones([1000,1000,1000],numpy.int32)

This returns a memory error.
A smaller array ([500,500,500]) worked fine..
Two smaller arrays again crashed the system.

So... I did the math. a 1000x1000x1000 array at 32 bits should be around 4gb RAM... Obviously larger than RAM, but much smaller than the swap drive.

1. So... does Numpy have a really lot of overhead? Or is my system just not somehow getting to make use of the 15gb swap area.
2. Is there a way I can access the swap area, or direct numpy to do so? Or do I have to write out my own numpy cache system...
3. How difficult is it to use data compression internally on numpy arrays?

thanks very much
Robert
 
M

Marc Christiansen

Robert LaMarca said:
Hi,

I am using numpy and wish to create very large arrays. My system is
AMD 64 x 2 Ubuntu 8.04. Ubuntu should be 64 bit. I have 3gb RAM and a
15 GB swap drive.

The command I have been trying to use is;
g=numpy.ones([1000,1000,1000],numpy.int32)

This returns a memory error.

Works for me on AMD64x2, 2GB RAM, 3GB swap. With much paging of course
;) Process size of python 4019508kB.
A smaller array ([500,500,500]) worked fine..

About 0.5GB.. no surprise.
Two smaller arrays again crashed the system.

Crash as in "computer reboots"? Strange. And you say, that two [500,
500, 500] arrays are two much?
So... I did the math. a 1000x1000x1000 array at 32 bits should be
around 4gb RAM... Obviously larger than RAM, but much smaller than the
swap drive.

Sounds like either your kernel or python are not 64bit.
For the kernel, have a look at /proc/meminfo, which value you get for
VmallocTotal. I have VmallocTotal: 34359738367 kB.
For python, check sys.maxint. If it's 2147483647, then you have a 32bit
python. Mine says 9223372036854775807.
1. So... does Numpy have a really lot of overhead? Or is my system
just not somehow getting to make use of the 15gb swap area.
No. Yes.
2. Is there a way I can access the swap area, or direct numpy to do
so? Or do I have to write out my own numpy cache system...
3. How difficult is it to use data compression internally on numpy
arrays?
2 + 3: Should not be necessary.

I just tried a 32bit python on my 64bit system, using Numeric instead of
numpy (don't have a 32bit numpy ready):

g=Numeric.zeros([1000,1000,1000],Numeric.Int32)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
MemoryError: can't allocate memory for array

g=Numeric.zeros([1000,1000,500],Numeric.Int32) succeeds.

So it looks like you have a 32bit kernel.

Marc
 
J

John Nagle

Robert said:
Hi,

I am using numpy and wish to create very large arrays. My system is AMD 64
x 2 Ubuntu 8.04. Ubuntu should be 64 bit. I have 3gb RAM and a 15 GB swap
drive.

Does a full 64-bit version of CPython, one where all pointers
and sizes are 64 bits, even exist?

John Nagle
 
T

Tim Roberts

John Nagle said:
Does a full 64-bit version of CPython, one where all pointers
and sizes are 64 bits, even exist?

Absolutely.

[timr@naxier ~]# uname -a
Linux naxier.xxxxxx.com 2.6.9-42.0.3.ELsmp #1 SMP Mon Sep 25 17:24:31 EDT
2006 x86_64 x86_64 x86_64 GNU/Linux
[timr@naxier ~]# python
Python 2.3.4 (#1, Feb 18 2008, 17:16:53)
[GCC 3.4.6 20060404 (Red Hat 3.4.6-9)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,773
Messages
2,569,594
Members
45,123
Latest member
Layne6498
Top