Python memory deallocate

M

mariano.difelice

Hi,
I've a big memory problem with my application.

First, an example:

If I write:

a = range(500*1024)

I see that python process allocate approximately 80Mb of memory.
What can i do for DEALLOCATE this memory, or good part of this?

My really problem is that my program work with more photos, which I
open with PIL package.
When I start it, my program allocate approximately 200Mb memory.

If I want abort actual work and restart without restarting python
process, the memory usage will go up approximately 380-400 Mb.

I would like find something that DEallocate the memory not used.

I've tried with gc python module, but don't work fine (it deallocate
approximately 20-30 Mb)
I've tried with Destroy, del command, but the memory don't show down.

Thx

I'm very desperate
 
T

Tim N. van der Leeuw

Hi,

'del a' should remove 'a', as a reference to the tuple created by the
'range' function.
If that is also the last reference, it can now be garbage-collected.

Of course, the question is if you really need to allocate such a big
amount of memory. If you just need to iterate over some numbers, it's
better to use 'xrange' instead of 'range': 'xrange' will not create the
whole list of numbers in advance, but will create an iterator,
producing the desired numbers one by one. With such large ranges, this
will reduce memory consumption significantly.

Cheers,

--Tim
 
D

Diez B. Roggisch

Hi,
I've a big memory problem with my application.

First, an example:

If I write:

a = range(500*1024)

I see that python process allocate approximately 80Mb of memory.
What can i do for DEALLOCATE this memory, or good part of this?

My really problem is that my program work with more photos, which I
open with PIL package.
When I start it, my program allocate approximately 200Mb memory.

If I want abort actual work and restart without restarting python
process, the memory usage will go up approximately 380-400 Mb.

I would like find something that DEallocate the memory not used.

I've tried with gc python module, but don't work fine (it deallocate
approximately 20-30 Mb)
I've tried with Destroy, del command, but the memory don't show down.

Thx

I'm very desperate

No need to. Just because the process seems to be that large doesn't mean
that there is so much memory consumed by python. It's just that modern OSs
will not correctly display the amount of memory really consumed, as they
aren't too eager to remove once allocated virtual memory a process has been
granted. I guess it's an optimization scheme. For example if I do


the VmSize of the python process grows to 500MB. If I issue an

it shrinks to 300MB. The reissuing

will make it grow - but only to about 400MB. Then I do
del a
a = [chr(i % 255) for i in xrange(500*1024 * 50)]


This time I used strings, to circumvene some internal caching python might
do on numbers. Yet still - the overall VmSize is 400MB

Bottomline: don't get confused by tsakmanagers memory size display. You only
have to be concerned when the consumption steadily grows - which indicates
a memory leak.

Diez
 
S

Sion Arrowsmith

If I write:

a = range(500*1024)

I see that python process allocate approximately 80Mb of memory.
What can i do for DEALLOCATE this memory, or good part of this?
[ ... ]
I've tried with Destroy, del command, but the memory don't show down.

It won't (much). When an object gets garbage collected Python
will keep hold of the memory and reuse it. Note how much memory
your process is using after assigning a above, then:

Of course, you've seen this doesn't release back to the OS all
the memory being that was being used. But now do:

and you'll see that you're using no more memory than you were
after the first assignment. If your memory usage keeps on growing
then either (a) your program needs that much memory for the data,
and you'll just have to stick more in your box or deal with
swapping if this is causing you a problem or (b) you've got some
stray references left over to objects you think you've deleted.
 
M

mariano.difelice

Well, it's right about range diff xrange, ok, but this was a simple
example...
I repeat that my program don't work with range or xrange...

I MUST find a system which deallocate memory...
Otherwise, my application crashes not hardly it's arrived to
break-point system
 
M

mariano.difelice

Ok, this is true.

Well, you consider that my app has a first windows, where I choose, for
example, the application 1.
The application 1 will be started, and it will allocate 200Mb total.
Now I want to abort this operation, and i will return to main initial
window. The memory usage remain to 200Mb, even if I've destroyed the
application 1 object with app1.Destroy() and del app1.
When from initial windows i re-choose application1, the memory usage
don't stay to 200Mb, but will increase to 360-380Mb memory, and I think
this is not good!!!

If I will repeat this procedure for 5-6 times, the application will
crash...

And it's not good!!!
 
H

Heiko Wundram

Am Donnerstag 11 Mai 2006 15:15 schrieb (e-mail address removed):
I MUST find a system which deallocate memory...
Otherwise, my application crashes not hardly it's arrived to
break-point system

As was said before: as long as you keep a reference to an object, the object's
storage _will not be_ reused by Python for any other objects (which is
sensible, or would you like your object to be overwritten by other objects
before you're done with them?). Besides, even if Python did free the memory
that was used, the operating system wouldn't pick it up (in the general case)
anyway (because of fragmentation issues), so Python keeping the memory in an
internal free-list for new objects is a sensible choice the Python developers
took here.

Basically, what I think is happening is that you are loading all images that
you use for your program into memory at once, and these simply will eat up
your memory until your program crashes (because Python can't know you no
longer need them). As you keep a reference to the image (somewhere,
someplace), del(eting) the reference in one scope isn't going to free the
memory for Python (or PIL in this case) to reuse for the next image. An
object is kept alive because of a memory leak inherent to your application.

This is a programming problem, not a Python problem. And, if you don't post
any sources, we won't be able to help you much here, save to tell you to look
closely where you create objects, and where you store them to.

PIL isn't known to have any memory leaks, by the way (AFAICT), just to confirm
what I've written before, but the effbot should be of more help here...

--- Heiko.
 
S

Serge Orlov

Heiko said:
Am Donnerstag 11 Mai 2006 15:15 schrieb (e-mail address removed):

As was said before: as long as you keep a reference to an object, the object's
storage _will not be_ reused by Python for any other objects (which is
sensible, or would you like your object to be overwritten by other objects
before you're done with them?). Besides, even if Python did free the memory
that was used, the operating system wouldn't pick it up (in the general case)
anyway (because of fragmentation issues), so Python keeping the memory in an
internal free-list for new objects is a sensible choice the Python developers
took here.

BTW python 2.5 now returns free memory to OS, but if a program keeps
allocating more memory with each new iteration in python 2.4, it will
not help.
 
B

bruno at modulix

Ok, this is true.

Well, you consider that my app has a first windows, where I choose, for
example, the application 1.
The application 1 will be started, and it will allocate 200Mb total.
Now I want to abort this operation, and i will return to main initial
window. The memory usage remain to 200Mb, even if I've destroyed the
application 1 object with app1.Destroy() and del app1.
When from initial windows i re-choose application1, the memory usage
don't stay to 200Mb, but will increase to 360-380Mb memory, and I think
this is not good!!!

If I will repeat this procedure for 5-6 times, the application will
crash...

Either there are references somewhere keeping some objects alive, or
you're facing a known problem with pymalloc() [1]. In the second case,
the good news is that this problem seems to be solved in Python 2.5 [2].
The bad news is that Python 2.5 is still alpha...

Just a question: do you really need your app to be monolithic ? If
you're application is really composed of many applications (which is
what I understand from your example), you should probably have many
distinct applications, one of them being in charge of running the
others. This way, 'aborting' an app would kill the corresponding
process, and free memory.


[1] or it's yet another problem, of course - CS wouldn't be that fun if
it was so simple :(

[2] http://docs.python.org/dev/whatsnew/section-other.html

My 2 cents
 
M

mardif

In python 2.5 this was resolved, ok, but i can't use any python version
then 2.3.5.
This project was initializated with this version, and now it can be
dangerous change version, even because I use McMillan installer for
compile e build an executable.

So, my initial window is a "menu window", when i can choose, for
example, upload foto for print, or create an object as PhotoAlbum.

If i'will choose PhotoAlbum, it will allocate 200Mb. well! if i decide
to abort this work, i will return to initial window. and if i will
click in PhotoAlbum ago, the memory will increase to 350-400Mb.

It's that!!! I don't understand why!!! when i abort the work the first
time, i call

app.Destroy()
del app

Why Python ( or GarbageCollector, or.... ) don't deallocate memory??
If it's not possibile, why python don't reuse the memory "cached"?

thx a lot
 
D

Diez B. Roggisch

mardif said:
In python 2.5 this was resolved, ok, but i can't use any python version
then 2.3.5.
This project was initializated with this version, and now it can be
dangerous change version, even because I use McMillan installer for
compile e build an executable.

So, my initial window is a "menu window", when i can choose, for
example, upload foto for print, or create an object as PhotoAlbum.

If i'will choose PhotoAlbum, it will allocate 200Mb. well! if i decide
to abort this work, i will return to initial window. and if i will
click in PhotoAlbum ago, the memory will increase to 350-400Mb.

It's that!!! I don't understand why!!! when i abort the work the first
time, i call

app.Destroy()
del app

Why Python ( or GarbageCollector, or.... ) don't deallocate memory??
If it's not possibile, why python don't reuse the memory "cached"?

Because you might have kept references somewhere. And nobody knows where
without you showing code.

Lots of long-running memory intensive apps such as ZODB are written in
python - so I doubt that it is a problem of python itself, but instead of
your program. Try and reproduce the behavior with a self-contained script -
if that still exposes the problem we can look at it maybe fixpython if
needed.

Besides: _if_ it was a python error, you'd have to move to a knew python
version anyway, won't you? And just because you McMillan Installer won't
work doesn't mean that YOU can't try python2.4 or even 2.5, and see if it
works better.

Diez
 
M

Michele Petrazzo

Heiko said:
As was said before: as long as you keep a reference to an object, the object's
storage _will not be_ reused by Python for any other objects (which is
sensible, or would you like your object to be overwritten by other objects
before you're done with them?). Besides, even if Python did free the memory
that was used, the operating system wouldn't pick it up (in the general case)
anyway (because of fragmentation issues), so Python keeping the memory in an
internal free-list for new objects is a sensible choice the Python developers
took here.

This isn't true. Just tried with python 2.5a2 and:

d:\python25\python.exe
So now, like you saied, if I try to allocate another memory chunk,
python'll re-use it... But this isn't true:
d:\python25\python.exe

Do why python don't reuse the freed memory and re-allocate 4 MB (126 -
122)?

For me it's a problem...
--- Heiko.

Bye,
Michele
 
B

bruno at modulix

mardif said:
In python 2.5 this was resolved, ok, but i can't use any python version
then 2.3.5.

This project was initializated with this version, and now it can be
dangerous change version, even because I use McMillan installer for
compile e build an executable.

Err... I'm sorry I don't understand your problem here. You don't plan on
sticking to 2.3.5 forever, aren't you ?
So, my initial window is a "menu window", when i can choose, for
example, upload foto for print, or create an object as PhotoAlbum.

If i'will choose PhotoAlbum, it will allocate 200Mb. well! if i decide
to abort this work, i will return to initial window. and if i will
click in PhotoAlbum ago, the memory will increase to 350-400Mb.

It's that!!! I don't understand why!!! when i abort the work the first
time, i call

app.Destroy()
del app

Why Python ( or GarbageCollector, or.... ) don't deallocate memory??

*please* re-read carefully what I and Diez wrote earlier in this thread
before jumping to possibly erroneous conclusion. I didn't say that the
problem *actually* was with Python - just that it *may* have to do with
a memory management issue fixed in 2.5. And, while we're at it : your
application uses a GUI toolkit and does image manipulations, so the
problem can also come from one of these packages...

I also suggested that it may have to do with your code keeping
references to objects. del' an object just decrease the reference
counter - the object won't be freed until there are no more reference to
it *anywhere*.

If I was in your place, I'd *really* make sure the problem is not in my
code. Like, amongst other things, installing Python 2.5 and testing the
program with it...
 
T

Tim Peters

[Serge Orlov]
BTW python 2.5 now returns free memory to OS, but if a program keeps
allocating more memory with each new iteration in python 2.4, it will
not help.

No version of CPython ever returns memory to "the OS". All memory is
obtained via the platform C's alloc() or realloc(), and any memory
"returned" is given back to the platform C's free(). Whether and when
the platform C's free() in turn gives memory back to the OS is
entirely up to the OS and C's implementation of free(), varies across
OSes and platform free() implementations, typically has no easy
answer, and is something Python has no control over regardless.

It's true that Python 2.5 will, in some cases, return more memory to
free() than did previous versions of CPython. When it does, that may
or may not affect what the OS reports as the process's memory use.
 
F

Fredrik Lundh

bruno said:
*please* re-read carefully what I and Diez wrote earlier in this thread
before jumping to possibly erroneous conclusion. I didn't say that the
problem *actually* was with Python - just that it *may* have to do with
a memory management issue fixed in 2.5.

the only thing that has changed is that Python 2.5 is slightly more likely to release
memory areas held by the Python object allocator back to the C runtime memory
allocator. if you have a leak in your application, changing to Python 2.5 won't
change a thing.

</F>
 
F

Fredrik Lundh

Heiko said:
PIL isn't known to have any memory leaks, by the way (AFAICT), just to confirm
what I've written before, but the effbot should be of more help here...

PIL is heavily used in 24/7 production systems, often by people who knows a lot
about how to run mission critical systems, so memory and resource leaks in PIL
tends to be noticed.

there has been one leak fix in 1.1.6, afaik: converting a grayscale image to a palette
image would (sometimes?) leak a palette structure. but that's a couple of hundred
bytes, not a couple of hundred megabytes...

</F>
 
B

bruno at modulix

Fredrik said:
bruno at modulix" wrote:




the only thing that has changed is that Python 2.5 is slightly more likely to release
memory areas held by the Python object allocator back to the C runtime memory
allocator. if you have a leak in your application, changing to Python 2.5 won't
change a thing.

Which is mostly what I meant - sorry if it wasn't clear.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,766
Messages
2,569,569
Members
45,044
Latest member
RonaldNen

Latest Threads

Top