deleting objects present in a list

S

Sandy

Hi all,
I have large number of objects created and to handle them properly, I
store them in a list. How can I delete all of these objects (delete I
mean here is to remove the object from memory not just from list)? I
cannot use the list to iterate through the objects to delete them.
Because 'del' only reduces the reference count and as it is present in
the list it is not deleted. I cannot delete the list because I loose
control over the objects.

Can anyone give a nice solution for this?

Cheers,
dksr
 
T

Terry Reedy

Hi all,
I have large number of objects created and to handle them properly, I
store them in a list. How can I delete all of these objects (delete I
mean here is to remove the object from memory not just from list)?
I cannot use the list to iterate through the objects to delete them.
Because 'del' only reduces the reference count and as it is present in
the list it is not deleted. I cannot delete the list because I loose
control over the objects.

Deleting the list is the best you can do. If that deletes the last
reference, then the interpreter will delete the object when it feels
like it. For *current* CPython, this will be immediately. For other
implementations, whenever.
 
C

Chris Rebert

Hi all,
I have large number of objects created and to handle them properly, I
store them in a list. How can I delete all of these objects (delete I
mean here is to remove the object from memory not just from list)? I
cannot use the list to iterate through the objects to delete them.
Because 'del' only reduces the reference count and as it is present in
the list it is not deleted. I cannot delete the list because I loose
control over the objects.

And what exactly is supposed to happen to any other references to the
objects besides the references in the list?

If there are no such references, then deleting the objects from the
list will indeed delete them "for real" (although /exactly/ when the
unreferenced objects will be garbage-collected is
implementation-dependent; in CPython, it will be right then-and-there
due to its use of refcounting).

You might want to look at using weak references
(http://docs.python.org/library/weakref.html) for all references to
the objects other than the references in the list.

Cheers,
Chris
 
S

Sandy

Thanks for the replies.

Terry,
What does 'immediately' mean? I did a small test and here are the
results.

import psutil

def testing():
class Object():
pass

l = {}
apm = psutil.avail_phymem()/(1024*1024)
print 'Before creating objs: ' + repr(apm)

for i in xrange(500000):
l.update({Object():1})

apm = psutil.avail_phymem()/(1024*1024)
print 'After creating objs: ' + repr(apm)
return l

def hello():
myl = testing()

apm = psutil.avail_phymem()/(1024*1024)
print 'Before deleting: ' + repr(apm)

del myl

# Here I want to delete the objects in the list
# deleting myl doesn't seem to change the memory

apm = psutil.avail_phymem()/(1024*1024)
print 'After deleting: ' + repr(apm)


if __name__ == '__main__':
hello()

OUTPUT:
Before creating objs: 2516L
After creating objs: 2418L
Before deleting: 2418L
After deleting: 2430L

In my original case the memory is not getting released even after long
time.

- dksr
 
C

Chris Rebert

What does 'immediately' mean? I did a small test and here are the
results.

import psutil

def testing():
class Object():
pass

l = {}
apm = psutil.avail_phymem()/(1024*1024)
print 'Before creating objs: ' + repr(apm)

for i in xrange(500000):
l.update({Object():1})

apm = psutil.avail_phymem()/(1024*1024)
print 'After creating objs: ' + repr(apm)
return l

def hello():
myl = testing()

apm = psutil.avail_phymem()/(1024*1024)
print 'Before deleting: ' + repr(apm)

del myl

# Here I want to delete the objects in the list
# deleting myl doesn't seem to change the memory

apm = psutil.avail_phymem()/(1024*1024)
print 'After deleting: ' + repr(apm)


if __name__ == '__main__':
hello()

OUTPUT:
Before creating objs: 2516L
After creating objs: 2418L
Before deleting: 2418L
After deleting: 2430L

In my original case the memory is not getting released even after long
time.

Python does *delete* the objects, but makes no guarantees regarding
*returning memory* to the OS.
CPython holds onto the now-unused memory for a while so it's not
constantly thrashing and/or fragmenting memory by malloc()-ing some
and then free()-ing [some of] it right back.

I'm unsure if there's a way to force Python to actually free() unused
memory back to the OS.

Cheers,
Chris
 
D

Dave Angel

(For some reason you posted your response before the message you were
replying to. That's called Top-posting, and is bad form on these
mailing lists)
Thanks for the replies.

Terry,
What does 'immediately' mean? I did a small test and here are the
results.

import psutil

def testing():
class Object():
pass

l =}
apm =sutil.avail_phymem()/(1024*1024)
print 'Before creating objs: ' + repr(apm)

for i in xrange(500000):
l.update({Object():1})

apm =sutil.avail_phymem()/(1024*1024)
print 'After creating objs: ' + repr(apm)
return l

def hello():
myl =esting()

apm =sutil.avail_phymem()/(1024*1024)
print 'Before deleting: ' + repr(apm)

del myl

# Here I want to delete the objects in the list
# deleting myl doesn't seem to change the memory

apm =sutil.avail_phymem()/(1024*1024)
print 'After deleting: ' + repr(apm)


if __name__ ='__main__':
hello()

OUTPUT:
Before creating objs: 2516L
After creating objs: 2418L
Before deleting: 2418L
After deleting: 2430L

In my original case the memory is not getting released even after long
time.

- dksr
First, you're using some 3rd party library for measuring some kind of
memory usage. I'd guess you're probably using
http://code.google.com/p/psutil/ Since I'm not familiar with
how they derive these numbers, I can only make a good guess as to how
valid they are. And if avail_phymem refers to what its name implies,
it has little to do with Python. Python doesn't control physical
memory, only virtual.

So let's skip those numbers and talk about what CPython actually does.
As others have pointed out, other implementations will be different.

When you delete a large number of objects (which you can do with
myl=None), CPython may keep some of the memory under its own control for
reuse. A future object of the same size will fit nicely in the hole,
and that may be faster than calling free() and malloc() again.

When CPython calls free(), the C runtime library almost certainly keeps
the memory for reuse by subsequent calls to malloc(). Operating system
calls for allocating and freeing memory are (usually) done in larger,
fixed-size blocks. In Windows for example, the granularity is 4k or 64k
for the more efficient methods of memory allocation. Swapfile
allocation, for example, is always in 4k multiples. See function call
VirtualAlloc(). Anyway, if there's even a single allocated byte in a
block, it can't release the block. And searching for such blocks is slow.

When the operating system is told to free something, it usually does not
"free" physical memory immediately. In the case of Windows, it marks
the block as available, and eventually a daemon task will zero it. But
it could very well be "charged to" the current process until some other
process needs the physical memory. What it does do is free it from
virtual memory. But notice that Virtual memory is represented by a
swapfile on disk, and Windows doesn't support a file with gaps in it
(sparse allocation). So unless this particular allocation is at the end
of the file, the size isn't likely to go down.


If you really want to "observe" that the memory has been "released,"
I'd suggest calling a similar testing() function a second time, with
objects of the same size, and see whether the numbers get any worse.
I'd say they won't, at least not by much, assuming there's any validity
to this avail_phymem() function.

I also have to point out that using "del myl" is not necessary for
freeing up the memory. All that's necessary is for the myl name to stop
referring to the list, and the list will go away. So myl=42 will work
just as well.

HTH
DaveA
 
S

Sandy

(For some reason you posted your response before the message you were
replying to.  That's called Top-posting, and is bad form on these
mailing lists)




















First, you're using some 3rd party library for measuring some kind of
memory usage.  I'd guess you're probably usinghttp://code.google.com/p/psutil/         Since I'm not familiar with
how they derive these numbers, I can only make a good guess as to how
valid they are.   And if avail_phymem refers to what its name implies,
it has little to do with Python.  Python doesn't control physical
memory, only virtual.

So let's skip those numbers and talk about what CPython actually does.  
As others have pointed out, other implementations will be different.

When you delete a large number of objects (which you can do with
myl=None), CPython may keep some of the memory under its own control for
reuse.  A future object of the same size will fit nicely in the hole,
and that may be faster than calling free() and malloc() again.

When CPython calls free(), the C runtime library almost certainly keeps
the memory for reuse by subsequent calls to malloc(). Operating system
calls for allocating and freeing memory are (usually) done in larger,
fixed-size blocks.  In Windows for example, the granularity is 4k or 64k
for the more efficient methods of memory allocation.  Swapfile
allocation, for example, is always in 4k multiples.  See function call
VirtualAlloc().  Anyway, if there's even a single allocated byte in a
block, it can't release the block.  And searching for such blocks is slow.

When the operating system is told to free something, it usually does not
"free" physical memory immediately.  In the case of Windows, it marks
the block as available, and eventually a daemon task will zero it.  But
it could very well be "charged to" the current process until some other
process needs the physical memory.  What it does do is free it from
virtual memory.  But notice that Virtual memory is represented by a
swapfile on disk, and Windows doesn't support a file with gaps in it
(sparse allocation).  So unless this particular allocation is at the end
of the file, the size isn't likely to go down.

If you really want to "observe" that the memory has been "released,"  
I'd suggest calling a similar testing() function a second time, with
objects of the same size, and see whether the numbers get any worse.  
I'd say they won't, at least not by much, assuming there's any validity
to this avail_phymem() function.

I also have to point out that using "del myl" is not necessary for
freeing up the memory.  All that's necessary is for the myl name to stop
referring to the list, and the list will go away.  So myl=42 will work
just as well.

HTH
DaveA

Hi,
Thanks for detailed reply. I tried running testing() function a second
time and as expected the
total memory available is not decreased.

Cheers,
dksr
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,055
Latest member
SlimSparkKetoACVReview

Latest Threads

Top