Memory leak/gc.get_objects()/Improved gc in version 2.5

Discussion in 'Python' started by crazy420fingers@gmail.com, Oct 9, 2007.

  1. Guest

    I'm running a python program that simulates a wireless network
    protocol for a certain number of "frames" (measure of time). I've
    observed the following:

    1. The memory consumption of the program grows as the number of frames
    I simulate increases.

    To verify this, I've used two methods, which I invoke after every
    frame simulated:

    -- Parsing the /proc/<pid>/status file as in:
    http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/286222
    -- Using ps vg | grep python | awk '!/grep/ {print " ",$8}' in an
    os.system() call.

    The memory usage vs. frame number graph shows some big "jumps" at
    certain points, and, after a large number of frames, shows a steady
    upward slope

    2. I think I've verified that the objects I instantiate are actually
    freed-- I'm therefore assuming that this "leak" is "caused" by
    python's garbage collection mechanism. I count the number of objects I
    generate that are being tracked by gc as follows:

    gc.collect()
    objCount = {}
    objList = gc.get_objects()
    for obj in objList:
    if getattr(obj, "__class__", None):
    name = obj.__class__.__name__
    if objCount.has_key(name):
    objCount[name] += 1
    else:
    objCount[name] = 1

    for name in objCount:
    print name, " :", objCount[name]

    del objList

    Running this snippet every hundred frames or so, shows that the number
    of objects managed by gc is not growing.

    I upgraded to Python 2.5. in an attempt to solve this problem. The
    only change in my observations from version 2.4 is that the absolute
    memory usage level seems to have dropped. However, I still see the
    jumps in memory usage at the same points in time.

    Can anybody explain why the memory usage shows significant jumps (~200
    kB or ~500 kb) over time (i.e. "frames") even though there is no
    apparent increase in the objects managed by gc? Note that I'm calling
    gc.collect() regularly.

    Thanks for your attention,

    Arvind
    , Oct 9, 2007
    #1
    1. Advertising

  2. Terry Reedy Guest

    <> wrote in message
    news:...

    Questions like this about memory consumption should start with the
    information printed by the interactive interpreter on startup and
    additional info about whether the binary is from stock CPython or has 3rd
    party modules compiled in. The latter are typically the source of real
    problems.
    Terry Reedy, Oct 9, 2007
    #2
    1. Advertising

  3. Chris Mellon Guest

    On 10/8/07, <> wrote:
    > I'm running a python program that simulates a wireless network
    > protocol for a certain number of "frames" (measure of time). I've
    > observed the following:
    >
    > 1. The memory consumption of the program grows as the number of frames
    > I simulate increases.
    >
    > To verify this, I've used two methods, which I invoke after every
    > frame simulated:
    >
    > -- Parsing the /proc/<pid>/status file as in:
    > http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/286222
    > -- Using ps vg | grep python | awk '!/grep/ {print " ",$8}' in an
    > os.system() call.
    >
    > The memory usage vs. frame number graph shows some big "jumps" at
    > certain points, and, after a large number of frames, shows a steady
    > upward slope
    >


    This would be expected if you're creating ever-larger amounts of
    objects - python uses memory pools and as the number of simultaneous
    objects increases, the size of the pool will need to increase. This
    isn't expected if the total number of objects you create is pretty
    much static, but the way you're trying to determine that is flawed
    (see below).

    > 2. I think I've verified that the objects I instantiate are actually
    > freed-- I'm therefore assuming that this "leak" is "caused" by
    > python's garbage collection mechanism. I count the number of objects I
    > generate that are being tracked by gc as follows:
    >
    > gc.collect()
    > objCount = {}
    > objList = gc.get_objects()
    > for obj in objList:
    > if getattr(obj, "__class__", None):
    > name = obj.__class__.__name__
    > if objCount.has_key(name):
    > objCount[name] += 1
    > else:
    > objCount[name] = 1
    >
    > for name in objCount:
    > print name, " :", objCount[name]
    >
    > del objList
    >
    > Running this snippet every hundred frames or so, shows that the number
    > of objects managed by gc is not growing.
    >
    > I upgraded to Python 2.5. in an attempt to solve this problem. The
    > only change in my observations from version 2.4 is that the absolute
    > memory usage level seems to have dropped. However, I still see the
    > jumps in memory usage at the same points in time.
    >
    > Can anybody explain why the memory usage shows significant jumps (~200
    > kB or ~500 kb) over time (i.e. "frames") even though there is no
    > apparent increase in the objects managed by gc? Note that I'm calling
    > gc.collect() regularly.
    >


    You're misunderstanding the purpose of Pythons GC. Python is
    refcounted. The GC exists only to find and break reference cycles. If
    you don't have ref cycles, the GC doesn't do anything and you could
    just turn it off.

    gc.get_objects() is a snapshot of the currently existing objects, and
    won't give you any information about peak object count, which is the
    most direct correlation to total memory use.

    > Thanks for your attention,
    >
    > Arvind
    >
    > --
    > http://mail.python.org/mailman/listinfo/python-list
    >
    Chris Mellon, Oct 9, 2007
    #3
  4. arvind Guest

    On Oct 9, 7:54 am, "Chris Mellon" <> wrote:
    > On 10/8/07, <> wrote:
    >
    >
    >
    > > I'm running a python program that simulates a wireless network
    > > protocol for a certain number of "frames" (measure of time). I've
    > > observed the following:

    >
    > > 1. The memory consumption of the program grows as the number of frames
    > > I simulate increases.

    >
    > > To verify this, I've used two methods, which I invoke after every
    > > frame simulated:

    >
    > > -- Parsing the /proc/<pid>/status file as in:
    > >http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/286222
    > > -- Using ps vg | grep python | awk '!/grep/ {print " ",$8}' in an
    > > os.system() call.

    >
    > > The memory usage vs. frame number graph shows some big "jumps" at
    > > certain points, and, after a large number of frames, shows a steady
    > > upward slope

    >
    > This would be expected if you're creating ever-larger amounts of
    > objects - python uses memory pools and as the number of simultaneous
    > objects increases, the size of the pool will need to increase. This
    > isn't expected if the total number of objects you create is pretty
    > much static, but the way you're trying to determine that is flawed
    > (see below).
    >
    >
    >
    > > 2. I think I've verified that the objects I instantiate are actually
    > > freed-- I'm therefore assuming that this "leak" is "caused" by
    > > python's garbage collection mechanism. I count the number of objects I
    > > generate that are being tracked by gc as follows:

    >
    > > gc.collect()
    > > objCount = {}
    > > objList = gc.get_objects()
    > > for obj in objList:
    > > if getattr(obj, "__class__", None):
    > > name = obj.__class__.__name__
    > > if objCount.has_key(name):
    > > objCount[name] += 1
    > > else:
    > > objCount[name] = 1

    >
    > > for name in objCount:
    > > print name, " :", objCount[name]

    >
    > > del objList

    >
    > > Running this snippet every hundred frames or so, shows that the number
    > > of objects managed by gc is not growing.

    >
    > > I upgraded to Python 2.5. in an attempt to solve this problem. The
    > > only change in my observations from version 2.4 is that the absolute
    > > memory usage level seems to have dropped. However, I still see the
    > > jumps in memory usage at the same points in time.

    >
    > > Can anybody explain why the memory usage shows significant jumps (~200
    > > kB or ~500 kb) over time (i.e. "frames") even though there is no
    > > apparent increase in the objects managed by gc? Note that I'm calling
    > > gc.collect() regularly.

    >
    > You're misunderstanding the purpose of Pythons GC. Python is
    > refcounted. The GC exists only to find and break reference cycles. If
    > you don't have ref cycles, the GC doesn't do anything and you could
    > just turn it off.
    >
    > gc.get_objects() is a snapshot of the currently existing objects, and
    > won't give you any information about peak object count, which is the
    > most direct correlation to total memory use.
    >
    > > Thanks for your attention,

    >
    > > Arvind

    >
    > > --
    > >http://mail.python.org/mailman/listinfo/python-list


    Chris,

    Thanks for your reply.

    To answer the earlier question, I used CPython 2.4.3 and ActivePython
    2.5.1 in my analysis above. No custom modules added. Interpreter
    banners are at the end of this message.

    In my program, I do keep instantiating new objects every "frame".
    However, these objects are no longer needed after a few frames, and
    the program no longer maintains a reference to old objects. Therefore,
    I expect the reference-counting mechanism built into python (whatever
    it is, if not gc) to free memory used by these objects and return it
    to the "pool" from which they were allocated. Further, I would expect
    that in time, entire pools would become free, and these free pools
    should be reused for new objects. Therefore the total number of pools
    allocated (and therefore "arenas"?) should not grow over time, if
    pools are being correctly reclaimed. Is this not expected behavior?

    Also, since I sample gc.get_objects() frequently, I would expect that
    I would stumble upon a "peak" memory usage snapshot, or at the very
    least see a good bit of variation in the output. However, this does
    not occur.

    Finally, if I deliberately hold on to references to old objects,
    gc.get_objects() clearly shows an increasing number of objects being
    tracked in each snapshot, and the memory leak is well explained.

    Python version info:

    ActivePython 2.5.1.1 (ActiveState Software Inc.) based on
    Python 2.5.1 (r251:54863, May 2 2007, 08:46:07)
    [GCC 3.3.4 (pre 3.3.5 20040809)] on linux2

    AND

    Python 2.4.3 (#1, Mar 14 2007, 19:01:42)
    [GCC 4.1.1 20070105 (Red Hat 4.1.1-52)] on linux2


    Arvind
    arvind, Oct 9, 2007
    #4
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Bill Hanna
    Replies:
    12
    Views:
    508
    Gammaburst
    Jul 10, 2003
  2. Edward K Ream

    Where are the strings in gc.get_objects?

    Edward K Ream, May 4, 2007, in forum: Python
    Replies:
    1
    Views:
    311
    Edward K Ream
    May 4, 2007
  3. V Green
    Replies:
    0
    Views:
    808
    V Green
    Feb 5, 2008
  4. PA Bear [MS MVP]
    Replies:
    0
    Views:
    908
    PA Bear [MS MVP]
    Feb 5, 2008
  5. Jurko Gospodnetiæ

    What does gc.get_objects() return?

    Jurko Gospodnetiæ, Mar 12, 2014, in forum: Python
    Replies:
    0
    Views:
    54
    Jurko Gospodnetiæ
    Mar 12, 2014
Loading...

Share This Page