Re: How to measure memory footprint of Python objects?

Discussion in 'Python' started by Fredrik Lundh, Sep 20, 2006.

  1. Neagu, Adrian wrote:

    > I have a python program that takes a lot of memory (>hundred Mb).


    > I can see the total process size of the Python process (Task manager on MS
    > Win or Unix "ps" command) but that is not precise enough for me


    I'm not sure those two statements are compatible, though.

    if your program is using hundreds of megabytes, surely kilobyte
    precision should be good enough for you ?

    </F>
    Fredrik Lundh, Sep 20, 2006
    #1
    1. Advertising

  2. Fredrik Lundh

    AdrianNg3 Guest

    Fredrik Lundh wrote:
    > Neagu, Adrian wrote:
    >
    > > I have a python program that takes a lot of memory (>hundred Mb).

    >
    > > I can see the total process size of the Python process (Task manager on MS
    > > Win or Unix "ps" command) but that is not precise enough for me

    >
    > I'm not sure those two statements are compatible, though.
    >
    > if your program is using hundreds of megabytes, surely kilobyte
    > precision should be good enough for you ?
    >

    Hi Fredrik,

    I'll be more precise.

    1) Indeed, a few kilobytes are no problem for me. For example, if I
    have to write a small function to get my mem size and that function
    will allocate a few Python objects that will bias the end result,
    that's still OK.

    2) The overhead of the Python execution engine in the total size of the
    process (C Python, JVM, ...) is more than just "a few kilobytes". As a
    last resort, this can be ignored for my purpose at hand (it is a
    constant in my comparison of different generations of my Python
    application) but it is not really nice (for example, I cannot
    meanigfully compare the memory footprint of only my application between
    platforms).

    3) The real problem with OS-based size of process is the evolution over
    time. On MS Win for example, the size of the process is ever-growing
    (unless a MS specific consolidation function is called) leading to the
    fact that the size of the process and the actual size of the Python
    heap(s) has nothing to do with each other towards the end of the
    program. I believe that the max size of the process is an indication of
    the max size of the Python heap(s) but I'm not sure at all how good as
    an indication is that (what about different OSes?).

    Anyway, would it be much simpler (for the Python programmer) and much
    faster (at run-time) to surface this functionality in the sys module?

    Adrian.
    AdrianNg3, Sep 20, 2006
    #2
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Beatrice Rutger

    JMS memory footprint size

    Beatrice Rutger, Jun 5, 2005, in forum: Java
    Replies:
    0
    Views:
    347
    Beatrice Rutger
    Jun 5, 2005
  2. gbrun
    Replies:
    1
    Views:
    442
    Andrey Kuznetsov
    Feb 19, 2006
  3. Adam Warner

    Memory footprint of a subclass

    Adam Warner, Feb 27, 2006, in forum: Java
    Replies:
    3
    Views:
    617
    Adam Warner
    Feb 27, 2006
  4. Neagu, Adrian
    Replies:
    3
    Views:
    383
    Heikki Toivonen
    Sep 21, 2006
  5. nick
    Replies:
    58
    Views:
    1,900
    Bart van Ingen Schenau
    Mar 16, 2009
Loading...

Share This Page