Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Python > Re: How to measure memory footprint of Python objects?

Reply
Thread Tools

Re: How to measure memory footprint of Python objects?

 
 
Fredrik Lundh
Guest
Posts: n/a
 
      09-20-2006
Neagu, Adrian wrote:

> I have a python program that takes a lot of memory (>hundred Mb).


> I can see the total process size of the Python process (Task manager on MS
> Win or Unix "ps" command) but that is not precise enough for me


I'm not sure those two statements are compatible, though.

if your program is using hundreds of megabytes, surely kilobyte
precision should be good enough for you ?

</F>

 
Reply With Quote
 
 
 
 
AdrianNg3
Guest
Posts: n/a
 
      09-20-2006
Fredrik Lundh wrote:
> Neagu, Adrian wrote:
>
> > I have a python program that takes a lot of memory (>hundred Mb).

>
> > I can see the total process size of the Python process (Task manager on MS
> > Win or Unix "ps" command) but that is not precise enough for me

>
> I'm not sure those two statements are compatible, though.
>
> if your program is using hundreds of megabytes, surely kilobyte
> precision should be good enough for you ?
>

Hi Fredrik,

I'll be more precise.

1) Indeed, a few kilobytes are no problem for me. For example, if I
have to write a small function to get my mem size and that function
will allocate a few Python objects that will bias the end result,
that's still OK.

2) The overhead of the Python execution engine in the total size of the
process (C Python, JVM, ...) is more than just "a few kilobytes". As a
last resort, this can be ignored for my purpose at hand (it is a
constant in my comparison of different generations of my Python
application) but it is not really nice (for example, I cannot
meanigfully compare the memory footprint of only my application between
platforms).

3) The real problem with OS-based size of process is the evolution over
time. On MS Win for example, the size of the process is ever-growing
(unless a MS specific consolidation function is called) leading to the
fact that the size of the process and the actual size of the Python
heap(s) has nothing to do with each other towards the end of the
program. I believe that the max size of the process is an indication of
the max size of the Python heap(s) but I'm not sure at all how good as
an indication is that (what about different OSes?).

Anyway, would it be much simpler (for the Python programmer) and much
faster (at run-time) to surface this functionality in the sys module?

Adrian.

 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Re: How include a large array? Edward A. Falk C Programming 1 04-04-2013 08:07 PM
Using virtual memory and/or disk to save reduce memory footprint nick C++ 58 03-16-2009 01:08 PM
How to measure memory footprint of Python objects? Neagu, Adrian Python 3 09-21-2006 02:32 AM
Thumbnail creation with small memory footprint. gbrun Java 1 02-19-2006 11:35 AM
JMS memory footprint size Beatrice Rutger Java 0 06-05-2005 09:56 PM



Advertisments