Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > C++ > Re: Linked list takes 10X more memory than expected. Why?

Reply
Thread Tools

Re: Linked list takes 10X more memory than expected. Why?

 
 
Karl Heinz Buchegger
Guest
Posts: n/a
 
      01-23-2004
Chocawok wrote:
>
>
> As you can see, the node struct should be about 4 bytes per node.
>


Don't guess. ASk your system what it computes for the size of a node.

cout << sizeof( Node );

will tell you.
On my system it comes up with 12

> For the first file 373,683 X 4 = 1.5MegaBytes
> For the second file 1,833,930 X 4 = 7.3 MegaBytes.



That makes 373683 * 12 -> 4484196 ~ 4.5 MB
1833930 * 12 -> 22007160 ~ 22.1 MB

> Even if the pointers in the struct are long (pointers are not a strong point
> for me) then these figures should only be about 2.5 and 10 MB.
>
> AND YET, after reading in the first file the foot print is 21MB.
> If I use my other larger file its 100 megabytes of memory used!!!!
>
> Thats A LOT of memory.
>
> I have run the file without a dictionary (actually there was a file but it
> was empty) and the foot print of the standalone app is 0.5 MB.


Could it be that you are testing a debug version?

>
> Has anyone got any ideas on why so much memory is being used?


I did a quick scroll through your files but didn't notice the
usual pitfalls.

Why don't you try your program with a simpler data set. It's next
to impossible to tell anything with a data set that huge. Just
insert 3 or 4 words in the data set. You might also want to add
a test function which just dumps the content of the list (with
all the terminal flags) to see if the optimization worked as expected.
(Although having 1.5 million characters which end up in only 300000
of them stored is a strong indication that something has been optimized).

--
Karl Heinz Buchegger
http://www.velocityreviews.com/forums/(E-Mail Removed)
 
Reply With Quote
 
 
 
 
Chocawok
Guest
Posts: n/a
 
      01-25-2004
> Don't guess. Ask your system what it computes for the size of a node.
>
> cout << sizeof( Node );
>
> will tell you.
> On my system it comes up with 12


Thanks for tip, I get 12 too.

> > Thats A LOT of memory.
> >
> > I have run the file without a dictionary (actually there was a file but

it
> > was empty) and the foot print of the standalone app is 0.5 MB.

>
> Could it be that you are testing a debug version?


You were absolutely right. I *was* running a debug version. I switched to
release and some optimisations and it now running at 43MB (for the larger
file). This is still double what I calculate it should be (22MB), but I'm
going in the right direction.

A friend of mine mentioned something about "memory segment size". I know
that in the HD file system anything smaller than 4K will still take up 4K of
diskspace. He said that a similar thing applies to memory too. Is this
right? He mentioned 8 bytes, so even if I'm just storing a bool, it will
take up 8 bytes.
Anyone know anything about this and what to do about it (if anything)?

Thanks

Choca


 
Reply With Quote
 
 
 
 
Nils Petter Vaskinn
Guest
Posts: n/a
 
      01-26-2004
On Sun, 25 Jan 2004 17:26:37 +0000, Chocawok wrote:

> A friend of mine mentioned something about "memory segment size". I know
> that in the HD file system anything smaller than 4K will still take up 4K of
> diskspace. He said that a similar thing applies to memory too. Is this
> right? He mentioned 8 bytes, so even if I'm just storing a bool, it will
> take up 8 bytes.
> Anyone know anything about this and what to do about it (if anything)?


NOTE: If you posted any code I haven't seen it because the original post
was not on my newsserver. It's probably been dropped because of the binary
attachment mentioned in another reply. Because I haven't seen any code
some of this is based on assumptions may not apply to your code.

Most OS-es (all for all I know) gives out memory to applications in
fixed size chunks. (let's call these chunks pages). If a page is 1MiB and
your program needs 0.5MiB the OS will still give it 1MiB. If your program
needs 1.5 you will get 2 pages wich amounts to 2MiB so this overhead is
larger (in %) for small programs.

Another thing is that each time you allocate memory for a new node
(through the use of "new") there is additional memory overhead because
your program needs to keep track of what memory is allocated and what
memory isn't (you usually don't need to think about it except to know
there will be overhead). So you can add a few bytes of overhead for each
of your nodes.

Also some prosessors are made in such a way that they are faster at
accessing data when the memory is "aligned" that means for example that
the prosessor could find your 12 bytes of data faster if the first byte
lives at a multiple of (eg) 16. The remainder will usually remain unused.
So if you compile with optimisations for speed you may waste 4 bytes if
your prosessor likes a 16 bit alignment (and worse if it likes 32).

--
NPV

"the large print giveth, and the small print taketh away"
Tom Waits - Step right up

 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Memory usage per top 10x usage per heapy MrsEntity Python 20 09-27-2012 08:00 AM
Linked list within a linked list joshd C++ 12 10-02-2006 08:57 AM
Linked list, New try (was:Linked list, no out put,help) fool C Programming 14 07-03-2006 12:29 AM
Re: Linked list takes 10X more memory than expected. Why? Alf P. Steinbach C++ 2 01-25-2004 05:47 PM
Generating a char* from a linked list of linked lists Chris Ritchey C++ 7 07-10-2003 10:12 PM



Advertisments