Can I improve efficiency of 50.000 "new"'ed objects?

D

David Hilsee

Casper said:
Is there features/containers of standard C++ or the STL which will
assist in minimizing memmory fragmentation upon creation of a huge
ammount of objects on the heap?

Currently my application allocates 50.000 objects using "new". I can
store the pointers to those objects in a container/collection class and
indeed this assists in processing (sort, delete etc.) but is there any
way for me to make my "new"'ed object resist in some previously
allocated memmory chunk to improve performance (allocate in bigger
chunks) and avoid memmory fragmentation at the same time?

Just one example of the inefficiency is when I delete my objects, I have
to loop through all and "delete" instead of simply delete in one big
chunk. Obviously I can not use "delete[]" as this would only delete my
array and leave me with a massive memmory leak.

What are these objects? Are these objects all of the same type? If they
are, then a std::vector can store them contiguously. If not, then things
might be more difficult.
 
V

Vladimir Ciobanu

Casper said:
Is there features/containers of standard C++ or the STL which will
assist in minimizing memmory fragmentation upon creation of a huge
ammount of objects on the heap?

Currently my application allocates 50.000 objects using "new". I can
store the pointers to those objects in a container/collection class
and indeed this assists in processing (sort, delete etc.) but is
there any way for me to make my "new"'ed object resist in some
previously allocated memmory chunk to improve performance (allocate
in bigger chunks) and avoid memmory fragmentation at the same time?

Just one example of the inefficiency is when I delete my objects, I
have to loop through all and "delete" instead of simply delete in
one big chunk. Obviously I can not use "delete[]" as this would only
delete my array and leave me with a massive memmory leak.

In his book, Modern C++ Design, Andrei Alexandrescu implements a
"Small-Object Allocator". I recommend you buy the book, but you can
download Loki (the library where he implemented everything in the
book) from SourceForge. I believe you can modify it to suit your
needs, even though it might be good enough as it is.

Vladimir Ciobanu
 
C

Casper

Is there features/containers of standard C++ or the STL which will
assist in minimizing memmory fragmentation upon creation of a huge
ammount of objects on the heap?

Currently my application allocates 50.000 objects using "new". I can
store the pointers to those objects in a container/collection class and
indeed this assists in processing (sort, delete etc.) but is there any
way for me to make my "new"'ed object resist in some previously
allocated memmory chunk to improve performance (allocate in bigger
chunks) and avoid memmory fragmentation at the same time?

Just one example of the inefficiency is when I delete my objects, I have
to loop through all and "delete" instead of simply delete in one big
chunk. Obviously I can not use "delete[]" as this would only delete my
array and leave me with a massive memmory leak.

And pointers greatly appreciated!
/Casper
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,054
Latest member
TrimKetoBoost

Latest Threads

Top