C
Chris E. Yoon
I just want to hear people's opinions on this subject.
My application has lots and lots of short-lived objects that use
dynamic allocation/deallocation. After implementing its functionality,
I found out that much processing is spent for new/delete calls.
So, to improve its performance, I made a little factory-like class that
pre-'new's those short-lived objects and stores them in a linked list when
the program starts.
Every time I need an instance, I would do something like the following:
// instead of new
ShortLivedObject* slo = SomeFactoryLikeClass::instance()->getOne();
// instead of delete
SomeFactoryLikeClass::instance()->returnOne(slo);
I would retrieve an instance from the pre-allocated objects and set its
members, and when I'm done, I just push it back to the end of the list.
Okay, so that was my intial solution. But an article I read recommended
that I overload new and delete operators, as in the following.
class ShortLivedObject {
...
ShortLivedObject* _next;
static ShortLivedObject* _freelist; // pointer to free list
void* operator new(size_t); // overloaded new()
void operator delete(void*); // overloaded delete()
};
And use the _freelist for storage and recycling.
So, let's say the memory for the object is aligned correctly,
and in new(), memory will either be allocated (if _freelist is empty)
using malloc(), or it will just return pop the front of _freelist
and return its pointer.
In delete(), memeory will not be deallocated. Only its pointer will be
returned to the _freelist.
Now in my personal opinion, the first solution is simpler and safer.
You have less to worry about inheritance. You don't have to come up
with your own storage allocator. You don't have to worry about new[]
and delete[].
Yet, lots of people, including some of my coleagues, seem to be
using the latter technique. Are there any reasons for this? Or is it
just a personal preference?
My application has lots and lots of short-lived objects that use
dynamic allocation/deallocation. After implementing its functionality,
I found out that much processing is spent for new/delete calls.
So, to improve its performance, I made a little factory-like class that
pre-'new's those short-lived objects and stores them in a linked list when
the program starts.
Every time I need an instance, I would do something like the following:
// instead of new
ShortLivedObject* slo = SomeFactoryLikeClass::instance()->getOne();
// instead of delete
SomeFactoryLikeClass::instance()->returnOne(slo);
I would retrieve an instance from the pre-allocated objects and set its
members, and when I'm done, I just push it back to the end of the list.
Okay, so that was my intial solution. But an article I read recommended
that I overload new and delete operators, as in the following.
class ShortLivedObject {
...
ShortLivedObject* _next;
static ShortLivedObject* _freelist; // pointer to free list
void* operator new(size_t); // overloaded new()
void operator delete(void*); // overloaded delete()
};
And use the _freelist for storage and recycling.
So, let's say the memory for the object is aligned correctly,
and in new(), memory will either be allocated (if _freelist is empty)
using malloc(), or it will just return pop the front of _freelist
and return its pointer.
In delete(), memeory will not be deallocated. Only its pointer will be
returned to the _freelist.
Now in my personal opinion, the first solution is simpler and safer.
You have less to worry about inheritance. You don't have to come up
with your own storage allocator. You don't have to worry about new[]
and delete[].
Yet, lots of people, including some of my coleagues, seem to be
using the latter technique. Are there any reasons for this? Or is it
just a personal preference?