bjeremy said:
STL Containers are not written with virtual destructors. You may open
yourself up to memory leaks.
It's not memory leaks, it's undefined behavior. You open yourself to that if
you delete a derived object through a pointer to the base class.
However, pointers to std::some_container< some_type > are not really likely
to be used polymorphically. In fact, the STL provides template classes like
std::iterator<> that do not have a virtual destructor yet are explicitly
meant to be derived from.
A more substantial issue are surprise matches with functions like
template < typename T, typename A >
std::vector< T, A > backwards ( std::vector< T, A > const & vec );
If you derive from std::vector, your class will match the argument type, but
the result will not be what you want.
Also any extension to a std container may
not guarantee compatability with the std algorithms, you clients would
need to be aware of this.
The standard algorithms work on ranges specified by pairs of iterators. They
will work just fine with iterators obtained from containers derived from
std::vector. (Actually, that is part of the problem and not part of a
solution, see below).
You shouldn't need to rewrite "tons of code", a standard practice is
to use composition and make an adapter class "have a" std container,
plus whatever extra functionality you need to extend the container for
in the first place.
That actually amounts to writing tons of code (just count the number of
methods in std::vector and you get an idea of how many stupid forwarding
methods you will have to provide to extend container classes this way). You
can cut down on that considerably by using private inheritance.
To the OP: to extend the functionality of standard containers just provide
free standing functions, preferably generic algorithms.
It is definitely bad practice to derive from standard containers in a way
that introduces new invariants (like the entries being sorted). The reason
is that your class inherits begin() and end() from the underlying container
whence you have no way to enforce the new invariants. E.g., mutating
sequence algorithms will happily destroy any given order of elements. This
also explains why it is by and large useless to inherit from standard
containers: you cannot do anything that you could not achieve by
free-standing functions.
There is exactly one case where I consider deriving from standard containers
useful: creating different sequence types that allow for distinct operator
overloading. The problem arises in math programming like this: Consider
typedef std::vector<int> word;
typedef std::vector<int> lattice_point;
You may want operator+ to add lattice_point objects component-wise, whereas
for objects of type word, you may want operator+ to concatenate. Typedefs
will not work since they just create alias names. In these cases, some
quick hack like
struct word : public std::vector<int> {
// forwarding templated constructors
// no further extensions.
};
struct lattice_point : public std::vector< int > {
// same as above
};
can be justified. The classes are not extending std::vector, they just
provide non-aliased versions that distinguish int-vectors according to
their meanings. That allows
word operator+ ( word const & lhs, word const & rhs );
and
lattice_point operator+ ( lattice_point const & lhs,
lattice_point const & rhs );
to coexist peacefully.
Best
Kai-Uwe Bux