Alex said:
Victor said:
Alex Vinokur wrote: [snip]
* Does STL require too much space/memory?
You need to define "too much".
I would like to know of experience of people working with STL for
embedded systems.
Are there hard/insoluble problems related to issue?
Presumably you mean, "Are there hard/insoluble problems related to
[this] issue [i.e., too much memory being required]?"
I have used STL on several different processors on embedded systems,
and I have had no problems related to memory usage that wouldn't have
also been incurred by statically allocated or new/malloc allocated
memory. However, problems related to std::vector certainly could be
incurred on an embedded (or other non-embedded!) system, especially
because of its exponential memory allocation strategy. Some embedded
systems have tighter memory requirements than others, so it will
entirely depend on your requirements. However, see below on how to
mitigate potential memory issues.
[snip]
Let v be of vector<in> type.
For instance, at the moment v.size() = 10000.
Now we do the following thing:
for (int i; i < 20000; i++) v.push_back(i);
Must 'v' be reallocated/copied to provide its continuity for v.size()
== 30000.
std::vector allocates memory exponentially to "amortize" the growing
cost over time. Consequently, when you do your first push_back, vector
will double its size (i.e., allocate a second buffer of double the
size, copy the existing data into it, and then delete the first
buffer), and when you exceed the new size, it will double it again to
40000. So this will (1) waste memory if the vector doesn't grow any
more beyond that, (2) take up some extra memory while you're doing the
copy, and (3) possibly fragment memory beyond repair.
If you know how big it will grow (or even approximately how big), you
can use std::vector<>::reserve() to set the initial capacity, which
will help minimize copying. If you do not know how big the size will
grow and std::vector's memory allocation scheme doesn't work well for
your system, you should probably not use them.
P.S. By the way, does similar continuity problem exist for ordinary
array?
Yes, and it must be done by hand, which is more error prone. However,
you do then control the memory allocation strategy and could make it
something other than exponential if your system demands it (e.g., add
some predefined amount to the current size, allocate that mem, and then
copy the old into the new; or use some sort of "chunk allocator", where
non-contiguous chunks are tracked by a linked list; etc.).
Cheers! --M