semantics of std::vector.reserve()

D

Dilip

If you reserve a certain amount of memory for a std::vector, what
happens when a reallocation is necessary because I overshot the limit?
I mean, say I reserve for 500 elements, the insertion of 501st element
is going to cause some more allocation -- for arguments sake if the
vector re-grows to accomodate 1000 elements, I play around with it and
completely erase everything in it after I am done. Now how much does
the vector hold? Do I need to re-reserve 500? Does it stay on with
the formerly re-adjusted capacity of 1000?
 
M

Michiel.Salters

Dilip said:
If you reserve a certain amount of memory for a std::vector, what
happens when a reallocation is necessary because I overshot the limit?
I mean, say I reserve for 500 elements, the insertion of 501st element
is going to cause some more allocation -- for arguments sake if the
vector re-grows to accomodate 1000 elements, I play around with it and
completely erase everything in it after I am done. Now how much does
the vector hold? Do I need to re-reserve 500? Does it stay on with
the formerly re-adjusted capacity of 1000?

What do you mean, "how much does the vector hold"? A vector has two
important properties, size and capacity. The functions to change them
are
resize() and reserve(). Capacity is at least as large as the size,
obviously.
If you clear the vector, you change its size but not its capacity.
Clearly
in your example the size is 0 and the capacity _at least_ 1000.

(And reserve(500) reserves at least 500, but reserving 512 would be
legal as well. Check capacity to be sure)

HTH,
Michiel Salters
 
D

Dilip

If you clear the vector, you change its size but not its capacity.

Awesome -- this is what I wanted to hear. However...
Clearly
in your example the size is 0 and the capacity _at least_ 1000.

If I reserve 500 I take it that space for _atleast_ 500 elements is
created. So if I put an upper limit of 500 in my code to do some
processing and clear the vector post-processing, ideally there should
be no reason why the vector will ever have a need to re-adjust its
memory, right?
 
B

BobR

Dilip wrote in message ...
If I reserve 500 I take it that space for _atleast_ 500 elements is
created. So if I put an upper limit of 500 in my code to do some
processing and clear the vector post-processing, ideally there should
be no reason why the vector will ever have a need to re-adjust its
memory, right?

Y'know, sometimes you can read 'till you are blue in the face and still not
get a clear picture. Do some tests to solidify it in your mind:

std::vector<int> VecInt(10);
std::cout<<" size="<<VecInt.size()<<" cap="
<<VecInt.capacity()<<std::endl;
// VecInt.push_back( 1 );
for(size_t i(0); i < 11; ++i){
VecInt.push_back( i );
}
std::cout<<" size="<<VecInt.size()<<" cap="
<<VecInt.capacity()<<std::endl;
for(size_t i(0); i < 50; ++i){
VecInt.push_back( i );
}
std::cout<<" size="<<VecInt.size()<<" cap="
<<VecInt.capacity()<<std::endl;
// etc.
// - output -
// size=10 cap=10
// size=21 cap=40
// size=71 cap=80

Notice how every time it exceeds capacity it doubles the capacity[1]?
Now try your own experiment. Set (reserve()) 500, then fill it with 501
elements and see if it doesn't go to cap=1000. Add 500 more elements, what do
you get for capacity()?

If the vector keeps doubling, eventually it will run out of memory, where if
you had set the capacity in the beginning, it may have fit in memory (memory
gets 'fragmented' sometimes, and the vector allocator won't be able to find a
big enough 'solid' chunk to continue.)

That help any?
[1] - (on my compiler implementation. YMMV)
 
R

Roland Pibinger

Notice how every time it exceeds capacity it doubles the capacity[1]?
Now try your own experiment. Set (reserve()) 500, then fill it with 501
elements and see if it doesn't go to cap=1000. Add 500 more elements, what do
you get for capacity()?

AFAIK, popular implementations produce this behavior for push_back(),
insert(), ... only, but not for reserve. E.g. the following copies the
vector contents twice (if capacity < 500):

VecInt.reserve(500);
VecInt.reserve(501);

Moreover, clear() is not guaranteed to preserve the capacity.
[1] - (on my compiler implementation. YMMV)

YIMV (Your Implementation May Vary)
Roland Pibinger
 
H

Howard Hinnant

Moreover, clear() is not guaranteed to preserve the capacity.

According to the standard, vector::clear() is guaranteed to preserve
capacity. One high profile implementation violated this guarantee in a
few versions but is standard conforming in its latest version.

-Howard
 
H

Howard Hinnant

Where did you find that in the C++ Standard?

23.1.1p4, Sequence requirements:

The semantics of a.clear() is erase(begin(), end()).

clear() is not otherwise defined for vector.

Same table:

erase(q1, q2): erases the elements in the range [q1, a2).

23.2.4.3p3-p5 goes on to further define erase(iter,iter) for vector but
does not address capacity issues one way or the other.

23.2.4.2p2-p5: Defines vector::reserve. The definition includes:
After reserve(), capacity() is greater or equal to the argument of reserve if
reallocation happens; and equal to the previous value of capacity()
otherwise. Reallocation happens at this point if and only if the current
capacity is less than the argument of reserve().

There is a note which further explains:
It is guaranteed that no reallocation takes place during insertions that
happen after a call to reserve() until the time when an insertion would make
the size of the vector greater than the size specified in the most recent
call to reserve().

Consider this code:

vector<int> v;
v.reserve(100);
v.resize(100);
v.clear();
v.resize(100);

According to the note above, neither the first nor second call to resize
should not be allowed to reallocate because of the preceding call to
reserve. There is no part of the definition of clear or erase which
otherwise grants such leeway.

Additionally the working draft for C++0X has already been changed via DR
329 to make it more clear that "most recent" really means "largest":
It is guaranteed that no reallocation takes place during insertions that
happen after a call to reserve()
until the time when an insertion would make the size of the vector greater
than the value of capacity().

There is interest in adding functionality to the C++0X vector allowing a
"shrink-to-fit" functionality as std::string already enjoys with its
reserve member.

http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2005/n1870.html#add-me
mber-resize-capacity-to-vector-and-basic-string

This would have the effect of negating the "largest" reserve. Currently
the only way to achieve such functionality is to allocate a new smaller
buffer, copy the old data into it, and swap buffers. However here is a
proposal:

http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2006/n1953.html

which would allow the possibility that the vector could shrink in place
(e.g. without reallocating a new smaller buffer). Neither proposal has
been turned down, but neither is enjoying overwhelming support either.

-Howard
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,768
Messages
2,569,574
Members
45,048
Latest member
verona

Latest Threads

Top