default size of STL vector question

Z

zl2k

hi, all
What is the default size of a stl vector? Suppose I have integers
needs to store in a vector and I know the max number of integer is,
say, 1000. Will it be more efficient that I first allocate size of
1000 and then use {vector[counter] = aNumber; counter++} to populate
the vector than by push_back(aNumber)? Of course I need to resize the
vector after it is populated. Thanks for help.
zl2k
 
I

Ian Collins

zl2k said:
hi, all
What is the default size of a stl vector?

There isn't one.
Suppose I have integers
needs to store in a vector and I know the max number of integer is,
say, 1000. Will it be more efficient that I first allocate size of
1000 and then use {vector[counter] = aNumber; counter++} to populate
the vector than by push_back(aNumber)?

Probably.
 
M

Michael DOUBEZ

Ian Collins a écrit :
zl2k said:
hi, all
What is the default size of a stl vector?

There isn't one.
Suppose I have integers
needs to store in a vector and I know the max number of integer is,
say, 1000. Will it be more efficient that I first allocate size of
1000 and then use {vector[counter] = aNumber; counter++} to populate
the vector than by push_back(aNumber)?

Probably.

You can use vector<>::reserve() instead. That way push_back() won't
cause reallocation and you can keep your vector size consistant with the
size of data held (and it doesn't initialize/destroy unnecessary data,
not that it matters with ints).

Michael
 
Z

zl2k

Ian Collins a écrit :
There isn't one.

Create a vector an use vector<>::capacity() to know your STL
implementation's default size.


Suppose I have integers
needs to store in a vector and I know the max number of integer is,
say, 1000. Will it be more efficient that I first allocate size of
1000 and then use {vector[counter] = aNumber; counter++} to populate
the vector than by push_back(aNumber)?
Probably.

You can use vector<>::reserve() instead. That way push_back() won't
cause reallocation and you can keep your vector size consistant with the
size of data held (and it doesn't initialize/destroy unnecessary data,
not that it matters with ints).

Michael

But if I use reserve(maxAmount), then the maxAmount of memory is
acturally occupied by the vector, no matter how many elements are
actually there. Is that right?
 
K

Kai-Uwe Bux

zl2k said:
Ian Collins a écrit :
[snip]
But if I use reserve(maxAmount), then the maxAmount of memory is
acturally occupied by the vector, no matter how many elements are
actually there. Is that right?

That is correct. In return you are guaranteed that reallocation will not
happen unless the vector grows beyond maxAmount in length. This will
usually gain a little bit of speed (profile to be sure it pays off). Also
it allows you to keep iterators and references valid in certain
circumstances.

Usually, you will use reserve() when you know the target size of the vector
in advance. In that case, it will actually conserve memory: with a naive
doubling strategy (each reallocation doubles the size), a randomly filled
vector will use only about 72% of its allocated memory.


Best

Kai-Uwe Bux
 
J

Juha Nieminen

Kai-Uwe Bux said:
That is correct. In return you are guaranteed that reallocation will not
happen unless the vector grows beyond maxAmount in length. This will
usually gain a little bit of speed (profile to be sure it pays off). Also
it allows you to keep iterators and references valid in certain
circumstances.

Another advantage is that it reduces memory fragmentation (especially
if the vector never grows beyond the reserved size). In most C++
implementations in most systems when the vector is resized larger it
cannot reuse the freed memory. This freed memory can only be reused by
other dynamically allocated objects or arrays which are small enough, if
any are created.

Memory fragmentation is one problem of std::vector which many
programmers are not aware of. In many cases it's not a big problem, but
there are certain situations where it can become a major issue. A
typical problematic case is when the final size of the vector is not
known, and values are continuously being pushed back to it. The effect
can be worsened if there are several such vectors (because there's a
high probability that none of them can reuse the memory freed by the
others simply because they are too large).

In one application I made years ago which required extensive amounts
of memory (and it was precisely of the type where data was generated
during the execution of the program and it was impossible to know the
final sizes of the vectors before all the data had been generated), the
memory used by the program almost halved when I replaced all the vectors
with deques (thus allowing much bigger datasets to be calculated). The
execution speed of the program was not affected almost at all.
 
M

Michael DOUBEZ

zl2k a écrit :
Ian Collins a écrit :
zl2k wrote:
Suppose I have integers
needs to store in a vector and I know the max number of integer is,
say, 1000. Will it be more efficient that I first allocate size of
1000 and then use {vector[counter] = aNumber; counter++} to populate
the vector than by push_back(aNumber)?
Probably.
You can use vector<>::reserve() instead. That way push_back() won't
cause reallocation and you can keep your vector size consistant with the
size of data held (and it doesn't initialize/destroy unnecessary data,
not that it matters with ints).
But if I use reserve(maxAmount), then the maxAmount of memory is
acturally occupied by the vector, no matter how many elements are
actually there. Is that right?

Yes but the result is the same with resize(). The standard does not
require that resize() give back memory and I assume that in most
implementation it doesn't.
It is one item of Herb Sutter's "Effective C++". There is a way that
potentialy allows you to get back the memory:
vector<int> myVector;//lot of memory reserved relatively to actual size

{vector<int> tmp(myVector);
myVector.swap(tmp);
}

//here there is a chance to have myVector.capacity() reduced

Another possibility is that vector isn't really the container you need
or you can use another container to build the data and then copy them in
a vector.

Michael
 
J

James Kanze

What is the default size of a stl vector?

0. Unless you specify a size, a vector is created empty,
without any elements.
Suppose I have integers needs to store in a vector and I know
the max number of integer is, say, 1000. Will it be more
efficient that I first allocate size of 1000 and then use
{vector[counter] = aNumber; counter++} to populate the vector
than by push_back(aNumber)?

It really depends on the implementation, but if you just do
push_back, without any reserve, then it's almost sure that using
push_back will be slower. On the other hand, the difference
between:

std::vector< int > a( 1000 ) ;
for ( int i = 0 ; i < 1000 ; ++ i ) {
a[ i ] = ...
}

and

std::vector< int > a ;
a.reserve( 1000 ) ;
for ( int i = 0 ; i < 1000 ; ++ i ) {
a.push_back( ... ) ;
}

will probably vary between implementations, and even between
processors, using the same implementation.

If instead of int, the vector is of a user defined type with an
expensive constructor and/or expensive assignment, the push_back
will be more favorable.
Of course I need to resize the
vector after it is populated.

Resizing a vector to make it smaller is very, very fast for the
built-in types. If a user defined type has a non-trivial
destructor, however, that destructor will be called for each
element removed by the resize.
 
A

Andrew Koenig

What is the default size of a stl vector?
Zero.

Suppose I have integers
needs to store in a vector and I know the max number of integer is,
say, 1000. Will it be more efficient that I first allocate size of
1000 and then use {vector[counter] = aNumber; counter++} to populate
the vector than by push_back(aNumber)?

Your best bet is probably to reserve enough space (that is, call reserve
rather than resize) and then call push_back repeatedly.

Example:

vector<int> v;
v.reserve(n);
for (int i = 0; i < n; ++i)
v.push_back(i);

That way, you avoid having to reallocate vector elements, and you also avoid
having to construct elements that you overwrite later.
 
Z

zl2k

zl2k a écrit :


Ian Collins a écrit :
zl2k wrote:
Suppose I have integers
needs to store in a vector and I know the max number of integer is,
say, 1000. Will it be more efficient that I first allocate size of
1000 and then use {vector[counter] = aNumber; counter++} to populate
the vector than by push_back(aNumber)?
Probably.
You can use vector<>::reserve() instead. That way push_back() won't
cause reallocation and you can keep your vector size consistant with the
size of data held (and it doesn't initialize/destroy unnecessary data,
not that it matters with ints).
But if I use reserve(maxAmount), then the maxAmount of memory is
acturally occupied by the vector, no matter how many elements are
actually there. Is that right?

Yes but the result is the same with resize(). The standard does not
require that resize() give back memory and I assume that in most
implementation it doesn't.
It is one item of Herb Sutter's "Effective C++". There is a way that
potentialy allows you to get back the memory:
vector<int> myVector;//lot of memory reserved relatively to actual size

{vector<int> tmp(myVector);
myVector.swap(tmp);

}

//here there is a chance to have myVector.capacity() reduced

Another possibility is that vector isn't really the container you need
or you can use another container to build the data and then copy them in
a vector.

Michael

Thanks for all the very helpful replies. One more question:

{
std::vector<int> v(1000, 0); //allocate memory and do nothing
}

//here, does the memory previously allocated release and available for
the program to use? Or do I need to use the swap thing inside {} in
order to make sure the memory is reusable?
 
G

Guest

zl2k a écrit :


Ian Collins a écrit :
zl2k wrote:
Suppose I have integers
needs to store in a vector and I know the max number of integer is,
say, 1000. Will it be more efficient that I first allocate size of
1000 and then use {vector[counter] = aNumber; counter++} to populate
the vector than by push_back(aNumber)?
Probably.
You can use vector<>::reserve() instead. That way push_back() won't
cause reallocation and you can keep your vector size consistant with the
size of data held (and it doesn't initialize/destroy unnecessary data,
not that it matters with ints).
But if I use reserve(maxAmount), then the maxAmount of memory is
acturally occupied by the vector, no matter how many elements are
actually there. Is that right?

Yes but the result is the same with resize(). The standard does not
require that resize() give back memory and I assume that in most
implementation it doesn't.
It is one item of Herb Sutter's "Effective C++". There is a way that
potentialy allows you to get back the memory:
vector<int> myVector;//lot of memory reserved relatively to actual size

{vector<int> tmp(myVector);
myVector.swap(tmp);

}

//here there is a chance to have myVector.capacity() reduced

Another possibility is that vector isn't really the container you need
or you can use another container to build the data and then copy them in
a vector.

Michael

Thanks for all the very helpful replies. One more question:

{
std::vector<int> v(1000, 0); //allocate memory and do nothing

Actually you not only allocate memory, you also create 1000 elements
(each with the value 0). To just create a vector and allocate space for
1000 elements use

//here, does the memory previously allocated release and available for
the program to use? Or do I need to use the swap thing inside {} in
order to make sure the memory is reusable?

There was no previous memory allocated. If you are asking if the memory
previously used when the vector reallocates is freed then yes, it is.
The swap thing is mostly useful when you have a large vector and reduces
its size and want to reclaim the memory not in use. In other words, the
capacity of a vector will never shrink, only grow.
 
G

Guest

zl2k a écrit :


Ian Collins a écrit :
zl2k wrote:
Suppose I have integers
needs to store in a vector and I know the max number of integer is,
say, 1000. Will it be more efficient that I first allocate size of
1000 and then use {vector[counter] = aNumber; counter++} to populate
the vector than by push_back(aNumber)?
Probably.
You can use vector<>::reserve() instead. That way push_back() won't
cause reallocation and you can keep your vector size consistant with the
size of data held (and it doesn't initialize/destroy unnecessary data,
not that it matters with ints).
But if I use reserve(maxAmount), then the maxAmount of memory is
acturally occupied by the vector, no matter how many elements are
actually there. Is that right?

Yes but the result is the same with resize(). The standard does not
require that resize() give back memory and I assume that in most
implementation it doesn't.
It is one item of Herb Sutter's "Effective C++". There is a way that
potentialy allows you to get back the memory:
vector<int> myVector;//lot of memory reserved relatively to actual size

{vector<int> tmp(myVector);
myVector.swap(tmp);

}

//here there is a chance to have myVector.capacity() reduced

Another possibility is that vector isn't really the container you need
or you can use another container to build the data and then copy them in
a vector.

Michael

Thanks for all the very helpful replies. One more question:

{
std::vector<int> v(1000, 0); //allocate memory and do nothing
}

//here, does the memory previously allocated release and available for
the program to use? Or do I need to use the swap thing inside {} in
order to make sure the memory is reusable?

Now I get what you are asking about. Yes, the memory used by the vector
will be freed when it goes out of scope.
 
J

James Kanze

What is the default size of a stl vector?
Suppose I have integers
needs to store in a vector and I know the max number of integer is,
say, 1000. Will it be more efficient that I first allocate size of
1000 and then use {vector[counter] = aNumber; counter++} to populate
the vector than by push_back(aNumber)?
Your best bet is probably to reserve enough space (that is, call reserve
rather than resize) and then call push_back repeatedly.

vector<int> v;
v.reserve(n);
for (int i = 0; i < n; ++i)
v.push_back(i);
That way, you avoid having to reallocate vector elements, and
you also avoid having to construct elements that you overwrite
later.

That was more or less what I'd assumed until I actually measured
it. For an std::vector<int>, at least for the implementation I
measured (g++ on Sun Sparc, IIRC), doing the initializations was
slightly faster than the push_back's.

This obviously depends on the type--if default construction
followed by assignment is significantly slower than for int, for
example---and the implementation, and as a general rule, I'd
still go with reserve() plus the push_back(), since it will
always be almost as fast, and protects against the case where
construction plus assignment is expensive. But if performance
matters, you have to profile.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,764
Messages
2,569,565
Members
45,041
Latest member
RomeoFarnh

Latest Threads

Top