vector container resize std::bad_alloc

P

Per

This is OS dependant, and unrelated to c++. Unfortunately, the
answer is probably 'no'.

Because of the way resize() works, you may be trying to allocate more
memory than you really need. If you know in advance what the maximum
is, you can try using resize() to pre-allocate the memory.

Perhaps a std::list will suffice in that case since no extra memory
space will be allocated beforehand. Or perhaps a std::deque will not
allocatede the same amount of memory.

/Per
 
F

fabian.lim

Hi all,

Im struggling with a memory allocation problem. I need to keep track
of the history of my computations, and I do so by filling up a growing
vector of elements, resizing it everytime. At some point
std::allocator will throw a std::bad_alloc exception, when it does the
array will be about size 40,000. Are they are ways to increase the
amount of memory my program can use?

Thanks in advance,
Fabian
 
J

joecook

std::allocator will throw a std::bad_alloc exception, when it does the
array will be about size 40,000. Are they are ways to increase the
amount of memory my program can use?

This is OS dependant, and unrelated to c++. Unfortunately, the
answer is probably 'no'.

Because of the way resize() works, you may be trying to allocate more
memory than you really need. If you know in advance what the maximum
is, you can try using resize() to pre-allocate the memory.

HTH,
Joe
 
J

joecook

This is OS dependant, and unrelated to c++.   Unfortunately, the
answer is probably 'no'.

Because of the way resize() works, you may be trying to allocate more
memory than you really need.  If you know in advance what the maximum
is, you can try using resize() to pre-allocate the memory.

HTH,
Joe

Of course, I meant to say "you can try using _reserve()_ to
preallocate the max size you may need"

Joe
 
M

Maxim Yegorushkin

Hi all,

  Im struggling with a memory allocation problem. I need to keep track
of the history of my computations, and I do so by filling up a growing
vector of elements, resizing it everytime. At some point
std::allocator will throw a std::bad_alloc exception, when it does the
array will be about size 40,000. Are they are ways to increase the
amount of memory my program can use?

There certainly are. You may like to tell us a bit more about your
problem.

What is you platform/environment?

What is the element type of your vector? 40000 elements does not seem
to be a lot, but it depends on the size of the element.
 
S

Salt_Peter

Hi all,

Im struggling with a memory allocation problem. I need to keep track
of the history of my computations, and I do so by filling up a growing
vector of elements, resizing it everytime. At some point
std::allocator will throw a std::bad_alloc exception, when it does the
array will be about size 40,000. Are they are ways to increase the
amount of memory my program can use?

Thanks in advance,
Fabian

A std::vector will store elements in a contiguous block of memory. The
std::bad_alloc is thrown if a contiguous block can't be allocated for
the entire array following a resize where capacity == size.

A std::deque doesn't suffer from such reallocation issues but elements
are not stored in one block. In the end, it all depends on the
requirements.
 
R

Rolf Magnus

This is OS dependant, and unrelated to c++. Unfortunately, the
answer is probably 'no'.

Because of the way resize() works, you may be trying to allocate more
memory than you really need.

.... in addition to the already allocated memory. So temporarily, the vector
will use up more than twice as much as really needed at that point.
 
J

jason.cipriani

There certainly are. You may like to tell us a bit more about your
problem.

What is you platform/environment?

What is the element type of your vector? 40000 elements does not seem
to be a lot, but it depends on the size of the element.


Yes, this does not seem like a lot. More details are in order.

You could reserve() before resize()ing and you won't run the risk of
allocating twice as much space as you need, but you lose the
performance benefits of exponentially growing the capacity on resize
().

You could use a list instead of a vector, perhaps a highly fragmented
heap is leaving you without a large enough single block of memory, a
list has a bit more overhead (although from what I'm inferring about
the sizes of your elements it's probably negligible) but doesn't
require contiguous blocks.

You could handle a bad_alloc by committing the current contents to the
file system, for example, and then clearing them to free memory.
Although, unless your swap file is too small (or you're using
incredible amounts of memory), the OS should basically be doing this
for you anyways.

You could queue the data you're storing and have a second thread
running and committing it to disk in the background, this may or may
not give good performance depending on the nature of your
"calculations".

You could use a SQL database to store data in. A decent DBMS is
optimized for high performance, and while it's certainly slower than
sticking things in a vector, there may be useful benefits (perhaps
combined with commit on bad_alloc strategy). It may likely be faster
than committing to disk on your own as well. I'm not pushing that as a
solution, just illustrating that there are other options.

You could reduce the memory usage by "compressing" the data somehow,
depending on the nature of your computations. For example, if you are
performing computations on 100,000 objects, but typically only a small
fraction of those are actually modified each iteration, then you do
not need to store the complete state of every object each time -- you
can store information only about what has changed since the last
iteration. I've used techniques like this (+ asynchronous commit to
file system) before for recording and playback of massive amounts of
real-time data with great success -- the disadvantage, of course, is
that to reconstruct a specific frame you must analyze all frames
before it (but there are techniques to reduce this problem as well).

There's a lot of alternatives with a wide range of complexity. What
kind of computations are you performing? What is an "element"?

Jason
 
P

peter koch

Hi all,

  Im struggling with a memory allocation problem. I need to keep track
of the history of my computations, and I do so by filling up a growing
vector of elements, resizing it everytime. At some point
std::allocator will throw a std::bad_alloc exception, when it does the
array will be about size 40,000. Are they are ways to increase the
amount of memory my program can use?

Thanks in advance,
Fabian

I am not sure it is an allocation problem unless you use a lot of heap
memory elsewhere, or your elements are really huge. In the last case,
std::deque is almost like std::vector, just without so big memory
requirements and is likely to be a very easy replacement.
My fear is that you might have a memory leak somewhere or that you
might be overwriting some memory somewhere, causing heap corruption.
For that there are tools to help you, but they are OS/compiler
specific so you should find a better group. (If you are under Linux/
Unix your OS might limit the amount of memory you have available: man
ulimit).

/Peter
 
F

fabian.lim

Dear All,

Thank you so much for your comments. An element is this particular
vector (lets call it vector A) which threw the exception is a vector
of size 2. I didnt mention in my earlier post that actually i have
another vector B, where each element of vector B is a vector. Vector B
is of size 2^22, but most of its components are empty vectors. Each
component of B is a vector which store vectors of size 2. Again each
component of B is grown as new data comes in.

I did some experimenting and found the following.

Expr 1: Disable the routine which stores stuff into B -> A does not
throw std::bad_alloc .

Expr 2: Run the program as normal and check what happens when A
throws std::bad_alloc -> the routine which is putting data into B
seems to be fine.

Thanks again in advance,
Fabian
 
J

jason.cipriani

Dear All,

 Thank you so much for your comments. An element is this particular
vector (lets call it vector A) which threw the exception is a vector
of size 2. I didnt mention in my earlier post that actually i have
another vector B, where each element of vector B is a vector. Vector B
is of size 2^22, but most of its components are empty vectors. Each
component of B is a vector which store vectors of size 2. Again each
component of B is grown as new data comes in.

 I did some experimenting and found the following.

 Expr 1: Disable the routine which stores stuff into B -> A does not
throw std::bad_alloc .

 Expr 2: Run the program as normal and check what happens when A
throws std::bad_alloc -> the routine which is putting data into B
seems to be fine.

Thanks again in advance,
Fabian


It's clear you are storing a lot of data in memory, but I'm a little
confused about exactly what you are storing. To try to get this
straight you have...

Vector A = vector with 40000 elements (mentioned in OP).
Each of those elements is a vector of 2 elements.

Vector B = vector with 2^22 = 4194304 elements
Each of those elements is also a vector.
Each of those vectors store vectors of 2 elements, although most
are empty?

Right? So what are you storing in the end?

Vector A = vector<vector<what> > ?
Vector B = vector<vector<vector<what> > > ?

Unless peter koch is right about you corrupting the heap or leaking
memory, it's probably not significant that one operation or the other
throws a bad_alloc -- it's significant that you're running out of
memory in general. Have you taken a look at your program in top /
Process Explorer / something similar depending on your platform to
check how much memory you're actually using?
 
F

fabian.lim

Hi Jason

To summarize im storing

Vector A = vector<V1 >
Vector B = vector<vector<vector<V2> > >

where both V1 and V2 are vectors<int> and the size |V1| <= 2 and the
size |V2| <= 10. I run in windows and when i check the memory usage,
it seems to be around 90,000K when the exception throws. I dont really
have a good feel for these figures, so I compared to my firefox
browser, which seems to occupy approximately the same amount of
memory, hence i concluded that not alot of memory is used. My
computers have around 1/2 Gb RAM. I do not see the typical OS slowdown
which accompanies alot of paging. And to be more precise im actually
using the vector templates from the BOOST UBLAS libraries.

Thanks,
Fabian
 
J

Juha Nieminen

Per said:
Perhaps a std::list will suffice in that case since no extra memory
space will be allocated beforehand. Or perhaps a std::deque will not
allocatede the same amount of memory.

std::deque consumes much less memory than std::list, so it's the
preferred method. (Basically the only situation where you want to use
std::list is when you want to be able to add or remove elements from the
middle in O(1) (assuming you already have an iterator to the position to
be added or removed) or for O(1) list splicing.)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,579
Members
45,053
Latest member
BrodieSola

Latest Threads

Top