Memory usage/Fragmentation

D

DLPnet

Hello,



I m working on Windows and Mac in C++. My application uses a lot of
memory since it s dealing with a lot of images.
So I load images in a cache when needed and if I haven t enough memory
to load next image I remove first ones from the cache.
But I have a lot of problems with memory. A lot of time when I just ask
for memory, a big chunk of memory though, like for example 200 Mo the
new/malloc returns NULL but as far as I can see in the ressource monitor
I still have a lot of memory available (far more than 200 Mo)!
After looking on the web this seems to be a problem due to memory
fragmentation, but how to change this ? what can I do ?

Other things is that it seems that my application cannot use more than
2Go of RAM even if I have a computer with more than that (3, 4 or even 5
Go). Any info on this ?

And last, should I consider using map file to reduce memory usage
(allocation/deallocation) ? any benefit from this ?

Thanks,

Dlp
 
E

EventHelix.com

Are you dealing with 200 MB images?

If the images are smaller but you are allocating memory for a set of
images,
I would recommend allocating memory for each image separately.

Deepa
 
R

Rolf Magnus

DLPnet said:
yes, and even more for example 20000x10000 pixels images ...

There are libraries that are specifically designed to deal with such big
images. They don't load the whole image at once, but just the parts of it
that are needed. Depending on the image format, this can be very efficient.
 
K

Karl Heinz Buchegger

DLPnet said:
Hello,

I m working on Windows and Mac in C++. My application uses a lot of
memory since it s dealing with a lot of images.
So I load images in a cache when needed and if I haven t enough memory
to load next image I remove first ones from the cache.
But I have a lot of problems with memory. A lot of time when I just ask
for memory, a big chunk of memory though, like for example 200 Mo the
new/malloc returns NULL but as far as I can see in the ressource monitor
I still have a lot of memory available (far more than 200 Mo)!
After looking on the web this seems to be a problem due to memory
fragmentation,

Could be
but how to change this ? what can I do ?

Not allocating that much memory in one big chunk.

You could eg. set up a structure where each line in the image has
its own memory allocation. So instead of allocating one memory block
for one 20000 * 10000 pixel image. You have 20000 allocations for
10000 pixel each plus 1 allocation for an array of 20000 pointers.
Other things is that it seems that my application cannot use more than
2Go of RAM even if I have a computer with more than that (3, 4 or even 5
Go). Any info on this ?

That's compiler dependent or operating system dependent.
And last, should I consider using map file to reduce memory usage
(allocation/deallocation) ? any benefit from this ?

Don't know. Try it.
When you get into such large memory usage you need to get creative on what
to do to reduce problems. Problems for the programmer *and* problems for
the operating system *and* problems for the runtime system.
 
I

Ioannis Vranos

DLPnet said:
Hello,



I m working on Windows and Mac in C++. My application uses a lot of
memory since it s dealing with a lot of images.
So I load images in a cache when needed and if I haven t enough memory
to load next image I remove first ones from the cache.
But I have a lot of problems with memory. A lot of time when I just ask
for memory, a big chunk of memory though, like for example 200 Mo the
new/malloc returns NULL but as far as I can see in the ressource monitor
I still have a lot of memory available (far more than 200 Mo)!
After looking on the web this seems to be a problem due to memory
fragmentation, but how to change this ? what can I do ?

Other things is that it seems that my application cannot use more than
2Go of RAM even if I have a computer with more than that (3, 4 or even 5
Go). Any info on this ?

And last, should I consider using map file to reduce memory usage
(allocation/deallocation) ? any benefit from this ?


In Windows there is a limit of memory per application. I do not remember
how much this is for Windows x86, but it is about 512 MB (perhaps so
much exactly).


Some similar restriction may exist in Mac and any other OS.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,054
Latest member
TrimKetoBoost

Latest Threads

Top