D
Dan Elliott
Hello all,
I am writing a program which needs to run as quickly as possible, but holds
a lot of data in memory (around 1GB for a usual run). Is this too much
memory to even consider putting most/all of it on the stack? Does it
matter?
Any considerations I might take into account when deciding how to allocate
the many data structures? By the way, the system will consist of a handful
of very large data structures (primarily matrices and vectors).
Historically, I have always used the heap. This was more a result of my C
background than anything else. However, I am now writing a system which
will use only C++ and OO practices. Therefore, I feel I have a new choice
to make. Amy assistance to make this design as efficient as possible would
be greatly appreciated.
Thank you,
dan elliott
I am writing a program which needs to run as quickly as possible, but holds
a lot of data in memory (around 1GB for a usual run). Is this too much
memory to even consider putting most/all of it on the stack? Does it
matter?
Any considerations I might take into account when deciding how to allocate
the many data structures? By the way, the system will consist of a handful
of very large data structures (primarily matrices and vectors).
Historically, I have always used the heap. This was more a result of my C
background than anything else. However, I am now writing a system which
will use only C++ and OO practices. Therefore, I feel I have a new choice
to make. Amy assistance to make this design as efficient as possible would
be greatly appreciated.
Thank you,
dan elliott