K
Kamaraju Kusumanchi
Hi gurus,
I am relatively newbie in c++ programming regarding numerical
computation. I am looking for advice from someone who have used c++ in
numerical computation. To be specific these are my following questions.
Any references, links would be useful.
1) Does object oriented programming introduce significant overhead as
opposed to using just arrays. I know this is a very subjective issue and
depends on operating system, compiler... What I have in mind is
class vector
double x[3]
some functions here
end class
class node
vector position, velocity
double temp, pres, density
some functions here
end class
Now say if I am having
node domain[128][128][128] // 3 dimensional array
In a given function usually, I will be accessing just one quantity (say
velocity) at all nodes in the domain.
Now every time I access velocity, the function has to first see to what
node velocity belongs + call the function to get velocity + determine
what component of velocity is needed. I am looking for some numbers on
how this overhead compares to the traditional way of doing things. (By
traditional I mean there are no classes but just 3-d arrays of
xvelocity, yvelocity, zvelocity, pressure etc)
For the above model, Can I quantify the overhead for each such call(say
1-5% etc) Some kind of ballpark figure would be sufficient. I am using
g++ 3.3.3 as my compiler on a debian testing distribution.
2) Does object oriented programming significantly affect parallelising
the code? Is MPI support in C++ good enough or is there any better
parallel programming language?
3) Does OOP overhead significantly affect the performance of code when
calling fftw(3.0) libraries?
As you probably would have guessed by now, I am a fortran programmer
trying to see if it is worth shifting to OOP for numerical computation
purposes. Should I stick to Fortran 90 or is it worth shifting to c++?
Any kind of references, helpful hints are appreciated.
thanks
raju
I am relatively newbie in c++ programming regarding numerical
computation. I am looking for advice from someone who have used c++ in
numerical computation. To be specific these are my following questions.
Any references, links would be useful.
1) Does object oriented programming introduce significant overhead as
opposed to using just arrays. I know this is a very subjective issue and
depends on operating system, compiler... What I have in mind is
class vector
double x[3]
some functions here
end class
class node
vector position, velocity
double temp, pres, density
some functions here
end class
Now say if I am having
node domain[128][128][128] // 3 dimensional array
In a given function usually, I will be accessing just one quantity (say
velocity) at all nodes in the domain.
Now every time I access velocity, the function has to first see to what
node velocity belongs + call the function to get velocity + determine
what component of velocity is needed. I am looking for some numbers on
how this overhead compares to the traditional way of doing things. (By
traditional I mean there are no classes but just 3-d arrays of
xvelocity, yvelocity, zvelocity, pressure etc)
For the above model, Can I quantify the overhead for each such call(say
1-5% etc) Some kind of ballpark figure would be sufficient. I am using
g++ 3.3.3 as my compiler on a debian testing distribution.
2) Does object oriented programming significantly affect parallelising
the code? Is MPI support in C++ good enough or is there any better
parallel programming language?
3) Does OOP overhead significantly affect the performance of code when
calling fftw(3.0) libraries?
As you probably would have guessed by now, I am a fortran programmer
trying to see if it is worth shifting to OOP for numerical computation
purposes. Should I stick to Fortran 90 or is it worth shifting to c++?
Any kind of references, helpful hints are appreciated.
thanks
raju