P
px1138
I am in a grad level computer architecture class. Although it does not
have any programming prereqs most of the students are CS (I am EE), so
the first assignment was to write a little program to determine the
amount of flops per second your computer runs at. It is suppose to be a
short program, and apart from making sure you use a good timer and run
for long enough to ensure accuracy it doesn't have to include any
really big optimizations (i.e. making sure all the data fits in cache
to minimize memory transfer, doing something like LINPACK and making
sure your # ops >> # data, etc etc).
My personal caveat is that I suck at C/C++, I wouldn't even know where
to start. However, I have coded in Perl at many internships and other
projects so I wanted to use Perl. Well, I am getting downright horrible
performance from my Perl script (I would say a factor of 100 or more
slower than my CPU should be). I have pretty much the same sort of
algorithm as a friend of mine has in C++ and he is getting something
like ~500Mflops on is laptop while my perl script (running perl for
windows, I know, I know) says 2-5Mflop! Now I know Perl is interpretted
but should it be THAT slow? If so, why? If not, so I am just coding
something oddly?
-Josh
have any programming prereqs most of the students are CS (I am EE), so
the first assignment was to write a little program to determine the
amount of flops per second your computer runs at. It is suppose to be a
short program, and apart from making sure you use a good timer and run
for long enough to ensure accuracy it doesn't have to include any
really big optimizations (i.e. making sure all the data fits in cache
to minimize memory transfer, doing something like LINPACK and making
sure your # ops >> # data, etc etc).
My personal caveat is that I suck at C/C++, I wouldn't even know where
to start. However, I have coded in Perl at many internships and other
projects so I wanted to use Perl. Well, I am getting downright horrible
performance from my Perl script (I would say a factor of 100 or more
slower than my CPU should be). I have pretty much the same sort of
algorithm as a friend of mine has in C++ and he is getting something
like ~500Mflops on is laptop while my perl script (running perl for
windows, I know, I know) says 2-5Mflop! Now I know Perl is interpretted
but should it be THAT slow? If so, why? If not, so I am just coding
something oddly?
-Josh