J
JS
We have the same floating point intensive C++ program that runs on
Windows on Intel chip and on Sun Solaris on SPARC chips. The program
reads the exactly the same input files on the two platforms. However,
they generate slightly different results for floating point numbers.
Are they really supposed to generate exactly the same results? I
guess so because both platforms are supposed to be IEEE floating point
standard (754?) compliant. I have turned on the Visual C++ compile
flags which will make sure the Windows produce standard compliant code
(the /Op flags). However, they still produce different results. I
suspect that this may be due to a commerical mathematical library that
we use which can't be compiled using /Op option. If I had recompiled
everything using /Op option, the two should have produced the same
results.
Am I right?
Thanks a lot.
Windows on Intel chip and on Sun Solaris on SPARC chips. The program
reads the exactly the same input files on the two platforms. However,
they generate slightly different results for floating point numbers.
Are they really supposed to generate exactly the same results? I
guess so because both platforms are supposed to be IEEE floating point
standard (754?) compliant. I have turned on the Visual C++ compile
flags which will make sure the Windows produce standard compliant code
(the /Op flags). However, they still produce different results. I
suspect that this may be due to a commerical mathematical library that
we use which can't be compiled using /Op option. If I had recompiled
everything using /Op option, the two should have produced the same
results.
Am I right?
Thanks a lot.