?
=?iso-8859-1?Q?Juli=E1n?= Albo
Hello.
This test:
#include <stdio.h>
int main ()
{
double a= -1.0e-120;
if (a < 0.0)
printf ("%g < 0\n", a);
if (a > 0.0)
printf ("%g > 0\n", a);
if (a == 0.0)
printf ("%g == 0\n", a);
}
Compiled with gcc 3.3 as a C program gives -1e-120 < 0, but compiled as
C++ gives me -1e-120 == 0.
I suspect that is a gcc problem, but is some workaround possible? How
can I reliably compute the sign of a?
Regards.
This test:
#include <stdio.h>
int main ()
{
double a= -1.0e-120;
if (a < 0.0)
printf ("%g < 0\n", a);
if (a > 0.0)
printf ("%g > 0\n", a);
if (a == 0.0)
printf ("%g == 0\n", a);
}
Compiled with gcc 3.3 as a C program gives -1e-120 < 0, but compiled as
C++ gives me -1e-120 == 0.
I suspect that is a gcc problem, but is some workaround possible? How
can I reliably compute the sign of a?
Regards.