C
Chris Brat
Hi,
I've found this little oddity in the application I'm working on where
the result of adding two double primitives gives a result that either
has an extra 0.0000000000000002 or it is 0.0000000000000001 less than
the expected result.
This sample app illustrates it.
Does anyone know why this happens ?
Thanks
Chris
public class A {
public static void main (String[] args){
double a = 67.41;
double b = 51.85;
double result = a + b;
// I get 119.25999999999999
System.out.println(result);
a = 1.01;
b = 2.02;
result = a + b;
// I get 3.0300000000000002
System.out.println(result);
a = 1.100;
b = 2.103;
result = a + b;
// 3.2030000000000003
System.out.println(result);
}
}
I've found this little oddity in the application I'm working on where
the result of adding two double primitives gives a result that either
has an extra 0.0000000000000002 or it is 0.0000000000000001 less than
the expected result.
This sample app illustrates it.
Does anyone know why this happens ?
Thanks
Chris
public class A {
public static void main (String[] args){
double a = 67.41;
double b = 51.85;
double result = a + b;
// I get 119.25999999999999
System.out.println(result);
a = 1.01;
b = 2.02;
result = a + b;
// I get 3.0300000000000002
System.out.println(result);
a = 1.100;
b = 2.103;
result = a + b;
// 3.2030000000000003
System.out.println(result);
}
}