# Double data type subtraction changes precision

Discussion in 'ASP .Net' started by neerajb@noida.nospamhcltech.com, Feb 13, 2009.

1. ### Guest

I have 3 variables, all Double data type. When I subtract dblA - dblB =
dblC, what should be a simple number changes to a lot of decimals. For
example: 2.04 - 1.02 becomes 1.0199999999999999.... Why is this? It seems
that if dblA and dblB are only 2 decimals, the result should be 2 decimals or
less... Very critical in the application that I'm working on that this
number comes out to the expected 2 decimals, but it can vary to any number of
decimals depending on what values are passed into dblA and dblB.

This is ridiculous. THis is happening in both .NET 1.1 and .NET 2.0

, Feb 13, 2009

2. ### Guest

Thanks Mark for the quick response.

"Mark Rae [MVP]" wrote:

> ""
> <> wrote in message
> news:...
>
> > I have 3 variables, all Double data type. When I subtract dblA - dblB =
> > dblC, what should be a simple number changes to a lot of decimals. For
> > example: 2.04 - 1.02 becomes 1.0199999999999999.... Why is this? It
> > seems
> > that if dblA and dblB are only 2 decimals, the result should be 2 decimals
> > or
> > less... Very critical in the application that I'm working on that this
> > number comes out to the expected 2 decimals, but it can vary to any number
> > of
> > decimals depending on what values are passed into dblA and dblB.
> >
> > This is ridiculous. THis is happening in both .NET 1.1 and .NET 2.0

>
> Completely standard behaviour, not just in .NET but in all computer
> languages:
> http://www.yoda.arachsys.com/csharp/floatingpoint.html
>
> Use Decimal instead of double...
>
>
> --
> Mark Rae
> ASP.NET MVP
> http://www.markrae.net
>
>

, Feb 14, 2009