Double data type subtraction changes precision

N

neerajb

I have 3 variables, all Double data type. When I subtract dblA - dblB =
dblC, what should be a simple number changes to a lot of decimals. For
example: 2.04 - 1.02 becomes 1.0199999999999999.... Why is this? It seems
that if dblA and dblB are only 2 decimals, the result should be 2 decimals or
less... Very critical in the application that I'm working on that this
number comes out to the expected 2 decimals, but it can vary to any number of
decimals depending on what values are passed into dblA and dblB.

This is ridiculous. THis is happening in both .NET 1.1 and .NET 2.0
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top