Double data type subtraction changes precision

Discussion in 'ASP .Net' started by neerajb, Feb 13, 2009.

  1. neerajb

    neerajb Guest

    I have 3 variables, all Double data type. When I subtract dblA - dblB =
    dblC, what should be a simple number changes to a lot of decimals. For
    example: 2.04 - 1.02 becomes 1.0199999999999999.... Why is this? It seems
    that if dblA and dblB are only 2 decimals, the result should be 2 decimals or
    less... Very critical in the application that I'm working on that this
    number comes out to the expected 2 decimals, but it can vary to any number of
    decimals depending on what values are passed into dblA and dblB.

    This is ridiculous. THis is happening in both .NET 1.1 and .NET 2.0
     
    neerajb, Feb 13, 2009
    #1
    1. Advertisements

  2. neerajb

    neerajb Guest

    neerajb, Feb 14, 2009
    #2
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.