Velocity Reviews > Double data type subtraction changes precision

# Double data type subtraction changes precision

neerajb@noida.nospamhcltech.com
Guest
Posts: n/a

 02-13-2009
I have 3 variables, all Double data type. When I subtract dblA - dblB =
dblC, what should be a simple number changes to a lot of decimals. For
example: 2.04 - 1.02 becomes 1.0199999999999999.... Why is this? It seems
that if dblA and dblB are only 2 decimals, the result should be 2 decimals or
less... Very critical in the application that I'm working on that this
number comes out to the expected 2 decimals, but it can vary to any number of
decimals depending on what values are passed into dblA and dblB.

This is ridiculous. THis is happening in both .NET 1.1 and .NET 2.0

neerajb@noida.nospamhcltech.com
Guest
Posts: n/a

 02-14-2009
Thanks Mark for the quick response.

"Mark Rae [MVP]" wrote:

> "(E-Mail Removed)"
> <(E-Mail Removed) t.com> wrote in message
> news:(E-Mail Removed)...
>
> > I have 3 variables, all Double data type. When I subtract dblA - dblB =
> > dblC, what should be a simple number changes to a lot of decimals. For
> > example: 2.04 - 1.02 becomes 1.0199999999999999.... Why is this? It
> > seems
> > that if dblA and dblB are only 2 decimals, the result should be 2 decimals
> > or
> > less... Very critical in the application that I'm working on that this
> > number comes out to the expected 2 decimals, but it can vary to any number
> > of
> > decimals depending on what values are passed into dblA and dblB.
> >
> > This is ridiculous. THis is happening in both .NET 1.1 and .NET 2.0

>
> Completely standard behaviour, not just in .NET but in all computer
> languages:
> http://www.yoda.arachsys.com/csharp/floatingpoint.html
>
> Use Decimal instead of double...
>
>
> --
> Mark Rae
> ASP.NET MVP
> http://www.markrae.net
>
>