Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does comparing a *nullable* decimal with `0` differ from comparing a decimal with `0`?

Can someone explain why result1 is false and result2 is true? code is given below:

namespace TestCsharp
{
    class Program
    {
        static void Main(string[] args)
        {
            Order objOrder = new Order(0.0M);

            bool result1 = objOrder.PriceNullable.Equals(0);//returns false
            bool result2 = objOrder.PriceNullable.Value.Equals(0);// returns true
        }
    }

    public class Order
    {
        public decimal? PriceNullable { get; set; }
        public Order(decimal? priceNullable)
        {
            PriceNullable = priceNullable;

        }
    }
}   
like image 444
Yaqub Ahmad Avatar asked Oct 01 '12 09:10

Yaqub Ahmad


1 Answers

Because System.Decimal exposes an overload of Equals that can accept a Decimal value, and your second case is invoking that method (having converted the int parameter to a decimal using an implicit conversion) and returning true.

Whereas in the first case, the Nullable is trying its best, but can only invoke Object.Equals which will fail when comparing between an int and a decimal. If your first call was:

bool result1 = objOrder.PriceNullable.Equals(0M);

You'd be comparing two decimals, and it will now return true.


The Nullable generic Equals method can invoke neither the implicit conversion from int to decimal, nor the overload of equals that accepts a decimal value.

like image 74
Damien_The_Unbeliever Avatar answered Sep 28 '22 02:09

Damien_The_Unbeliever