I came across a weird behavior when I changed an if-else to a ternary operator for a return statement.
I've simplified the code here:
class Foo
{
private bool condition;
private int intValue = 1;
private decimal decimalValue = 1M;
public object TernaryGet
{
get
{
return condition ? decimalValue : intValue;
}
}
public object IfElseGet
{
get
{
if (condition)
return decimalValue;
return intValue;
}
}
public Foo(bool condition)
{
this.condition = condition;
}
}
class Program
{
static void Main(string[] args)
{
var fooTrue = new Foo(true);
var fooFalse = new Foo(false);
Console.WriteLine("{0}, {1}", fooTrue.TernaryGet.GetType(), fooTrue.IfElseGet.GetType());
Console.WriteLine("{0}, {1}", fooFalse.TernaryGet.GetType(), fooFalse.IfElseGet.GetType());
}
}
The output from this is:
System.Decimal, System.Decimal
System.Decimal, System.Int32
I'd expect the second row to output Int32 on both getters, but for the ternary I'm getting the incorrect CLR type back for the int.
Never mind the code and what it's trying to do - I'm curious to why this is happening, so if anyone can explain it, I'd appreciate it.
Result of ternary (conditional) operator is always of single type - one/both of the options is casted to common type:
var result = condition ? decimalValue : intValue;
Type of result
must be known statically at compile time. Since there is cast from int
to decimal
than decimal
type is selected as type of whole ? :
operator.
So you whole function can be written as (showing automatic casts):
public object TurnaryGet
{
get
{
/*decimal*/ var result = condition ? decimalValue : (decimal)intValue;
return (object)result;
}
}
condition ? decimalValue : intValue;
means
condition ? decimalValue : (decimal) intValue;
try if this work: (I'm stranger to C#, but this work in Java)
condition ? (object) decimalValue : (object) intValue;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With