I have this code;
static int test = 100;
static int Test
{
get
{
return (int)(test * 0.01f);
}
}
output is : 0
But this code returns different
static int test = 100;
static int Test
{
get
{
var y = (test * 0.01f);
return (int)y;
}
}
output is : 1
Also I have this code
static int test = 100;
static int Test
{
get
{
return (int)(100 * 0.01f);
}
}
output is : 1
I look at IL output and I dont understand why C# doing this mathematical operation at compile time and output different?
What is difference of this two code? Why I decided to use variable result is changing?
Because the compiler tricks you. The compiler is smart enough to do some math already so it doesn't need to do that on run-time, which would be pointless. The expression 100 * .01f
is calculated in the compiler, without the lack of precision on the float, which breaks you up on run-time.
To prove this, try to make the static test
a const
. You will see the compiler is able to do the math for you on compile time then too. It has nothing with writing to a variable first, as in your sample. Run-time vs. compile-time is.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With