Can somebody explain the following System.Numerics.BigInteger
behavior?
Console.WriteLine(Math.Log10(100)); // prints 2
Console.WriteLine(Math.Log10(1000)); // prints 3 (as expected)
Console.WriteLine((int)Math.Log10(100)); // prints 2
Console.WriteLine((int)Math.Log10(1000)); // prints 3 (as axpected)
var bi100 = new BigInteger(100);
var bi1000 = new BigInteger(1000);
Console.WriteLine(BigInteger.Log10(bi100)); // prints 2
Console.WriteLine(BigInteger.Log10(bi1000)); // prints 3 (as axpected)
Console.WriteLine((int)BigInteger.Log10(bi100)); // prints 2
Console.WriteLine((int)BigInteger.Log10(bi1000)); // prints 2 ???????
Console.WriteLine(Math.Floor(BigInteger.Log10(bi100))); // prints 2
Console.WriteLine(Math.Floor(BigInteger.Log10(bi1000))); // prints 2 ???????
Console.WriteLine(Math.Round(BigInteger.Log10(bi100))); // prints 2
Console.WriteLine(Math.Round(BigInteger.Log10(bi1000))); // prints 3 (as expected)
EDIT: Please note that I know that it's a rouding problem. I want to know why the behavior of Math.Log10 and BigInteger.Log10 differs.
It is due to precision and rounding.
This line:
Console.WriteLine((int)BigInteger.Log10(bi1000));
is rounding down the value 2.9999999999999996 to 2, whereas Console.WriteLine
is writing this out as 3
You can verify this using an intermediate double
variable, and inspecting its value:
double x = BigInteger.Log10(bi1000);
Console.WriteLine((int)x);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With