I want to calculate the average of two floating point numbers, but whatever the input, I am getting an integer returned.
What should I do to make this work?
public class Program
{
public static float Average(int a, int b)
{
return (a + b) / 2;
}
public static void Main(string[] args)
{
Console.WriteLine(Average(2, 1));
}
}
The numeric arguments are first converted to a common type. Division of integers yields a float, while floor division of integers results in an integer; the result is that of mathematical division with the 'floor' function applied to the result. The result of flooring is safe to convert to an integer.
The division operator / means integer division if there is an integer on both sides of it. If one or two sides has a floating point number, then it means floating point division.
Integers and floats are two different kinds of numerical data. An integer (more commonly called an int) is a number without a decimal point. A float is a floating-point number, which means it is a number that has a decimal place. Floats are used when more precision is needed.
There're two problems with your code
1 / 2 == 0
not 0.5
since result must be integera + b
can overflow int.MaxValue
and you'll get negative result The most accurate implementation is
public static float Average(int a, int b)
{
return 0.5f * a + 0.5f * b;
}
Tests:
Average(1, 2); // 1.5
Average(int.MaxValue, int.MaxValue); // some large positive value
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With