Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does 5/2 results in '2' even when I use a float? [duplicate]

I entered the following code (and had no compiling problems or anything):

float y = 5/2;
printf("%f\n", y);

The output was simply: 2.00000

My math isn't wrong is it? Or am I wrong on the / operator? It means divide doesn't it? And 5/2 should equal 2.5?

Any help is greatly appreciated!

like image 482
Alex Lord Avatar asked Dec 24 '22 00:12

Alex Lord


1 Answers

5 is an int and 2 is an int. Therefore, 5/2 will use integer division. If you replace 5 with 5.0f (or 2 with 2.0f), making one of the ints a float, you will get floating point division and get the 2.5 you expect. You can also achieve the same effect by explicitly casting either the numerator or denominator (e.g. ((float) 5) / 2).

like image 119
R_Kapp Avatar answered Dec 28 '22 10:12

R_Kapp