NSLog(@"CEIL %f",ceil(2/3));
should return 1. However, it shows:
CEIL 0.000000
Why and how to fix that problem? I use ceil([myNSArray count]/3)
and it returns 0 when array count is 2.
The same rules as C apply: 2
and 3
are ints, so 2/3
is an integer divide. Integer division truncates so 2/3
produces the integer 0
. That integer 0
will then be cast to a double precision float for the call to ceil
, but ceil(0)
is 0
.
Changing the code to:
NSLog(@"CEIL %f",ceil(2.0/3.0));
Will display the result you're expecting. Adding the decimal point causes the constants to be recognised as double precision floating point numbers (and 2.0f
is how you'd type a single precision floating point number).
Maudicus' solution works because (float)2/3
casts the integer 2
to a float and C's promotion rules mean that it'll promote the denominator to floating point in order to divide a floating point number by an integer, giving a floating point result.
So, your current statement ceil([myNSArray count]/3)
should be changed to either:
([myNSArray count] + 2)/3 // no floating point involved
Or:
ceil((float)[myNSArray count]/3) // arguably more explicit
2/3 evaluates to 0 unless you cast it to a float. So, you have to be careful with your values being turned to int's before you want.
float decValue = (float) 2/3;
NSLog(@"CEIL %f",ceil(decValue));
==>
CEIL 1.000000
For you array example
float decValue = (float) [myNSArray count]/3;
NSLog(@"CEIL %f",ceil(decValue));
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With