What is the point of having a seperate unsigned type, aka NSUInteger
if there is no guarantee (nor even, it seems, a chance) that you can assume, bet your bottom dollar on, or cry yourself to sleep for - what the name implies - an inherently nonnegative result.
NSUInteger normal = 5;
NSUInteger freaky = normal - 55;
NSLog(@"%ld, %ld", normal, freaky);
NSLOG 5, -50
Sure, I can bend over backwards trying to get zero, or some kind of normalized value…
NSUInteger nonNeg = (((normal - 55) >= 0) ? (normal - 55) : 0);
PARRALELUNIVERSELOG 5, -50
But here the compiler complains.. rightfully so that comparison of unsigned expression >= 0 is always true
- and there it is, an answer I didn't want / expect. Someone slap my face, get me a drink, an tell me what year it is.. or better yet… how to make it - you know - not do that.
%ld
tells NSLog
to print it as a signed integer. Try %lu
.
See 2's Complement on wikipedia for an explanation of what's going on at the bit level.
What is happening here is that subtraction is causing the unsigned integer representation to wrap-around. To protect against this you need to check before you do the subtraction.
NSUInteger x = 5;
NSUInteger y = 55;
// If 0 makes sense in your case
NSUInteger result = (x >= y) ? (x - y) : 0;
// If it should be an error
if(x < y)
{
// Report error
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With