Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Int to Double casting issue

I'm an Objective-C developer with little C/C++ experience (and zero training), and I encountered something strange today with hard coded numeric values.

I'm sure it's a simple/stupid question, but can someone please explain why this works:

NSDate *start = [NSDate date];
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 1 * NSEC_PER_SEC);

dispatch_after(popTime, dispatch_get_main_queue(), ^{
  NSLog(@"seconds: %f", [start timeIntervalSinceNow]);
});
// output: seconds: -1.0001

And this also works (note number of seconds has changed):

NSDate *start = [NSDate date];
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 2 * NSEC_PER_SEC);

dispatch_after(popTime, dispatch_get_main_queue(), ^{
  NSLog(@"seconds: %f", [start timeIntervalSinceNow]);
});
// output: seconds: -2.0001

But this is executed immediately:

NSDate *start = [NSDate date];
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 4 * NSEC_PER_SEC);

dispatch_after(popTime, dispatch_get_main_queue(), ^{
  NSLog(@"seconds: %f", [start timeIntervalSinceNow]);
});
// output: seconds: -0.0001

However, using 4.0 instead of 4 fixes it:

NSDate *start = [NSDate date];
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 4.0 * NSEC_PER_SEC);

dispatch_after(popTime, dispatch_get_main_queue(), ^{
  NSLog(@"seconds: %f", [start timeIntervalSinceNow]);
});
// output: seconds: -4.0001

Why do 1 and 2 properly cast to the relevant double value, but bigger numbers (I tested 3 and 4) appear to be represented as 0?

I'm compiling with Xcode 4.2, configured to use LLVM 3.0.

EDIT:

dispatch_time_t is defined as:

typedef uint64_t dispatch_time_t;

And dispatch_time is:

dispatch_time_t dispatch_time(dispatch_time_t when, int64_t delta);

And NSEC_PER_SEC is:

#define NSEC_PER_SEC    1000000000  /* nanoseconds per second */
like image 469
Abhi Beckert Avatar asked Dec 07 '11 23:12

Abhi Beckert


1 Answers

There are 1,000,000,000 nanoseconds in a second, so I'm going to assume that NSEC_PER_SEC is defined as 1000000000.

  • 4 is of type int
  • 4.0 is of type double

Now assuming that an int contains 32 bits, the range of an int would be [-2,147,483,648 to 2,147,483,647]

4000000000 > 2147483647, therefore you'll cause the int to overflow, which is causing the value to be set to 0.

EDIT: I probably could've worded the above statement better. The overflow could cause the int (assuming it's 32 bits in size, as stated above) to equal the value -294967296, and dispatch_time would be treating any value <= 0 as 0 seconds. That's where the "0" above came from.

A double variable can hold larger values than an int, and is able to store an approximation of the value 4000000000.

like image 113
AusCBloke Avatar answered Oct 02 '22 19:10

AusCBloke