Take the following piece of code:
NSError *error;
NSString *myJSONString = @"{ \"foo\" : 0.1}";
NSData *jsonData = [myJSONString dataUsingEncoding:NSUTF8StringEncoding];
NSDictionary *results = [NSJSONSerialization JSONObjectWithData:jsonData options:0 error:&error];
My question is, is results[@"foo"]
an NSDecimalNumber, or something with finite binary precision like a double or float? Basically, I have an application that requires the lossless accuracy that comes with an NSDecimalNumber
, and need to ensure that the JSON deserialization doesn't result in rounding because of doubles/floats etcetera.
E.g. if it was interpreted as a float, I'd run into problems like this with precision:
float baz = 0.1;
NSLog(@"baz: %.20f", baz);
// prints baz: 0.10000000149011611938
I've tried interpreting foo
as an NSDecimalNumber and printing the result:
NSDecimalNumber *fooAsDecimal = results[@"foo"];
NSLog(@"fooAsDecimal: %@", [fooAsDecimal stringValue]);
// prints fooAsDecimal: 0.1
But then I found that calling stringValue
on an NSDecimalNumber
doesn't print all significant digits anyway, e.g...
NSDecimalNumber *barDecimal = [NSDecimalNumber decimalNumberWithString:@"0.1000000000000000000000000000000000000000000011"];
NSLog(@"barDecimal: %@", barDecimal);
// prints barDecimal: 0.1
...so printing fooAsDecimal
doesn't tell me whether results[@"foo"]
was at some point rounded to finite precision by the JSON parser or not.
To be clear, I realise I could use a string rather than a number in the JSON representation to store the value of foo, i.e. "0.1"
instead of 0.1
, and then use [NSDecimalNumber decimalNumberWithString:results[@"foo"]
]. But, what I'm interested in is how the NSJSONSerialization class deserializes JSON numbers, so I know whether this is really necessary or not.
NSJSONSerialization
(and JSONSerialization
in Swift) follow the general pattern:
long long
. If that doesn't overflow, return an NSNumber
with long long
.strtod_l
. If it doesn't overflow, return an NSNumber
with double
.NSDecimalNumber
which supports a much larger range of values, specifically a mantissa up to 38 digits and exponent between -128...127.If you look at other examples people have posted you can see that when the value exceeds the range or precision of a double
you get an NSDecimalNumber
back.
The short answer is that you should not serialize to JSON if you require NSDecimalNumber levels of precision. JSON has only one number format: double, which has inferior precision to NSDecimalNumber.
The long answer, which is of academic interest only, because the short answer is also the right answer, is "Not necessarily." NSJSONSerialization does sometimes deserialize as NSDecimalNumber, but it is undocumented, and I have not determined, what the set of circumstances under which it does is. For instance:
BOOL boolYes = YES;
int16_t int16 = 12345;
int32_t int32 = 2134567890;
uint32_t uint32 = 3124141341;
unsigned long long ull = 312414134131241413ull;
double dlrep = 1.5;
double dlmayrep = 1.1234567891011127;
float fl = 3124134134678.13;
double dl = 13421331.72348729 * 1000000000000000000000000000000000000000000000000000.0;
long long negLong = -632414314135135234;
unsigned long long unrepresentable = 10765432100123456789ull;
dict[@"bool"] = @(boolYes);
dict[@"int16"] = @(int16);
dict[@"int32"] = @(int32);
dict[@"dlrep"] = @(dlrep);
dict[@"dlmayrep"] = @(dlmayrep);
dict[@"fl"] = @(fl);
dict[@"dl"] = @(dl);
dict[@"uint32"] = @(uint32);
dict[@"ull"] = @(ull);
dict[@"negLong"] = @(negLong);
dict[@"unrepresentable"] = @(unrepresentable);
NSData *data = [NSJSONSerialization dataWithJSONObject:dict options:NSJSONWritingPrettyPrinted error:nil];
NSDictionary *dict_back = (NSDictionary *)[NSJSONSerialization JSONObjectWithData:data options:NSJSONReadingMutableContainers error:nil];
and in the debugger:
(lldb) po [dict_back[@"bool"] class]
__NSCFBoolean
(lldb) po [dict_back[@"int16"] class]
__NSCFNumber
(lldb) po [dict_back[@"int32"] class]
__NSCFNumber
(lldb) po [dict_back[@"ull"] class]
__NSCFNumber
(lldb) po [dict_back[@"fl"] class]
NSDecimalNumber
(lldb) po [dict_back[@"dl"] class]
NSDecimalNumber
(lldb) po [dict_back[@"dlrep"] class]
__NSCFNumber
(lldb) po [dict_back[@"dlmayrep"] class]
__NSCFNumber
(lldb) po [dict_back[@"negLong"] class]
__NSCFNumber
(lldb) po [dict_back[@"unrepresentable"] class]
NSDecimalNumber
So make of that what you will. You should definitely not assume that if you serialize an NSDecimalNumber to JSON that you will get an NSDecimalNumber back out.
But, again, you should not store NSDecimalNumbers in JSON.
I had the same problem, except I'm using Swift 3. I made a patched version of the JSONSerialization class that parses all numbers as Decimal
's. It can only parse/deserialize JSON, but does not have any serialization code. It's based on Apple's open source re-implementation of Foundation in Swift.
To answer the question in the title: No, it doesn't, it creates NSNumber
objects. You can easily test this:
NSArray *a = @[[NSDecimalNumber decimalNumberWithString:@"0.1"]];
NSData *data = [NSJSONSerialization dataWithJSONObject:a options:0 error:NULL];
a = [NSJSONSerialization JSONObjectWithData:data options:0 error:NULL];
NSLog(@"%@", [a[0] class]);
will print __NSCFNumber
.
You can convert that NSNumber
object to an NSDecimalNumber
with [NSDecimalNumber decimalNumberWithDecimal:[number decimalValue]]
, but according to the docs for decimalValue
The value returned isn’t guaranteed to be exact for float and double values.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With