Have a node.js app that is receiving JSON data strings that contain the literal NaN, like
"[1, 2, 3, NaN, 5, 6]"
This crashes JSON.parse(...)
in Node.js. I'd like to parse it, if i can into an object.
I know NaN
is not part of JSON spec. Most SO links (sending NaN in json) suggest to fix the output.
Here, though the data is produced in a server I don't control, it's by a commercial Java library where I can see the source code. And it's produced by Google's Gson library:
private Gson gson = (new GsonBuilder().serializeSpecialFloatingPointValues().create());
...
gson.toJson(data[i], Vector.class, jsonOut)
So that seems like a legitimate source. And according to the Gson API Javadoc it says I should be able to parse it:
Section 2.4 of JSON specification disallows special double values (NaN, Infinity, -Infinity). However, Javascript specification (see section 4.3.20, 4.3.22, 4.3.23) allows these values as valid Javascript values. Moreover, most JavaScript engines will accept these special values in JSON without problem. So, at a practical level, it makes sense to accept these values as valid JSON even though JSON specification disallows them.
Despite that, this fails in both Node.js and Chrome: JSON.parse('[1,2,3,NaN,"5"]')
Is there a flag to set in JSON.parse()? Or an alternative parser that accepts NaN
as a literal?
I've been Googling for a while but can't seem to find a doc on this issue.
PHP: How to encode infinity or NaN numbers to JSON?
NaN is not valid JSON, and the ignore_nan flag will handle correctly all NaN to null conversions. The default parameter will allow simplejson to parse your datetimes correctly.
The toJSON() method returns a date object as a string, formatted as a JSON date.
The JSON.stringify() method converts a JavaScript value to a JSON string, optionally replacing values if a replacer function is specified or optionally including only the specified properties if a replacer array is specified.
Use the JavaScript function JSON. stringify() to convert it into a string. const myJSON = JSON. stringify(obj);
Have a node.js app that is receiving JSON data strings that contain the literal NaN, like
Then your NodeJS app isn't receiving JSON, it's receiving text that's vaguely JSON-like. NaN
is not a valid JSON token.
Three options:
This is obviously the preferred course. The data is not JSON, that should be fixed, which would fix your problem.
NaN
in a simple-minded way:You could replace it with null
before parsing it, e.g.:
var result = JSON.parse(yourString.replace(/\bNaN\b/g, "null"));
...and then handle null
s in the result. But that's very simple-minded, it doesn't allow for the possibility that the characters NaN
might appear in a string somewhere.
Alternately, spinning Matt Ball's reviver
idea (now deleted), you could change it to a special string (like "***NaN***"
) and then use a reviver to replace that with the real NaN
:
var result = JSON.parse(yourString.replace(/\bNaN\b/g, '"***NaN***"'), function(key, value) {
return value === "***NaN***" ? NaN : value;
});
...but that has the same issue of being a bit simple-minded, assuming the characters NaN
never appear in an appropriate place.
eval
If you know and trust the source of this data and there's NO possibility of it being tampered with in transit, then you could use eval
to parse it instead of JSON.parse
. Since eval
allows full JavaScript syntax, including NaN
, that works. Hopefully I made the caveat bold enough for people to understand that I would only recommend this in a very, very, very tiny percentage of situations. But again, remember eval
allows arbitrary execution of code, so if there's any possibility of the string having been tampered with, don't use it.
When you deal with about anything mathematical or with industry data, NaN
is terribly convenient (and often infinities too are). And it's an industry standard since IEEE754.
That's obviously why some libraries, notably GSON, let you include them in the JSON they produce, losing standard purity and gaining sanity.
Revival and regex solutions aren't reliably usable in a real project when you exchange complex dynamic objects.
And eval
has problems too, one of them being the fact it's prone to crash on IE when the JSON string is big, another one being security risks.
That's why I wrote a specific parser (used in production) : JSON.parseMore
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With