Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

"Unexpected token in JSON at position 0" using JSON.parse in Node with valid JSON

I've been tearing my hair out over this one for hours now.

I have a simple Node server that's making a call to an external API to get a (massive, like 4+ MB) bit of JSON. I'm using about as boilerplate a request as you can get, taken straight from the Node docs:

const muniURL = `http://api.511.org/transit/vehiclemonitoring?api_key=${API_KEYS.API_KEY_511}&format=json&agency=sf-muni`;

http.get(muniURL, (res) => {
  const statusCode = res.statusCode;
  const contentType = res.headers['content-type'];
  console.log('Status Code:', statusCode);
  console.log('Content Type:', contentType);

  let error;
  if (statusCode !== 200) {
    error = new Error(`Request Failed.\n` +
                      `Status Code: ${statusCode}`);
  } else if (!/^application\/json/.test(contentType)) {
    error = new Error(`Invalid content-type.\n` +
                      `Expected application/json but received ${contentType}`);
  }
  if (error) {
    console.log(`Request error: ${error.message}`);
    // consume response data to free up memory
    res.resume();
    return;
  }

  res.setEncoding('utf8');
  let rawData = '';
  res.on('data', (chunk) => rawData += chunk);
  res.on('end', () => {
    try {
      const parsedData = JSON.parse(rawData);
      console.log('parsedData:', parsedData);
    } catch (e) {
      console.log(`Caught error: ${e.message}`);
    }
  });
}).on('error', (e) => {
  console.log(`Got error: ${e.message}`);
});

...and every single time, it hits the catch statement with: Caught error: Unexpected token in JSON at position 0. (Note the two spaces between 'token' and 'in'.)

I've checked the JSON returned from both Chrome and Postman with two different web-based JSON validators, and it comes back as valid. While writing rawData to a file looks something like a buffer(?)...

1fef bfbd 0800 0000 0000 0400 efbf bdef
bfbd efbf bd72 efbf bdc8 b62d efbf bd2b
0c3f 7547 1cef bfbd 00ef bfbd efbf bd0b
efbf bd5b 49ef bfbd 2def bfbd 6c6b efbf
bd5c 55ef bfbd efbf bd44 3fef bfbd 126c
71ef bfbd 021c 2029 6def bfbd 13ef bfbd
efbf bdef bfbd 437f 52ef bfbd 4227 48ef
bfbd efbf bd4d efbf bd31 13ef bfbd 09ef
bfbd 5d2f 7bef bfbd efbf bde5 aa81 745e
efbf bd65 efbf bd31 efbf bdef bfbd efbf
...

...Buffer.isBuffer comes back false.

Thus far I've tried JSON.stringifying first, toStringing, converting to a new Buffer and then stringifying, .triming white space, and replaceing all sorts of escaped characters, all to no avail.

What am I missing here?


EDIT: I realized that I was validating JSON fetched by Chrome and Postman, which apparently are doing some pre-processing of some sort. curling the URL yields a whole bunch of mess that's definitely not JSON. Still left with the questions of what data type that mess actually is, and why I'm not getting JSON when I'm specifically requesting it.

like image 725
dangerismycat Avatar asked Dec 03 '16 04:12

dangerismycat


1 Answers

It appears api.511.org is applying gzip to any api calls that supply a valid api_key. Also it's returning an invalid first character in the json response.

Here is a workaround:

var request = require('request');

var apiUrl = 'http://api.511.org/transit/vehiclemonitoring?api_key=${API_KEYS.API_KEY_511}&format=json&agency=sf-muni';
//apiUrl = 'http://ip.jsontest.com/';

var response = request({
    method: 'GET',
    uri: apiUrl,
    gzip: true
}, function(error, response, body) {
    //* workaround for issue with this particular apiUrl
    var firstChar = body.substring(0, 1);
    var firstCharCode = body.charCodeAt(0);
    if (firstCharCode == 65279) {
        console.log('First character "' + firstChar + '" (character code: ' + firstCharCode + ') is invalid so removing it.');
        body = body.substring(1);
    }
    //*/

    var parsedJson = JSON.parse(body);
    console.log('parsedJson: ', parsedJson);
});
like image 88
Rocky Sims Avatar answered Oct 20 '22 23:10

Rocky Sims