I've written a program in Node and Express, using Request to connect to an API and downloads a bunch of data (think 3,000 API requests) (all within the usage limits of the API, mind you).
When running this in a Docker container, I'm getting a lot of getaddrinfo ENOTFOUND errors, and I'm wondering if this is a resourcing issue. My requests are like so:
request.get(url, function(err, resp, body){
// do stuff with the body here,
// like create an object and handball to a worker function
});
For the first few hundred requests this always works fine, but then I get lots nad lots of either ENOTFOUND or timeout errors, and I think the issue might be the way my code is dealing with all these requests.
I've batched them in a queue with timeouts so the requests happen relatively slowly, it helps a little bit but doesn't solve the problem completely.
Do I need to destroy the body/response objects to free up memory or something?
I've encountered similar issues with an API I was using, and it ended up being what some here suggested - rate limits. Some APIs don't return readable errors on rate limits, as they provide certain amount of resources per client, and when you've used it all up they can't even send you a bad response.
This happened even though I've stayed within the published rate limits per day, but turned out they have an unwritten limit per minute (or more like - just unable to process so many requests).
I answered it though by mocking that API with my own code, placing it in the network so it will maximize the similarities, and as my mocked code didn't do anything, I never got any errors in the NodeJS server.
Then I put it some countdowns and timeouts when it was needed.
I suggest the same to you. Remember them having a per hour limit, doesn't mean they don't have a different per second/minute limit.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With