Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to queue http get requests in Nodejs in order to control their rate?

Tags:

node.js

I have a NodeJS app which sends HTTP get requests from various places in the code, some are even dependent (sending a request, waiting for a reply, processing it and based on results sending another request). I need to limit the rate the requests (e.g., 10 requests per hour).

I thought about queuing the requests and then at some central point releasing them in a controlled manner, but got stuck at how to queue the callback functions and their dependent parameters.

Would be happy to hear suggestions how to over come this scenario with minimum restructuring for the app.

Thanks

like image 906
user971956 Avatar asked May 26 '12 21:05

user971956


People also ask

How do I handle GET and POST request in NodeJS?

Note: If you are going to make GET, POST request frequently in NodeJS, then use Postman , Simplify each step of building an API. In this syntax, the route is where you have to post your data that is fetched from the HTML. For fetching data you can use bodyparser package. Web Server: Create app.


2 Answers

I think that you have answered your question already. A central queue that can throttle your requests is the way to go. The only problem here is that the queue has to have the full information of for the request and the callback(s) that should be used. I would abstract this in a QueueableRequest object that could look something like this:

var QueueableRequest = function(url, params, httpMethod, success, failure){
  this.url = url;
  this.params = params;
  ...

}
//Then you can queue your request with

queue.add(new QueueableRequest({
  "api.test.com",
  {"test": 1},
  "GET",
  function(data){ console.log('success');},
  function(err){ console.log('error');}
}));

Of course this is just sample code that could be much prettier, but I hope you get the picture.

like image 65
topek Avatar answered Oct 03 '22 10:10

topek


The Async module has a number of control flow options that could help you. queue sounds like a good fit, where you can limit concurrency.

like image 37
Wes Johnson Avatar answered Oct 03 '22 10:10

Wes Johnson