I have a NodeJS app which sends HTTP get requests from various places in the code, some are even dependent (sending a request, waiting for a reply, processing it and based on results sending another request). I need to limit the rate the requests (e.g., 10 requests per hour).
I thought about queuing the requests and then at some central point releasing them in a controlled manner, but got stuck at how to queue the callback functions and their dependent parameters.
Would be happy to hear suggestions how to over come this scenario with minimum restructuring for the app.
Thanks
Note: If you are going to make GET, POST request frequently in NodeJS, then use Postman , Simplify each step of building an API. In this syntax, the route is where you have to post your data that is fetched from the HTML. For fetching data you can use bodyparser package. Web Server: Create app.
I think that you have answered your question already. A central queue that can throttle your requests is the way to go. The only problem here is that the queue has to have the full information of for the request and the callback(s) that should be used. I would abstract this in a QueueableRequest
object that could look something like this:
var QueueableRequest = function(url, params, httpMethod, success, failure){
this.url = url;
this.params = params;
...
}
//Then you can queue your request with
queue.add(new QueueableRequest({
"api.test.com",
{"test": 1},
"GET",
function(data){ console.log('success');},
function(err){ console.log('error');}
}));
Of course this is just sample code that could be much prettier, but I hope you get the picture.
The Async module has a number of control flow options that could help you. queue
sounds like a good fit, where you can limit concurrency.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With