Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is node.js only processing six requests at a time?

Tags:

node.js

We have a node.js server which implements a REST API as a proxy to a central server which has a slightly different, and unfortunately asymmetric REST API.

Our client, which runs in various browsers, asks the node server to get the tasks from the central server. The node server gets a list of all the task ids from the central one and returns them to the client. The client then makes two REST API calls per id through the proxy.

As far as I can tell, this stuff is all done asynchronously. In the console log, it looks like this when I start the client:

Requested GET URL under /api/v1/tasks/*: /api/v1/tasks/ 

This takes a couple seconds to get the list from the central server. As soon as it gets the response, the server barfs this out very quickly:

Requested GET URL under /api/v1/tasks/id/:id :/api/v1/tasks/id/438 Requested GET URL under /api/v1/workflow/id/:id :/api/v1/workflow/id/438 Requested GET URL under /api/v1/tasks/id/:id :/api/v1/tasks/id/439 Requested GET URL under /api/v1/workflow/id/:id :/api/v1/workflow/id/439 Requested GET URL under /api/v1/tasks/id/:id :/api/v1/tasks/id/441 Requested GET URL under /api/v1/workflow/id/:id :/api/v1/workflow/id/441 

Then, each time a pair of these requests gets a result from the central server, another two lines is barfed out very quickly.

So it seems our node.js server is only willing to have six requests out at a time.

like image 261
Almo Avatar asked Aug 21 '12 18:08

Almo


People also ask

How many requests can node js handle?

However, considering that a “Hello World” Node. js server is easily capable of thirty thousand requests per second on that machine that produced these results, 23 requests per second with an average latency exceeding 3 seconds is dismal.

Can NodeJS handle multiple requests?

How NodeJS handle multiple client requests? NodeJS receives multiple client requests and places them into EventQueue. NodeJS is built with the concept of event-driven architecture. NodeJS has its own EventLoop which is an infinite loop that receives requests and processes them.

How many requests can express handle per second?

There's a benchmark made by Fastify creators, it shows that express. js can handle ~15K requests per second, and the vanilla HTTP module can handle 70K rps.

How the request is processed in node JS?

The first one runs only once since it is a synchronous method. The * get route is not setup until after it returns. The second will run when any http request comes to the server. And yes, it will block the whole server for the duration of that synchronous call (I/O cost to open and read the contents of the file).


1 Answers

There are no TCP connection limits imposed by Node itself. (The whole point is that it's highly concurrent and can handle thousands of simultaneous connections.) Your OS may limit TCP connections.

It's more likely that you're either hitting some kind of limitation of your backend server, or you're hitting the builtin HTTP library's connection limit, but it's hard to say without more details about that server or your Node implementation.

Node's built-in HTTP library (and obviously any libraries built on top of it, which are most) maintains a connection pool (via the Agent class) so that it can utilize HTTP keep-alives. This helps increase performance when you're running many requests to the same server: rather than opening a TCP connection, making a HTTP request, getting a response, closing the TCP connection, and repeating; new requests can be issued on reused TCP connections.

In node 0.10 and earlier, the HTTP Agent will only open 5 simultaneous connections to a single host by default. You can change this easily: (assuming you've required the HTTP module as http)

http.globalAgent.maxSockets = 20; // or whatever 

node 0.12 sets the default maxSockets to Infinity.

You may want to keep some kind of connection limit in place. You don't want to completely overwhelm your backend server with hundreds of HTTP requests under a second – performance will most likely be worse than if you just let the Agent's connection pool do its thing, throttling requests so as to not overload your server. Your best bet will be to run some experiments to see what the optimal number of concurrent requests is in your situation.

However, if you really don't want connection pooling, you can simply bypass the pool entirely – sent agent to false in the request options:

http.get({host:'localhost', port:80, path:'/', agent:false}, callback); 

In this case, there will be absolutely no limit on concurrent HTTP requests.

like image 137
josh3736 Avatar answered Oct 23 '22 08:10

josh3736