I am running a nodejs server on port 8080, so my server can only process one request at a time. I can see that if i send multiple requests in one single shot, new requests are queued and executed sequentially one after another.
What I am trying to find is, how do i run multiple instances/threads of this process. Example like gunicorn for python servers. Is there something similar, instead of running the nodejs server on multiple ports for each instance.
I have placed nginx infront of the node process. Is that sufficient and recommended method.
worker_processes auto;
worker_rlimit_nofile 1100;
events {
worker_connections 1024;
multi_accept on;
use epoll;
}
pid /var/run/nginx.pid;
http {
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
server {
listen 80;
server_name localhost;
access_log /dev/null;
error_log /dev/null;
location / {
proxy_pass http://localhost:8080;
}
}
}
Where Node. js really shines is in building fast, scalable network applications, as it's capable of handling a huge number of simultaneous connections with high throughput, which equates to high scalability.
Node.js offers Easy Scalability With Node. js, you'll have no problems scaling the application horizontally across multiple servers or vertically to increase its performance on a single server.
Scale across Multiple Machines with a Load Balancer Scaling your Node. js application horizontally across multiple machines is similar to scaling across multiple cores on a single machine. As long as your application can run as an independent process, it can be distributed to run across several machines.
First off, make sure your node.js process is ONLY using asynchronous I/O. If it's not compute intensive and using asynchronous I/O, it should be able to have many different requests "in-flight" at the same time. The design of node.js is particularly good at this if your code is designed properly. If you show us the crux of what it is doing on one of these requests, we can advise more specifically on whether your server code is designed properly for best throughput.
Second, instrument and measure, measure, measure. Understand where your bottlenecks are in your existing node.js server and what is causing the delay or sequencing you see. Sometimes there are ways to dramatically fix/improve your bottlenecks before you start adding lots more clusters or servers.
Third, use the node.js cluster module. This will create one master node.js process that automatically balances between several child processes. You generally want to creates a cluster child for each actual CPU you have in your server computer since that will get you the most use out of your CPU.
Fourth, if you need to scale to the point of multiple actual server computers, then you would use either a load balancer or reverse proxy such as nginx to share the load among multiple hosts. If you had a quad core CPUs in your server, you could run a cluster with four node.js processes on it on each server computer and then use nginx to balance among the several server boxes you had.
Note that adding multiple hosts that are load balanced by nginx is the last option here, not the first option.
Like @poke said, you would use a reverse proxy and/or a load balancer in front.
But if you want a software to run multiple instances of node, with balancing and other stuffs, you should check pm2
http://pm2.keymetrics.io/
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With