Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

NodeJS - express server, pm2 cluser and nginx balancing - multiple threads

I'm trying to set up express server to listen on multiple threads. I've used pm2 to set up application on ports 3000, 3001, 3002, 3003 but requests are still waiting for each other...

express application index.js:

const express = require('express')
const axios = require('axios')
const app = express()

app.get('/', async (req, res) => {
    console.log('-----> GOT REQUEST -> ' + (new Date()).getTime());
    let resp = await axios.get("here some correct http get");
    res.send("Hello world!")
    console.log('    Response time: ' + (new Date()).getTime());
})

let instance  = +process.env.NODE_APP_INSTANCE || 0;
let port      = (+process.env.PORT || 3000) + instance;
app.listen(port, () => console.log('Example app listening on port ' + port))

so every instance is on another port. Now it's time for nginx:

upstream test_upstream {
    least_conn;
    server 127.0.0.1:3000;
    server 127.0.0.1:3001;
    server 127.0.0.1:3002;
    server 127.0.0.1:3003;
}


server {
    listen 8000;

    location / {
        proxy_hide_header Access-Control-Allow-Origin;

        add_header Access-Control-Allow-Origin * always;

        proxy_hide_header Access-Control-Allow-Methods;
        add_header Access-Control-Allow-Methods "GET,POST,DELETE,PUT,OPTIONS" always;
        proxy_hide_header Access-Control-Allow-Headers;
        add_header Access-Control-Allow-Headers "Authorization, X-Requested-With, Content-Type" always;
        proxy_hide_header Access-Control-Allow-Credentials;
        add_header Access-Control-Allow-Credentials "true" always;

        if ($request_method = OPTIONS ) { # Allow CORS
            add_header Access-Control-Allow-Origin *;
            add_header Access-Control-Allow-Methods "GET,POST,DELETE,PUT,OPTIONS";
            add_header Access-Control-Allow-Headers "Authorization, X-Requested-With, Content-Type";
            add_header Access-Control-Allow-Credentials "true" always;
            add_header Content-Length 0;
            add_header Content-Type text/plain;
            add_header Allow GET,POST,DELETE,PUT,OPTIONS;
            return 200;
        }
        proxy_pass http://test_upstream;
    }

}

So far so good. My environment:

  • node v 10.3.0
  • cpu cores 8, but i'm using only 4 instances

Ok, started application:

┌──────────┬────┬─────────┬───────┬────────┬─────────┬────────┬─────┬───────────┬───────────────┬──────────┐
│ App name │ id │ mode    │ pid   │ status │ restart │ uptime │ cpu │ mem       │ user          │ watching │
├──────────┼────┼─────────┼───────┼────────┼─────────┼────────┼─────┼───────────┼───────────────┼──────────┤
│ index    │ 0  │ cluster │ 57069 │ online │ 6       │ 17m    │ 0%  │ 39.7 MB   │ administrator │ disabled │
│ index    │ 1  │ cluster │ 57074 │ online │ 6       │ 17m    │ 0%  │ 39.0 MB   │ administrator │ disabled │
│ index    │ 2  │ cluster │ 57091 │ online │ 6       │ 17m    │ 0%  │ 37.5 MB   │ administrator │ disabled │
│ index    │ 3  │ cluster │ 57097 │ online │ 6       │ 17m    │ 0%  │ 38.8 MB   │ administrator │ disabled │
└──────────┴────┴─────────┴───────┴────────┴─────────┴────────┴─────┴───────────┴───────────────┴──────────┘

Now it's time to invoke it. I want to send multiple requests at the same time:

async sendRequest() {
  const startTime = performance.now();
  console.log("Sending request");
  const els = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9];
  const promises = els.map(() => axios.get("http://my_server_with_nginx:8000"));
  let results = await Promise.all(promises);
  console.log(results);
  const stopTime = performance.now();
  console.log("Time of request: " + (stopTime - startTime));
}

And test looks like this: enter image description here

And finally node app log:

0|index    | -----> GOT REQUEST -> 1527796135425
0|index    |     Response time: 1527796135572
1|index    | -----> GOT REQUEST -> 1527796135595
1|index    |     Response time: 1527796135741
2|index    | -----> GOT REQUEST -> 1527796135766
2|index    |     Response time: 1527796136354
3|index    | -----> GOT REQUEST -> 1527796136381
3|index    |     Response time: 1527796136522
0|index    | -----> GOT REQUEST -> 1527796136547
0|index    |     Response time: 1527796136678
1|index    | -----> GOT REQUEST -> 1527796136702
1|index    |     Response time: 1527796136844
2|index    | -----> GOT REQUEST -> 1527796136868
2|index    |     Response time: 1527796137026
3|index    | -----> GOT REQUEST -> 1527796137098
3|index    |     Response time: 1527796137238
0|index    | -----> GOT REQUEST -> 1527796137263
0|index    |     Response time: 1527796137395
1|index    | -----> GOT REQUEST -> 1527796137419
1|index    |     Response time: 1527796137560

As we can see, it's correctly balancing requests to nodes, but somewhere has stalled. How to force it to run in parallel?

like image 936
Patzick Avatar asked Sep 20 '25 23:09

Patzick


1 Answers

It turns out that everything works just great. Problem was in the browser. When browser send the same http get requests they are queued. To change that i had to change invocation:

const els = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9];
const promises = els.map(() => axios.get("http://my_server_with_nginx:8000"));

to this:

const els = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9];
const promises = els.map((el) => axios.get(`http://my_server_with_nginx:8000?number=${el}`));
like image 170
Patzick Avatar answered Sep 22 '25 14:09

Patzick