Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

nodejs http with redis, only have 6000req/s

Test node_redis benchmark, it show incr has more than 100000 ops/s

$ node multi_bench.js   
Client count: 5, node version: 0.10.15, server version: 2.6.4, parser: hiredis  
INCR,     1/5 min/max/avg/p95:    0/   2/   0.06/   1.00   1233ms total, 16220.60 ops/sec  
INCR,    50/5 min/max/avg/p95:    0/   4/   1.61/   3.00    648ms total, 30864.20 ops/sec  
INCR,   200/5 min/max/avg/p95:    0/  14/   5.28/   9.00    529ms total, 37807.18 ops/sec    
INCR, 20000/5 min/max/avg/p95:   42/ 508/ 302.22/ 467.00    519ms total, 38535.65 ops/sec

Then I add redis in nodejs with http server

var http = require("http"), server,        

redis_client = require("redis").createClient();

server = http.createServer(function (request, response) {
        response.writeHead(200, {
                "Content-Type": "text/plain"
            });
    
        redis_client.incr("requests", function (err, reply) {
            response.write(reply+'\n');                                                                                          
            response.end();
        });
}).listen(6666);

server.on('error', function(err){
    console.log(err);
    process.exit(1);
});

Use ab command to test, it only has 6000 req/s

$ ab -n 10000 -c 100 localhost:6666/
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking localhost (be patient)
Completed 1000 requests
Completed 2000 requests
Completed 3000 requests
Completed 4000 requests
Completed 5000 requests
Completed 6000 requests
Completed 7000 requests
Completed 8000 requests
Completed 9000 requests
Completed 10000 requests
Finished 10000 requests


Server Software:        
Server Hostname:        localhost
Server Port:            6666

Document Path:          /
Document Length:        7 bytes

Concurrency Level:      100
Time taken for tests:   1.667 seconds
Complete requests:      10000
Failed requests:        0
Write errors:           0
Total transferred:      1080000 bytes
HTML transferred:       70000 bytes
Requests per second:    6000.38 [#/sec] (mean)
Time per request:       16.666 [ms] (mean)
Time per request:       0.167 [ms] (mean, across all concurrent requests)
Transfer rate:          632.85 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.3      0       2
Processing:    12   16   3.2     15      37
Waiting:       12   16   3.1     15      37
Total:         13   17   3.2     16      37

Percentage of the requests served within a certain time (ms)
  50%     16
  66%     16
  75%     16
  80%     17
  90%     20
  95%     23
  98%     28
  99%     34
 100%     37 (longest request)

Last I test 'hello world', it reached 7k req/s

Requests per second:    7201.18 [#/sec] (mean)

How to profile and figure out the reason why redis in http lose some performance?

like image 488
linbo Avatar asked Dec 15 '22 08:12

linbo


1 Answers

I think you have misinterpreted the result of multi_bench benchmark.

First, this benchmark spreads the load over 5 connections, while you have only one in your node.js program. More connections mean more communication buffers (allocated on a per socket basis) and better performance.

Then, while a Redis server is able to sustain 100K op/s (provided you open several connections, and/or use pipelining), node.js and node_redis are not able to reach this level. The result of your run of multi_bench shows that when pipelining is not used, only 16K op/s are achieved.

Client count: 5, node version: 0.10.15, server version: 2.6.4, parser: hiredis  
INCR,     1/5 min/max/avg/p95:    0/   2/   0.06/   1.00   1233ms total, 16220.60 ops/sec  

This result means that with no pipelining, and with 5 concurrent connections, node_redis is able to process 16K op/s globally. Please note that measuring a throughput of 16K op/s while only sending 20K ops (default value of multi_bench) is not very accurate. You should increase num_requests for better accuracy.

The result of your second benchmark is not so surprising: you add an http layer (which is more expensive to parse than Redis protocol itself), use only 1 connection to Redis while ab tries to open 100 concurrent connections to node.js, and finally get 6K op/s, resulting in a 1.2K op/s throughput overhead compared to a "Hello world" HTTP server. What did you expect?

You could try to squeeze out a bit more performance by leveraging node.js clustering capabilities, as described in this answer.

like image 177
Didier Spezia Avatar answered Dec 18 '22 10:12

Didier Spezia