Edit(2): Now using db-mysql with generic-pool module. The error rate has dropped significantly and hovers at 13% but the throughput is still around 100 req/sec.
Edit(1): After someone suggesting that ORDER BY RAND() would cause MySQL to be slow, I had removed that clause from the query. Node.js now hovers around 100 req/sec but still the server reports 'CONNECTION error: Too many connections'.
You probably saw many "Hello World" benchmarking of node.js... but "hello world" tests, even those that were delayed by 2 seconds per request, are not even close to real world production usage. I also performed those variations of "Hello World" tests using node.js and saw throughput of about 800 req/sec with 0.01% error rate. However, I decided to some tests that were a bit more realistic.
Maybe my tests are not complete, most likely something is REALLY wrong about node.js or my test code and so if your a node.js expert, please do help me write some better tests. My results are published below. I used Apache JMeter to do the testing.
The test is pretty simple. A mysql query for number of users is ordered randomly. The first user's username is retrieved and displayed. The mysql database connection is through a unix socket. The OS is FreeBSD 8+. 8GB of RAM. Intel Xeon Quad Core 2.x Ghz processor. I tuned the Lighttpd configurations a bit before i even came across node.js.
Number of threads (users) : 5000 I believe this is the number of concurrent connections
Ramp up period (in seconds) : 1
Loop Count : 10 This is the number of requests per user
Label | # Samples | Average | Min | Max | Std. Dev. | Error % | Throughput | KB/sec | Avg. Bytes HTTP Requests Lighttpd | 49918 | 2060ms | 29ms | 84790ms | 5524 | 19.47% | 583.3/sec | 211.79 | 371.8 HTTP Requests Node.js | 13767 | 106569ms | 295ms | 292311ms | 91764 | 78.86% | 44.6/sec | 79.16 | 1816
Node.js was so bad i had to stop the test early. [Fixed Tested completely]
Node.js reports "CONNECTION error: Too many connections" on the server. [Fixed]
Most of the time, Lighttpd had a throughput of about 1200 req/sec.
However, node.js had a throughput of about 29 req/sec. [Fixed Now at 100req/sec]
var cluster = require('cluster'), http = require('http'), mysql = require('db-mysql'), generic_pool = require('generic-pool');
var pool = generic_pool.Pool({
name: 'mysql',
max: 10,
create: function(callback) {
new mysql.Database({
socket: "/tmp/mysql.sock",
user: 'root',
password: 'password',
database: 'v3edb2011'
}).connect(function(err, server) {
callback(err, this);
});
},
destroy: function(db) {
db.disconnect();
}
});
var server = http.createServer(function(request, response) {
response.writeHead(200, {"Content-Type": "text/html"});
pool.acquire(function(err, db) {
if (err) {
return response.end("CONNECTION error: " + err);
}
db.query('SELECT * FROM tb_users').execute(function(err, rows, columns) {
pool.release(db);
if (err) {
return response.end("QUERY ERROR: " + err);
}
response.write(rows.length + ' ROWS found using node.js<br />');
response.end(rows[0]["username"]);
});
});
});
cluster(server)
.set('workers', 5)
.listen(8080);
<?php
$conn = new mysqli('localhost', 'root', 'password', 'v3edb2011');
if($conn) {
$result = $conn->query('SELECT * FROM tb_users ORDER BY RAND()');
if($result) {
echo ($result->num_rows).' ROWS found using Lighttpd + PHP (FastCGI)<br />';
$row = $result->fetch_assoc();
echo $row['username'];
} else {
echo 'Error : DB Query';
}
} else {
echo 'Error : DB Connection';
}
?>
This is a bad benchmark comparison. In node.js your selecting the whole table and putting it in an array. In php your only parsing the first row. So the bigger your table is the slower node will look. If you made php use mysqli_fetch_all it would be a similar comparison. While db-mysql is supposed to be fast it's not very full featured and lacks the ability to make this a fair comparison. Using a different node.js module like node-mysql-libmysqlclient should allow you to only process the first row.
100 connections is the default setting for MySQL maximum number of connections.
So somehow your connections aren't being reused for different requests. Probably you already have one query running on each connection.
Maybe the nodejs MySQL library you are using will not queue queries on the same MySQL connection but try to open an other connection and fail.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With