Alright, I've searched everywhere and I can't seem to find a detailed resource online for how to interpret the results from Apache's ab server benchmarking tool. I've run several tests with what I thought were drastically different parameters, but have seen very similar results (I have a hard time thinking that this means my site is scaling perfectly!). If there is a detailed resource someone could point me to, on how to understand the results from this test, or if someone feels like creating one here, I think that would be very useful to me and others.
Apache Bench (ab) is a load testing and benchmarking tool for Hypertext Transfer Protocol (HTTP) server. It can be run from command line and it is very simple to use. A quick load testing output can be obtained in just one minute.
ab is a tool for benchmarking your Apache Hypertext Transfer Protocol (HTTP) server. It is designed to give you an impression of how your current Apache installation performs. This especially shows you how many requests per second your Apache installation is capable of serving.
Frustrating, isn't it? I am trying to do the same thing, see how my newly provisioned and configured dedicated server compares to others.
What I am ending up doing is comparing my current production server (Dual core 4GB RAM) to the new server (Quad core 8GB RAM).
I need to 'play nice' with my side by side comparisons, as the production server is live and I don't want to 'break' the server for my users.
Comparing the current vs new with the following command on a php page that just calls phpinfo(): ab -kc 20 -t 60
On my current production server, I see something like the following, where it couldn't complete the task in the given amount of time:
Time taken for tests: 60.1234 seconds Complete requests: 24538 Failed requests: 58 (Connect: 0, Length: 58, Exceptions: 0) Requests per second: 408.96 [#/sec] (mean) Time per request: 48.905 [ms] (mean) Time per request: 2.445 [ms] (mean, across all concurrent requests)
VS the following on the new server which completed all the tests in half the amount of time:
Time taken for tests: 29.838791 seconds Complete requests: 50000 Failed requests: 11 (Connect: 0, Length: 11, Exceptions: 0) Requests per second: 1675.67 [#/sec] (mean) Time per request: 11.936 [ms] (mean) Time per request: 0.597 [ms] (mean, across all concurrent requests)
Now, this isn't really a 'fair' test, as the current server is handling 20 websites in addition to the benchmark test. Also, it is really only testing apache & php.
Putting this same test against one of my more complex home pages, one that 'feels' slow on the current server, I see the following: Current Server:
Time taken for tests: 60.14170 seconds Complete requests: 510 Requests per second: 8.50 [#/sec] (mean) Time per request: 2353.497 [ms] (mean) Time per request: 117.675 [ms] (mean, across all concurrent requests)
New Server:
Time taken for tests: 60.18651 seconds Complete requests: 1974 Requests per second: 32.89 [#/sec] (mean) Time per request: 608.092 [ms] (mean) Time per request: 30.405 [ms] (mean, across all concurrent requests)
This test is loading a Joomla CMS dynamically generated page. It is a bit more of a 'real world' test. Again, with the new server not dealing with current site traffic, so it's not an apples to apples comparison. I don't want to test much harder or I risk my end user's experience on my sites.
After migrating the sites to the new server, I plan on doing the above tests again so I can see what affect my regular site traffic has on the benchmarking. The same machine's production vs idle benchmark results.
Now, I am also looking at stressing the new server out and making sure that it reacts well. Running the command ab -n 50000 -c 200 I am watching the top command and seeing how much CPU & memory are being used while also *f5*ing the page in my browser to see if I get any errors and also to get a feel for how long it takes the server to respond.
My first test gave me:
Concurrency Level: 200 Time taken for tests: 692.160011 seconds Complete requests: 50000 Failed requests: 30102 (Connect: 0, Length: 30102, Exceptions: 0) Write errors: 0 Non-2xx responses: 30102 Total transferred: 456568770 bytes HTML transferred: 442928962 bytes Requests per second: 72.24 [#/sec] (mean) Time per request: 2768.640 [ms] (mean) Time per request: 13.843 [ms] (mean, across all concurrent requests) Transfer rate: 644.17 [Kbytes/sec] received
Note the very high failed request rate. My apache is set to a max of 250 simultaneous requests, but my MySQL was at only 175. MySQL was the failure point here. It couldn't processes all the requests coming from apache. My web browser page loads was giving me a MySQL connection error page on many page refreshes.
So, I bumped up MySQL to 300 simultaneous requests (I had done it already, but had forgotten to restart MySQL, so this turned out to be a good test - I had identified a needed change, and accidentally did an empirical test validating the change's necessity).
The next run gave me the following results:
Concurrency Level: 200 Time taken for tests: 1399.999463 seconds Complete requests: 50000 Failed requests: 5054 (Connect: 0, Length: 5054, Exceptions: 0) Write errors: 0 Non-2xx responses: 5054 Total transferred: 1016767290 bytes HTML transferred: 995713274 bytes Requests per second: 35.71 [#/sec] (mean) Time per request: 5599.998 [ms] (mean) Time per request: 28.000 [ms] (mean, across all concurrent requests) Transfer rate: 709.24 [Kbytes/sec] received
This took over twice as long, but the failed requests rate was much much lower. Basically, the server is now configured to be able to handle at least 200 simultaneous page views of one of my site's home page, but it will take 5 seconds a page to serve them. Not great, but much better than the MySQL errors I was getting previously.
During all of this, my server CPU usage is pegging at 100% with the 'load average' hovering upwards of 180. MySQL is using about 8-9% of the CPU and isn't using much of the RAM I've alloted it, as I am just repeatedly hammering the same page, so it's only dealing with a single database. 400MB of the 4GB+ it is configured to grow into. top shows the buffers and cached memory usage at about 50% of the total available RAM. So while I am loading the machine up with this test, it's not getting near the overloaded point. Under a real world database usage, MySQL should grab much of the memory I've alloted it, so the server should be pretty close to full load at that point.
My next test was to test apache at the 'full load' of 250 connections ab -n 50000 -c 250
Concurrency Level: 250 Time taken for tests: 1442.515514 seconds Complete requests: 50000 Failed requests: 3509 (Connect: 0, Length: 3509, Exceptions: 0) Write errors: 0 Non-2xx responses: 3509 Total transferred: 1051321215 bytes HTML transferred: 1029809879 bytes Requests per second: 34.66 [#/sec] (mean) Time per request: 7212.577 [ms] (mean) Time per request: 28.850 [ms] (mean, across all concurrent requests) Transfer rate: 711.73 [Kbytes/sec] received
This is showing similar results to the 200 connection test with the proper MySQL connection cap. This is good to me I think. I don't like the 7 seconds to return a page, but I think I can improve that at the Joomla level by enabling cacheing in Joomla either with APC or Memcache which are both installed but not utilized by Joomla yet.
Trying to push my luck, I thought I would try 300 simultaneous connections. ab -n 50000 -c 300 The browser shows a long wait for a quick page load. Otherwise, not really much change in the results.
Concurrency Level: 300 Time taken for tests: 1478.35890 seconds Complete requests: 50000 Failed requests: 2266 (Connect: 0, Length: 2266, Exceptions: 0) Write errors: 0 Non-2xx responses: 2266 Total transferred: 1079120910 bytes HTML transferred: 1057241646 bytes Requests per second: 33.83 [#/sec] (mean) Time per request: 8868.215 [ms] (mean) Time per request: 29.561 [ms] (mean, across all concurrent requests) Transfer rate: 712.99 [Kbytes/sec] received
I don't know if my interpretation of these results are 'right' or if I am missing something valueable, but with the lack of instruction that I could find, this is what I came up with.
I just used the results to make sure that I got a good response rate - the lack of a perfect response rate concerns me, but I don't know how to see or reproduce the failures in a way I can inspect them.
The slow time per request also concerns me, but I think I can address much of that at the application layer.
I am confident that while the server would slow to a crawl, it could handle a heavy load situation.
Looking at other performance tuning tools like MonYog after these benchmarking tests has also shown me that my current configurations are 'good enough'.
I wish there was a place where people have posted results of tests I can reproduce with a hardware description and software configs so I know if I am 'competitive' or if I have a lot of work to do yet to best utilize my equipment. Thus, why I am posting my results.
Please note that for the "failed requests" line, a failed request is determined by comparing the length of subsequential requests against each other. For dynamic website, this doesn't have to mean that the request failed at all! So don't worry about the failed request line.
See also: http://www.celebrazio.net/tech/unix/apache_bench.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With