Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to simulate a huge amount of simultaneous requests to a web-server?

I want to see how far my nginx + node.js setup can go and what changes I can make to squeeze out extra performance I've stumbled on a great article detailing some tuning that can be done to the OS to withstand more requests (which I'm not sure I completely understand)

Say I want to see how it handles 60,000 requests per second for a duration of time.

I've tried apachebench and beeswithmachineguns. apachebench seems to be limited locally to about 3500 requests or something. Raising the concurrency only serves to decrease the average req/s somehow. I was able to see (claimed) ~5000 requests per second to a test page with beeswithmachineguns but that's still nowhere close to what I want. It seems to be a bit on the buggy side, however.

Is there a reliable way to simulate a huge amount of requests like this?

like image 665
dsp_099 Avatar asked Nov 03 '13 10:11

dsp_099


People also ask

How many concurrent requests can a Web server handle?

A Single CPU core will commonly handle an average of 220 to 250 concurrent connections simultaneously. If for instance, a website runs on a server that has a single CPU with 2 CPU cores, approximately 500 visitors may access and search the website at the same time.

How does a Web server handle multiple requests?

The server opens a socket that 'listens' at port 80 and 'accepts' new connections from that socket. Each new connection is represented by a new socket whose local port is also port 80, but whose remote IP:port is as per the client who connected. So they don't get mixed up.

How many request can a server handle at a time?

By default, the limits are 1024 for soft limits and 4096 for hard limits. The open files limit is a common limitation where the RPS cannot arise even though the server still have CPU and memory usage room. Higher the open files limit means more requests can be received and made resulting higher RPS.

How many concurrent requests can express handle?

Fastify benchmark There's a benchmark made by Fastify creators, it shows that express. js can handle ~15K requests per second, and the vanilla HTTP module can handle 70K rps.


1 Answers

You could give siege a try as well.

The article you've linked looks good to me.

Generating 60,000 rq/s and answering them at the same time will be a problem because you most definitely run out of resources. It would be best to have some other computers (maybe on the same network) to generate the requests and let your server only handle answering those.

Here's an example siege configuration for your desired 60,000 rq/s that will hit your server for one minute.

# ~/.siegerc

logfile         = $(HOME)/siege.log
verbose         = true
csv             = true
logging         = true
protocol        = HTTP/1.1
chunked         = true
cache           = false
accept-encoding = gzip
benchmark       = true
concurrent      = 60000
connection      = close
delay           = 1
internet        = false
show-logfile    = true
time            = 1M
zero-data-ok    = false

If you don't have the infrastructure to generate the load, rent it. A very great service is Blitz.IO (I'm not affiliated with them). They have an easy and intuitive interface and (most important) they can generate nearly any traffic for you.

like image 174
Fleshgrinder Avatar answered Oct 14 '22 02:10

Fleshgrinder