Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it realistic for a single server to handle 2000 HTTP requests per second?

I am building a Java-based web service (using JSON as a data encoding) that will need to handle up-to 2,000 HTTP requests per second. The processing required for each request is almost negligible (a HashMap.put() method call), parsing the JSON would probably be the dominant overhead.

I am wondering whether a single High-Memory Quadruple Extra Large EC2 instance (68GB RAM, 8 cores, 64-bit) would be capable of handing as much as 2,000 HTTP requests per second?

I realize that an exact answer will be difficult, I'm just wondering whether this is within the bounds of possibility, or whether I'm smoking crack.

I'm currently using the SimpleWeb web framework, although I've noticed that it doesn't seem to be maintained currently. Can people recommend alternative embeddable HTTP servers that would be well suited to this kind of high-volume usage?

like image 577
sanity Avatar asked Oct 13 '11 13:10

sanity


People also ask

How many HTTP requests can a server handle per second?

The from-the-box number of open connections for most servers is usually around 256 or fewer, ergo 256 requests per second. You can push it up to 2000-5000 for ping requests or to 500-1000 for lightweight requests.

What is a good number of requests per second?

Average 200-300 connections per second.

How many requests per second should an API handle?

In the API Console, there is a similar quota referred to as Requests per 100 seconds per user. By default, it is set to 100 requests per 100 seconds per user and can be adjusted to a maximum value of 1,000. But the number of requests to the API is restricted to a maximum of 10 requests per second per user.

How do you handle millions of requests per second?

Simple Backend optimizationsMake sure you are using database connection pooling. Inspect your SQL queries and add caching for them. Add caching for whole responses.


2 Answers

2000 requests per second (or 2 krps) should be well within the realm of possibility for a Java servlet, provided that you don't introduce huge bottlenecks and that the framework you are using doesn't suck too much. Given that apparently you are not accessing any backends, the task should be CPU-bound and scale very well.

The JSON serialization test of the Web Framework Benchmarks shows a lot of Java frameworks that give very good results; even with 20 database queries results are still very well over 2 krps. On Amazon they are using m1.large instances which are smaller than the ones you plan to use (c3.4xlarge, I gather).

You might try Undertow which provides a convenient servlet API and is well maintained. Netty is another possibility, although it has its own API.

Note: I realize that the question is a bit old, but the problem should still be valid.

like image 130
alexfernandez Avatar answered Nov 14 '22 23:11

alexfernandez


This is definitely possible, in accordance to this question, Netty can handle way over 100 000 interactions per second. Some JSON parser can convert request string into JSON object or maybe you can even use a binaray variant of it, BSON, as describe here (if the messages are long or very complex). From this question looks like the number of connections that a server with recent operating system can handle is over 300 000 so much more than would be needed for your task.

Of course, this also depends on which actions do you need to take to handle the request that may be a limiting factor.

like image 28
Audrius Meškauskas Avatar answered Nov 14 '22 23:11

Audrius Meškauskas