In our application we need to handle request volumes in excess of 5,000 requests per second. We've been told that this is feasible with Jetty in our type of application (where we must expose a JSON-HTTP API to a remote system, which will then initiate inbound requests and connections to us).
We receive several thousand inbound HTTP connections, each of which is persistent and lasts about 30 seconds. The remote server then fires requests at us as quickly as we can respond to them on each of these connections. After 30 seconds the connection is closed and another is opened. We must respond in less than 100ms (including network transit time).
Our server is running in EC2 with 8GB of RAM, 4GB of which is allocated to our Java VM (past research suggested that you should not allocate more than half the available RAM to the JVM).
Here is how we currently initialize Jetty based on various tips we've read around the web:
Server server = new Server();
SelectChannelConnector connector = new SelectChannelConnector();
connector.setPort(config.listenPort);
connector.setThreadPool(new QueuedThreadPool(5120));
connector.setMaxIdleTime(600000);
connector.setRequestBufferSize(10000);
server.setConnectors(new Connector[] { connector });
server.setHandler(this);
server.start();
Note that we originally had just 512 threads in our threadpool, we tried increasing to 5120 but this didn't noticeably help.
We find with this setup we struggle to handle more than 300 requests per second. We don't think the problem is our handler as it is just doing some quick calculations, and a Gson serialization/deserialization.
When we manually do a HTTP request of our own while it's trying to handle this load we find that it can take several seconds before it begins to respond.
We are using Jetty version 7.0.0.pre5.
Any suggestions, either for a solution, or techniques to isolate the bottleneck, would be appreciated.
The jetty continuations api and the newer servlet 3.0 support provide mechanisms to release threads back to the primary threadpool so they can spend time on accepting and processing other requests.
In HTTP/1.1, a destination holds a pool of connections (controlled by maxConnectionsPerDestination, by default 64). So if you send requests in a tight loop, assuming connection establishment takes zero time, you can have at most 64 (on the network) + 1024 (queued) outstanding requests. The next one will be rejected.
In your jetty-base folder, notice the webapps directory. This is where your web apps will go. To add a web app to your server, create a folder inside the webapps directory, and then add your files inside your folder.
In order for this application to be able to connect to your Jetty server, you will need to un-comment the last section of etc/jetty-jmx. xml configuration file and optionally modify the endpoint name. That will create a JMX HTTP connector and register a JMX URL that it will be output to the Stderr log.
First, Jetty 7.0.0.pre5 is VERY old. Jetty 9 is now out, and has many performance optimisations.
Download a newer version of the 7.x line at https://www.eclipse.org/jetty/previousversions.html
This following advice is documented at
Be sure you read them.
Next, the threadpool size is for handling accepted requests, 512 is high. 5120 is ridiculous. Pick a number higher than 50, and less than 500.
If you have a Linux based EC2 node, be sure you configure the networking for maximum benefit at the OS level. (See the document titled "High Load" in the above mentioned list for details)
Be sure you are using a recent JRE/JDK, such as Oracle Java 1.6u38 or 1.7u10. Also, if you have a 64 bit OS, use the 64 bit JRE/JDK.
Set your acceptor count, SelectChannelConnector.setAcceptors(int) to be a a value between 1 and (number_of_cpu_cores - 1).
Lastly, setup optimized Garbage Collection, and turn on GC Logging to see if the problems you are having are with jetty, or with Java's GC. If you see via the GC logging that there are massive GC "stop the world" events taking lots of time, then you know one more cause for your performance issues.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With