I recently tested a simple HTTP server using vert.x( java based) I was amazed by the throughput and api latency of http server, it is blazingly fast.
The same piece of code of http-server was run on java application with single thread, lock free and non-blocking. Performance was less than 1/3rd of vert.x one.
What I do not understand is, what is the core technical difference that the vert.x outperforms non-reactive java application ?
STATS :
Testing was done using Jmeter. Both jmeter and application was run on same machine- jmeter consuming cpu 25-50%, app consuming cpu 20-30% . All the tests were run for 5 minutes.
Jmeter with 1 client thread bombarding request (throughout :3474 per sec):
Jmeter with 50 client thread bombarding request concurrently (throughout :4285 per sec):
Jmeter with 1 client thread bombarding request (throughout :9382 per sec) :
Jmeter with 50 client threads bombarding requests concurrently (throughout :20785 per sec):
Because Vert. x is fundamentally asynchronous and can therefore scale pretty well out-of-the-box, there isn't a whole lot of multi-threading going on in its internals. This makes the system extremely efficient by removing much of the thread synchronization overhead which is a performance killer in so many other tools.
x is an open source, reactive and polyglot software development toolkit from the developers of Eclipse. Reactive programming is a programming paradigm, associated with asynchronous streams, which respond to any changes or events.
A Vert. x component is called a verticle. It's a single-threaded, event-driven (generally non-blocking) component that receives events (HTTP requests, ticks from a timer) and produces responses (an HTTP response, console output). It can do other things as well.
One of the key advantages of Vert. x over many legacy application platforms is that it is almost entirely non-blocking (of kernel threads) - this allows it to handle a lot of concurrency (e.g. handle many connections, or messages) using a very small number of kernel threads, which allows it to scale very well.
There are many reasons for that.
First one is that you compare single threaded bare Java application with Vertx, which is actually multithreaded.
Second one is how you use lock free data structures. Lock free doesn't necessarily means "faster in all conditions".
Third, and I think that's the main point, some of the best Red Hat developers contributed to Vertx development. You can examine the source code and see some very smart use of buffers, for example. It's a bit too much to expect from a example project to outperform such a framework from the first shot. If you're interested in some alternatives, check Rapidoid performance, which should be on par with Vertx.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With