I have pretty simple Jetty-based websockets server, responsible for streaming small binary messages to connect clients.
To avoid any blocking on server side I was using sendBytesByFuture method.
After increasing load from 2 clients to 20, they stop receive any data. During troubleshooting I decided to switch on synchronous send method and finally got potential reason:
java.lang.IllegalStateException: Blocking message pending 10000 for BLOCKING
at org.eclipse.jetty.websocket.common.WebSocketRemoteEndpoint.lockMsg(WebSocketRemoteEndpoint.java:130)
at org.eclipse.jetty.websocket.common.WebSocketRemoteEndpoint.sendBytes(WebSocketRemoteEndpoint.java:244)
Clients not doing any calculations upon receiving data so potentially they can't be slow joiners.
So I wondering what can I do to solve this problem? (using Jetty 9.2.3)
If the error message occurs from a synchronous send, then you have multiple threads attempting to send messages on the same RemoteEndpoint
- something that isn't allowed per the protocol. Only 1 message at a time may be sent. (There is essentially no queue for synchronous sends)
If the error message occurs from an asynchronous send, then that means you have messages sitting in a queue waiting to be sent, yet you are still attempting to write more async messages.
Try not to mix synchronous and asynchronous at the same time (it would be very easy to accidentally have output that become an invalid protocol stream)
Using Java Futures:
You'll want to use the Future
objects that are provided on the return of the sendBytesByFuture()
and sendStringByFuture()
methods to verify that the message was actually sent or not (could have been an error), and if enough start to queue up unsent you back off on sending more messages until the remote endpoint can catch up.
Standard Future
behavior and techniques apply here.
Using Jetty Callbacks:
There is also the WriteCallback
behavior available in the sendBytes(ByteBuffer,WriteCallback)
and sendString(String,WriteCallback)
methods that would call your own code on success/error, at which you can put some logic around what you send (limit it, send it slower, queue it, filter it, drop some messages, prioritize messages, etc. whatever you need)
Using Blocking:
Or you can just use blocking sends to never have too many messages queue up.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With