Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

WebSocket async send can result in blocked send once queue filled

I have pretty simple Jetty-based websockets server, responsible for streaming small binary messages to connect clients.

To avoid any blocking on server side I was using sendBytesByFuture method.

After increasing load from 2 clients to 20, they stop receive any data. During troubleshooting I decided to switch on synchronous send method and finally got potential reason:

java.lang.IllegalStateException: Blocking message pending 10000 for BLOCKING
at org.eclipse.jetty.websocket.common.WebSocketRemoteEndpoint.lockMsg(WebSocketRemoteEndpoint.java:130)
at org.eclipse.jetty.websocket.common.WebSocketRemoteEndpoint.sendBytes(WebSocketRemoteEndpoint.java:244)

Clients not doing any calculations upon receiving data so potentially they can't be slow joiners.

So I wondering what can I do to solve this problem? (using Jetty 9.2.3)

like image 509
rimas Avatar asked Oct 08 '14 19:10

rimas


Video Answer


1 Answers

If the error message occurs from a synchronous send, then you have multiple threads attempting to send messages on the same RemoteEndpoint - something that isn't allowed per the protocol. Only 1 message at a time may be sent. (There is essentially no queue for synchronous sends)

If the error message occurs from an asynchronous send, then that means you have messages sitting in a queue waiting to be sent, yet you are still attempting to write more async messages.

Try not to mix synchronous and asynchronous at the same time (it would be very easy to accidentally have output that become an invalid protocol stream)

Using Java Futures:

You'll want to use the Future objects that are provided on the return of the sendBytesByFuture() and sendStringByFuture() methods to verify that the message was actually sent or not (could have been an error), and if enough start to queue up unsent you back off on sending more messages until the remote endpoint can catch up.

Standard Future behavior and techniques apply here.

Using Jetty Callbacks:

There is also the WriteCallback behavior available in the sendBytes(ByteBuffer,WriteCallback) and sendString(String,WriteCallback) methods that would call your own code on success/error, at which you can put some logic around what you send (limit it, send it slower, queue it, filter it, drop some messages, prioritize messages, etc. whatever you need)

Using Blocking:

Or you can just use blocking sends to never have too many messages queue up.

like image 190
Joakim Erdfelt Avatar answered Oct 05 '22 06:10

Joakim Erdfelt