Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

long polling vs streaming for about 1 update/second

is streaming a viable option? will there be a performance difference on the server end depending on which i choose? is one better than the other for this case?

I am working on a GWT application with Tomcat running on the server end. To understand my needs, imagine updating the stock prices of several stocks concurrently.

like image 897
jcee14 Avatar asked Jul 03 '09 18:07

jcee14


People also ask

What is an advantage of long polling compared to short polling?

Consuming messages using long polling Long polling helps reduce the cost of using Amazon SQS by eliminating the number of empty responses (when there are no messages available for a ReceiveMessage request) and false empty responses (when messages are available but aren't included in a response).

What is long polling and why would it be beneficial to use?

Long polling is the simplest way of having persistent connection with server, that doesn't use any specific protocol like WebSocket or Server Side Events. Being very easy to implement, it's also good enough in a lot of cases.

What is polling and streaming?

Key Point: In Polling, for a response of a server, each request is sent, but in streaming, the client listens openly with no external request of data from the server. On the server-side, for streaming, it would not wait for data to send for every request but will push the data as it notices any changes.

Is long polling good?

Long polling takes HTTP request/response polling and makes it more efficient, since repeated requests to a server wastes resources. For example, establishing a new connection, parsing the HTTP headers, a query for new data, response generation and delivery, and finally connection closure and clean up.


1 Answers

Do you want the process to be client- or server-driven? In other words, do you want to push new data to the clients as soon as it's available, or would you rather that the clients request new data whenever they see fit, even though that might not be once/second? What is the likelyhood that the client will be able to stick around to wait for an answer? Even though you expect the events to occur once/second, how long does it take between a request from a client and the return from the server? If it's longer than a second, I'd expect you to lean towards pushing the events to the clients, though the other way around, I'd expect polling to be okay. If the response takes longer than the interval, then you're essentially streaming anyway, since there's a new event ready by the time the client receives the last one, so the client could essentially poll continually and always receive events - in this case, streaming the data would actually be more lightweight, since you're removing the connection/negotiation overhead from the process.

I would suspect that server load to be higher for a client-based (pull) subscription, instead of a streaming configuration, since the client would have to re-negotiate the connection each time, instead of leaving a connection open, but each open connection in a streaming model would require server resources as well. It depends on what the trade-off is between how aggressive your negotiation process is vs. how much memory/processing is required for each open connection. I'm no expert, though, so there may be other factors.

UPDATE: This guy talks about the trade-offs between long-polling and streaming, and he seems to say that with HTTP/1.1, the connection re-negotiation process is trivial, so that's not as much of an issue.

like image 138
SqlRyan Avatar answered Nov 03 '22 22:11

SqlRyan