I'm working with Ruby on Rails 4.1, streaming data from a Sidekiq process using ActionController::Live. In development, my streaming is working fantastic. In production (using Nginx/Puma), it's not going quite as well. Here's what's happening.
In production, referring to the image below of my Firebug, the "/events" is being fired multiple times. Why would my EventSource
be fired repeatedly as opposed to waiting for my data? This does not happen in development.
As long as my Sidekiq process is running, it will fire repeatedly at random intervals. As soon as my sidekiq process is complete, it will hang and not fire off any more. Then that last one will eventually time out (see red text in the image)
Here is my coffeescript:
source = new EventSource('/events')
source.addEventListener 'my_event', (e) ->
console.log 'Got a message!'
# Add the content to the screen somewhere
Referring to the Firebug image, it's almost like it times out during my Sidekiq process. I'm posting to redis from my worker.
Initializer
REDIS = Redis.new(url: 'redist://localhost:6379')
Sidekiq Worker
REDIS.publish('my_event', 'some data')
Then I have the controller action that the EventSource is hooked up to:
def events
response.headers["Content-Type"] = "text/event-stream"
redis = Redis.new(url: "redist://localhost:6379")
# blocks the current thread
redis.subscribe(['my_event', 'heartbeat']) do |on|
on.message do |event, data|
if event == 'heartbeat'
response.stream.write("event: heartbeat\ndata: heartbeat\n\n")
elsif event == 'my_event'
response.stream.write("event: #{event}\n")
response.stream.write("data: #{data.to_json}\n\n")
end
end
end
rescue IOError
logger.info 'Events stream closed'
ensure
logger.info 'Stopping events streaming thread'
redis.quit
response.stream.close
end
Again, this works 100% fantastic in development. My data from Sidekiq streams perfectly to the browser. The heartbeat stuff above is because of this answer (I had the same issue)
Does anyone see something I'm doing wrong? Am I missing some form of setup for ActionController::Live
to work in production with Nginx/Puma?
IMPORTANT NOTE
At the VERY END of each of those requests from the image (where it seems to time out and attempt to do the next one), I get one line of data successfully. So something is working it seems. But the single EventSource
is not staying open and listening for the data long enough, it just keep repeating.
I had a similar issue with a recent project. I found it was my nginx config.
Add the following to the location section:
proxy_set_header Connection '';
proxy_http_version 1.1;
chunked_transfer_encoding off;
proxy_buffering off;
proxy_cache off;
This information was obtained from the following Stack Overflow discussion:
EventSource / Server-Sent Events through Nginx
Hope this helps.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With