Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there a way to run a command on all Heroku dynos?

I have N dynos for a Rails application, and I'd like to run a command on all of them. Is there a way to do it? Would running rails r "SomeRubyCode" be executed on all dynos?

I'm using a plugin which syncs with a 3rd party every M minutes. The problem is, sometimes the 3rd party service times out, and I'd like to run it again without having to wait for another M minutes to pass.

like image 698
Geo Avatar asked Feb 11 '13 14:02

Geo


2 Answers

No. One off commands (those like heroku run bash) are ran on another, one-off dyno. You would need to setup some kind of pubsub/message queue that all dynos listen to to accomplish this. https://devcenter.heroku.com/articles/one-off-dynos

like image 83
catsby Avatar answered Sep 28 '22 05:09

catsby


(Asked to turn my comment into an answer... will take this opportunity to expound.)

I don't know about the details of what your plugin needs to do to 'sync' to a 3rd-party service, but I'm going to proceed with the assumption that the plugin basically fetches some transient data which your Web application then uses somehow.

Because the syncing or fetching process occasionally fails, and your Web application relies on up-to-date data you want the option of running the 'sync' process manually. Currently, the only way to do this from the plugin itself which means you need to run some code on all dynos which, as others have pointed out, isn't currently possible.

What I've done in a previous, similar scenario (fetching analytics from an external service) is simple:

  1. Provision and configure your Heroku app with Redis
  2. Write a rake task that simply executes the code (that would otherwise be run by the plugin) to fetch the data, then write that data into cache
  3. Where you would normally fetch the data in the app, first try to fetch from cache (and on a cache miss, just run the same code again—just means that the data expired from cache before it was refreshed)

I then went further and used Heroku simple scheduler to execute the said rake task every n minutes to attempt to keep the data freshly updated and always in cache (cache expiry was set to a little less than n minutes) and reduce instances of perceivable lag as the data fetch occurs. I could've set cache expiry to never or greater than n but this wasn't mission-critical.

This way, if I did want to ensure that the latest analytics were displayed, all I had to do was either a) connect to Redis and remove the item from cache, or (easier), b) just heroku run rake task.

Again—this mainly works if you're just pulling data that needs to be shared among all dynos.

This obviously doesn't work the other way around. For instance, if you had a centralized service that you wanted to periodically send metrics (say, time spent per request) to on a per-dyno basis. Can't think of an easy, elegant way to do that using Heroku (other than at real-time, with all the overhead that entails).

like image 31
Alistair A. Israel Avatar answered Sep 28 '22 04:09

Alistair A. Israel