By workers I mean a python script that runs some task in an infinite loop. This script should be deployed to a separate EC2 instance and run forever (perhaps using supervisor).
I'm successfully using Elastic Beanstalk to deploy the web app with git aws.push
and that works fine. However, the set-up for a worker needs to be a little different, because
the workers don't need a web server or an elastic IP. They are also started with a different command.
From what I've read I think what I want is pretty similar to Heroku's worker dynos vs web dynos, but I don't have any experience with Heroku either, so I could be wrong.
So is this possible with Elastic Beanstalk? Or should I be using something completely different for deployment?
By the way I'm using linux and the Elastic Beanstalk's CLI.
The ability to deploy background processes (workers) on ElasticBeanstalk was actually just introduced on December 11th.
Take a look at the announcement: http://aws.typepad.com/aws/2013/12/background-task-handling-for-aws-elastic-beanstalk.html
and the docs: http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/using-features-managing-env-tiers.html
I don't think Elastic Beanstalk has the same concept as Heroku's worker dynos and web dynos (Although I don't think these relate to your case where you are trying to run your custom worker process)
As it stands you can run your workers somewhere else. (EC2 instance, your own cloudserver, some very custom made SAS, etc)
Sidekiq/rescue (in the ruby world) is something equivalent to running workers independently. You can try this approach described in this blog and apply to your specific worker.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With