I have a docker application on elastic beanstalk with a web-server and worker environment.
The worker environment currently runs scheduled jobs via cron. I'm trying to connect the server to the worker to achieve the following:
I haven't been able to find documentation on what the JSON message should look like. There are some HTTP headers mentioned in the official documentation. But there's no mention of header to specify the desired endpoint in the worker environment.
# server.py
from bottle import post, HTTPResponse
@post('/trigger_job')
def trigger_worker_job():
# should send a JSON message to sqs to trigger the '/perform_job'
# Need help with what the JSON message looks like
return HTTPResponse(status=200, body={'Msg': 'Sent message'})
# worker.py
from bottle import post, HTTPResponse
@post('/perform_job')
def perform_job():
# job is performed in the worker environment
return HTTPResponse(status=200, body={'Msg': 'Success'})
In Python, you can see how this from the python sample application where you can find on this aws doc step 4: Deploy a New Application Version.
You can configure SQS endpoint in beanstalk worker environment console. Configuration > Worker > Select a Worker Queue
# for example:
environ['HTTP_X_AWS_SQSD_TASKNAME']
environ['HTTP_X_AWS_SQSD_SCHEDULED_AT']
logger.info("environ X-Aws-Sqsd-Queue %s" % environ['HTTP_X_AWS_SQSD_QUEUE'])
# regarding your message attribute. For example, the attribute name is Email,
# you can extract it via environ['HTTP_X_AWS_SQSD_ATTR_EMAIL']).
# Make sure that the attribute name is all capital.
logger.info("environ X-Aws-Sqsd-Attr Email %s" % environ['HTTP_X_AWS_SQSD_ATTR_EMAIL'])
The message will contains the following info in the image. You can read more on aws AWS Elastic Beanstalk Worker Environments
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With