I was trying to setup Amazon SQS for Celery and I have the below configuration:
BROKER_BACKEND = "SQS"
BROKER_TRANSPORT_OPTIONS = {
'region': 'us-east-1',
}
AWS_ACCESS_KEY_ID = # access id
AWS_SECRET_ACCESS_KEY = # secret access key
os.environ.setdefault("AWS_ACCESS_KEY_ID", AWS_ACCESS_KEY_ID)
os.environ.setdefault("AWS_SECRET_ACCESS_KEY", AWS_SECRET_ACCESS_KEY)
BROKER_URL = 'sqs://'
CELERY_IMPORTS = ("tasks", )
CELERY_TASK_RESULT_EXPIRES = 300
CELERY_DEFAULT_QUEUE = #queue name
CELERY_DEFAULT_EXCHANGE = CELERY_DEFAULT_QUEUE
CELERY_DEFAULT_EXCHANGE_TYPE = CELERY_DEFAULT_QUEUE
CELERY_DEFAULT_ROUTING_KEY = CELERY_DEFAULT_QUEUE
CELERY_QUEUES = {
CELERY_DEFAULT_QUEUE: {
'exchange': CELERY_DEFAULT_QUEUE,
'binding_key': CELERY_DEFAULT_QUEUE,
}
}
In my SQS configuration on the AWS account, I have a queue with the name written in CELERY_DEFAULT_QUEUE
. When I run this locally, everything works...but for some reason it creates another queue on SQS with the name format <user_id>-celery-pidbox
. Something like this: MyUser-MacBook-Pro-local-celery-pidbox
.
Is this normal? Why would it be creating another queue when I have a queue created with the name specified? Otherwise, its working, not sure if that other queue is required or I missed something? Any help is appreciated, I could not find this in the docs.
EDIT
Turns out this is normal. For some reason django-celery does this, it creates a queue for each box that you have accessing the queue you want to access. They will fix this in a future release. If somebody knows how to fix this temporarily, please let me know, thanks!
Celery is a great and simple Task Queueing system for Python. It allows you to offload heavy tasks to another server and run it asynchronously. It can also run periodic tasks too. It's also surprisingly easy to setup with AWS Simple Queue Service(if your app is hosted with AWS).
If you want a delivery point for this queue where a relevant sub-system can pick it up and process it later, you can use a service like SQS. One thing to note is that we don't need to couple SNS and SQS in all cases. We can have SNS send messages to email, SMS or HTTP endpoint apart from SQS.
Q: Do Amazon SQS FIFO queues support multiple producers? Yes. One or more producers can send messages to a FIFO queue. Messages are stored in the order that they were successfully received by Amazon SQS.
This is actually a good behavior so you can monitor which instances (IPs or local names) are accessing your SQS account. It is just one request, so it won't cost you anything.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With