I have a celery server that handles some counters for my application
class IncrementStatsCounterTask(Task):
def run(self, count, shortcode, stat_type, operator_id, date, **kwargs):
r_server = redis.Redis(settings.REDIS_HOST)
key = key_mask % {
'shortcode': shortcode,
'stat_type': stat_type,
'operator_id': operator_id,
'date': date.strftime('%Y%m%d')
}
return key, r_server.incr(key, count)
It all works great,however this opens and closes the redis connection every time my task, runs. Is there a better way to handle the connections? maybe have some sort of persistent connection?
I'm running latest django-celery
In python redis library you can use connection pooling. Just create a pool globally in one of your modules and use it for every new connection.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With