I'm using python-rq to manage Redis-based jobs and I want to determine which jobs are currently being processed by my workers.
python-rq offers a get_current_job function to find 'the current job' for a connection but:
Here is my code (which always returns None):
from rq import Queue, get_current_job
redis_url = os.getenv('REDIS_FOO')
parse.uses_netloc.append('redis')
url = parse.urlparse(redis_url)
conn = Redis(host=url.hostname, port=url.port, db=0, password=url.password)
q = Queue(connection=conn)
get_current_job(connection=conn)
Does anyone have any ideas, please, on getting the above code to work but, more importantly, on a way to get a list of all current jobs from all workers on all queues from this connection?
Looked into some source code, I figure this what you need:
There is one more thing you should notice: the number of the running jobs is equal to the number of rq worker. Because worker only process one job at a time.
from rq import Queue
from redis import Redis
from rq.registry import StartedJobRegistry
from jobs import count_words_at_url
redis_conn = Redis()
q = Queue('default', connection=redis_conn)
for i in range(5000):
job = q.enqueue(count_words_at_url, 'http://nvie.com', ttl=43)
registry = StartedJobRegistry('default', connection=redis_conn)
running_job_ids = registry.get_job_ids() # Jobs which are exactly running.
expired_job_ids = registry.get_expired_job_ids()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With