I have some update triggers which push jobs onto the Sidekiq queue. So in some cases, there can be multiple jobs to process the same object.
There are a couple of uniqueness plugins ("Middleware", Unique Jobs), they're not documented much, but they seem to be more like throttlers to prevent repeat processing; what I want is a throttler that prevents repeat creating of the same jobs. That way, an object will always be processed in its freshest state. Is there a plugin or technique for this?
Update: I didn't have time to make a middleware, but I ended up with a related cleanup function to ensure queues are unique: https://gist.github.com/mahemoff/bf419c568c525f0af903
What about a simple client middleware?
module Sidekiq
class UniqueMiddleware
def call(worker_class, msg, queue_name, redis_pool)
if msg["unique"]
queue = Sidekiq::Queue.new(queue_name)
queue.each do |job|
if job.klass == msg['class'] && job.args == msg['args']
return false
end
end
end
yield
end
end
end
Just register it
Sidekiq.configure_client do |config|
config.client_middleware do |chain|
chain.add Sidekiq::UniqueMiddleware
end
end
Then in your job just set unique: true
in sidekiq_options when needed
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With