We are building basically a Push API for our platform and expect it to push out a high number of Webhook events per minute (some where in the thousands). It's possible that two identical events could be created within milliseconds of each other, and we'd like to aggregate those together.
Goals:
All that being said, in a setup where we have maybe 16 processes queuing jobs, would using Redis Sorted Sets be useful?
I was thinking about using the timestamp as the score and just continually "popping" items through a command like:
MULTI
ZRANGE queue 0 {demand - 1}
ZREMRANGEBYRANK queue 0 {demand - 1}
EXEC
But I'm not sure about performance costs or other considerations. Would anyone recommend this? I feel like sorted sets make sense here.
Your approach makes sense.
The time complexity for ZRANGE and ZREMRANGEBYSCORE is
O(log(N)+M) with N being the number of elements in the sorted set and M the number of elements returned.
that means it depends on the number of events, and of the requests you will do.
Redis has very good performances in theory, but to be sure it will fit your use case, the better is to make your own benchmarks, with your code, your configuration, a production-like number of items in the ZSET, etc.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With