So I've already read this post about there not being an MGET
analog for Redis hashes. One of the answers said to use MULTI/EXEC
to do the operation in bulk, and that does work for lists and regular keys, but not for hashes, unfortunately. Right now, however, I'm doing a call over the wire for every single hash I want to retrieve which seems like bad news to me.
So my question is: what is the most efficient way to get several hashes back from Redis, with the standard of efficiency being the least number of network calls? I'm using Redis 2.0.4, programming with the Python client. Thanks!
Conclusion. Querying multiple key/values from Redis is easy using KEYS and MGET . If you need to write key/values to a JSON, TXT or CSV file, just use the Python redis-mass-get CLI. Quick and easy.
In Redis you can also have lists as the value, which overcomes the problem of having more than one value for a key, as you can have an ordered list with multiple values (so “they'll none of 'em be missed”).
We use various data structures (linked lists, arrays, hashes, etc) in our applications. They are usually implemented in memory but sometimes we need persistence AND speed. This is where in memory DB like Redis can be very useful.
A Redis hash is a data type that represents a mapping between a string field and a string value. Hashes can hold many field-value pairs and are designed to not take up much space, making them ideal for representing data objects.
The most efficient way would be using a pipeline.
Assuming you want everything for a given key and know all the keys already:
import redis r = redis.Redis(host='localhost', port=6379, db=0) p = r.pipeline() for key in keys: p.hgetall(key) for h in p.execute(): print h
More information about pipelines can be found here: http://redis.io/topics/pipelining
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With