Is there an upper limit to the suggested size of the value stored for a particular key in Redis?
Is 100KB too large?
Values can be strings (including binary data) of every kind, for instance you can store a jpeg image inside a value. A value can't be bigger than 512 MB.
The maximum allowed size of a key is 512 MB. To reduce memory usage and facilitate key query, ensure that each key does not exceed 1 KB. The maximum allowed size of a string is 512 MB. The maximum allowed size of a Set, List, or Hash is 512 MB.
Azure Redis Cache is Generally Available in sizes up to 53 GB and has an availability SLA of 99.9%. The new premium tier offers sizes up to 530 GB and support for clustering, VNET, and persistence, with a 99.9% SLA.
These include a better performance with large data volumes and distribution across multiple systems. Redis can also score with the following advantages: Fast accessibility through in-memory memory. Support for most common programming languages.
There are two things that you need to take into consideration when deciding if something is "too big".
Does Redis have support for the size of key/value object that you want to store?
The answer to this question is documented pretty well on the Redis site (https://redis.io/topics/data-types), so I won't go into detail here.
For a given key/value size, what are the consequences I need to be aware of?
This is a much more nuanced answer as it depends heavily on how you are using Redis and what behaviors are acceptable to your application and which ones are not.
For instance, larger key/value sizes can lead to fragmentation of the memory space within your server. If you aren't using all the memory in your Redis server anyway, then this may not be a big deal to you. However, if you need to squeeze all of the memory out of your Redis server you can, then you are now reducing the efficiency of how memory is allocated and you are losing access to some memory that you would otherwise have.
As another example, when you are reading these large key/value entries from Redis, it means you have to transfer more data over the network from the server to the client. Some consequences of this are:
It takes more time to transfer the data, so your client may need to have a higher timeout value configured to allow for this additional transfer time.
Requests made to the server on the same TCP connection can get stuck behind the big transfer and cause other requests to timeout. See here for an example scenario.
Your network buffers used to transfer this data can impact available memory on the client or server, which can aggravate the available memory issues already described around fragmentation.
If these large key/value items are accessed frequently, this magnifies the impacts described above as you are repeatedly transferring this data over and over again.
So, the answer is not a crisp "yes" or "no", but some things that you should consider and possibly test for your expected workload. In general, I do advise our customers to try to stay as small as possible and I have often said to try to stay below 100kb, but I have also seen plenty of customers use Redis with larger values (in the MB range). Sometimes those larger values are no big deal. In other cases, it may not be an issue until months or years later when their application changes in load or behavior.
Is there an upper limit to the suggested size of the value stored for a particular key in Redis?
According to the official docs, the maximum size of key(String) in redis is 512MB.
Is 100KB too large?
It depends on the application and use, for a general purpose applications it should be fine.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With