Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Avoiding Memcache "1000000 bytes in length" limit on values

Tags:

My model has different entities that I'd like to calculate once like the employees of a company. To avoid making the same query again and again, the calculated list is saved in Memcache (duration=1day).. The problem is that the app is sometimes giving me an error that there are more bytes being stored in Memcache than is permissible:

Values may not be more than 1000000 bytes in length; received 1071339 bytes

Is storing a list of objects something that you should be doing with Memcache? If so, what are best practices in avoiding the error above? I'm currently pulling 1000 objects. Do you limit values to < 200? Checking for an object's size in memory doesn't seem like too good an idea because they're probably being processed (serialized or something like that) before going into Memcache.

like image 586
David Haddad Avatar asked Feb 03 '12 11:02

David Haddad


1 Answers

David, you don't say which language you use, but in Python you can do the same thing as Ibrahim suggests using pickle. All you need to do is write two little helper functions that read and write a large object to memcache. Here's an (untested) sketch:

def store(key, value, chunksize=950000):
  serialized = pickle.dumps(value, 2)
  values = {}
  for i in xrange(0, len(serialized), chunksize):
    values['%s.%s' % (key, i//chunksize)] = serialized[i : i+chunksize]
  return memcache.set_multi(values)

def retrieve(key):
  result = memcache.get_multi(['%s.%s' % (key, i) for i in xrange(32)])
  serialized = ''.join([v for k, v in sorted(result.items()) if v is not None])
  return pickle.loads(serialized)
like image 99
Guido van Rossum Avatar answered Sep 17 '22 16:09

Guido van Rossum