I'm using Django's per-view @cache_page
decorator and have set a different key_prefix
for each view.
I've previously deleted the cache with:
from django.core.cache import cache
cache.clear()
But what if I just want to delete the keys containing a specific key_prefix
? I can obviously do it by just connecting to the database and delete with raw sql but I wonder if it can be done with 'pure' Django?
I'm using a database cache, not a memory cache.
I'm using Django 1.11 and Python 3.6
For convenience, Django offers different levels of cache granularity: You can cache the output of specific views, you can cache only the pieces that are difficult to produce, or you can cache your entire site. Django also works well with “downstream” caches, such as Squid and browser-based caches.
As its name implies, this method stores cached data in RAM on the machine where Django is running. Local memory caching is fast, responsive, and thread-safe. The downside is that it works best if we're only running a single instance of Django.
Memcached can be used with Django by setting CACHE_BACKEND to memcached://ip:port/. Here ip is the IP address of the Memcached daemon and port is the port on which the Memcached is running.
As @e4c5 mentioned cache is used for fast stuff, you should be using redis for the same. But since your question is about database I would answer the same.
There is no existing function to do this in Django. But then best part of python is you can easily monkey path to add new functionality. Below is a test request I created
def index(request):
cache.set("name", "tarun")
cache.set("name_1", "tarun")
cache.set("name2", "tarun")
cache.set("name_4", "tarun")
cache.set("nam", "tarun")
cache.clear(prefix="name")
nam = cache.get("nam")
name_4 = cache.get("name_4", default="deleted")
return HttpResponse("Hello, world. nam={nam}, name_4={name_4}".format(nam=nam, name_4=name_4))
To get the prefix
functionality you need to add below patch code in some place. I used settings.py
as such
original_clear = None
def patch_clear():
from django.db import connections, router
from django.core.cache.backends.db import DatabaseCache
def __clear(self, prefix=None, version=None):
db = router.db_for_write(self.cache_model_class)
connection = connections[db]
table = connection.ops.quote_name(self._table)
with connection.cursor() as cursor:
if prefix is None:
cursor.execute('DELETE FROM %s ' % table)
else:
prefix = self.make_key(prefix, version)
cursor.execute("DELETE FROM %s where cache_key like '%s%%'" % (table, prefix))
global original_clear
original_clear = DatabaseCache.clear
DatabaseCache.clear = __clear
patch_clear()
TLDR; cache.delete
and cache.delete_many
are your available options.
Long answer.
@cache_page
is over rated. When you use this decorator, you often find that the cache always contains many more cache entries than you expected. You end up wanting to delete a whole bunch of cache entries. Which seems to be exactly what has happened here.
I'm using a database cache, not a memory cache.
One of the main ideas of using caching is to reduce the load on the server another is to reduce expensive calculations or db queries. But in reality a great many web pages do not have expensive calculations. Most slow queries can be optimized by carefully choosing your indexes.
If the database itself is that cache, you are not reducing the load on the database. And what if you need to display different content for different users? This get's awfully complicated.
what if I just want to delete the keys containing a specific key_prefix?
Consider using redis. This is one of the best caching backends available in django (as a third party module). Being able to delete multiple keys in a single command is one of the many useful features of redis.
I achieved what I want with the code like this:
cache.delete_many(keys=cache.keys('*.letters.*'))
It deletes all caches which keys contain "letters".
EDIT: I use redis server. I didn't test it for other cache servers.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With