Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Limiting Memory Use in a *Large* Django QuerySet

I have a task which needs to be run on 'most' objects in my database once every some period of time (once a day, once a week, whatever). Basically this means that I have some query that looks like this running in it's own thread.

for model_instance in SomeModel.objects.all():     do_something(model_instance) 

(Note that it's actually a filter() not all() but none-the-less I still end up selecting a very large set of objects.)

The problem I'm running into is that after running for a while the thread is killed by my hosting provider because I'm using too much memory. I'm assuming all this memory use is happening because even though the QuerySet object returned by my query initially has a very small memory footprint it ends up growing as the QuerySet object caches each model_instance as I iterate through them.

My question is, "what is the best way to iterate through almost every SomeModel in my database in a memory efficient way?" or perhaps my question is "how do I 'un-cache' model instances from a django queryset?"

EDIT: I'm actually using the results of the queryset to build a series of new objects. As such, I don't end up updating the queried-for objects at all.

like image 617
Chris W. Avatar asked Jan 31 '11 22:01

Chris W.


People also ask

Is Django QuerySet lazy?

This is because a Django QuerySet is a lazy object. It contains all of the information it needs to populate itself from the database, but will not actually do so until the information is needed.

How do I get the size of a QuerySet in Django?

If the QuerySet only exists to count the amount of rows, use count(). If the QuerySet is used elsewhere, i.e. in a loop, use len() or |length.

What does QuerySet []> mean?

A QuerySet is a collection of data from a database. A QuerySet is built up as a list of objects. QuerySets makes it easier to get the data you actually need, by allowing you to filter and order the data.

How does Django handle large data?

Use bulk query. Use bulk queries to efficiently query large data sets and reduce the number of database requests. Django ORM can perform several inserts or update operations in a single SQL query. If you're planning on inserting more than 5000 objects, specify batch_size.


2 Answers

What about using django core's Paginator and Page objects documented here:

https://docs.djangoproject.com/en/dev/topics/pagination/

Something like this:

from django.core.paginator import Paginator from djangoapp.models import SomeModel  paginator = Paginator(SomeModel.objects.all(), 1000) # chunks of 1000  for page_idx in range(1, paginator.num_pages):     for row in paginator.page(page_idx).object_list:         # here you can do what you want with the row     print "done processing page %s" % page_idx 
like image 135
mpaf Avatar answered Sep 26 '22 01:09

mpaf


So what I actually ended up doing is building something that you can 'wrap' a QuerySet in. It works by making a deepcopy of the QuerySet, using the slice syntax--e.g., some_queryset[15:45]--but then it makes another deepcopy of the original QuerySet when the slice has been completely iterated through. This means that only the set of Objects returned in 'this' particular slice are stored in memory.

class MemorySavingQuerysetIterator(object):      def __init__(self,queryset,max_obj_num=1000):         self._base_queryset = queryset         self._generator = self._setup()         self.max_obj_num = max_obj_num      def _setup(self):         for i in xrange(0,self._base_queryset.count(),self.max_obj_num):             # By making a copy of of the queryset and using that to actually access             # the objects we ensure that there are only `max_obj_num` objects in             # memory at any given time             smaller_queryset = copy.deepcopy(self._base_queryset)[i:i+self.max_obj_num]             logger.debug('Grabbing next %s objects from DB' % self.max_obj_num)             for obj in smaller_queryset.iterator():                 yield obj      def __iter__(self):         return self      def next(self):         return self._generator.next() 

So instead of...

for obj in SomeObject.objects.filter(foo='bar'): <-- Something that returns *a lot* of Objects     do_something(obj); 

You would do...

for obj in MemorySavingQuerysetIterator(in SomeObject.objects.filter(foo='bar')):     do_something(obj); 

Please note that the intention of this is to save memory in your Python interpreter. It essentially does this by making more database queries. Usually people are trying to do the exact opposite of that--i.e., minimize database queries as much as possible without regards to memory usage. Hopefully somebody will find this useful though.

like image 45
Chris W. Avatar answered Sep 26 '22 01:09

Chris W.