Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the best design pattern for batch insertion using the Django REST Framework?

Background

I have a Django app that allows record insertion via the Django REST Framework.

Records will be periodically batch-inserted row-by-row by client applications that interrogate spreadsheets and other databases. The REST API allows these other applications, which handle data transformation, etc, to be abstracted from Django.

Problem

I'd like to decouple the actual record insertion from the API to improve fault tolerance and the potential for scalability.

Suggested Approach

I am considering doing this with Celery, but I've not used it before. I'm considering overriding perform_create() in my existing DRF ModelViewSets (perform_create() was added in DRF 3.0) to create Celery tasks that workers would grab and process in the background.

The DRF documentation says that perform_create() should "should save the object instance by calling serializer.save()". I'm wondering whether, in my case, I could ignore this recommendation and instead have my Celery tasks call on the appropriate serializer to perform the object saves.

Example

If for example I've got a couple of models:

class Book(models.Model):
    name = models.CharField(max_length=32)

class Author(models.Model):
    surname = models.CharField(max_length=32)

And I've got DRF views and serializers for those models:

class BookSerializer(serializers.ModelSerializer):
    class Meta:
        model = Book

class AuthorSerializer(serializers.ModelSerializer):
    class Meta:
        model = Author

class BookViewSet(viewsets.ModelViewSet):
    queryset = Book.objects.all()
    serializer_class = Book

class AuthorViewSet(viewsets.ModelViewSet):
    queryset = Author.objects.all()
    serializer_class = Author

Would it be a good idea to override perform_create() in e.g. BookViewSet:

def perform_create(self, serializer):
    create_book_task(serializer.data)

Where create_book_task is separately something like:

@shared_task
def create_book_task(data):
    serializer = BookSerializer(data=data)
    serializer.save()

I've not really been able to find any examples of other developers doing something similar or trying to solve the same problem. Am I overcomplicating it? My database is still going to be the limiting factor when it comes to physical insertion, but at least it won't block the API clients from queueing up their data. I am not committed to Celery if it isn't suitable. Is this the best solution, are there obvious problems with it, or are there better alternatives?

like image 515
Paul J Avatar asked Jan 21 '16 17:01

Paul J


People also ask

What is Django REST framework good for?

Django REST framework is a powerful and flexible toolkit for building Web APIs. Some reasons you might want to use REST framework: The Web browsable API is a huge usability win for your developers. Authentication policies including packages for OAuth1a and OAuth2.

What is Django rest Framework architecture?

Django REST framework is an open source, flexible and fully-featured library with modular and customizable architecture that aims at building sophisticated web APIs and uses Python and Django.

Which Django use REST framework?

Who uses Django REST framework? 338 companies reportedly use Django REST framework in their tech stacks, including Robinhood, UpstageAI, and BirdView.


1 Answers

I find your approach is sound, Celery is great except for some border cases that can get a little nasty in my experience (but I wouldn't expect to run into that in the use case you outline in the question).

However, consider a simplified approach as follows using Redis. It has some pros and cons.

In BookViewSet:

from redis import StrictRedis
from rest_framework import viewsets, renderers

redis_client = StrictRedis()

class BookViewSet(viewsets.ModelViewSet):
    queryset = Book.objects.all()
    serializer_class = Book

    def perform_create(self, serializer):
        json = renderers.JSONRenderer().render(serializer.data)
        redis_client.lpush('create_book_task', json)

In a separate worker script:

from django.utils.six import BytesIO
from redis import StrictRedis
from rest_framework.parsers import JSONParser
from myproject import BookSerializer, Book

MAX_BATCH_SIZE = 1000

def create_book_task():
    bookset = []
    for json in redis_client.brpop(('create_book_task',)):
       stream = BytesIO(json)
       data = JSONParser().parse(stream)
       serializer = BookSerializer(data=data)
       assert serializer.is_valid()
       bookset.append(serializer.instance)
       if len(bookset) >= MAX_BATCH_SIZE:
           break

    if len(bookset) > 0:
        Book.objects.bulk_create(bookset)

while True:
    create_book_task()

Pros

  • You don't need to add Celery (Again, love it, but it makes testing a little trickier and can sometimes get a little hairy depending on workloads, configuration, etc)
  • It handles bulk creation, so if you get thousands of books submitted over a short timespan (seconds or less than a second) only a few inserts will be executed on the DB (as opposed to thousands of inserts)

Cons

  • You're taking care of the low level serialization yourself instead of Celery doing it "magically"
  • You will need to manage the worker script yourself (daemonizing it, maybe packaging it as a management command, taking care of restarts, etc.) instead of handing that off to Celery

Of course the above is a first approach, you might want to make it more generic to reuse for additional models, move the MAX_BATCH_SIZE TO your settings, use pickling instead of JSON or a variety of other adjustments, improvements or design decisions according to your specific needs.

In the end, I would probably go along with the approach outlined in my answer, unless there are several other tasks you anticipate will be offloaded to asynchronous processing where the case for using Celery would become much stronger.

PS: Since the actual insertion will be done asynchronously consider responding with a 202 Accepted response code instead of 201 Created (unless this screws up your clients).

like image 191
Sebastian Avatar answered Oct 13 '22 01:10

Sebastian