Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Django multidb: write to multiple databases

With Django multidb, it's fairly easy to write a router that runs a master/slave infrastructure. But is it possible to write a router that writes to multiple databases? My use case is a collection of projects, all running on the same domain. To save users from registering/login in on every site, I'd like to synchronize the contrib.auth and contrib.sessions tables. Is that possible with Django multidb or should I look into replication features of the database system (MySQL in my case)?

like image 959
Benjamin Wohlwend Avatar asked Oct 28 '10 09:10

Benjamin Wohlwend


2 Answers

i think you will be better implementing an SSO or OAuth service

but if you want like to synchronize your table users between two database and if you are using your own UserModel you can do something like this

class MyUser(models.Model):
    name = models.CharField(max_length=100)
    user = models.ForeignKey(User, unique=True)


    def save(self, ...): # ALL the signature
        super(MyUser, self).save(using='database_1')
        super(MyUser, self).save(using='database_2')

you can also putting with a decorator like this, like this you can also use it for synchronizing other tables:

def save_multi_db(model_class):

    def save_wrapper(save_func): 
        def new_save(self, *args, **kws):
            super(model_class, self).save(using='database_1')
            super(model_class, self).save(using='database_1')
        return new_save

    func = getattr(model_class, 'save')
    setattr(model_class, 'save', save_wrapper(func)) 

    return save_wrapper

# and use it like this:

@save_multi_db
class MyUser(models.Model):
      ....

Hope this will help :)

like image 69
mouad Avatar answered Oct 09 '22 02:10

mouad


I'm working on a Django sharding schema right now.

i looked at the Django router but decided to roll my own.

some thoughts on your issue:

  1. One idea is to use one database and then copy over the appropriate data via the use of Django signals on post-save.

something like--

import settings.databases as dbs_list

def post_save_function(UserModel):
     for db in dbs_list:
          UserModel.save(using=db,force_insert=True)

saving User objects (at least on a single-DB model) seems to save session, auth, etc. data under the hood via the various magic occurring inside django.contrib, so there's a chance you might not have to go in and figure out the names and types of all these database tables.

in support of the possibility of this working, I swear I read somewhere recently (probably on one of Alex Gaynor's blog posts) that if an object has a foreign key Django will attempt to use the same DB the object lives on (for obvious reasons in light of the way Django typically operates).

  1. another idea:

from the example on the Django multiDB page you referenced I wonder if something like the following would work:

their sample code:

def allow_syncdb(self, db, model):
        "Explicitly put all models on all databases."
        return True

a possible modification:

def allow_syncdb(self, db, model):

    if isinstance(model,User):
         return True

    elif isinstance(model,Session):
         return True

    else:
         ''' something appropriate --
             whatever your sharding schema is for other objects '''

looking at it again, this code would probably be more useful as the "db_for_write" function. but you get the idea.

there are no doubt other types of model you would have to add to make this work (all the auth stuff, which is extensive).

good luck! hope this is helpful in some way.

i'm interested in your findings and comments!

jb

like image 33
jsh Avatar answered Oct 09 '22 01:10

jsh