Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

django database synchronization for an offline usage

Tags:

I have one master django sever where the data are stored (mysql database).

Online : I would like many users to have a copy from this database synchronized (only delta's must be copied) on their laptops (sqlLite DB)

Offline (users do not have access to the master server) : users can view and update their local database.

Back to Online : what has been modified on users laptops is synchronized back to the master django server.

I think,as I have 2 kind of database, I need to synchronize at django object level. Is there a django application doing that ? If not, how will you procced to code such a feature ?

like image 234
Eric Avatar asked Jul 29 '11 08:07

Eric


People also ask

What is offline data synchronization?

Offline data sync is an SDK feature of Azure Mobile Apps. Data is stored in a local store. When your app is offline, you can still create, modify, and search the data. Data is synchronized with your Azure Mobile Apps service when your device is online.

Can Django use offline?

You can make Django web apps work faster, more engaging and also work offline. You just need to have support for sevice worker's in your target browser.

How does Django communicate with the database?

mysql' line tells Django to use its built-in MySQL database backend. The read_default_file option points to /etc/mysql/my. cnf , the MySQL option file you edited earlier. This tells Django where it can find the relevant connection details to connect to the MySQL database you created in Step 1.


1 Answers

Turns out that I'm running a system like this in Django.

This is not a complete answer, just the answer that is currently solving (mostly) the problem.

  • Use of UUIDs for primary keys. That decreases greatly primary keys collision for diferent objects.
  • Use Django's serialization framework for data interchange. The central admin site has an option to download the selected objects in the changelist to a Django-compatible serialized file. Then the user can go offline and start a local admin site, and there, upload the serialized file. When finished offline edition, the same process is used, in the "offline" admin site the objects are serialized to a file, and the uploaded to the central admin site.
  • The serialization frameworks is very useful, since you can get an actual (and unsaved) object, then decide to save it or not, and to modify some fields before the save.

We have run into very little trouble with this simple system, also helped since the content is properly categorized and the editors only create/edit a non overlapped set of categories.

I have talked with this with some people, and proposed me several solutions:

  • Use an timestamp field: That help decide which version to save and one to discard.
  • Use a version fields, with mayor and minor version numbers. Minor editing (like spelling corrections) only updates the minor version number, and major changes updates the mayor version number and sets the minor to 0. That way when comparing you always know which one gets higher priority. However this needs education and conventions within the editing users.
  • Object updates. A separate model, which stores updates coming from offline edits. Then a 'chief' editor merges them into the actual object, helped with some additional admin views to view differences (using google-diff-match-patch and the like). An object can also be flagged to allow direct updates, that is, no storing updates and apply them directly on arrival. The inconvenience is the 'chief' editor has to review the all the updates, and that depends on how much information get updated.

Hope this helps in some way. If anyone decides to implement something of this, I'll love to hear from him.

like image 126
Armando Pérez Marqués Avatar answered Oct 20 '22 17:10

Armando Pérez Marqués