Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Synchronize home directories from multiple clients to a server

Tags:

bash

backup

rsync

I'm using multiple Linux laptops/desktops and want them to "share" home directories.

NFS is unfortunately not an option. Therefor I was trying to create a bash script using rsync but I can't figure out how to do it.

This is my example right now

`#`!/bin/bash

sync() {
  rsync -azvR --exclude-from=/home/ME/.rsync_excludes --delete -e 'ssh -ax' $1 $2
}

sync /home/ME server.domain:/home/ME
`#`sync server.domain:/home/ME /home/ME

I think this would work great if I only where using one single client machine which updates the server files. Correct?

What if I delete a file in one client? That file want be deleted on the other client (after sync's of cause)?

Can I use rsync for this purpose? Should I look for an other program? Hopefully not though...

Edit: Since this solution shouldn't be only for me I would appreciate if the solution would be sort of automatically.

Edit2: Maybe there must be a solution including a repo in somehow. Subversion, Git, Mercurial or someting else.

like image 600
Daniel Avatar asked Apr 03 '09 14:04

Daniel


2 Answers

rsync is good to keep one location in sync with a master. Or in other terms, mirror A to B. That's not what you're doing, though. You'd have to rsync A to B and B to A. Which brings a whole new set of problems. If a file disappeared, do you need to delete in on the other side or rsync it back? Maybe it was modified on the other side; you can't check.

Anyway; the solution to this problem comes in the form of unison. That's a tool (works on Linux, OS X, Windows, BSD, ...) (has CLI tools, GUI tools, and can be scheduled nicely in cron) which will keep your home directory or any other directory nicely in sync, and is made to be able to deal with almost any type of conflict or problem. Those people thought it all out way better than we could here.

Alternatively, there's SCMs. Many people use SCMs for managing their home directories. Subversion is popular for this, but I wouldn't recommend it at all. It will not only consume massive amounts of space, make everything horribly slow and force your keeping in sync on depending on an active connection to the master repository. There's alternatives, like GIT, and others, but they all have their downsides.

Either way, any SCM-based solution violates one very big rule of SCMs: You should never keep big binary data in there. SCMs are not made for this. You don't keep your photo collections, movies, documents, downloads, and stuff like that in an SCM, even though you may want to keep them in sync or keep a history on them (especially so for pictures/documents).

It's important to understand that there is a difference between keeping backups and keeping in sync. Your backups should be kept in a remote/detached location and can contain a history of everything you own. I personally recommend rdiff-backup for this. It keeps history of everything beautifully, uses the rsync algorithm under the hood to minimize traffic and accessing the backup location looks like the most current state of the backup: You can just browse through it like you do normal files.

To summarize, I recommend you combine unison and rdiff-backup for an all-round solution to keeping your data safe and reliably in sync.

like image 70
lhunath Avatar answered Nov 15 '22 07:11

lhunath


Why not do this using Subversion ? The linked article details how the author synchronises and stores history using source control (you don't have to use Subversion, obviously - there are alternatives)

like image 38
Brian Agnew Avatar answered Nov 15 '22 06:11

Brian Agnew