Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Version control of deliverables

We need to regularly synchronize many dozens of binary files (project executables and DLLs) between many developers at several different locations, so that every developer has an up to date environment to build and test at. Due to nature of the project, updates must be done often and on-demand (overnight updates are not sufficient). This is not pretty, but we are stuck with it for a time.

We settled on using a regular version (source) control system: put everything into it as binary files, get-latest before testing and check-in updated DLL after testing.

It works fine, but a version control client has a lot of features which don't make sense for us and people occasionally get confused.

Are there any tools better suited for the task? Or may be a completely different approach?

Update:

I need to clarify that it's not a tightly integrated project - more like extensible system with a heap of "plugins", including thrid-party ones. We need to make sure those modules-plugins works nicely with recent versions of each other and the core. Centralised build as was suggested was considered initially, but it's not an option.

like image 646
ima Avatar asked Mar 01 '23 08:03

ima


2 Answers

I'd probably take a look at rsync.

Just create a .CMD file that contains the call to rsync with all the correct parameters and let people call that. rsync is very smart in deciding what part of files need to be transferred, so it'll be very fast even when large files are involved.

What rsync doesn't do though is conflict resolution (or even detection), but in the scenario you described it's more like reading from a central place which is what rsync is designed to handle.

like image 169
pilif Avatar answered Mar 08 '23 15:03

pilif


Another option is unison

like image 22
user11087 Avatar answered Mar 08 '23 15:03

user11087