I'm wondering what is the best way to keep binary dependencies in the "team friendly" way.
We have many dependencies for cross-platform application. That includes commercial library in 32/64 * linux/windows/mac versions and a few open source libraries compiled in non-standard, no-so-trivial to reproduce environments. Moreover we have graphical assets that sometimes are large (250Mb is quite common).
Libraries needs to be updated, sometimes recompiled etc. Assets are updated too, and we need them to be in-sync with code.
I want to achieve something close to once-click update and build system.
I tried with keeping everything in SVN, but it leads to veeeery long updates, even if data wasn't modified.
Now I am thinking about some scripted system, that would download and decompress zipped files with libs and assets, only when needed, as build event. Scripts will be versioned, date not.
But maybe there is ready to use solution? Do you have any experience with that?
With Mercurial 2.0, you can use the largefiles extension to manage such files. They will be kept outside of the normal history and downloaded on demand. The extension still has some sharp edged, so make sure you use at least version 2.0.1.
That being said, I prefer to use a dedicated tool for this — since ependency management is really best done outside of the source control system.
Submodules work great for this. On one project, I had the code for the application hosted privately in unfuddle.com and all the DLLs, etc. (large stuff) that was in the public domain anyway, was kept in github. This ended up using very little of our private storage.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With