Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

replicate Hg repo with all largefiles

Tags:

mercurial

We have a large, old repository with largefiles. I want to replicate the repository to a backup server using a cron script that runs hg pull. However, this command doesn't retrieve the largefiles.

I currently have 2GB of history replicated, but I'm missing 6GB of largefiles. How can I get Hg to pull down those important files?

like image 720
Chris Jones Avatar asked Aug 25 '14 16:08

Chris Jones


1 Answers

By default, only largefiles for the revision you update to will be downloaded.

'hg help largefiles' says:

When you pull a changeset that affects largefiles from a remote repository,
the largefiles for the changeset will by default not be pulled down. However,
when you update to such a revision, any largefiles needed by that revision are
downloaded and cached (if they have never been downloaded before). One way to
pull largefiles when pulling is thus to use --update, which will update your
working copy to the latest pulled revision (and thereby downloading any new
largefiles).

If you want to pull largefiles you don't need for update yet, then you can use
pull with the "--lfrev" option or the "hg lfpull" command.

You should be able to use 'hg lfpull --rev "all()"' for this purpose.

like image 176
Mathiasdm Avatar answered Nov 06 '22 14:11

Mathiasdm