Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Get changes from mercurial to FTP site

Tags:

mercurial

ftp

I work with a partner on an PHP site for a client. We have a common Mercurial repository (on Bitbucket), both local copies and the live site. We have only FTP access to the live site (which can't be changed since it is a hosting package with FTP only).

I want to be able to push changes from the repository to the live site.

Until now I simply keep track of changed files in the repo and copy them manually with FileZilla - a error prone and annoying task. My idea is, to mount the remote location locally (i.e. using CurlFtpFS) and tell mercurial to automagically copy changed files to the site. Ideally I want to be able to specify which changes but this would be a bonus. It would be sufficient if the local state of the files within the repo are synced.

Is there any good way to do this using linux commandline tools?

like image 302
Martin Thurau Avatar asked Mar 15 '11 11:03

Martin Thurau


3 Answers

My first recommendation is, if at all possible, get a package that allows more access. FTP only is just brutal.

But since you are looking for a real answer to your question, I have two ideas for you:

  1. I would suggest looking into the mercurial FTP Extension. I personally have never used it since I have never gotten myself stuck in a ftp-only situation (not for a long time at least), but it looks promising. Looks like if you make sure that you tag your production releases it will work really well for you. (make sure to use the -uploaded param)

  2. Also, if you only ever want the tip to be installed on your production env, then you could look at the suggestion Martin Geisler made on the bitbucket user group a few days ago. Basically his suggestion is to utilize bitbucket's "ping url" functionality. You would have to write a server-side script/url handler that would accept that ping, then fetch the tip from bitbucket (as a zip) and then unzip/unpack it. This is a bit complicated, but if you are looking for complete automation and the tip will always be the best this could work for you.

like image 98
shaune Avatar answered Oct 29 '22 16:10

shaune


One notion is the use the hg archive command:

hg archive /path/to/curlftpsfs

which will put a snapshot of your repo in that location -- it will however overwrite any file already there.

Another option is to create a Mercurial clone in that same /path/to/curlftpsfs and then just do a hg pull ; hg update in it on your local system with the remote one mounted. Setting that up initially will mean transferring the whole thing but subsequently you'll only be sending deltas.

Some folks don't like this last options because it exposes your entire /.hg repository too, but you can block access to that at the web server.

like image 27
Ry4an Brase Avatar answered Oct 29 '22 14:10

Ry4an Brase


I came across this problem a while ago after switching from AWS to a local web hosting that provides only ssh/ftp.

My previous approach of updating a production site on AWS using "hg pull; hg update -C" can no longer be used on the new web hosting. They don't have mercurial installed for shared hosts.

So, what I did is to mount the remote location using ftp, to a local machine (i.e. your laptop), then run the hg pull and update commands locally on your machine at the path where has the remote ftp site mounted.

like image 21
Maxim Mai Avatar answered Oct 29 '22 16:10

Maxim Mai