Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to setup mixed (S)FTP + Git workflow for website

I have a lot of virtualhosted websites that are currently being updated via chrooted SFTP. Updates are done by clients (Typically using Dreamweaver and CuteFTP/Filezilla) and staff at my company (typically using Eclipse Team Syncronisation with JCraft SFTP). This setup works ok unless the clients are editing their site at the same time we are. In that case we have to constantly sync to check for changes which is both slow and unreliable. Also SFTP transfers are slow given the huge number of files in each site, slow directory traversal and lack of delta-compression.

I want to move to a (partial) Git workflow mostly to benefit from delta-compression but also to introduce some rudimentary revision control. I say "rudimentary" because the typical git workflow of developing and testing locally before bundling changes into git commits is a poor fit for our needs because:

  • Some of our clients will need to use SFTP, not Git for convenience or compatibility
  • I don't want to force a particular SFTP client (like git-ftp). Our clients are comfortable with their own choice (Dreamweaver or CuteFTP for example). They may be on any OS and they don't use shells.
  • None of our clients can test locally, every small change must be uploaded to the webserver and go "live" immediately (we have test sites so "live" doesn't necessarily mean "production" or "public" in this case)
  • About 90% of changes will be small 1-line CSS/HTML changes making commit messages time consuming.
  • Some changes are done directly on the server webroot via a shell or scripts.

I guess what I want is for the remote git repo to use the webroot as a working directory and auto-commit any file changed there with a generic message indicating which file(s) changed. However I keep reading that the primary repo for a project shouldn't have a working directory (git init --bare) and even if it does it wouldn't normally commit changes made there. I don't care if this setup loses ownership details of commits (since it wont know who changed the files and I typically don't care).

Failing that I basically want to use Git as an alternative to Rsync-over-SSH (which does not appear to be supported by Eclipse Team Sync). Any answer suggesting another technology needs to support Eclipse (my tool of choice) and Dreamweaver (my clients main tool of choice). I realise this is not ideal but it isn't negotiable. If I could force use of Git I would and this issue would be irrelevant. I have to deal with hundreds of clients, mostly Graphic Designers on Macs.

PS. I realise there will be issues when the clients don't resync their local files regularly enough. Any suggestions on handling that would be appreciated.

Can anybody provide guidance on this setup (Centos 6.4 linux).

like image 549
SpliFF Avatar asked Apr 03 '13 03:04

SpliFF


2 Answers

For changes made directly on the server (or via SFTP):

Take a look at Gitwatch which is "a bash script to watch a file or folder and commit changes to a git repo." It seems to be just the thing you are looking for.

You will probably want to have your webroot not as the primary repo, but as a remote cloned repo. Setup your primary somewhere else (like github or bitbucket), clone it on the webserver. Use gitwatch to monitor changes, auto-commit, and push to your primary/github repo.

A big benefit of using a separate primary repo like github/bitbucket is that you can easily see/review changes that are being made, and pull them down locally yourself if you want, without needing direct access to the git repo on the webserver.

For changes made via git:

Are you looking to also make changes via git and have them auto-pulled on the webserver? In theory you could hook up a post-receive hook in git to do this for you. However you could easily get merge conflicts if there are changes being made directly on the server at the same time.

On second thought, avoid this. Stick with a single direction webserver -> local commit -> push to primary git repo.

Insert disclaimer here about how this is not a proper user of git, breaks all kinds of rules, and violates best practices. You seem well aware of this already, so I'll skip it. =)

like image 143
jszobody Avatar answered Oct 27 '22 01:10

jszobody


To me, you should setup a continuous integration tool such as Jenkins, CruiseControl or one of the many other alternatives, configured to deploy at the web server.

You need:

  1. a repository for each website, accessible via sftp and watched with gitwatch to commit changes (like jszobody suggests) and a githook to pull / push / ask_for_help_via_email_to_a_developer_if_merge_fails
  2. a "central" repository where developers and the hook will push each commit.
  3. a CI tool of your choice that, at each push recieved by the central repository, wakes-up and sends the updated contents to the webserver (sftp, direct fs access... what is better for you).
  4. a web server that, now, don't need to provide public access neither via git nor via sftp.

Pros:

  • covers all your requirements
  • no need for a working directory in the central repository
  • no need to open more ports at the webserver (except for http/https, obviously)
  • no need to install git on the web server
  • no need to overload the web server's filesystem with the repository history (that is contained in the .git/ folder)
  • quite flexible
  • easy to evolve to a full featured CI solution (automated tests, code metrics and so on)
  • copes with clients out of sync, because the hook won't be able to merge and a developer will be warned

Cons:

  • complex, thus expensive to setup
  • you will have to inform your clients about the new sftp address to use (unless you had wisely used a different subdomain and you can just wait for DNSs to update).
like image 33
Giacomo Tesio Avatar answered Oct 27 '22 02:10

Giacomo Tesio