Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can Git software (e.g. Gitbox, Github, SourceTree) use a remote repo instead of local?

I like using Git software to push commits, but the ones I use (Gitbox, Github, SourceTree) all ask for a local repo when adding a new repo to them.

Thing is, my repo is on my development server not my local machine.

So can Git software use a remote Git repo as a development repo, then push that to your main repo (e.g. Github or Bitbucket)?

Otherwise it seems you cannot use the software and have to resort to command line over SSH.

Thanks

like image 698
Laurence Cope Avatar asked Oct 16 '12 19:10

Laurence Cope


2 Answers

One solution, which doesn't rely on the front-end to support manipulating a remote repo directly, would be to mount the remote as a networked filesystem. If you only have SSH access to the remote machine, you could try using SSHFS via FUSE (on Linux) or OSXFUSE on Mac OS X. Or depending on your preferences and setup, you could use SMB, NFS, DAV, or another network filesystem.

Another way to do it, that I bring up in the comments, is to export the network filesystem from your development machine to your server. I do this so that I can mount my current working copy on multiple machines at once, and also so that I still have my local working copy even when I'm not connected to the server.

You write:

I am surprised git software can't deal with remote repos as the working version.

Most Git GUIs do some of their work by calling out to the git command. In order for them to support remote operation, core Git would have to as well. It is written in a mix of C and shell script; all of that would have to be rewritten to cope with remote files.

A text editor has a much easier job; it reads one file when you open it, and writes when you save, while Git reads and writes many files in the course of a single operation like commit.

A networked filesystem would mean that all tools (Git and otherwise) will work on your remote files. Instead of building a layer into each and every application to support networked file access, doing it in the kernel (or via FUSE) and then just treating it like a local filesystem gives you that support in every application for free.

like image 85
Brian Campbell Avatar answered Oct 14 '22 12:10

Brian Campbell


Remember, Git is a DVCS. The fact that you don't connect to a remote server to commit stuff is by design.

What you want to do is have local Git repos that push code to your integration server (the one that actually runs the code). It's like deploying, only you deploy to a test server instead of production.

This is normally achieved by having a shared Git repository you push to. This repo should be bare. Besides the bare shared repo, you'll want a non-bare clone of the shared Git repo which will serve as your Apache docroot.

When the shared repo receives a commit, it will make the docroot repo execute git pull.

This can be achieved by using post-receive hooks on the shared repo.

The docroot repo is checked out on a specific branch (let's say develop). So even if you commit stuff to other branches and push them, the server won't be affected.

This allows you to setup multiple deployment repositories, so you could have another branch prod associated with one of those that would actually update production code when you push stuff to it.

It also allows you to store incomplete / on-going work on a shared branch that doesn't deploy at all, so that you know the thing you've been working on your laptop is safe on the shared repo, even though it can't be sent to the test server because it's not complete and would break the test server, making other people unable to work or something.

This article goes in detail how to setup all that. I've done it before, it works well.

like image 30
Gui Prá Avatar answered Oct 14 '22 13:10

Gui Prá