Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to work on a large number of remote files with PHPStorm

I have a small Debian VPS-box on which I host and develop a few small, private PHP websites.
I develop on a Windows desktop with PHPStorm.

Most of my projects only have a few dozen source files but also contain a few thousand lib files.

I don't want to run a webserver on my local machine because this creates a whole set of problems, I don't want to be bothered with for such small projects (e.g.
setting up another webserver
synching files between my Desktop and the VPS-box;
managing different configurations for Windows and Debian (different hosts, paths...);
keeping db schema and data in synch).

I am looking for a good way to work with PHPStorm on a large amount of remote files.

My approaches so far:

  1. Mounting the remote file system in Windows (tried via pptp/smb, ftp, webdav) and working on it with PHPStorm as if it were local files.
    => Indexing, synching, and PHPStorms VCS-support became unusably slow. This is probably due to the high latency for file access.
  2. PHPStorm offers the possibility to automatically copy the remote files to the local machine and then synching them when changes are made.
    => After the initial copying, this is fast. Unfortunately, with this setup, PHPStorm is unable to provide VCS support, which I use heavily.


Any ideas on this are greatly appreciated :)

like image 431
user2704297 Avatar asked Aug 21 '13 17:08

user2704297


1 Answers

I use PhpStorm in a very similar setup as your second approach (local copies, automatic synced changes) AND importantly VCS support.

Ideal; Easiest In my experience the easiest solution is to checkout/clone your VCS branch on your local machine and use your remote file system as a staging platform which remains ignorant of VCS; a plain file system.

Real World; Remote VCS Required If however (as in my case) it is necessary to have VCS on each system; perhaps your remote environment is the standard for your shop or your shop's proprietary review/build tools are platform specific. Then a slightly different remote setup is required, however treating your remote system as staging is still the best approach.

Example: Perforce - centralized VCS (client work-space) In my experience work-space based VCS systems (e.g. Perforce) can be handled best by sharing the same client work-space between local and remote systems, which has the benefit of VCS file status changes having to be applied only once. The disadvantage is that file system changes on the remote system typically must be handled manually. In my case I manually chmod (or OS equivalent) my remote files and wash my hands (problem solved). The alternative (dual work-space) approach requires more moving parts, which I do not advice.

Example: Git - distributed VCS The easier approach is certainly Git which has it's wonderful magic of detecting file changes without file permissions being directly coupled to the VCS. This makes life easy as you can simply start with a common working branch and create two separate branches "my-feature" and "my-feature-remote-proxy" for example. Once you decide to merge your changes upstream, you do so (ideally) from your local environment. The remote proxy branch could be reverted or whatever you want. NOTE: in the case of Git I always have two branches because it's easy. And when you hard drive melts in a freak lighting strike you have extra redundancy :|

Hope this helps.

like image 66
Lance Caraccioli Avatar answered Sep 27 '22 17:09

Lance Caraccioli