I need the best practice for updating one or more files on my website without making it go down;
example, if i update Model.php file, the upload will take few seconds, like one or two seconds before it replaces the file on the server, in the meantime, the website will show some error that model.php is not found, or not complete, even if i suppress the errors, the website will die eventually.
what is the best practice for that?
When your updating production, create a folder called 1.1
and put the new application in there (whether manually or through some VCS) then symlink the public html directory to it. The switch will be instantaneous.
This is not an uncommon approach as one of the benefits is if there's something wrong with the code, the admin can immediately symlink back to the 1.0
folder.
Another good thing is to name the VCS tag the same as folder so the version in use can be easily tracked.
One practice I've often used and seen in practice is -
use a Version Control System (VCS) like SVN or Git
on the live server, make the site's web root a symbolic link to a directory containing the latest revision of your web site (eg. /www/domain.com/r555
) for revision no. 555
when a change comes up, check in changes to the VCS
have a script that checks out the latest revision to a new directory, carrying the revision name (say, /www/domain.com/r556
)
when the checkout is done, change the symbolic link and make it point to /www/domain.com/r556
.
There are a few snags if you have dynamic data like file uploads and such that you can't have in the VCS, but they can all be dealt with without downtime.
Things like database changes may still require some kind of maintenance mode, though.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With