Deploy PHP web system to multiple locations - php

I am developing (solo web developer) a rather large web based system which needs to run at various different locations. Unfortunately, due to some clients having dialup, we have had to do this and not have a central server for them all. Each client is part of our VPN, and those on dialup/ISDN get dialed on demand from our Cisco router. All clients are accessable within a matter of seconds.
I was wondering what the best way to release an update to all these clients at once would be. Automation would be great as their are 23+ locations to deploy the system to, each of which is used on a very regular basis. Because of this, when deploying, I need to display a 'updating' page so that the clients don't try access the system while the update is partially complete.
Any thoughts on what would be the best solution
EDIT: Found FileSyncTask which allows me to rsync with Phing. Going to use that.

There's also a case here for maintaining a "master" code repository (in SVN, CVS or maybe GIT). This isn't your standard "keep editions of your code in the repo and allow roll backs"... this repo holds your current production code (only). Once an update is ready you check the working updated code into the master repo. All of your servers check the repo on a scheduled bases to see if it's changed, downloading new code if a change is found. That check process could even include turning on the maintenance.php file (that symcbean suggested) before starting the repo download and removing the file once the download is complete.

At the company I work for, we work with huge web-based systems which are both Java and PHP. For all systems we have our development environments and production environments.
This company has over 200 developers, so I guess you can imagine the size of the products we develop.
What we have done is use ANT and RPM build archives for creating deployment packages. This is done quite easily. I haven't done this myself, but might be worth for you to look into.
Because we use Linux systems we can easily deploy RPM packages, the setup scripts within a RPM package can make sure everything gets to the correct place. Also you get a more proper version handling and release process.
Hope this helped you.
Br,
Paul

There's 2 parts to this, lets deal with the simple one first:
I need to display a 'updating' page
If you need to disable the entire site while maintaining transactional integrity, and publishing a message to the users from the server being updated, then the only practical way to do this is via an auto-prepend - this needs to be configured in advance (note - I believe this can be done using a .htaccess file without having to restart the webserver for a new PHP config):
<?php
if (file_exists($_SERVER['DOCUMENT_ROOT'] . '/maintenance.php')) {
include_once($_SERVER['DOCUMENT_ROOT'] . '/maintenance.php');
exit;
}
Then just drop maintenance.php into your webroot and that file will be displayed instead of the expected file. Note that it should probably include a session_start() and auto-refresh to ensure the session is not expired. You might want to extend the above to allow a grace period where POSTs will still be processed e.g. by adding a second php file.
In terms of deploying to remote sites, I'd recommend using rsync over ssh for copying content files - which should be invoked via a controlling script which:
Applies the lock file(s) as shown above
runs rsync to replicate files
runs any database deployment script
removes the lock file(s)
If each site has a different set up then I'd recommend either managing the site specific stuff via a hierarchy of include paths, or even maintaining a comlpete image of each site locally.
C.

Related

How can I have my CMS upgrade itself?

I've built a CMS (using the Codeigniter PHP framework) that we use for all our clients. I'm constantly tweaking it, and it gets hard to keep track of which clients have which version. We really want everyone to always have the latest version.
I've written it in a way so that updates and upgrades generally only involve uploading the new version via FTP, and deleting the old one - I just don't touch the /uploads or /themes directories (everything specific to the site is either there or in the database). Everything is a module, and each module has it's own version number (as well as the core CMS), as well as an install and uninstall script for each version, but I have to manually FTP the files first, then run the module's install script from the control panel. I wrote and will continue to write everything personally, so I have complete control over the code.
What I'd like is to be able to upgrade the core CMS and individual modules from the control panel of the CMS itself. This is a "CMS for Dummies", so asking people to FTP or do anything remotely technical is out of the question. I'm envisioning something like a message popping up on login, or in the list of installed modules, like "New version available".
I'm confident that I can sort out most of the technical details once I get this going, but I'm not sure which direction to take. I can think of ways to attempt this with cURL (to authenticate and pull source files from somewhere on our server) and PHP's native filesystem functions like unlink(), file_put_contents(), etc. to preform the actual updates to files or stuff the "old" CMS in a backup directory and set up the new one, but even as I'm writing this post - it sounds like a recipe for disaster.
I don't use git/github or anything, but I have the feeling something like that could help? How should (or shouldn't) I approach this?
Theres a bunch of ways to do this but the least complicated is just to have Git installedo n your client servers and set up a cron job that runs a git pull origin master every now and then. If your application uses Migrations it should be easy as hell to do.
You can do this as it sounds like you are in full control of your clients. For something like PyroCMS or PancakeApp that doesn't work because anyone can have it on any server and we have to be a little smarter. We just download a ZIP which contains all changed files and a list of deleted files, which means the file system is updated nicely.
We have a list of installations which we can ping with a HTTP request so the system knows to run the download, or the click can hit "Upgrade" when they log in.
You can use Git from your CMS: Glip. The cron would be a url on your own system, without installing Git.
#Obsidian Wouldn't a DNS poisoning attack also compromise most methods being mentioned in this thread?
Additionally SSH could be compromised by a man in the middle attack as well.
While total paranoia is a good thing when dealing with security, Wordpress being a GPL codebase would make it easy to detect an unauthorized code change in your code if such an attack did occur, so resolution would be easy.
SSH and Git does sound like a good solution, but what is the intended audience?
Have you taken a look at how WordPress does it?
That would seem to do what you want.
Check this page for a description of how it works.
http://tech.ipstenu.org/2011/how-the-wordpress-upgrade-works/

Using SVN in web-development

Recently I've read this article: http://www.smashingmagazine.com/2009/09/25/svn-strikes-back-a-serious-vulnerability-found/
Developers of many popular sites like apache.org, php.net (http://ru2.php.net/.svn/entries), classmates.com and russian Yandex use SVN, but do not follow the recommendations given by SVN (to use command export).
So, what are the reasons for not using svn export instead of updating the public copy like all they do?
Some people, not including myself, think that to deploy onto production you should just issue an svn up. If you do an export it loses the meta data about the versioning so you can t do that, you have to use another mechanism for tracking which version is where. It is an easy solution, but I think it can make for lazy packaging and also for "fixing in production" as if you do this you can also check back in from production...
From my perspective what I do is lock off/block access to any .svn files on the server (either Apache2 or IIS) this way the hidden folders are not accessible externally, and it allows for version tracking for sites that we use which do not require compiling before rollout
Languages like:
PHP
ASP (not .NET)
PLAIN HTML
COLDFUSION
PDF / IMAGE versioning (if needed, in my case we needed it for updated PDF docs for customers).
So certainly you can use SVN for web development, but you do need to be cautions as you expose your .svn folders to the world if you are not cautious. Otherwise it is a tool you could use to make your job easier and more efficient.
With that said, we simply run an SVN UPDATE on our production to update changed files, and with limited developers working on one piece of code at a time (like I said in my case) we don't get mixups with wrong things getting deployed. PLUS to be safe, always do a SVN CHECK FOR MODIFICATIONS to see what is going to be updated, and hey, if you do make a mistake, roll it back.
With svn export files can never get deleted, only added and modified. This could be issue sometimes.
When the entire website is open-source and available for downloading over a public resource (like PHP's). Protecting the .svn directories so other's can't get the source code is probably not worth the effort over simply doing a svn up.

publishing a website using svn export

I currently ftp all my files to my website when i do an update (over a slowish adsl connection)
And I want to make things easier, so I just recently started using a hosted svn service, and i thought i could speed things up a bit by doing an svn export of my website directly onto my webserver
i have tried that a few times and it seems to work ok, however it does fetch the entire site everytime which is a bit slow for a 1 file update
so my questions are
is it possible to do an export and only get the changes since the last export (how will this handle deleted files ?)
OR will it be easier to do an svn checkout and svn update it all the time instead of svn export and just hide the .svn folders using apache htaccess
is this a good idea, or is there a better way to publish my website
i am trying to achieve the 1 click deploy type ideal
maybe there are some gotcha's i haven't thought of that someone else has run into
debian/apache/php
I would do an svn checkout, and have done so successfully on a live site for a number of years. You should add mod_rewrite rules to 404 the .svn directories (and files) though.
This is what I'm doing on my host:
For every project I have a structure that looks more less like this:
~/projects/myproj
~/public_html/myproj
First dir is a checkout from SVN, while second one is just svn export.
I have a small bash script
#!/bin/bash
SOURCE="$HOME/projects/"
TARGET="$HOME/public_html/"
for x in `ls $SOURCE`
do
if [ -d $SOURCE$x ]; then
svn update $SOURCE$x
svn export --force $SOURCE$x $TARGET$x
fi
done
Export is done from working copy so it's very fast.
It might not be exactly the answer you are looking for, but, if you have an SSH access to your webserver (it depends on your hosting service ; some "low cost" don't give such kind of access), you can use rsync to "synchronise" the remote site with what you have on your disk.
In the past, I was using something like the idea you are describing (fetching the svn log between last revision pushed to production and HEAD, analysing every lines, and in the end calculating what to send to the server) ; but it was not really a great process ; I now use rsync, and like it way better.
(Here too, you will have to exclude .svn directories, btw)
You can just accept having .svn directories in your website (generally not a problem esp. if you configure it not to permit access to these) - this is the easy option. Alternatively, do what RaYell does, and have two copies of your website on the webserver. One normal checkout outside of the web-directory, and one in your web-directory. When you update, simply export the svn (just a copy with .svn dirs deleted) into the web directory (and you should make sure to first delete old files if you wish to avoid files that have been removed from SVN from remaining on your website).
I do something like this, using robocopy set to mirror the svn checkout while excluding .svn directories, and get both the export and the old-file deletion in one step, thus minimizing downtime if the copy takes long. I'm sure this is easy on unix too, if that's your hosting environment. For example, you can use a local rsync: http://blog.gilluminate.com/2006/12/12/yes-you-can-rsync-between-two-local-directories/
Old topic but since this is what came up in Google during my research I thought I would add to this. I recommend doing an export to the site instead of checking out to it. I like to keep the repositories and the sites separate. I also don't recommend exporting the entire repo to the site each time, specially if only a few file change at a time. What you can do instead is do a diff on the repo to see what's changed from a release to another and only export those files. More info at: http://www.joeyrivera.com/2011/automate-svn-export-to-site-w-bash-script/

How should I set up my development environment for rolling code into a live website?

In the past, I've always edited all my sites live; wasn't too concerned about my 2 visitors seeing an error message.
However, there may come a day when I get more than 2 visitors. What would be the best approach to testing my changes and then making all the changes go live simultaneously?
Should I copy and paste ever single file into a sub-folder and edit these, then copy them back when I'm done? What if I have full URLs in my code (they'll break if I move them)? Maybe I can use some .htaccess hackery to get around this? What about database dummy test data? Should I dupe all my MySQL tables and reference those instead?
I'm using CakePHP for the particular project I'm concerned about, but I'm curious to know what approaches people are taking both with Cake (which may have tools to assist with this?), and without a framework.
I've been getting a lot of recommendations for SVN, which sounds great, but unfortunately my host doesn't support it :\
The best thing you can do is to create a staging environment in which you test your changes. The staging environment (ideally) is a complete, working duplicate of your production system. This will prevent you from experiencing many headaches and inadvertent production crashes.
If you are working on a small project the best thing to do is to recreate your remote site locally (including the database). Code all your changes there and then, once you are satisfied that you are finished, deploy the changes to your remote site in one go.
I would recommend putting your website code under full version control (git or subversion). Test and maintain your source in a separate, private sandbox server, and just check out the latest stable version at the production site whenever it's ready for release.
For database support, even for small projects I maintain separate development and production databases. You can version the SQL used to generate and maintain your schema and testing or bootstrapping data along with the rest of your site. Manage the database environment used by your site from an easily separated configuration file, and tell your version control solution to ignore it.
Absolute URLs are going to be a problem. If you can't avoid them, you could always store the hostname in the same configuration file and read it as needed... except within stylesheets and Javascript resources, of course. My second choice for that problem would be URL-rewriting magic or its equivalent in the development server, and my last choice would be just messing with the /etc/hosts file when I wanted to test features that depend on them.
I set up a development server on my laptop that duplicates my web server as closely as possible (server software and configuration, operating system, filesystem layout, installed software, etc.) That way I can write the code on my laptop and test it locally; once I've gotten things working there, I copy it to the server. Sometimes a few problems arise because of slight differences between the two computers, but those are always quickly resolved (and just in case they're not, I have my site in an SVN repository so I can always revert it).
On another website I used to maintain, I used a slightly different tactic: I designated a URL path within the site that would be a development version of the base site. That is, http://www.example.com/devweb would ordinarily mirror http://www.example.com, http://www.example.com/devweb/foo/bar.php would mirror http://www.example.com/foo/bar.php, etc. I created a folder devweb under the document root, but instead of copying all the files, I configured the server so that if a requested file didn't exist in the /devweb directory, it would look for it under the document root. That was a more fragile setup than having a separate development server, though.
I have a number of websites written in CakePHP. I develop and test on my local machine, using the database on my production server (I just have a MySQL login that works for my static IP address).
All code is checked into Subversion, and I then have a continuous integration server - Hudson:
https://hudson.dev.java.net/
This builds and deploys my project on the production machine. It just checks out the code in subversion for a specific project then runs a simple script to SSH/copy the files into the staging or production location on the server. You can either set this up to be a manual process (which I have currently) or you can set this up so that it deploys once code has been checked in. There's lots of other CI tools that can be setup to do this (have a look at Xinc as well).
http://code.google.com/p/xinc/
As for absolute URLs you can always setup something in your host file to resolve the site locally on your machine instead. It works for me, just don't forget to take it out afterwards : )
Hope that helps...
I have a version of config/database.php that uses the php server variable "SERVER NAME" to determine which system the app is running on. Then when I clone my git repo across my home system, development site (which shares the same specs as the live machine), and the live machine they all connect to their respective databases.
I pasted here, but I also believe its available on thebakery.
http://pastebin.com/f1a701145

How do you deploy a website to your webservers?

At my company we have a group of 8 web developers for our business web site (entirely written in PHP, but that shouldn't matter). Everyone in the group is working on different projects at the same time and whenever they're done with their task, they immediately deploy it (cause business is moving fast these days).
Currently the development happens on one shared server with all developers working on the same code base (using RCS to "lock" files away from others). When deployment is due, the changed files are copied over to a "staging" server and then a sync script uploads the files to our main webserver from where it is distributed over to the other 9 servers.
Quite happily, the web dev team asked us for help in order to improve the process (after us complaining for a while) and now our idea for setting up their dev environment is as follows:
A dev server with virtual directories, so that everybody has their own codebase,
SVN (or any other VCS) to keep track of changes
a central server for testing holding the latest checked in code
The question is now: How do we manage to deploy the changed files on to the server without accidentaly uploading bugs from other projects? My first idea was to simply export the latest revision from the repository, but that would not give full control over the files.
How do you manage such a situation? What kind of deployment scripts do you have in action?
(As a special challenge: the website has organically grown over the last 10 years, so the projects are not split up in small chunks, but files for one specific feature are spread all over the directory tree.)
Cassy - you obviously have a long way to go before you'll get your source code management entirely in order, but it sounds like you are on your way!
Having individual sandboxes will definitely help on things. Next then make sure that the website is ALWAYS just a clean checkout of a particular revision, tag or branch from subversion.
We use git, but we have a similar setup. We tag a particular version with a version number (in git we also get to add a description to the tag; good for release notes!) and then we have a script that anyone with access to "do a release" can run that takes two parameters -- which system is going to be updated (the datacenter and if we're updating the test or the production server) and then the version number (the tag).
The script uses sudo to then run the release script in a shared account. It does a checkout of the relevant version, minimizes javascript and CSS1, pushes the code to the relevant servers for the environment and then restarts what needs to be restarted. The last line of the release script connects to one of the webservers and tails the error log.
On our websites we include an html comment at the bottom of each page with the current server name and the version -- makes it easy to see "What's running right now?"
1 and a bunch of other housekeeping tasks like that...
You should consider using branching and merging for individual projects (on the same codebase), if they make huge changes to the shared codebase.
we usually have a local dev enviroment for testing (meaning, webserver locally) for testing the uncommited code (you don't want to commit non functioning code at all), but that dev enviroment could even be on a separeate server using shared folders.
however, committed code, should be deployed to a staging server for testing before putting it in production.
You can probably use Capistrano even though is more for ruby there are some articles that describe how to use it for PHP
I think Phing can be use with CVS but not with SVN (at least that what I last read)
There are also some project around that mimic Capistrano but written in PHP.
Otherwise there is also a custom made solution :
tag files you want to deploy.
checkout files using the tag in a
specific directory
symlink the directory to the current
document root (easy to rollback to
the previous version)
Naturally check out SVN for the repository, Trac to track things, and Apache Ant to deploy.
The basic process is managing in Subversion, tracking the repositroy and developers in Trac and using Ant deployment scripts to push your site out with the settings needed. Ant allows you to easily deploy a project to a specific location. (Dev/test/prod) etc.
You need to look at:
Continuous Integration
Running unit tests on check-in of code to check it is bug free
Potentially rejecting code if it contains a bug
Having nightly builds
Releasing only the last build that was bug free
You may not get to a perfect solution, especially not at first, but the more you use your chosen solution, the more comfortable everyone will get and be able to make suggestions on improving it.
We check for the stability with ant, every night. And use ant script to deploy. It is very easy to configure and use.
I gave a similar answer yesterday to another question. Basically you can work in branches and integrate before going live.
The biggest thing you will have to get your head round is that you are dealing with changes to files, rather than individual files. Once you have branches there isn't really a current version there are just versions with different changes in.

Categories