deploying code changes [duplicate] - php

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Best Practices for Code/Web Application Deployment?
I don't know if this is relevant to stackoverflow or not.
I always update & test code locally, then when it's time to upload the changes it's giving me a nightmare.
Like I will either have to go through Git repo & check what I've done & selectively upload files (not to mention DB changes) or I'll have to upload the whole thing all over again.
Selectively uploading files & applying DB changes is a headache & the full upload results in interruption of service for a few minutes or even worse.
Is there some standard way of doing this?

codeigniter specific -
name your codeigniter application folder something relevant - the date or your version number. do the same with the system folder so you can quickly see what version of codeigniter you are running.
if you need to do a full upload heres how to do it with no interruption - name the application folder with a newer date or version. upload the folder. then in your index.php file change the name of the application folder to new version. besides switching to new version with no down time - it also gives you instant roll back if you find a bug - just change the application name back in your index file. much less stress then trying to selectively replace files on a live site.
you need to get an app like SmartGit or similar so you can do your git tracking and changes locally on your computer. make your commits locally - you will do it more often because its local so you will have a better commit history - and will be much faster when you need to push to remote.
small point but if you put your application and system folders one level above the index.php file -- its much much more secure, and it keeps things organized if you have different application folders.

Related

work as a group on webapp using cvs

Me and my friend are in different countries have been developing a LAMP web app for several weeks. All these times we have been sharing source code over ftp. In this way php files become messy. I have heard about CVS, and have been reading about it. But I still cannot figure out how it works exactly.
How does the CVS could help me in this matter ?
I would be much appreciated for someone who point me in the right direction.
Ok here comes a very simple explanation of VCS. After using it for a while you'll laugh at the explanation but for now I guess this should help you.
What are the problems of your current ftp file sharing?
If 2 people upload the same file one of the files will get overwritten
After uploading it you'll only see who changed the file (the last time) but not where it got changed
You can't provide information about the changes (despite putting comments in the files itself)
You can't go back in time, once uploaded old files are lost
With version control you can solve these problems:
Files get either merged into one new file, or get overwritten but the old file will still be stored to roll back if needed
You can see who made which changes when
You can provide comments when you "upload" your files about what got changed (without storing these comments inside files)
You can always go back in time and restore old "uploads"/changes
You can also create small side projects by branching. This basically let's you split your project in smaller pieces and work on them separately.
So at the beginning of your work you usually get your local sources up-to-date by getting all the changes that got made. Then you do your work and afterwards you update the online version with your changes so that other developers can pull these changes and continue to work on them or integrate these changes into their current changes.
How to implement this sorcery?
You could google for "how to implement git" or "how to implement svn" but I would recommend you to use an online service as a beginner. Here is a list of services: https://git.wiki.kernel.org/index.php/GitHosting
My personal preference for closed source projects with a low number of developers is https://bitbucket.org/. You get a small wiki page and bug tracking tool provided with some of the services. If you want to use bitbucket, here is the very easy to understand documentation: https://confluence.atlassian.com/display/BITBUCKET/Bitbucket+101
Important to know:
Soon you'll learn that you don't upload files as I've written multiple times but rather change lines of code. You also don't upload them you "commit" them.
While cvs could help, not many developers will recommend using it for new projects. It has largely been replaced with Subversion (svn), but even that is falling out of favour. Many projects these days use distributed version control with git or Mercurial (hg).
A good introduction to git can be found in the free online book Pro Git.
In any case, these things are all version control systems. They help to synchronize the code between developers, and also let you track
who changed code,
when it was changed,
why it was changed, and
how it was changed.
This is very important on projects with multiple developers, but there is value in using such a system even when working on your own.

How to start creating a php script, that will be installed on many servers?

I was wondering how to start coding a script using php, and that script will be used on many websites.
so should I start first by creating the database ? and then start creating php files that will process data from the database ?
and should I start thinking of an install wizard for this script at first, or later when I finish the project I'll create one ?
I'm really confused on how to start a project, can you please give me some advice ?
and thanks everyone :D
should I start first by creating the database?
If you are going to use a database in your PHP script, then yes, you should install a database first. MySQL is a good start.
and then start creating php files that will process data from the database?
I would start on one server first, and create one PHP file called index.php that will do a database query. Then work your way to multiple PHP files from there.
and should I start thinking of an install wizard for this script at first, or later when I finish the project I'll create one.
Installing PHP files is 90% of the times as simple as just copying them onto your new server. I wouldn't worry about an install wizard just yet.
Another general tip because you are a beginner: install WAMPServer, it is a webserver/PHP server/MySQL Server in one that runs on your local computer. This is great for developing because you can just put your PHP files in C:\WAMP, edit them and directly see the result in your browser through http://localhost/. Then when you are happy you can upload to the server, or multiple servers. (Just by copying).
Most php software does not have, or need for that matter, what you would call an install wizzard.
I would suggest you to develop whichever way feels most natural to you.
Some people find it easier to start with the database design, while others prefer to write some code first and then expand the db schema further. There really is no right way to do it.
Starting a PHP project can be as easy as creating a text file and pumping out lines of code, however if you plan on creating a sizeable project, I would suggest a fully featured IDE.
Decide what dependencies your script has.
Decide which minimum version of PHP the script will be compatible with.
Work out a script which queries the users setup to detect whether these conditions are met or not. (eg does it rely on the mysql extension to be installed).
Detail how to meet each of the dependencies in case they are missing.
Explain which is the minimum version number supported, if your script detects it is below that version number.
Test it on your target Operating Systems.
Run a script which creates a database, test whether that was created. Provide detailed instructions on how to do this manually, and how to provide the correct privileges.
If necessary give them a config file which permits them to enter key information such as doc_root etc.
Conform to common wisdom such as short_tags = off else override these settings. Imagine the user is on shared hosting and is running on safe_mode = on.
Try and follow your own instructions and re-install it on your localhost, then on a live server - ideally on a variety of OSs too.

How to self-update PHP+MySQL CMS?

I'm writing a CMS on PHP+MySQL. I want it to be self-updatable (throw one click in admin panel). What are the best practices?
How to compare current version of cms and a version of the update (application itself and database). Should it just download zip archive, upzip it and overwrite files? (but what to do with files that are no longer used). How to check if an update is downloaded correctly? Also it supports modules and I want this modules to be downloadable from the admin panel of cms.
And how should I update MySQL tables?
Keep your code in a separate location from configuration and otherwise variable files (uploaded images, cache files, etc.)
Keep the modules separate from the main code as well.
Make sure your code has file system permissions to change itself (use SuPHP for example).
If you do these, simplest would be to completely download the new version (no incremental patches), and unzip it to a directory adjacent to the one containing the current version. Because there won't be variable files inside the code directory, you can just remove or rename the old one and rename the new one to replace it.
You can keep the version number in a global constant in the code.
As for MySQL, there's no other way than making an upgrade script for every version that changes the DB layout. Even automatic solutions to change the table definition can't know how to update the existing data.
A slightly more experimental solution could be to use something like the phpsvnclient library.
With features:
List all files in a given SVN repository directory
Retrieve a given revision of a file
Retrieve the log of changes made in a repository or in a given file between two revisions
Get the repository latest revision
This way you can see if there are new files, removed files or updated files and only change those in your local application.
I recon this will be a little harder to implement, but the benefit would probably be that it is easier and quicker to add updates to your CMS.
You have two scenarios to deal with:
The web server can write to files.
The web server can not write to files.
This just dictates if you will be decompressing a ZIP file or using FTP to update the files. In ether case, your first step is to take a dump of the database and a backup of the existing files, so that the user can roll back if something goes horribly wrong. As others have said, its important to keep anything that the user will likely customize out of the scope of the update. Wordpress does this nicely. If a user has made changes to core logic code, they are likely smart enough to resolve any merge conflicts on their own (and smart enough to know that a one click upgrade is probably going to lose their modifications).
Your second step is to make sure that your script doesn't die if the browser is closed. This is a process that really should not be interrupted. You could accomplish this via ignore_user_abort(true);, or some other means. Or, if you like, allow the user to check a box that says "Keep going even if I get disconnected". I'm assuming that you'll be handling errors internally.
Now, depending on permissions, you can either:
Compress the files to be updated to the system /tmp directory
Compress the files to be updated to a temporary file in the home directory
Then you are ready to:
Download and decompress the update en situ , or in place.
Download and decompress the update to the system's /tmp directory and use FTP to update the files in the web root
You can then:
Apply any SQL changes as needed
Ask the user if everything went OK
Roll back if things went badly
Clean up your temp directory in the system /tmp directory, or any staging files in the user's web root / home directory.
The most important aspect is making sure you can roll back changes if things went bad. The other thing to ensure is that if you use /tmp, be sure to check permissions of your staging area. 0600 should do nicely.
Take a look at how Wordpress and others do it. If your choice of licenses and their's agree, you might even be able to re-use some of that code.
Good luck with your project.
There is a SQL library called SQLOO (that I created) that attempts to solve this problem. It's a little rough still, but the basic idea is that you setup the SQL schema in PHP code and then SQLOO changes the current database schema to match the code. This allows for the SQL schema and attached PHP code to be changed together and in much smaller chunks.
http://code.google.com/p/sqloo/
http://code.google.com/p/sqloo/source/browse/#svn/trunk/example <- examples
Based on experience with a number of applications, CMS and otherwise, this is a common pattern:
Upgrades are generally one-way. It's possible to take a snapshot of full system state for a restore upon failure, but to restore usually entails losing any data/content/logs added to the system since the upgrade. Performing an incremental rollback can put data at risk if something were not converted properly (e.g. database table changes, content conversions, foreign key constraints, index creation, etc.) This is especially true if you've made customizations that rollback scripts couldn't possibly account for.
Upgrade files are packaged with some means of authentication/verification, such as md5 or sha1 hashes and/or digital signature to ensure it came from a trusted source and was not tampered. This is particularly important for automated upgrade processes. Suppose a hacker exploited a vulnerability and told it to upgrade from a rogue source.
Application should be in an offline mode during the upgrade.
Application should perform a self-check after an upgrade.
I agree with Bart van Heukelom's answer, it's the most usual way of doing it.
The only other option would be to turn your CMS into a bunch of remote Web Services/scripts and external CSS/JS files that you host in one location only.
Then everyone using your CMS would connect to your central "CMS server" and all that would be on their (calling) server is a bunch of scripts to call your Web Services/scripts that do all the processing and output. If you went down this route you'd need to identify/authenticate each request so that you returned the corresponding data for the given CMS user.

Website doesn't work during uploading of crucial files

I have a problem with maintenance of my php based website. My website is built on the Zend Framework. When I wish to upload a new copy or version online - during the time of upload especially when crucial files like models and controllers are being uploaded and rewritten - the site won't run understandably.
Is there a way to upload a website without having to go through this issue?
My updates are really quite regular. Let's say like once or twice a week in this case.
You can make use of the fact that renaming directories is quick and easy even through FTP. What I usually do is:
Have two directories, website_live and website_upload
website_live contains the live website (obviously)
Upload contents to website_upload
Rename website_live to website_old (or whatever)
Rename website_upload to website_live
done! Downtime less than two seconds if you rename quickly.
It gets a bit more complex if you have uploaded content in the old version (e.g. from a CMS) that you need to transfer to the new one. It's cumbersome to do manually every time, but basically, it's just simple rename operations too (renaming works effortlessly in FTP as well).
This is a task that can be automated quite nicely using a simple deployment script. If you're on Linux, setting up a shell script for this is easy. On Windows, a very nice tool I've worked with to do automated FTP synchronizing, renaming and error handling - even with non-technical people starting the process - is ScriptFTP. It comes with a good scripting language, and good documentation. It's not free, though.
If you're looking to get into hard-core automated PHP deployment, I've been doing some research in that field recently. Maybe the answers to my recent bounty question can give you inspiration.

How do you deploy a website to your webservers?

At my company we have a group of 8 web developers for our business web site (entirely written in PHP, but that shouldn't matter). Everyone in the group is working on different projects at the same time and whenever they're done with their task, they immediately deploy it (cause business is moving fast these days).
Currently the development happens on one shared server with all developers working on the same code base (using RCS to "lock" files away from others). When deployment is due, the changed files are copied over to a "staging" server and then a sync script uploads the files to our main webserver from where it is distributed over to the other 9 servers.
Quite happily, the web dev team asked us for help in order to improve the process (after us complaining for a while) and now our idea for setting up their dev environment is as follows:
A dev server with virtual directories, so that everybody has their own codebase,
SVN (or any other VCS) to keep track of changes
a central server for testing holding the latest checked in code
The question is now: How do we manage to deploy the changed files on to the server without accidentaly uploading bugs from other projects? My first idea was to simply export the latest revision from the repository, but that would not give full control over the files.
How do you manage such a situation? What kind of deployment scripts do you have in action?
(As a special challenge: the website has organically grown over the last 10 years, so the projects are not split up in small chunks, but files for one specific feature are spread all over the directory tree.)
Cassy - you obviously have a long way to go before you'll get your source code management entirely in order, but it sounds like you are on your way!
Having individual sandboxes will definitely help on things. Next then make sure that the website is ALWAYS just a clean checkout of a particular revision, tag or branch from subversion.
We use git, but we have a similar setup. We tag a particular version with a version number (in git we also get to add a description to the tag; good for release notes!) and then we have a script that anyone with access to "do a release" can run that takes two parameters -- which system is going to be updated (the datacenter and if we're updating the test or the production server) and then the version number (the tag).
The script uses sudo to then run the release script in a shared account. It does a checkout of the relevant version, minimizes javascript and CSS1, pushes the code to the relevant servers for the environment and then restarts what needs to be restarted. The last line of the release script connects to one of the webservers and tails the error log.
On our websites we include an html comment at the bottom of each page with the current server name and the version -- makes it easy to see "What's running right now?"
1 and a bunch of other housekeeping tasks like that...
You should consider using branching and merging for individual projects (on the same codebase), if they make huge changes to the shared codebase.
we usually have a local dev enviroment for testing (meaning, webserver locally) for testing the uncommited code (you don't want to commit non functioning code at all), but that dev enviroment could even be on a separeate server using shared folders.
however, committed code, should be deployed to a staging server for testing before putting it in production.
You can probably use Capistrano even though is more for ruby there are some articles that describe how to use it for PHP
I think Phing can be use with CVS but not with SVN (at least that what I last read)
There are also some project around that mimic Capistrano but written in PHP.
Otherwise there is also a custom made solution :
tag files you want to deploy.
checkout files using the tag in a
specific directory
symlink the directory to the current
document root (easy to rollback to
the previous version)
Naturally check out SVN for the repository, Trac to track things, and Apache Ant to deploy.
The basic process is managing in Subversion, tracking the repositroy and developers in Trac and using Ant deployment scripts to push your site out with the settings needed. Ant allows you to easily deploy a project to a specific location. (Dev/test/prod) etc.
You need to look at:
Continuous Integration
Running unit tests on check-in of code to check it is bug free
Potentially rejecting code if it contains a bug
Having nightly builds
Releasing only the last build that was bug free
You may not get to a perfect solution, especially not at first, but the more you use your chosen solution, the more comfortable everyone will get and be able to make suggestions on improving it.
We check for the stability with ant, every night. And use ant script to deploy. It is very easy to configure and use.
I gave a similar answer yesterday to another question. Basically you can work in branches and integrate before going live.
The biggest thing you will have to get your head round is that you are dealing with changes to files, rather than individual files. Once you have branches there isn't really a current version there are just versions with different changes in.

Categories