Push website updates without affecting the user - php

I'm working on my first website right now, and when I'm updating a file (uploading the newest version to the server), for example the stylesheet or a .php file, and I try to load the page halfway through it being uploaded/changed, I get a blank screen or broken CSS. Sometimes the cache has to be cleared for the correct stylesheet to load.
When pushing updates to users, how can this be prevented?

One way to do this is with "cache busting". Everytime you make changes to your .css or .js file you rename the file.
style_v1.0.css // Release 1.0
style_v1.1.css // Release 1.1
style_v2.0.css // Release 2.0
Or do it after the filename:
style.css?v=1.0 // Release 1.0
style.css?v=1.1 // Release 1.1
style.css?v=2.0 // Release 2.0

Regarding the .php file updates, the users should just receive the new responses from the updated files.
In respect to the CSS, I'd suggest trying some JavaScript via jQuery.
Either way, it is good practice to not work on a live version of a website, but in a "sandbox" copy of it saved on your local machine or another folder on the server and after you've achieved the desired features, upload the files at night or when your website has minimal traffic.

We've all been there and done this. Although the question probably isn't a good fit for Stack Overflow, there is a chance it'll remain here and other folks will find it. With this in mind here's an example workflow I use and some tips:
General tips
Never, ever, ever, ever work on a website live. Always take a backup and set it up locally on your machine using WAMP, MAMP, LAMP, or similar
Set up source control. There's plenty of providers out there that will give you a free account for you to use.
Learn about .git and branching
Don't use FTP. It's full of problems
SSH to your server. If you're unsure about what you're doing, read some tutorials.
Workflow
My typical workflow when working on websites is as follows:
Clone the repo from .git
Set up website locally
Make changes to the files, website
Commit those changes back into source control and push to the develop branch.
Run some tests (selenium/similar) on the develop code
If those tests pass, merge the develop branch to the master branch by doing git merge develop --no-ff -m "message"
Have a webhook set up to push code to your server/live/production website.
Only ever do a push to production at a time when both:
Visitors are at a low point (i.e. night)
When you have enough time to fix something going wrong
If any issues happen when pushing to Production, rollback to the last version and figure out what went wrong.

You can also append time Stamp to your file name so each time you refresh you get unique file
<script src="../assets/js/header.js?rndstr=<?php echo getCurrentTimeStamp();?>"></script>
function getCurrentTimeStamp(){
return date('Y-m-d G:i:s');
}

Related

How to separate application from data in a good manner?

During development of my php-app I've stored all data that the app produces within the app-folder and hence the data being part of my git-repo. The root of the repo is also the webroot. This has worked well for me during the beginning of development and it sort of made sense at the time to commit the app together with the data.
But I feel this needs to be redone now for several reasons. One being that the data-size is getting too big for me to comfortable store it in the git-repo.
Another is that I simply do not need to have a snapshot of the data for a given commit.
The main-reason, I guess is that I've deployed my app to my NAS(synology) for real use, and I push commits to it via webdav. And to my knowledge of git, it's not possible or at least not practical to do pushes while leaving certain data in the remote-repo untouched.
Also, it would be nice though not necessary if both the app on the nas and the app on my computer, while testing has access to the same data-directory. This made me think of moving the data out from the app-directory and access it via ftp. But I'm not sure that would be a good idea, especially since all the "real" access to the files would be from the very same device, the nas.
What would be a good structure and file access methods etc for this?
I always keep my data in a separate directory which I take out of git with a .gitignore file. This means that when I push changes I am not overwriting the live data.
If I need fresh data to test on I use ftp to download a new version of the data directory from the live server to my development server.
This works for me because most of the data is uploaded by users so it quickly gets out of sync with my development copy and I don't want to overwrite it. It also means that I can't accidentally corrupt their data with a mistake in development.

Make GIT auto-commit every file change

I know that this was already asked, but I want to share my situation also to focus on alternative solutions and tips. We develop PHP websites, and we test everything on a remote semi-production server, to see how our web applications work in the real world.
Usually, we mount remote SFTP on our computers, and read/write directly on the server (then we have a nightly backup from the test server to another), so we haven't to save-and-upload every file we change.
Now we are considering to switch to GitHub to track file changes. I already installed and configured GitHub on my Mac, and a webhook that automatically pull from our GIT repository to out test web server. So, when I commit and sync to GitHub, I can refresh the page(s) I changed in my browser to see it in action.
The problem is that web scripting is often made of many micro-changes, sometimes you have to continuously hit "save" and switch to the browser to check if it works correctly... so having to commit (and specify a summary!) everytime is unacceptable.
How we can solve this problem with the workflow?
Thank you in advance.
Commit only good code. Test with sftp if you have to, but only commit good code.
Losing a benefit like git bisect is a pretty serious side-effect of what you're proposing among others.
Simply put: you shouldn't do that.
Source code management systems are not for testing and transferring your code under development to testing servers. You should either work with SFTP as before or set up rsync. These are the tools for transfer.
You should use GitHub to store only working, complete changes. After testing your "micro-changes" and getting to a working snapshot you should commit and describe everything you modified since the last commit.
Anything else would be just hacking the tools you try to use. Doing what you propose would just mean loosing (nearly) all benefits that git gives you and would be much less convenient than using plain sftp/rsync.

How to upload changes automatically?

I'm developing a simple website(in a VPS) using php.
My question is: how can I send the changes I do using a secure protocol?
At the moment I use scp but I need a software that check if the file is changed, if yes sends it to the server.
Advice?
use inotify+rsync or lsyncd http://code.google.com/p/lsyncd/
check out this blog post for the inotify method http://www.cyberciti.biz/faq/linux-inotify-examples-to-replicate-directories/
If you have complete access to your VPS, then you can mount your remote home directory, via ssh and webdav, locally as a normal filesystem. Afterwards you can edit on your computer locally. Any changes should be automatically uploaded.
Have a look at this relevant article
FTP(S): Set up an FTP server on the target, together with an SSL cert (a personal cert is good enough).
SSH: You can pass files around through SSH itself, but I found it most unreliable (imho).
Shares: You could share a folder (as the other people suggested) with your local machine. This would be the perfect solution if it wasn't hard at diagnosing disconnection issues. I've been through this and it is far from a pleasant experience.
Dropbox: Set up a dropbox server on your VPS and a dropbox client on your local PC. Using dropbox (or similar methods like skydrive, icloud..), also has some added advantages, eg, versioning, recovery, conflict detection...
All in all, I think the FTPS way is the most reliable and popular one I've ever used. Via SSH, it caused several issues after using it from about 2 days. The disk sharing method caused me several issues mostly because it requires that you are somewhat knowledgeable with the protocol etc.
The dropbox way, although a bit time-consuming to set up, is still a good solution. However, I would only use it for larger clients, not the average pet hobby website :)
Again, this is just from experience.
You could also use Subversion to do this, although the setup may take a little while.
You basically create a Subversion repository and commit all of your code. Then on your web server, make the root directory a checkout of that repository.
Create a script that runs a svn update command, and set it as a post-commit hook. Essentially, the script would be executed every time you would do a commit from your workstation.
So in short, you would...
Update your code locally
Commit via Subversion (use Tortoise SVN, it's easy and simple to use)
Automatically, the commit would trigger the post-commit hook on the server and run the svn update command to synchronize the code in your root directory.
You would of course have to hide all of the .svn folders to the public using .htaccess files (providing you're using Apache).
I know this would work, as I have done it in the past.
The beauty of this option is that you also get a two in one. Source versionning and easy updates. If you make a mistake, you can also easily revert them back.

Process for updating a live website

What is the best process for updating a live website?
I see that a lot of websites (e.g. StackOverflow) have warnings that there will be downtime for maintenance in advance. How is that usually coded in? Do they have a config value which determines whether to display such a message in the website header?
Also, what do you do if your localhost differs from the production server, and you need to make sure that everything works the same after you transfer? In my case, I set up development.mydomain.com (.htaccess authentication required), which has its own database and is basically my final staging area before uploading everything to the live production site. Is this a good approach to staging?
Lastly, is a simple SFTP upload the way to go? I've read a bit about some more complex methods like using server-side hooks in Git.. Not sure how this works exactly or whether it's the approach I should be taking.
Thanks very much for the enlightenment..
babonk
This is (approximately) how it's done on Google App Engine:
Each time you deploy an application, it is associated with a subdomain according to it's version:
version-1-0.example.com
version-1-1.example.com
while example.com is associated with one of the versions.
When you have new version of server-side software, you deploy it to version-2-0.example.com, and when you are sure to put it live, you associate example.com with it.
I don't know the details, because Google App Engine does that for me, I just set the current version.
Also, when SO or other big site has downtime, that is more probable to be a hardware issue, rather than software.
That will really depend on your website and the platform/technology for your website. For simple website, you just update the files with FTP or if the server is locally accessible, you just copy your new files over. If you website is hosted by some cloud service, then you have to follow whatever steps they offer to you to do it because a cloud based hosting service usually won’t let you to access the files directly. For complicated website that has a backend DB, it is not uncommon that whenever you update code, you have to update your database as well. In order to make sure both are updated at the same time, you will have to take you website down. To minimize the downtime, you will probably want to have a well tested update script to do the actual work. That way you can take down the site, run the script and fire it up again.
With PHP (and Apache, I assume), it's a lot easier than some other setups (having to restart processes, for example). Ideally, you'd have a system that knows to transfer just the files that have changed (i.e. rsync).
I use Springloops (http://www.springloops.com/v2/) to host my git repository and automatically deploy over [S/]FTP. Unless you have thousands of files, the deploy feels almost instantaneous.
If you really wanted to, you could have an .htaccess file (or equivalent) to redirect to a "under maintenance" page for the duration of the deploy. Unless you're averaging at least a few requests per second (or it's otherwise mission critical), you may not even need this step (don't prematurely optimize!).
If it were me, I'd have a an .htacess file that holds redirection instructions, and set it to only redirect during your maintenance hours. When you don't have an upcoming deploy, rename the file to ".htaccess.bak" or something. Then, in your PHP script:
<?php if (file_exists('/path/to/.htaccess')) : ?>
<h1 class="maintenance">Our site will be down for maintenance...</h1>
<?php endif; ?>
Then, to get REALLY fancy, setup a Springloops pre-deploy hook to make sure your maintenance redirect is setup, and a post-deploy hook to change it back on success.
Just some thoughts.
-Landon

Making Changes a Live Site (Codeigniter, but not specific to it)

I'm using Codeigniter if this makes it easier. I'm wondering if a website is live with populated database and users accessing, and I have a new idea to implement into the website, how should I do it? Do you work directly on the live site?
Or do you copy the database and the files to a local server (MAMP/WAMP/XAMMP) and work on it there then if it works update the live site with the changes. For this second method, is there anyway to check which are the files that have been changed and only upload those? What if it works on local sever, but after updating the live site it does not work?
Codeigniter configuration also has the option of default database and other database. I wonder how these can be used for testing?
Don't work directly on the live site. Instead, have a development environment (using, say, vmware or virtualbox on your machine) and clone the live environment. Get you code in version control (I'll say it again: GET YOUR CODE IN VERSION CONTROL), do your development on the development machine, against a dev branch in version control. After you're done testing and happy with the changes, commit them to a 'deployments' or 'live' branch, and deploy on the live site from there. Be sure to do a backup of the database before you roll out the new code.
Edit: use symlinks to stage your new code base on the live site. If it doesn't work, just switch it back to the old directory. Saves you a lot of greif!
Read up on version control (svn, git, et al.).
Never work on a live site, preferably on another server (to prevent while(1){..} crashes etc.), but on the same server at least on another documentroot/domain, preferably with limited access to your IP only.
Normally I only copy the table-definitions (mysqldump -t is nice for that) and have another database altogether, if you need the latest & greatest data, you could replicate your main database to a test-database, which also gives you the advantage of a cheap backup if you haven't got one already.
I usually set a switch in Apache/Vhost configuration (SetEnv DEV=1), so that in code I can use if(getenv('DEV')==1) to check whether I can just dump variables on error conditions, and which limit the possibility of accidentaly committing/uploading code with a 'development switch' still on by accident.
The typical answer to this question is going to be do your work in the test environment, not the production environment. And I agree that that is often the best way to handle changes. If you have the luxury of a test environment, then take full advantage of it. After all, that's what it's there for--to test.
However, that doesn't mean that working in the production environment is completely off-limits. Your decision should be based on a few factors:
Is the operation of your website critical to your business needs?
If so, do all your work in a test environment and deploy it to your live environment when you've fully tested your changes.
Are the changes you're about to make going to have a large impact on the rest of the website?
For example, are you about to change the Database schema? Are you about to change the way users log in or out of your website? If so, do your work in the test environment. If you're changing the behavior of a page that doesn't have any effect elsewhere, you could get away with making the change in the production environment.
How long will your changes take to implement?
If you can't guarantee that your changes won't take longer than 15-20 minutes, do your work in a test environment.

Categories