How to upload changes automatically? - php

I'm developing a simple website(in a VPS) using php.
My question is: how can I send the changes I do using a secure protocol?
At the moment I use scp but I need a software that check if the file is changed, if yes sends it to the server.
Advice?

use inotify+rsync or lsyncd http://code.google.com/p/lsyncd/
check out this blog post for the inotify method http://www.cyberciti.biz/faq/linux-inotify-examples-to-replicate-directories/

If you have complete access to your VPS, then you can mount your remote home directory, via ssh and webdav, locally as a normal filesystem. Afterwards you can edit on your computer locally. Any changes should be automatically uploaded.
Have a look at this relevant article

FTP(S): Set up an FTP server on the target, together with an SSL cert (a personal cert is good enough).
SSH: You can pass files around through SSH itself, but I found it most unreliable (imho).
Shares: You could share a folder (as the other people suggested) with your local machine. This would be the perfect solution if it wasn't hard at diagnosing disconnection issues. I've been through this and it is far from a pleasant experience.
Dropbox: Set up a dropbox server on your VPS and a dropbox client on your local PC. Using dropbox (or similar methods like skydrive, icloud..), also has some added advantages, eg, versioning, recovery, conflict detection...
All in all, I think the FTPS way is the most reliable and popular one I've ever used. Via SSH, it caused several issues after using it from about 2 days. The disk sharing method caused me several issues mostly because it requires that you are somewhat knowledgeable with the protocol etc.
The dropbox way, although a bit time-consuming to set up, is still a good solution. However, I would only use it for larger clients, not the average pet hobby website :)
Again, this is just from experience.

You could also use Subversion to do this, although the setup may take a little while.
You basically create a Subversion repository and commit all of your code. Then on your web server, make the root directory a checkout of that repository.
Create a script that runs a svn update command, and set it as a post-commit hook. Essentially, the script would be executed every time you would do a commit from your workstation.
So in short, you would...
Update your code locally
Commit via Subversion (use Tortoise SVN, it's easy and simple to use)
Automatically, the commit would trigger the post-commit hook on the server and run the svn update command to synchronize the code in your root directory.
You would of course have to hide all of the .svn folders to the public using .htaccess files (providing you're using Apache).
I know this would work, as I have done it in the past.
The beauty of this option is that you also get a two in one. Source versionning and easy updates. If you make a mistake, you can also easily revert them back.

Related

Make GIT auto-commit every file change

I know that this was already asked, but I want to share my situation also to focus on alternative solutions and tips. We develop PHP websites, and we test everything on a remote semi-production server, to see how our web applications work in the real world.
Usually, we mount remote SFTP on our computers, and read/write directly on the server (then we have a nightly backup from the test server to another), so we haven't to save-and-upload every file we change.
Now we are considering to switch to GitHub to track file changes. I already installed and configured GitHub on my Mac, and a webhook that automatically pull from our GIT repository to out test web server. So, when I commit and sync to GitHub, I can refresh the page(s) I changed in my browser to see it in action.
The problem is that web scripting is often made of many micro-changes, sometimes you have to continuously hit "save" and switch to the browser to check if it works correctly... so having to commit (and specify a summary!) everytime is unacceptable.
How we can solve this problem with the workflow?
Thank you in advance.
Commit only good code. Test with sftp if you have to, but only commit good code.
Losing a benefit like git bisect is a pretty serious side-effect of what you're proposing among others.
Simply put: you shouldn't do that.
Source code management systems are not for testing and transferring your code under development to testing servers. You should either work with SFTP as before or set up rsync. These are the tools for transfer.
You should use GitHub to store only working, complete changes. After testing your "micro-changes" and getting to a working snapshot you should commit and describe everything you modified since the last commit.
Anything else would be just hacking the tools you try to use. Doing what you propose would just mean loosing (nearly) all benefits that git gives you and would be much less convenient than using plain sftp/rsync.

Process for updating a live website

What is the best process for updating a live website?
I see that a lot of websites (e.g. StackOverflow) have warnings that there will be downtime for maintenance in advance. How is that usually coded in? Do they have a config value which determines whether to display such a message in the website header?
Also, what do you do if your localhost differs from the production server, and you need to make sure that everything works the same after you transfer? In my case, I set up development.mydomain.com (.htaccess authentication required), which has its own database and is basically my final staging area before uploading everything to the live production site. Is this a good approach to staging?
Lastly, is a simple SFTP upload the way to go? I've read a bit about some more complex methods like using server-side hooks in Git.. Not sure how this works exactly or whether it's the approach I should be taking.
Thanks very much for the enlightenment..
babonk
This is (approximately) how it's done on Google App Engine:
Each time you deploy an application, it is associated with a subdomain according to it's version:
version-1-0.example.com
version-1-1.example.com
while example.com is associated with one of the versions.
When you have new version of server-side software, you deploy it to version-2-0.example.com, and when you are sure to put it live, you associate example.com with it.
I don't know the details, because Google App Engine does that for me, I just set the current version.
Also, when SO or other big site has downtime, that is more probable to be a hardware issue, rather than software.
That will really depend on your website and the platform/technology for your website. For simple website, you just update the files with FTP or if the server is locally accessible, you just copy your new files over. If you website is hosted by some cloud service, then you have to follow whatever steps they offer to you to do it because a cloud based hosting service usually won’t let you to access the files directly. For complicated website that has a backend DB, it is not uncommon that whenever you update code, you have to update your database as well. In order to make sure both are updated at the same time, you will have to take you website down. To minimize the downtime, you will probably want to have a well tested update script to do the actual work. That way you can take down the site, run the script and fire it up again.
With PHP (and Apache, I assume), it's a lot easier than some other setups (having to restart processes, for example). Ideally, you'd have a system that knows to transfer just the files that have changed (i.e. rsync).
I use Springloops (http://www.springloops.com/v2/) to host my git repository and automatically deploy over [S/]FTP. Unless you have thousands of files, the deploy feels almost instantaneous.
If you really wanted to, you could have an .htaccess file (or equivalent) to redirect to a "under maintenance" page for the duration of the deploy. Unless you're averaging at least a few requests per second (or it's otherwise mission critical), you may not even need this step (don't prematurely optimize!).
If it were me, I'd have a an .htacess file that holds redirection instructions, and set it to only redirect during your maintenance hours. When you don't have an upcoming deploy, rename the file to ".htaccess.bak" or something. Then, in your PHP script:
<?php if (file_exists('/path/to/.htaccess')) : ?>
<h1 class="maintenance">Our site will be down for maintenance...</h1>
<?php endif; ?>
Then, to get REALLY fancy, setup a Springloops pre-deploy hook to make sure your maintenance redirect is setup, and a post-deploy hook to change it back on success.
Just some thoughts.
-Landon

How to optimize upgrading web application?

I mantain a custom PHP application (build for me) that is hosted in a web server. Sometimes I add new features or repair bugs, and after test in local I upload the changes to the web server. It's not a critical application (is a game), but the most of the time there are some people connected.
The steps that I make to upgrade the application:
Access via FTP (Filezilla)
Upload a .htaccess file that redirects all the people (except my IP) to a mantain.html file
Check that access is denied for other IP except mine.
Backup old code
Upload new code
Go to PhPMyAdmin
Backup DB
Execute scripts for the DB
Test that all works fine (if not -> revert the backups)
remove .htaccess file
I usually spend an average of 30 minutes doing these steps, and I'm wondering if there is any way to optimize, automatize or doing something to spend less time. Also I know that if I can automatize some steps there are less prone to have errors.
Several other answers suggest PHP-specific deployment tools, but being as I'm not very familiar with PHP, I'll offer some general tips. These suggestions may be redundant by some of the other tools already suggested, though.
First off, don't upload a new .htaccess file every time--just have two of them on your server. Perhaps call them .htaccess-permanent, and .htaccess-maintenence. Then create a symlink to the one that ought to be active. Then once you've tested that access is properly denied once, you don't have to do this manual testing phase every single time you do an upgrade.
I'd also write a shell script to do most everything for me. My new work flow would look like this:
Upload new code to server in a directory called new/
Log in to the server via shell, and execute the upgrade script
Test the new site
Run upgrade-finalize
The end.
Now for the interesting part, the upgrade script will do this:
It will delete the .htaccess symlink, and re-create it, pointing to .htaccess-maintenence.
It will copy the current code in current/ to backup/
It will back up the DB, using the exact same commands that PHPMyAdmin uses
It will move the contents of new/ (which you just uploaded) to current/
It will execute the scripts for the DB
And the upgrade-finalize script will simply:
Delete the .htaccess symlink, and re-create it, pointing to .htaccess-permanent once again
The only possibly tricky part here will be getting the exact commands that PHPMyAdmin uses to back up your database, but it's probably a simple mysqldump command, and you can probably get that info from PHPMyAdmin or some logs, or something. Sorry, I don't know more about PHPMyAdmin to help in this specific area.
Look into a deployment tool like Capistrano that allows you to automate those steps.
I usually spend an average of 30 minutes doing these steps, and I'm wondering if there is any way to optimize, automatize or doing something to spend less time.
There are many ways. For starters, steps one through eight can be done in a single shell script. You could checkout Phing, an automated deployment system. Also, you might want to delve in continuous integration for even more control over how and when the software can be deployed.
Doing this manually is, like you say, asking for trouble.
For starters, you could upload your files into a new webroot and when done, switch over the DocumentRoot in apache, leaving it available during the copy process. For any shared files you could use a symlink to a common folder (eg, uploaded images etc)
You could probably take the backup during operation as well if you don't care about consistency in the database. For migrations that doesn't "break" the functionality, you could also migrate it and test it on your new webroot with another hostname if consistency isn't a problem.
The best option is always to use multiple webservers so that you can take one offline for testing while the other one is operational, but you will still have problem with consistency, however I assume that is not an option since you don't mention it.

Making Changes a Live Site (Codeigniter, but not specific to it)

I'm using Codeigniter if this makes it easier. I'm wondering if a website is live with populated database and users accessing, and I have a new idea to implement into the website, how should I do it? Do you work directly on the live site?
Or do you copy the database and the files to a local server (MAMP/WAMP/XAMMP) and work on it there then if it works update the live site with the changes. For this second method, is there anyway to check which are the files that have been changed and only upload those? What if it works on local sever, but after updating the live site it does not work?
Codeigniter configuration also has the option of default database and other database. I wonder how these can be used for testing?
Don't work directly on the live site. Instead, have a development environment (using, say, vmware or virtualbox on your machine) and clone the live environment. Get you code in version control (I'll say it again: GET YOUR CODE IN VERSION CONTROL), do your development on the development machine, against a dev branch in version control. After you're done testing and happy with the changes, commit them to a 'deployments' or 'live' branch, and deploy on the live site from there. Be sure to do a backup of the database before you roll out the new code.
Edit: use symlinks to stage your new code base on the live site. If it doesn't work, just switch it back to the old directory. Saves you a lot of greif!
Read up on version control (svn, git, et al.).
Never work on a live site, preferably on another server (to prevent while(1){..} crashes etc.), but on the same server at least on another documentroot/domain, preferably with limited access to your IP only.
Normally I only copy the table-definitions (mysqldump -t is nice for that) and have another database altogether, if you need the latest & greatest data, you could replicate your main database to a test-database, which also gives you the advantage of a cheap backup if you haven't got one already.
I usually set a switch in Apache/Vhost configuration (SetEnv DEV=1), so that in code I can use if(getenv('DEV')==1) to check whether I can just dump variables on error conditions, and which limit the possibility of accidentaly committing/uploading code with a 'development switch' still on by accident.
The typical answer to this question is going to be do your work in the test environment, not the production environment. And I agree that that is often the best way to handle changes. If you have the luxury of a test environment, then take full advantage of it. After all, that's what it's there for--to test.
However, that doesn't mean that working in the production environment is completely off-limits. Your decision should be based on a few factors:
Is the operation of your website critical to your business needs?
If so, do all your work in a test environment and deploy it to your live environment when you've fully tested your changes.
Are the changes you're about to make going to have a large impact on the rest of the website?
For example, are you about to change the Database schema? Are you about to change the way users log in or out of your website? If so, do your work in the test environment. If you're changing the behavior of a page that doesn't have any effect elsewhere, you could get away with making the change in the production environment.
How long will your changes take to implement?
If you can't guarantee that your changes won't take longer than 15-20 minutes, do your work in a test environment.

Downloading PHP content from another domain (safe way)?

So, if this question has been asked before, I'm sorry. I'm not exactly sure what to search for.
Introduction:
All the domains I maintain now are hosted on my server, so I have not ran into this problem yet.
I have created a structure, similar to WordPress, for uploading and editing images.
I regularly create changes in the functions and upload them to a single folder. When the user logs in, the contents are automatically downloaded into their folder.
What I am wanting to do:
Now, say I have a user that is not hosted on my server. I cannot use copy(), but is there a safe and secure way to echo the contents of each php file (obviously, I can echo) into another file on the users server?
For example:
Currently I can copy from jasonleodurbin.com to geodun.com (same server), but say I want to copy jasonleodurbin.com/test.php to somedomain.com/test.php.
I had some thoughts like give each user a private key and send that to a file like echo.php. echo.php will grab the contents of every file (that has been modified recently) and echo that to the screen. The requesting server would take that content and copy that into it's respective .php file.
I assume I could send the key through GET, but since I have never dabbled into the security implications of anything (I am a hobbyist), I don't know how secure this is.
Are there any suggestions or directions that someone could send me?
I appreciate the help!
I'm assuming this is sensitive data. If that's the case, then I would suggest encrypting the file using PGP keys. Either way, you need a method to send the file from your server to their server. I can't recall how I did it, but I used to send encrypted data file from our remote server to a server in house. We used PGP keys to encrypt and decrypt once it arrived in house. As for the method we used to send the file across the web, I believe we used SCP (you need shell access on the server).
You could use FTP, but how about setting it up so that they only have access to a particular directory so they can't touch anything else. You'll need a script to grab the file from the FTP location and storing it in the appropriate directory per user?
Just thought of something, store the file in a protected folder. Have the user download the file using curl. I believe you can specify username/password with curl.
Several options:
Upload the newest version of test.php as test.phps (PHP Source file, will be displayed instead of run) in a location know to the client. It is then up to them to download this file and install it on their web server.
pros: not much effort required on your part, no keys or encryption required.
cons: everyone can view the contents of your PHP file if they know where to look, no guarantee that clients will actually get updated versions of the file.
Copy the file to clients web server. Use scp, ftp, or some such method to update test.php on the clients web server whenever you change it.
pros: file will always be updated. Reasonably secure if you use scp
cons: extra step required for you, you will have to remember to do this each time you change test.php. You will need to have access to the clients web server for this to work
Automated copy at a timed interval. Set up a cron script that syncs test.php to the clients web server at a certain time each hour/day/week/whatever
pros: Not much repeated effort required on the part of either party. Reasonably secure if you use scp
cons: could break if something changes and you're not emailing when an error occurs. You will still also need access to the clients machine for this to work.
There's probably a lot more different ways to do this as well, but this is just a few to get you started
Use a version control system, such as subversion. Just check in your code to the repository each time you make some changes you want to push, and run an update from the clients. If you're already using a version control system, create a production-branch where you commit your changes when they're ready to be pushed to clients.
It can be done from the clients in pure php (slightly experimental) with library from here or here, with a PHP extension, or with a wrapper to the native svn client.
This gives you security, as each user can have their own password, which you can retract if you so please. Can also do encryption by running through a ssh tunnel (limits your library choices to the wrapper I think), but really, wouldn't worry too much about encryption, who's going to be looking at the traffic between the servers? Unless you're doing top secret type stuff.
It also gives you automatic change detection, you don't have to roll your own way of keeping track of which files are updated as this is done when you commit your new changes.
It's a proven way of doing code bases up to date, so I don't see why you would implement your own. It also gives you the extra advantage of being able to roll back changes if (when) there's a problem with the code update.

Categories