How to avoid crashing your web app during replacing a file? - php

Let's say you have a big web app with large visits, but you don't want your app to crash & you don't want people to see the php or mysql errors that happens during replacing files using FTP, How to avoid that? How to just execute the old version of file until the replacing is done?
Thanks

you can follow at least one of this 2 rules:
to use accelerators (like APC) with turned off mtime checking. so until you clear cache manually - old versions will be used from memory
to use virtualhost symlinked to directory with your project: let's examine you store yout project at /home/project/www. and /home/project/public_html is your real webroot and symlinked to www. so - create /home/project/www2, checkout files there, setup and do whatever you want. after this - just change symlink.

I use git to upload my changes to a staging website on the same server, after testing I then push it to the production website. None of the files are changed until they are all received. On the plus side, it only sends the changes compressed, so I don't even have to send an entire file.
The staging area isn't required. I work with a lot of servers and sometimes some of the specific configurations on that server (mostly just find that an extension isn't installed)/
I'm sure you can do the same with another version control system. You need to be careful though. The tutorial I linked specifically stores the git information OUTSIDE the document root. Otherwise someone can just clone all the source code for your website.
If you like SVN, the .svn being in every directory can be a little annoying. Make sure that people can't download what they shouldn't be able to.

Deploy your app into the temporary directory. Then after you done, just rename the original app directory to app.old and the directory where you deployed your files into app
Note this should work okay in Unix environments. Also this will only work if all of the above directories are on the same file systems. In rare case users might see 404 error if they happen to access the app after your renamed the original app into .old and before you renamed temp dir into the original app directory.

Related

MAMP moodle project

I am working on a moodle based project, which I inherited from someone else. Having copied the files into htdocs folder and started MAMP, the files still don't show in the browser. Instead, the browser automatically initiates a download. I might be required to change the config file, however, since I do not have much experience with php and SQL I am not sure what exactly. My part of the project is to develop html and css, but need to be able to run it locally first.
What do I need to do to get the files run locally? The route I use is localhost:8888/whatever/whatever/index.php
In case someone else runs into the same problem - Apache downloads php files instead of reading them - here is what helped me.
.htaccess file may need changes if the application has changed servers.
Delete (at least rename if you don't want to remove it) config.php and run the application through the browser. It should initiate install automatically.
To run php and SQL I used MAMP.

Looking for a safe way to deploy PHP code

How we do things now
We have a file server (using NFS) that multiple web servers mount and use these mounts as the web root. When we deploy our codebase, we SCP an archive (tar.gz) to the NFS server and unarchive the data directly in the "web directory" of file server.
The issue
During the deploy process we are seeing some i/o errors, mostly when a requested file cannot be read: Smarty error: unable to read resource: "header.tpl" These errors seem to go away after the deploy is finished, so we assume that it's because unarchiving the data directly to the web directory isn't the safest of things. I'm guessing we need something atomic.
My Question
How can we atomically copy new files into an existing directory (the web server's root directory)?
EDIT
The files that we are uncompromising into the web directory are not the only files that are in the directory. We are adding files to the directory, that already has files. So copying the directory or using a symlink is not an option (that I know of).
Here's what I do.
DocumentRoot is, for example, /var/www/sites/www.example.com/public_html/:
cd /var/www/sites/www.example.com/
svn export http://svn/path/to/tags/1.2.3 1.2.3
ln -snf 1.2.3 public_html
You could easily modify this to expand your .tar.gz before changing the symlink instead of exporting from svn. The important part is that the change is the atomic application of the symlink.
I think rsync is a better choise instead of scp, only the changed files would be synchroned. but deploying code by script is not convenient for deveopments in a team, and the errors in deployment is not humanize.
you can think about Capistrano, Magallanes, Deployer, but they are script too. I may recommend you have a try walle-web, a deployment tool written in PHP with yii2 out of the box. I have hosted it in our company for months, it works smoothly while deploying test, simulate, production enviroment.
it depend on groups of bash tools, rsync, git, link, but a web ui generally well for operation, have a try:)
Why don't you just have 2 dirs with 2 different versions of the site. So when you finished deploying in site_2 you just switched site dir in your webserver config (for example apache) and copy all files to site_1 dir. Then you can deploy in site_1 dir and switched to it from site_2 with the same method.
RSync was born to run... er... I mean to do this very thing
RSync works over local file systems and ssh - it's very robust and fast - sending/copying only changed files.
It can be configured to delete any files that have been deleted (or are simply just missing from the source), or it can be configured to leave them alone. You can set up exclude lists to exclude certain files/directories when syncing.
Here's a link to a tutorial.
Re: atomic - link to another question on SO
I like the NFS idea. We do deploy our code to NFS server that is mout on our frontends. In fact we run a shell script when we want to release a new version. What we do is using a symlink current to the last release dir, like this:
/fasmounts/website/current -> /fasmounts/website/releases/2013120301/
And apache document root is:
/fasmounts/website/current/public
(in fact apache document root is /var/www which is a symlink to /fasmounts/website/current/public)
The shell script updates the current symlink to the new release AFTER everything has been uploaded correctly.

Using Version Control but still access the files normal way

I am trying hard to use version control but i am so used to old method of editing files directly via FTP that i am feeling confused what to do. So i am thinking of one solution and please help me with this if its possible or not
I have the user folder in Linux VPS system(Single VPS only)
/home/user/public_html2
Now that will be linked with Http://demo.server.com.
That directory will be version controlled with repo in the folder /usr/bin/repo/ so that i can commit the changes.
Now i will have another directory called
/home/user/public_html which will have the contents updated as the commit is done in /home/user/public_html2.
That /home/user/public_html will be linked with http://main.server.com
So that i keep working and editing files in public_html2 via FTP normal and test code as it is but main site is version controlled
Is it possible??
When you are using a version control system, why do you need to of to a FTP location to edit the files... It defeats the whole purpose.
Rather, checkout the files from the repository into your local machine and after the changes are done, commit the code.
If you don't want to use the version while working with the files on your local, simply export the code instead of checkout.
The problem I see from your approach is that the files / directories may go out of sync. Also, if there are multiple users working then this approach would bomb.
I would suggest to have the working copy of the files in your local machine and use them instead of mirroring them to another folder.
Maybe it's possible. But once you start to use version control systems, you should keep accessing your version controlled files through that system.
Or better, if you want to access files in "normal way" as you said, then if you access that files without committing any change then it should be ok. But if you want to modify them, you should always pass through your version control system, otherwise you could have problem with file versions.
You need a deployment script. Create a script to upload the files to the appropriate locations. Run this script whenever you wish to see your changes reflected on the site. If you wish to maintain two separate locations on a server running different versions, simply add a command line argument to the script to choose which location to upload to.
Use rsync instead of FTP if you can, to avoid transferring unchanged files every time.

php, user-uploaded files, version control, and website deployment

I have a website that I regularly update the code to. I keep it in version control. When I want to deploy a new version of the site, I do an export and then symlink the served directory name to the directory of the deployment.
There is a place where users can upload files, and I noticed once that, after I had deployed a new version, the user files were gone! Of course, I hadn't added them to the repository, and since the served site was from an export, they weren't uploaded into a version-controlled directory anyways.
PHP doesn't yet have integrated svn functionality, so I couldn't do much programmatically to user uploaded files. My solution was to create an additional website, files.website.com, which sits in a parallel directory to the served website, and is served out of a directory that is under version control. That way they don't get obliterated when I do an upgrade to the website. From time to time, I manually add uploaded files to the svn project, deleted user-deleted ones, and commit the new version. I'm working on a shell script to run from cron to do this, but it isn't my forte, so it's on the backburner as it's not a pressing need.
Is there a better way to do this?
I usually dont keep user generated data/file in svn. only the code, db schema/test data. What i usually do to deploy is an rsync from an up to date working copy which excludes the upload dir and .svn dirs. IMO content should be handled by more traditional filesystem/db backup mechanisms and not version control.
EDIT:
Just to be clear your symlinking strategy seems like a good practice. youre jsut missing the backup part it think. Id probably just tar | gzip the uploaded stuff in the cron job instead of interacting with SVN. And then probably have a seperate one to use mysqldump to dump the db and gzip that as well.
I would continue with the practice of exporting the site as upgrades are needed but have a symbolic link to a directory outside of the version controlled directory with the user uploaded content. This way when you do an export you only need to recreate the symlink if it gets blown away. You should then of course be backing up that user content as needed.
Rather than manually doing the export and managing symlinks, you could make your deployment directory a subversion checkout (from a production branch). That way, deploying is as simple as checking in your updates to the production branch.
This works as long as you have sufficient control of your subversion server and hosting setup, and your subversion repository is "ready to run." In this situation, your user directory could be an empty placeholder in subversion and would be left alone by the update process that runs on commit (business as usual for svn update). I'd still recommend (as mentioned by #Flash84x and #prodigitalson) a separate process to back up the user content.
There's an Ars Technica article with a description of how to set this up.
Update: If you follow this approach, make sure that your web server does not allow access to the .svn files in the deployment checkout.

How should I move my code from dev to production?

I have created a PHP web-application.
I have 3 environments: DEV, TEST, PROD.
What's a good tool / business practice for me to move my PHP web-application code from DEV to TEST to the PROD environment?
Realizing that my TEST environment still only connects to my TEST database; whereas, I need to PROD environment to connect to my PROD database. So the code is mostly the same, except that I need to change my TEST code once moved into PROD to connect to the PROD database and not TEST database.
I've heard of people taking down Apache in such away that it doesn't allow new connections and once all the existing connections are idle it simply brings down the web server.
Then people manually copy the code and then manually update the config files of the PHP application to also point to the PROD instance.
That seems terribly dangerous.
Does a best practice exists?
1) Separate configuration from the rest of the code. The rest of the code should then be able to run on all 3 locations without modifications. A typical config file could be:
<?
$db = "main_db"; $db_user="web1"; $db_pass = "xyz123";
$site ="example.com";
$htroot = "/var/www/prod/htdocs"
?>
And for the test environment:
<?
$db = "test_db"; $db_user="web1"; $db_pass = "xyz123";
$site ="test.example.com";
$htroot = "/var/www/test/htdocs"
?>
Do not include the configuration files with the passwords in the code base. The code base may be copied through insecure connections or be stored on third party code hosting servers later (see below). And you maybe don't want your passwords on all your backup disks of the code.
You could also create one single config file and use a switch depending on the environment the code runs:
<?
$site = $_SERVER["HTTP_HOST"];
if ($site == "example.com"; ) {
$db = "main_db"; $db_user="web1"; $db_pass = "xyz123";
$htroot = "/var/www/prod/htdocs";
}
if ($site == "test.example.com") {
$db = "test_db"; $db_user="web1"; $db_pass = "xyz123";
$htroot = "/var/www/test/htdocs";
}
?>
But now you could be tempted to put it back into the code base, which is less secure as explained above. And if you do not put it there you have to update 3 files, or use one fixed location per server, and you have to make sure the code finds the file on each server. I personally prefer the one-file-per-site solution from above.
2) You have already "versions". There is one version running on prod now. Give it a unique name and number, which will never change again. You can use that version name when you backup the code and when you refer to the version or when you move it somewhere you name the subdiectory that will contian it after the version.
The version that you will put on prod in the near future is a different version, and if you make changes again this is also a different version.
As a rule of thumb: increase the version number when you move or export the code, when you swap or exchange or upgrade between locations, when you make a demo, and after each feature or milestone and each time when you do a full backup.
Please note that the config files (3, one for prod, test and dev) are NOT part of the versions. So you can "move the versions around" but the not the config files. If you can, put the config files OUTSIDE the tree with the rest of the code, so you do not need to separate them and take care when you move around the versions later. You could move the config one directory "up" and access them from the files like this:
"include ../config.php";
3) If you want to use version control systems, they do a great job but it needs some time to get used to it and if you are in a hurry with your update it is probaby not the right time to start live with it now. But I would for the future recommend to use a latest generation distributed version control system. Distributed means you do not need to setup a server and many more advantages. I will name bazaar, if it is necessary to update over ftp it can do. Please note that a version control system makes exchanging a version very fast, because only the differences between the versions are written. Bazaar has a community and documentation which makes it easy to start. There is also Git, which has the most up to date commercial hosting site: http://github.com. You can view the code online and compare between the versions and there are many more helpful features, even if you are the only coder, but in a group it is even better. The choice between the systems is not easy. I can not recommend CVS, which is outdated. Also SVN is not the latest generation of distributed version control system, I would not recommend to use it if there is not a specific reason, and it will pollute all your subdirs with special subdirs, which can be annoying. For peolple who are used to it and have already code in it it is fine, but for a starter I would say don't.
There is also Mercurial and Darcs among the distributed and open source version control systems. Mercurial also has a great commercial site for collaboration and online code view (http://bitbucket.org).
4) As long as you do not use a version control system, how about using symlinks?
You could have a directory on the server src/versions/ somewhere and put the named versions in there, each one in their own subdirectory. You will only add versions (because a version that exists will not be changed, if you change it it becomes a new version)
You could have src/versions/v001/ src/versions/v002/ src/versions/v003/ or whatever naming scheme you use.
Now here comes the trick: /var/www/prod/htdocs is a symlink to src/versions/v001/
When you upgrade to v002 you just do the following:
shutdown apache
remove the old symlink /var/www/prod/htdocs (at this point the apache webroot is gone!)
create the new symlink /var/www/prod/htdocs being a link to src/versions/v002
start apache
You can also write a script for this with parameters and call it like this:
upgrade-web prod 002
This makes the gap even shorter.
Sometimes you have to do a "fallback", when you find out that the new version has errors in production and you have to go back. This would be easy (because you do not remove the old directories, you just stop apache, delete the symlink and re-create it to the former location, in this case src/versions/v001 )
And if test and dev is on the same server, you can of course also symlink the same directories, so there would be no any for move or copy.
5) If you do it manually without symlinks, why not move instead of copy?
(When the files are not yet on the same server, you can copy them somewhere near, and then start with the migration, so you do not have to stop the server for such a ling time)
If there are several directories on the root level of the project you could move them one at time. Be sure to NOT MOVE the config files. Or find some strategy to bring them bakc. Workflow would be:
Stop apache
Move away all current prod directories and files on the root level except config file(s)
Move all new prod directories and files to the root level except config file(s)
Start apache
6) try to find your perfect individual file and directory layout and perfect workflow. This takes maybe some time and some thinking, but it pays. Do it on a piece of paper until you find the best solution. This could mean that you have to refactor your code and change server config files, but for the future your life is easier when you do administration and upgrades. From my experience: do not wait so long with this step. Your layout shoudl nmake it easy and safe to upgrade. Upgrading is not somehting extraordinary, it is routine and it should be safe and simple to do.
7) If you name your server and workstation environments (operating system of server is linux I guess, but is it a hosted or a root server, do you have ftp acces or also shell (ssh) access, or sftp? where do you develop, on a windows machine, or a mac?) then people can name tools to do the copying and moving. Also interesting: Is the test and dev server the same machine, if not, how they are connected, or they aren't? If not, you would make a 3-way transfer (Copy it to your local workstation and then to the server).
8) Think of file permissions. If you move files around or copy them, maybe the file permissions change, and if the application depends on some of them there should be a way to check and maybe chang. Example: Some applications need writable directories where they put uploaded files or session files or template caching. Other applications do not allow some files for security to be writable.
Use configuration files to determine what database you're connecting to. That is, have a DEV configuration file, a TEST configuration file, and a PROD configuration file. It's generally the best way to avoid costly and frustrating mistakes.
Actually, I don't see any reason why TEST environment should miraculously migrate to PROD without any server shutdowns. TEST production is supposed to be for testing purposes. And even if you are actually TESTING on production server, bring it down (shutdown apache), change one line in your main config file, that is determining what set of minor config files to use) and bring it up again (start apache). This will take not more than 1-3 mins to complete and since you surely not going to do that twelve times a day, you will be fine.
Have your code in a revision control system (I prefer Subversion (svn)). This makes it easy to keep your DEV, TEST and PROD environments in sync, you don't have to keep track of files you modified. Once you are happy with your modifications on DEV, you commit the changes to svn and then run "svn update" on the TEST and eventually after testing on PROD server. Most linux hosting providers have svn client installed or you can install it yourself.
I don't like having a different version of a config file for each site because it requires manually renaming one file and removing the other two. I prefer having DEV, TEST and PROD configurations in the same config file. In the config file I determine which server the code is running on by checking either the hostname or the request url. Then I can have "if" or "switch" statement that would load configuration settings based on which server is currently running the code.
You might also need to sync database structure between your servers. I use sqlyog for this purpose, it has a built-in database structure synchronization tool that compares 2 database structures and prepares SQL to synchronize them.
When we push updates live, it's not often we have to reboot apache. This maybe a side effect of not having high traffic sites (< 1M pagedraws a month).
We have 3 branches, for various stages of development/QA: alpha (bleeding edge but available for viewing by non-developers and testers), beta (somewhat frozen for a particular release, final QA phase) and live (production).
To migrate from one branch to another, we perform a merge between say alpha and beta, commit that merge. Then run a deployment script which updates the branch from SVN on our development machine, then rsync's the code web servers to the beta document root.
As already mentioned by others, each branch can contain it's own config file with appropriate settings to cater for environment differences.
We are in the process of migrating to git to smooth out the branch merging process, which can be a little traumatic in SVN for large projects/releases.

Categories