php installation script - php

I'm selling a php script which customers can install and run on their own sites. Currently the install process is a somewhat "standard" one...that is:
1. Check if your hosting covers the install requirements
2. Download a zip package
3. Unzip
4. Upload to server
5. Read install instructions
6. Create mysql DB
7. Set up folder permissions where needed
8. Run install.php
9. FINALLY..we're done!
This all may not be a rocket science but even for an expreienced user it's way too clumsy of a procedure. So, I'm thinking to start providing just one setup.php file instead of a whole package. The user will create a /myscript folder on the server, upload setup.php there and run it. Then setup.php downloads all the needed files from my server and the whole installation thing is done automatically (except the DB creation, which the user has to do himself). That should be much more user-friendly and you don't need to read a manual just so you can install the damn thing.
So, I'm going to create such a setup script but before I start, my question is - why isn't everyone doing it? Laziness is the most obvious excuse - let the user sweat over the installation, if they want the software bad enough, they'll install it somehow anyway.
But even the big companies are following the same "standard" so there must be something else. Can you think of any technical obstacle that will prevent such a setup script from working good enough and that's why everyone is resorting to using downloadable zip packages?
Thanks!

Not all hosters allow file-downloads via PHP. And since you are downloading a zip-file, you'll need to unzip it. Which is probably the biggest obstacle.
system() and the like are very often forbidden, and unzip is also very often not installed due to security issues.
When having all required files in a package you are always on the safe side.

Related

Is it ABSOLUTELY NECESSARY to use Composer to manage a Drupal 8 project?

Apologies in advance if this is off-topic, or something that has been asked before, but is there a way to maintain a stable Drupal 8 website WITHOUT using Composer?
Why I ask is because I am just unable to use Composer on my shared GoDaddy hosting. While it is powerful enough to run multiple drupal 7 and 8 installations, it seems to come to its knees as soon as I run a composer command using ssh (via terminal). Things freeze for a while and then I get a "Killed" message. If I check server processor, I/O or RAM statistics during this time, they are all in the red.
I have read somewhere else on this site that it is not advisable to run composer on a live site. The recommended approach is to run composer on a local (localhost) copy of the website, and upload updated files, but it seems impractical, because sometimes all I need to do is install a small module, or something else that involves only a few files.
Any insight is very welcome. If I am doing something wrong, please suggest the right path. Things have reached a head now.
Thanks in advance.
In a basic , composer downloads libs and creates autoloader.
So if you have the same/similar(php version !!) enviroment on localhost and on server you can simply upload vendor dir to server ( and that's all) .
It'll be working till you add some new lib ( then you need to upload vendor dir again)
So anwsering - you don't need ssh and composer on server.
There's no need to use composer at all, it's just a lot easier.
You can just manually add / remove (uninstall first) modules with FTP ... I don't recommend this though!

Where do I use Composer for PHP?

I'm still new to coding and I'm learning everything on my own. This is a silly question for you but after reading a dozen of articles I am still confused.
I have a php based website on a shared host. After reading the various articles on benefits of using repositories and Composer, I decided to give it a try. These are my difficulties so far:
Which version of the operating system of Composer should I download, to enable me to install/update repositories of my cPanel based shared hosting?
If I am to install Windows version, how do I connect to my shared hosting to install/update the repositories?
My apologies for my silly questions, but it would really help.
If you are using shared hosting, you are unlikely to be able to use Composer on the host itself. Furthermore, you are not encouraged to use Composer "on production".
I would recommend you use Composer locally (on the O/S of your local machine), to compose your project and install your dependent packages. Once it's all working and tested with your own code, you upload your entire development directory tree including the resulting vendor library - as one big FTP/SCP upload of "flat files".
Once you get more advanced you could adventure into automated deployment techniques, but I feel for now you would be best to stick to using Composer as a local development tool to manage your codebase.
Update, further details:
Composer is really a tool to help you manage your codebase in development. It's not intended as a "deployment" tool. Previously you used to find a library you liked, download it, unzip it into your codebase somewhere random like "lib/stuff" and then link to it, and commit it into your version control system (VCS). OK, but they a year later you want to update it and you have to download it again, figure out where you saved it and how to overwrite the files, or delete old ones... it gets hard. Also your VCS repository gets full of 3rd-party components - even duplicates of the same one! Composer solved this by bringing order to this long-term dependency management chaos.
The reason you don't want to run Composer "on production" (i.e. your live website), is that during the process of download, update, composition your website will probably be broken. Even if the composer process works, this could be several minutes of broken site. After the update has finished - you now have a completely new set of 3rd party packages: how do you know they are compatible with your codebase?
So therefore you only do composer updates locally, test everything, amend your code to work the shiny new updates, and only then do you decide to upload the whole new site to the server - just as if you'd cobbled it all together manually. The deployment is independent.

Faster composer install

A composer install normally takes a few minutes. And on a production environment it's feels too slow.
Is it possible to make a composer install to a temp directory and then switch it? If that is possible the downtime should be about zero.
Or are there any other way to do a composer install faster?
I created a composer plugin to download packages in parallel.
https://packagist.org/packages/hirak/prestissimo
$ composer global require hirak/prestissimo
Please try it. In my environment, composer install become 10 times faster.
You can sometimes speed up composer install significantly by using the --prefer-dist flag, which just happens to be recommended for production use:
--prefer-dist: Reverse of --prefer-source, composer will install from dist if possible. This can speed up installs substantially on build servers and other use cases where you typically do not run updates of the vendors.
composer install docs here: http://getcomposer.org/doc/03-cli.md#install
Edited To Clarify Sometimes
I say it sometimes speeds up composer install because there are plenty of factors that go into it feeling slow, not the least of which are network performance and the current Github status. A slow install can be really frustrating, but it's not always b/c of Composer.
You are asking two different and unrelated things.
Yes, it is a solution to build the next version of your site in a separate directory then put it in place after you moved the old version out of the way. It is, actually, the best solution.
This is how the deployment scripts I build work:
prepare the next version of the site in a separate directory (let's say /var/www/new); the following list of items and their order is not static, some projects need a different flow:
get the last version of the code from the repo;
remove the files that are not needed on the live site (.gitignore, IDE project files, placeholders etc);
run composer install;
copy/generate the configuration files containing the settings for the live servers (the ones stored in the repository contain dummy values);
change the user and permissions of all files; make some directories writable by the web server;
create symlinks, directories etc; for example, the directory that contains the user uploaded files is somewhere outside the server directory and a symlink to it is created during deploy to make its content available through the web server;
move the live code out of the way; I use mv to move the entire directory (/var/www/html to /var/www/old, for example);
move the prepared new version to the proper place (mv /var/www/new /var/www/html);
move the previous version into the archive (after I remove the content of vendor and other files that do not change or are external).
The advantages:
the downtime is zero (microseconds, probably, between steps #2 and #3);
a failed build doesn't affect the live site (if handled properly);
reverting to the previous version (if needed) can be done easily.
Regarding the other question, the only way I know to speed composer up is to avoid running it using a PHP that loads the xdebug extension. The xdebug extension shouldn't be loaded on the production server anyway.

(Lightweight) backup strategies for a LAMP application stack?

I'm researching some lightweight tools to backup a LAMP stack. The two most imporatant pieces are the
php codebase and
the mysql database.
I can tar/bz2 the code and a mysqldump and restore it on a new server (if the old one crashes) and this is more or less fine.
Anyway, are there more complete solutions to this?
e.g. track and re-install additionally installed pear-packages;
track other packages related to the LAMP stack installed via linux package managers, e.g. APC;
keep mysql and php configurations alongside backups and being able to restore them automatically ...
possibly complete server images, which can be restored without the need to reinstall everything ..
I'm curious about hints, tips, experiences, solutions ..
The PHP code base should be managed in a version control system such as SVN, Git, etc. Just creating a tar doesn't give you many capabilities that a proper version control system gives you.
The trouble with mysqldump is that you have to lock the tables you are dumping to ensure a consistent snapshot. If this takes a long time, other DB operations might timeout while waiting. We use a wonderful script for snapshotting the running database without excessive locks. It was designed for the Amazon/EC2 environment but the principals apply to any Linux system with the xfs file system.
Here is a great guide for imaging an Ubuntu machine (obviously you can use on other distros):
http://ubuntuforums.org/showthread.php?t=35087
In a nutshell (from the article)
tar cvpzf backup.tgz --exclude=/proc --exclude=/lost+found --exclude=/backup.tgz --exclude=/mnt --exclude=/sys /
To back up the system, then ftp it to another server.
I can answer a few points. I know it is not a popular package, but I've always revisioned schema with RCS at the server. It doesn't have to be RCS, but no reason not to dump the CVS/RCS repository with the backup.
For "complete server images," instead of autonomously installing application-requirements (PHP packages &c) we deploy our own bin/ src/ usr/ var/ lib/ structure as per each application which simplifies the backup and system req's perspective.
Hope that helps.
I've also seen mysqldumps RCS'd to save only changes. I'm sure that would be somewhat non-trivial in terms of change management though.

What's the best way to use SVN to version control a PHP site?

I've always just FTPed files down from sites, edited them and put them back up when creating sites, but feel it's worth learning to do things properly.
I've just commited everything to a SVN repo, and have tried sshing into the server and checking out a tagged build, as well as updating that build using switch.
All good, but it's a lot lot slower than my current process.
What's the best way to set something like this up? Most of my time is just bug fixes or small changes rather than large rewrites, so I'm frequently updating things.
You don't necessarily need to use SVN to deploy the files to the server. Keep using FTP for that and just use SVN for revision history.
You should look at installing rsync to upload changes to your server.
Rsync is great because it compares your local copy of the repo to the copy that's currently on the server and then only sends files that have changed.
This saves you having to remember every file that you changed and selecting them manually to FTP, or having to upload your whole local copy to the server again (and leaving FTP to do the comparisons).
Rsync also lets you exclude files/folder (i.e. .svn/ folders) when syncing between your servers.
I'd recommend you keep using Subversion to track all changes, even bug fixes. When you wish to deploy to your production server, you should use SSH and call svn update. This process can be automated using Capistrano, meaning that you can sit at your local box and call cap deploy -- Capistrano will SSH into your server and perform the Subversion update. Saves a lot of tedious manual labor.
For quick updates I just run svn update from the server.
Sometimes for really really quick updates I edit the files using vim and commit them from the server.
It's not very proper, but quick and quite reliable.
If you want to do this properly, you should definitely look into setting up a local SVN repository. I would also highly recommend setting up a continuous integration (CI) server such as cruise control, which would automatically run any tests against your PHP code when ever you check in to svn. Your CI server could also be used to publish your files via FTP to your host at the click of a button, once it has passed the tests.
Although this sounds like a lot of work, it really isn't and the benefits of a smooth deployment process will more than pay for itself in the long run.
For my projects, I usually have a repo. On my laptop is a working copy, and the live website is a working copy. I make my changes on the local copy, using my local webserver. When everything is tested and ready to go, I commit the changes, then I ssh into the remote server and svn update.
I also keep a folder in this repository which contains sql files of any changes I've made to the database structure, labelled according to their revision number. For instance, when I commit Revision 74 and it has a couple extra columns in one of the tables, included in the commit will be dbupdates/rev74.sql. That way, after I do my svn update, all I just have to run my sql file (mysql db_name -p -u username < dbupdates/rev74.sql) and I'm good to go.
If you want to get real funky with it, you could use a build script to get the current version from SVN, then compile your PHP code, then on a successful build, automatically push the changes to your server.
This will help in debugging and may make your code run faster. Also, getting into the build habit has really improved my coding over just pushing the PHP straight to the server and debugging via Firefox.
The benefits of source control reveal themselves as the complexity of the project and number of developers increase. If you are working directly on a remote server, and are only making quick patches most of the time, source control might not be worth the effort to you.
Preferably, you should be working from a local working copy of the repository (meaning you should also set up a local server). Working against a remote server using SVN as the only means to update it would slow you down quite considerably.
Having said that, working with SVN (or any other source control) will yield many benefits in the long run - you have a complete history of changes, you can always be sure the server is up-to-date (if you ran update) and if you add more developers to the project you can avoid costly source overwrites from each other.
What I do at work, is use FTP to upload changes to a test server. Then when I am finished with the section of the site that I was working on, I commit the changes and update both. Sometimes, if I am working on something and I change a lot of files in different directories, I commit it and update the test server. But I don't update the production server. But I am the only programmer here, I wouldn't recommend committing possibally buggy code if there is more than one programmer.
I use ZendStudio for Eclipse (currently version 6.1). And I use SVN to keep my source codes available. Initially I thought the process was somewhat slow due to commit process (and entering commit comment) and wait until it stops.
However after learning that Ctrl+Alt+C to Commit and check 'Always run in Background', the process doesn't slow at all.
Plus, I do run everything locally, then only SSH after a while.
I did a post-commit hook to automatically update my web. It´s fast but you can make mistakes.
IF on a *nix server AND you have the appropriate SSH access AND you have space to keep multiple copies of the website, THEN the single most useful versioning technique I have found is to use a symbolic link to point to the "current" version of the website. (You can still use SVN to version source code -- this is a way to easily/instantly switch between versions of the website on the server.)
Set up the webserver to point to /whatever.com as the root of the website.
Have a folder like /website/r1v00 to which you FTP the website files, then create a symlink called "whatever.com" that points to /website/r1v00
When you have an updated version of the website, create another folder called /website/r1v001, FTP all the files for the updated site, then change the symlink for "whatever.com" to now point to /website/r1v01. If there are any problems with the new site, you can back it out instantly by simply pointing the "whatever.com" symlink back to /website/r1v00
Of course, you can/should set up scripts to automate the creation and switching of the symlink. In my case, I have an "admin" page written in PHP that lists all the available versions, and allows me to switch to any of them. This technique has saved my bacon several times...!
Obviously this does not address any issues with versioning database schemas or database content.

Categories