Using PEAR, Phing for a ROBUST deployment solution? - php

I'm starting a new project and want a one-step build process for my development, continuous integration, and production servers.
I'll want the process to be included in my source control (git/github), and to run robustly on any of my three servers, which are pretty similar, but do have different paths for the project root.
I had planned to use Phing, PHPDocumentor, PHP_CodeSniffer, PHPUnit, etc. installed via Pyrus into a localized PEAR/PEAR2 install that could be deployed along with the project, so I could be confident in my dependencies at build time.
However, I've had nothing but problems with getting this set up.
This is the first time I've tried to set up such a build system, and *nix installs aren't my strong point (though I always do end up with a reliable system eventually), so the weak link here may be me. However, a lot of the problems I'm having seem to come from PEAR.
For example, no matter how carefully I install via pyrus, everything ends up having path issues. Looking into some of the PEAR packages, all paths are hard coded (probably set at runtime) with the relative paths I used during install (./pear, etc.). This means I can only successfully run the packages from the folder where pyrus was during install, even though the run scrips were put in ./pear/bin and I did set the bin folder during install. Sometimes the paths conflict internally to a single package, so it wants to run from here, but it wants to manage the config from over there...
PHPDocumentor doesn't handle PHP5.3 (and won't run for me at all, maybe because it has issues, maybe because I've mis-installed it?), so I've replaced it with PHPDoctor which initially seems pretty good.
Another example, after setup, Phing will run, but just dies silently. After digging in and tracing it, I find that an obscure function on line 70 of /io/PhingFile is getting null for a required arg, which throws a ConfigurationException which causes it to die silently.
I'm confident that I can get all this working ok, but I'm NOT confident that it will ever be a deployable robust system, and I really don't want to have to debug, tweak, and then maintain a big pile of self-modified PEAR packages that have to be retweaked at every upgrade.
So, finally the question =o)
Does anyone have a really good robust build system using these apps? Was there some trick to it?
Or does everyone have a nice robust build system and it's just my naivete with *nix installs and system config that makes this all seem like a double-sized helping of clusterphuckery?
Does anyone have any pointers on getting such a system set up to work across multiple servers, or am I just kidding myself? Maybe I should just do separate installs on the systems outside the project root and get on with my development?

#work, we use many PEAR packages in our software, install them via Pyrus into local directories and extensively use Phing as build tool to run tests and deploy the software - on different Linux systems, and some developers even run them on their Windows boxes.
It's working reliably, and we're not experiencing the issues you described.

Related

Where do I use Composer for PHP?

I'm still new to coding and I'm learning everything on my own. This is a silly question for you but after reading a dozen of articles I am still confused.
I have a php based website on a shared host. After reading the various articles on benefits of using repositories and Composer, I decided to give it a try. These are my difficulties so far:
Which version of the operating system of Composer should I download, to enable me to install/update repositories of my cPanel based shared hosting?
If I am to install Windows version, how do I connect to my shared hosting to install/update the repositories?
My apologies for my silly questions, but it would really help.
If you are using shared hosting, you are unlikely to be able to use Composer on the host itself. Furthermore, you are not encouraged to use Composer "on production".
I would recommend you use Composer locally (on the O/S of your local machine), to compose your project and install your dependent packages. Once it's all working and tested with your own code, you upload your entire development directory tree including the resulting vendor library - as one big FTP/SCP upload of "flat files".
Once you get more advanced you could adventure into automated deployment techniques, but I feel for now you would be best to stick to using Composer as a local development tool to manage your codebase.
Update, further details:
Composer is really a tool to help you manage your codebase in development. It's not intended as a "deployment" tool. Previously you used to find a library you liked, download it, unzip it into your codebase somewhere random like "lib/stuff" and then link to it, and commit it into your version control system (VCS). OK, but they a year later you want to update it and you have to download it again, figure out where you saved it and how to overwrite the files, or delete old ones... it gets hard. Also your VCS repository gets full of 3rd-party components - even duplicates of the same one! Composer solved this by bringing order to this long-term dependency management chaos.
The reason you don't want to run Composer "on production" (i.e. your live website), is that during the process of download, update, composition your website will probably be broken. Even if the composer process works, this could be several minutes of broken site. After the update has finished - you now have a completely new set of 3rd party packages: how do you know they are compatible with your codebase?
So therefore you only do composer updates locally, test everything, amend your code to work the shiny new updates, and only then do you decide to upload the whole new site to the server - just as if you'd cobbled it all together manually. The deployment is independent.

What ways are there to work on a project in a testing environment where the Git commit needs to be different to the PHP code used for testing and dev?

In my project the deployable version needs to have a copy of each of the external libs, a different config file and install and setup files, for security concerns, the main project is set to refuse to run if they are present. Thus the upstream copies of the other projects need to be committed to repo. How can I work on code running on localhost where the file layout and sometimes file contents from dev and testing are different to what I need to commit?
Background
I am working on a project on hosted on github and my main IDE is netbeans which has imperfect git support (good enough for >99% of my needs). The project is in PHP and uses several other projects as libraries.
As Netbeans does not have the best support for sub-repos I have chosen to keep each additional project in a separate project. This is fine as the central project looks at the config data for where to find these outside libs.
Half an answer
My instinct is to suppose that there will need to be some "build stage" prior to committing to the github repo but how on earth do I go about setting all that up?
I could write some sort of homebrew thing but then when I pull other people's contributions I would need to reverse the process unless we had a branch for builds and a branch for working copies which seems needlessly complex and could leave the dev(s) config data on public display (not to mention updates being a mess).
I have seen that others have wrestled with somewhat similar problems to no conclusion (at time of asking) (How to push and pull from github without sharing sensitive information? Smudge & clean?) so I am looking for anything that might help me come up with a solution
my main IDE is netbeans which has imperfect git support
Most devs just use the command line. I switch to the NetBeans conflict resolver occasionally, which is very good, but for normal stuff the console is usually faster.
My instinct is to suppose that there will need to be some "build stage" prior to committing to the github repo
... unless we had a branch for builds and a branch for working copies
No, there is only ever one repository. It is better to think of your repo as your code history, rather than your deployment state. Branches should just be for features or large changes, which merge into your mainline/master.
There are a good deal of options available to you when deploying. The first is Composer, which Mark points out: when deploying you issue an install or update command, which fetches the dependencies that satisfy your library requirements recursively. You can use Bower to do the same thing for your JavaScript dependencies.
Some deployment strategies prefer to build locally and then scp/rsync to a remote server. Composer and Bower are still probably a good idea, but you write a build script (using Ant or Phing, for example) to create a build copy in a local temporary folder, and then send it to the server. It is common here also to push it to a new release folder on the server, and then swap a symlink or Apache config file when it's ready to go live.
the deployable version needs to have a copy of each of the external libs, a different config file and install and setup files, for security concerns
Assuming this is a web project, have you tried adding your sensitive environment data to your Apache configuration file? This can be trivially read in PHP, and of course PHP does not care that this information is different according to whether you are developing, testing, demoing a branch or operating live.
Further reading: an excellent PHP deployment book, free of charge, that suggests Phing and Capistrano.

Setting up a Laravel 4 app on a VPS

I am trying to deploy my first Laravel App. So I hope I am providing all the necessary info. I have walked down several paths trying to deploy this app. I tried a shared hosting account, but found too many errors to continue deploying my Laravel app. In the meantime, someone has said to me I need a VPS, so I may go with that.
So with a new VPS, I now am trying to install the following: phpMyAdmin, node.js, Composer, and Laravel 4. These are the technologies I am using on my local server with MAMP. Now after being overwhelmed with the information on installing each on a VPS, I have found myself extremely confused. Some places say I need to install Ubuntu. Some say I need to install Apache first. Some talk about using CentOS. I honestly have no idea what I need to install, and in what order. All I really need is to figure out how to set up a PHP environment on my VPS with phpMyAdmin, Node.js, and Composer. After that I am pretty sure it's all straight forward, as far as installing my app.
I also saw some one talking about committing my app to Git, and the cloning it to the VPS. If I did this, I would still need to set up the environment correct? Once again, I hope I have provided the necessary information. If my question is not clear, could you please refer me to a resource that I can study.
You don't need install Laravel separately from the app it is part of - these days a PHP app just contains everything it needs in its vendor folder. How to deploy depends on how you have arranged your dependencies locally, but the simplest way is to copy everything in your local project to your remote server (FTP or rsync). I don't think Laravel demands a VPS, but if you are using Node as well, then yes you will.
So, the short answer is: if it works locally, copy it up to the remote host, and it should work there. Make sure you've set up your config system in your app so that it can cope with the different settings you need in local/remote environments, such as database connection settings.
My feeling is that a shared host would be easier for you as a beginner - is the Node.js component of your app critical? Running your own VPS is not difficult, but there is quite a bit to learn. Your distro (such as Ubuntu) would be ready-installed, and on top of that you would use the package system (something like apt-get) to install Apache, PHP, PHP modules, phpMyAdmin, git, and whatever else you need.
Yes, you can certainly deploy using Git. One way to do this is to create bare repositories on your server in a private place, set it up as a remote in your local dev machine, and push to it as your off-site copy. Then, from your dev or production web folders, pull and update submodules. This is not trivial, and requires at least a working knowledge of Git - so presently I wouldn't recommend this route.

PHP Deployment to Live Server

I am new to this, I just reading about how I should not edit code on the live production server. I don't know anything about source control or SVN.
I would like to start coding on a test server then once everything is confirmed working, I want to send all the files over to the production server.
How should I go about this? I am on mac os x and was looking into apps like http://versionsapp.com/ but I am not sure if this is the right solution.
What do you suggest?
If you are new to web development, I wouldn't suggest jumping into Subversion right away. You should have a firm grasp on Subversion before actually using it in any production environment, as its surprisingly easy to screw things up. Don't let that scare you off, though, as version management (whether through SVN or another avenue) is highly useful.
And if the project in question is small enough, I don't see anything wrong with the old "develop locally then ftp it to the server" approach. Sometimes a full-blown version management tool just isn't necessary.
Whether or not a SVN deployment strategy is appropriate depends on factors such as the size of the site, your familiarity with using the command line, and whether you are working as part of a team.
It is worth noting that in most shared hosting environments you won't have the option to install SVN on the server, which narrows your options somewhat!
I don't think there is anything inherently bad about using good old fashioned FTP to get files up to a server, especially for smaller sites where you are the only person working on the site.
Even then, SVN can still be very useful. I keep all my sites under version control even if they are going to be deployed by FTP.
Just go with SVN as it is the basics. After you get a taste of it you can explore the alternatives like GIT or whatever.
You should learn to do the basic SVN stuff through the console, there is no other way especially if you are going to work on a live server. Your live server should have SVN installed and some SSH access so you can execute your SVN command there.
You can also get TortoiseSVN which is a nice visual client for SVN.
The basic SVN you are going to use is those 3 commands:
$ svn co # Checkout
$ svn ci # Commit
$ svn up # Update
There are tons of tutorials of how to do stuff, here is the first one from google:
http://paulstamatiou.com/how-to-subversion-basics
After you feel comfortable with that stuff and when you start working with team and contribute to public repositories you might start exploring the advanced stuff like branches, tags and all other terms that make some people feel important when they mention them.
This versionwhatever.com I would hold for now and explore the popular choices, before specializing, especially that SVN works on UNIX and you can put it on Linux which runs most of the web hosting out there.
Cheers!
If you're just starting out I recommend you avoid SVN. Try git, there are numerous mac tools, such as GitX for Mac. There's also github and others to host your projects for easy pulling and revisions within a group of people.

What's the best process / app for automated deployment of PHP apps?

There's another post on SO relating to .NET -- not us. Pure PHP. Trying to find the best way/process to deploy stable version of our PHP app. I've seen an article on Capistrano, but am curious what else is out there. Aside from the obvious reasons, I'm also looking to add some scripting so that the SVN rev number gets added in there as well.
Much thanks.
I've used a home-grown script for quite some time. It will (based on an application configuration file):
Run svn export on the repository based on a tag.
Package the export into a tar or zip file, which includes the tag in the name.
Use scp to copy the package to the appropriate server (QA or release).
Connect to the server with ssh to install the package and run post-install scripts.
The application configuration file is part of the project. It can tell the script (at step 2) to strip paths and otherwise process specified files. It also specifies server names and how to handle externals.
I've recently migrated the script to support Git as well as Subversion. I'm also probably going to migrate it to PHP since we're now running in a mixed (Linux and Windows) set up, with Linux now in the minority.
I have plans to automatically call the script with post-commit hooks, but haven't had the need to implement that just yet.
Coincidentally, I was just reading about an Apache Ant/gnu make like build tool called Phing. What I like about it is the ability to write custom extensions in PHP!
I don't know if it works for deploying an app live, but phpUnderControl is a continuous integration suite (which I'm just now starting to look into). If it doesn't support doing deployments natively, it can probably be extended to do them.
Chris Hartjes has a nice view on this: Deployment is not a 4 letter word
We're using Webistrano, which is a web frontend for Capistrano, to deploy a few dozen projects. It's built as a Ruby on Rails app, and provides a nice, centralized and consistent user interface for Capistrano deployments.
Instead of having cap recipes in every project, and running command-line tools, Webistrano stores the recipes in its database, and allows you to attach the recipes to multiple projects and stages. This reduces duplication of scripts.
Also nice is that all deployment logs are stored so there's an auditing trail. Who deployed which revision to the live server, that sort of thing.
As you requested, the Revision number is stored in the deployed project as well.
All in all, we're very pleased with it.

Categories