This may be more of a systems question, but as I need it for a Vagrant development environment, I thought I would ask it here.
I am using Vagrant and Chef to create a PHP development environment. It's working correctly, but the setup takes forever because I'm installing PHP from source because I need some custom configurations.
Is there a way to install PHP through Vagrant once, capture all changes made by the build in an archive, so I can just unpack the archive?
It is worth considering building the custom-configured PHP into a deb (or any other) package, so you actually install already compiled PHP with your custom stuff inside.
It still depends how customizable you still want your build to be - I have no idea what you're actually trying to achieve as you haven't mentioned specifics of your custom PHP build.
Related
I'm still new to coding and I'm learning everything on my own. This is a silly question for you but after reading a dozen of articles I am still confused.
I have a php based website on a shared host. After reading the various articles on benefits of using repositories and Composer, I decided to give it a try. These are my difficulties so far:
Which version of the operating system of Composer should I download, to enable me to install/update repositories of my cPanel based shared hosting?
If I am to install Windows version, how do I connect to my shared hosting to install/update the repositories?
My apologies for my silly questions, but it would really help.
If you are using shared hosting, you are unlikely to be able to use Composer on the host itself. Furthermore, you are not encouraged to use Composer "on production".
I would recommend you use Composer locally (on the O/S of your local machine), to compose your project and install your dependent packages. Once it's all working and tested with your own code, you upload your entire development directory tree including the resulting vendor library - as one big FTP/SCP upload of "flat files".
Once you get more advanced you could adventure into automated deployment techniques, but I feel for now you would be best to stick to using Composer as a local development tool to manage your codebase.
Update, further details:
Composer is really a tool to help you manage your codebase in development. It's not intended as a "deployment" tool. Previously you used to find a library you liked, download it, unzip it into your codebase somewhere random like "lib/stuff" and then link to it, and commit it into your version control system (VCS). OK, but they a year later you want to update it and you have to download it again, figure out where you saved it and how to overwrite the files, or delete old ones... it gets hard. Also your VCS repository gets full of 3rd-party components - even duplicates of the same one! Composer solved this by bringing order to this long-term dependency management chaos.
The reason you don't want to run Composer "on production" (i.e. your live website), is that during the process of download, update, composition your website will probably be broken. Even if the composer process works, this could be several minutes of broken site. After the update has finished - you now have a completely new set of 3rd party packages: how do you know they are compatible with your codebase?
So therefore you only do composer updates locally, test everything, amend your code to work the shiny new updates, and only then do you decide to upload the whole new site to the server - just as if you'd cobbled it all together manually. The deployment is independent.
In my project the deployable version needs to have a copy of each of the external libs, a different config file and install and setup files, for security concerns, the main project is set to refuse to run if they are present. Thus the upstream copies of the other projects need to be committed to repo. How can I work on code running on localhost where the file layout and sometimes file contents from dev and testing are different to what I need to commit?
Background
I am working on a project on hosted on github and my main IDE is netbeans which has imperfect git support (good enough for >99% of my needs). The project is in PHP and uses several other projects as libraries.
As Netbeans does not have the best support for sub-repos I have chosen to keep each additional project in a separate project. This is fine as the central project looks at the config data for where to find these outside libs.
Half an answer
My instinct is to suppose that there will need to be some "build stage" prior to committing to the github repo but how on earth do I go about setting all that up?
I could write some sort of homebrew thing but then when I pull other people's contributions I would need to reverse the process unless we had a branch for builds and a branch for working copies which seems needlessly complex and could leave the dev(s) config data on public display (not to mention updates being a mess).
I have seen that others have wrestled with somewhat similar problems to no conclusion (at time of asking) (How to push and pull from github without sharing sensitive information? Smudge & clean?) so I am looking for anything that might help me come up with a solution
my main IDE is netbeans which has imperfect git support
Most devs just use the command line. I switch to the NetBeans conflict resolver occasionally, which is very good, but for normal stuff the console is usually faster.
My instinct is to suppose that there will need to be some "build stage" prior to committing to the github repo
... unless we had a branch for builds and a branch for working copies
No, there is only ever one repository. It is better to think of your repo as your code history, rather than your deployment state. Branches should just be for features or large changes, which merge into your mainline/master.
There are a good deal of options available to you when deploying. The first is Composer, which Mark points out: when deploying you issue an install or update command, which fetches the dependencies that satisfy your library requirements recursively. You can use Bower to do the same thing for your JavaScript dependencies.
Some deployment strategies prefer to build locally and then scp/rsync to a remote server. Composer and Bower are still probably a good idea, but you write a build script (using Ant or Phing, for example) to create a build copy in a local temporary folder, and then send it to the server. It is common here also to push it to a new release folder on the server, and then swap a symlink or Apache config file when it's ready to go live.
the deployable version needs to have a copy of each of the external libs, a different config file and install and setup files, for security concerns
Assuming this is a web project, have you tried adding your sensitive environment data to your Apache configuration file? This can be trivially read in PHP, and of course PHP does not care that this information is different according to whether you are developing, testing, demoing a branch or operating live.
Further reading: an excellent PHP deployment book, free of charge, that suggests Phing and Capistrano.
Just a quick one about installing a PHP website, are there any tools out there that would allow me to create an install package to fully install this website on a Windows platform? If possible maybe even take details like company name and database connections which then maybe updates the necessary PHP files?
If the later cannot be done its fine, but a free tool for installation would be great!
Thank you!
Ash.
It's possible with a tool like innosetup.
It let you build setup , in which you can put what ever you want (webpage , other setup ...).
If you want to do something a little bit advanced (installing apache , configuring file ...) you will need to do some code (delphi) , but the documentation is pretty clear about all the possibility.
Don't be fooled by the simplicity of the tool , it's very powerfull. For example you can check if a specific service is running to lauch (or not) a specific part of your setup. (if httpd is running just copy the webpage a skip the apache installation for example).
You can combine innosetup with server2go
Download the package you want as a .ZIP file, extract it, put your website in "htdocs" folder and then create an installer with innosetup.
Edit: You can edit pms_config.ini too with your needs.
I'm starting a new project and want a one-step build process for my development, continuous integration, and production servers.
I'll want the process to be included in my source control (git/github), and to run robustly on any of my three servers, which are pretty similar, but do have different paths for the project root.
I had planned to use Phing, PHPDocumentor, PHP_CodeSniffer, PHPUnit, etc. installed via Pyrus into a localized PEAR/PEAR2 install that could be deployed along with the project, so I could be confident in my dependencies at build time.
However, I've had nothing but problems with getting this set up.
This is the first time I've tried to set up such a build system, and *nix installs aren't my strong point (though I always do end up with a reliable system eventually), so the weak link here may be me. However, a lot of the problems I'm having seem to come from PEAR.
For example, no matter how carefully I install via pyrus, everything ends up having path issues. Looking into some of the PEAR packages, all paths are hard coded (probably set at runtime) with the relative paths I used during install (./pear, etc.). This means I can only successfully run the packages from the folder where pyrus was during install, even though the run scrips were put in ./pear/bin and I did set the bin folder during install. Sometimes the paths conflict internally to a single package, so it wants to run from here, but it wants to manage the config from over there...
PHPDocumentor doesn't handle PHP5.3 (and won't run for me at all, maybe because it has issues, maybe because I've mis-installed it?), so I've replaced it with PHPDoctor which initially seems pretty good.
Another example, after setup, Phing will run, but just dies silently. After digging in and tracing it, I find that an obscure function on line 70 of /io/PhingFile is getting null for a required arg, which throws a ConfigurationException which causes it to die silently.
I'm confident that I can get all this working ok, but I'm NOT confident that it will ever be a deployable robust system, and I really don't want to have to debug, tweak, and then maintain a big pile of self-modified PEAR packages that have to be retweaked at every upgrade.
So, finally the question =o)
Does anyone have a really good robust build system using these apps? Was there some trick to it?
Or does everyone have a nice robust build system and it's just my naivete with *nix installs and system config that makes this all seem like a double-sized helping of clusterphuckery?
Does anyone have any pointers on getting such a system set up to work across multiple servers, or am I just kidding myself? Maybe I should just do separate installs on the systems outside the project root and get on with my development?
#work, we use many PEAR packages in our software, install them via Pyrus into local directories and extensively use Phing as build tool to run tests and deploy the software - on different Linux systems, and some developers even run them on their Windows boxes.
It's working reliably, and we're not experiencing the issues you described.
There's another post on SO relating to .NET -- not us. Pure PHP. Trying to find the best way/process to deploy stable version of our PHP app. I've seen an article on Capistrano, but am curious what else is out there. Aside from the obvious reasons, I'm also looking to add some scripting so that the SVN rev number gets added in there as well.
Much thanks.
I've used a home-grown script for quite some time. It will (based on an application configuration file):
Run svn export on the repository based on a tag.
Package the export into a tar or zip file, which includes the tag in the name.
Use scp to copy the package to the appropriate server (QA or release).
Connect to the server with ssh to install the package and run post-install scripts.
The application configuration file is part of the project. It can tell the script (at step 2) to strip paths and otherwise process specified files. It also specifies server names and how to handle externals.
I've recently migrated the script to support Git as well as Subversion. I'm also probably going to migrate it to PHP since we're now running in a mixed (Linux and Windows) set up, with Linux now in the minority.
I have plans to automatically call the script with post-commit hooks, but haven't had the need to implement that just yet.
Coincidentally, I was just reading about an Apache Ant/gnu make like build tool called Phing. What I like about it is the ability to write custom extensions in PHP!
I don't know if it works for deploying an app live, but phpUnderControl is a continuous integration suite (which I'm just now starting to look into). If it doesn't support doing deployments natively, it can probably be extended to do them.
Chris Hartjes has a nice view on this: Deployment is not a 4 letter word
We're using Webistrano, which is a web frontend for Capistrano, to deploy a few dozen projects. It's built as a Ruby on Rails app, and provides a nice, centralized and consistent user interface for Capistrano deployments.
Instead of having cap recipes in every project, and running command-line tools, Webistrano stores the recipes in its database, and allows you to attach the recipes to multiple projects and stages. This reduces duplication of scripts.
Also nice is that all deployment logs are stored so there's an auditing trail. Who deployed which revision to the live server, that sort of thing.
As you requested, the Revision number is stored in the deployed project as well.
All in all, we're very pleased with it.