Creating docker images from puppet for jenkins - php

I'm currently trying to set up some continuous integration testing for the company I work for. System is LAMP based web application and I'm trying to get jenkins set up to do some continuous integration testing.
Current stumbling block is finding a docker image to run jenkins within PHP, most of the images I've found are out of date (haven't been updated in a year) and then don't work when you try to run through the jenkins setup.
We currently have a puppet configuration for our production servers and my current line of thinking is to use puppet image build to create a docker image based on our production set up; with the added bonus of like for like testing.
The problem here though is that we don't want to install jenkins on all our production servers. Is there a way to add jenkins to a created docker image? (preferably via CLI). And am I tackling this the right way or am I putting the cart before the horse i.e. should I be looking at trying to put PHP on a working jenkins image?

Where I work, we have a... few lines of puppet that we maintain, and we've been discussing internally leveraging the existing puppet in docker, but how? I know nothing of the project you've referenced here, but it's a but of a bad smell that the project has sat dormant for nearly a year without a commit. We're taking a hard look at HashiCorp's Packer—for your purposes, the free and open source should do just fine!
But, to the second half of your question, the cart is probably before the horse right now. I'm confused as to what you're trying to accomplish with installing Jenkins inside the docker image to conduct your testing. I think what you're looking for is a longer-running jenkins server (with or without slaves) to run through setting up, running unit tests, integration tests, and so on. If you have the resources, gitlab-ce with gitlab-ci is a pretty simple way to get started in developing an on-premise ci/cd workflow.

Related

Coding at multiple locations?

At the moment I am using poor method to work at home and at work to do web development.
I use Wamp for testing/development and then I upload to a production web server (Linux) via FTP.
If I continue with the project at home, I have to download the files from FTP.
What is good method to work on same projects at multiple locations?
Someone suggest me to learn Git and get Github private account. Also suggested to get Vagrant installed at work and home. Do I need to install Git in Vagrant VM or local machine?
Learn git: http://try.github.io
Create a Vagrant/VrtualBox VM by following the directions at https://puphpet.com
One of the tricks here is to put the Vagrant stuff you get from Puphpet directly in your project and then commit all of it to git. You'll then be able to check out the project in a new environment and, as long as Vagrant and VirtualBox are installed, you can run vagrant up and be working in about 5 mins.
Here's an example of how I'm doing just that to allow people to easily try out a library I've written: https://github.com/jeremykendall/query-auth-impl.
Enjoy! Your life as a developer is about to get a lot easier and a whole lot better.
Github or Bitbucket. Git or Mercurial, and also Svn if it's just for yourself and you want an easier learning curve.
Any source control system would be ideal for this.
You don't want your production server to be the source of truth for the actual code. Those two concerns should definitely be separated. The production application is the output of the code, not the code itself. For a language like PHP the two may be identical, but the concerns themselves should still be separated. Indeed, for small systems the two services may even be hosted on the same server, but should still be logically separated.
The source control system maintains the changes made to the code over time, the production server is a snapshot of the current release version of the code.

Continuous Deployment with Jenkins & PHP

I'm sure there are answers for this all over stackoverflow but I was unable to find anything specific.
I have a PHP project which I am revisiting. Its running on a RHEL5 box. I have SVN on the same box.
Out of curiosity I recently added Jenkins to the machine and have the jenkins php template at...
http://jenkins-php.org/
There was a bit of playing around with the setup but I more or less have this all running and doing Continuous Inspection builds when something is committed to SVN.
What I want to do now is have Jenkins copy my updated files across to the server when the build completes.
I am running a simple LAMP setup and would ideally only like to copy across the files that have actually changed.
Should I just use ANT & sync? Currently the files reside on the same box as the server but this may change whereby I will need to sync these files across to multiple remote boxes.
Thanks
Check these - Copy Artifact Plugin and the job's env variables.
Now set 2 jobs - 1 on source machine and 1 on destination server (make it a slave). Use the plugin to copy required artifacts by using environment variables.
Do you have your project (not the jenkins but that with LAMP setup) under the SVN? If yes I'd recommend to create standalone job in Jenkins that will just do an svn up and you could tie it to jenkins job the way like - you running your main job, and if build is ok jenkins automatically runs job to update your project.
For copying to other servers take a look at Publish Over plugins
It's very easy to setup server and rules. The bad thing is that you can't setup copying only the new files for current build which means that the entire project is uploaded every build.
If your project is too big, another solution is to use rsync as post build action.

2 cloud servers, one dev, one prod; what's a good deployment process?

Currently using LAMP stack for my web app. My dev and prod are in the same cloud instance. Now I am getting a new instance and would like to move the dev/test environment to the new instance, separating it from the prod environment.
It used to be a simple Phing script that would do a SVN export into the prod directory (pointed to by my vhost.conf). How do I make a good build process now with the environments separated?
Thinking of transferring the SVN repository to the dev server and then doing a ssh+svn push (is this possible with Phing?)
What's the best/common practice for this type of setup?
More Info:
I'm currently using CodeIgniter for MVC framework, Phing for automated builds for localhost deployment. The web app is also supported by a few CRON scripts written in Java.
Update:
Ended up using Phing + Jenkins. Working well so far!
We use Phing for doing deployments similar to what you have described. We also use Symfony framework for our projects (which is not so much important for this but Symfony supports the concept of different environments so it's a plus).
However we still need to produce different configuration files for database, front controllers etc.
So we ended up having a folder with build.properties that define configuration for different environments (and in our case also for different clients we ship our product to). This folder is linked to the file structure using svn externals (again not necessary).
The Phing build.xml file then accept a property file as a parameter on the command line, takes the values from it and produces all necessary configuration files, controllers and other environment specific files.
We store the configuration in template files and then use copy/filter feature in Phing to replace the placeholders in the templates with the specific values.
The whole task of configuring the given environment can then be as simple as something like this:
phing configure-environment -DpropertyFile=./build_properties/build.properties.prod
In your build file you check if the propertyFile property that specifies the properties file is defined and load the file using <property file="./build_properties/build.properties.prod" override="true" />. Then you just do any magic with the values as you need.
You can still use your svn checkout/update and put all the resulting configuration files into svn ignore (you will have them generated by phing). We actually use additional steps in Phing. Those steps in the end produce a Linux shell installation self-deploy package. This is produced automatically in Jenkins. We then send the package to our clients or the support team can grab the package from Jenkins and they can do the whole deployment just by executing it (we still prefer manual deployments to production servers) or Jenkins can deploy it automatically (for example to test servers).
I'll be happy to write more info if needed.
I recommend using Capistrano (looks like they haven't updated the docs since they moved the site) and railsless-deploy for doing deployment. Eventually, you are probably going to need to add more app boxes and run other tasks as part of your deployment so choosing a framework that will support this can save you a lot of time in the future. I have used capistrano for two PHP deployments (one small and one large) and although its not perfect, it works well. It also handles all of the code checkout / update, moving symlinks into place, and rolling back if something goes wrong.
Once you have capistrano configured, all you have to do is something like:
cap dev deploy
cap prod deploy
Another option that I have explored for doing this is fabric. Although I haven't used it, if I had to deploy a complex app again, I would consider it. The interface is simple and straightforward.
A third option you might take a look at thought its still in the early stages of development is gantry (pardon the self promoting). This is something I have been working on out of frustration with using capistrano to deploy a PHP application in an environment with a lot of moving pieces. Capistrano is great and works well for non PHP application deployments, but you still have to some poking around in the code to understand what is happening and tweak it to suit your needs. This is also why I suggest giving fabric a good look.
I use a similar config now. Lamp + SVN + codeigniter + prd and dev servers.
I run the svn repos on dev. I checkout the repos into the root folder of the dev domain. Then use a post-commit hook to update the root folder everytime any developer commits.
When we are happy and have fully tested the code I ssh into the prd server and rsync the dev root to the prd root.
Heres my solution for the different configs. Outside the root folder I have a config.ini file. I parse the file in my codeigniter constants.php script. This means that the prd and dev server can have separate settings without them ever being in the repos.
If you want help with post-commit, rsync and ini code let me know.

Deploying / Continuous integration of a Symfony 2 application with Jenkins/Hudson

I've developed an application which uses the Symfony 2 framework. The application code resides in a Bundle, and on my local machine I just downloaded the Symfony2 Standard Distribution and added the Bundle to the src folder as the tutorials describe, before editing the config / routing files appropriately. That's served me well from a development perspective.
I'm now starting to think about how to handle the framework dependencies with regards to deploying to a production environment / a continuous integration setup. Should I continue as I have to date, using a distribution and perhaps a build tool like Phing to check out my bundle and any other dependencies? Or should I be checking out only the Symfony source from Github, and maintain a custom 'distribution' for my application?
I'm hoping someone else has had to do a similar thing and can recommend a solution that works with minimum fuss!
Thanks.
Are your tests written with PHPUnit? If so you can run the tests directly using ant, and then run Ant from jenkins. In my set-up I then have a second project that updates the git checkout in our staging environment if the test passes. I ran into a bunch of issues duck-taping this all together ( mostly around github keys, user permissions, user shell environments, etc ) but the phpqa tools work very well. I just saw this post that seems like a more recent guide on getting everything running:
http://edorian.posterous.com/setting-up-jenkins-for-php-projects
I've got this 'Hello world' project including a working build.xml that should work if ant and the PHP tools are set up correctly:
https://github.com/canuckistani/JenkinsTest

Version control PHP Web Project

We have a PHP project that we would like to version control. Right now there are three of us working on a development version of the project which resides in an external folder to which all of our Eclipse IDEs are linked, and thus no version control.
What is the right way and the best way to version control this?
We have an SVN set up, but we just need to find a good way to check in and out that allows us to test on the development server. Any ideas?
We were in a similar situation, and here's what we ended up doing:
Set up two branches -- the release and development branch.
For the development branch, include a post-commit hook that deploys the repository to the dev server, so you can test.
Once you're ready, you merge your changes into the release branch. I'd also suggest putting in a post-commit hook for deployment there.
You can also set up individual development servers for each of the team members, on their workstations. I find that it speeds things up a bit, although you do have some more setup time.
We had to use a single development server, because we were using a proprietary CMS and ran into licensing issues. So our post-commit hook was a simple FTP bot.
Here is what we do:
Each dev has a VM that is configured like our integration server
The integration server has space for Trunk, each user, and a few slots for branches
The production server
Hooks are in Subversion to e-mail when commits are made
At the beginning of a project, the user makes a branch and checks it out on their personal VM as well as grabs a clean copy of the database. They do their work, committing as they go.
Once they have finished everything in their own personal space they log into the integration server and check out their branch, run their tests, etc. When all that passes their branch is merged into Trunk.
Trunk is rebuilt, the full suite of tests are run, and if all is good it gets the big ol' stamp of approval, tagged in SVN, and promoted to Production at the end of the night.
If at any point a commit by someone else is made, we get an e-mail and can merge those changes into our individual branches.
Beanstalk has built-in post-commit hooks for deploying to development, staging, and production servers.
One way to use subversion for PHP development is too setup a repository for one or all three developers, and use this repository, more as a syncing tool, than true version control.
You could,
Make a repo
Add your entire PHP document structure of your project
Checkout a copy of this repo into the correct spot on your dev server
Use an svn hook, that activates on commit
This hook, will automatically update the contents of the dev sever, whenever anybody on the team checks in any code.
Hook resides in:
svn_dir/repo_name/hooks/post-commit
And could look like:
/usr/bin/svn up /path_to/webroot --username svn_user --password svn_pass
That will update your working copy on the dev server to the latest check in.
What about something distributed? You can start for example with Mercurial, try different workflows, and see which one fits you the best.
Each of you could run it locally, or on your own dev server (or even the same one with a different port...).
One possible way (there are probably better ways):
Each of you should have your own checked out version of the project.
Have a local copy of the server on your computer and test it there throughout the day. Then at the end of each day (or whenever), you merge together whatever you are ready to test, and you check it out onto the dev server and test it.
Another tool you can use for the builds is TeamCity which is free for 20 build configurations (enough for most small companies/projects.) This way you can run your tests as well as schedule builds.

Categories