I am pretty new to Jenkins, and have some sort of understanding but need guidance further.
I have a PHP application on a Git repo, that uses Composer, has Assets, has user uploaded Media files, uses Memcache/Redis, has some Agents/Workers, and has Migration files.
So far I understood I need to create two jobs in Jenkins.
Job 1 = Build
Job 2 = Deploy
In the Build job, I setup the Git repo as source, and I setup a post shell script that has one single line composer update.
1) My first question relates to how/where are the files cloned. I understand there is a Workspace, and every time gets cloned there, or only new things are pulled.
2) composer update seams to load again and again the same stuff, and looks like it's not being cached with multiple builds. I'd love to hear the opinion here, but I was expecting on the next build it will check for changes, and get the diff only. Doing a full composer update takes several minutes.
In the Deploy job, I would love to setup a process that takes the most recent stable build, and moves the files to a dedicated folder like releases2. Then runs some provision scripting and in the end, it updates the /htdocs folder symlink to the new releases2 folder, so the webserver starts to serve from this folder the website.
3) How can I get the latest build (in the build folder I saw only a couple of log and xml files, couldn't locate the files from git) and move to a fresh destination.
4) How shall I setup the destination, so that I can keep Media Files between different deploys.
5) When shall I deal with the Assets (like publishing to a CDN) after successful build, and before deploy is finished. Shall this be a pre/post hook, or a different job.
6) When shall I clear the caches (memcache, redis).
7) How can I rollback to previous versions? And how can I setup to keep last 5 successful releases.
8) How can I get email of failed build and failed deploy email alerts?
9) How can operations get a list of recents commit messages, after a successful deploy by email.
I noticed Jenkins has a lot of plugins. Not sure if these are handled by those plugins, but feel free to recommend anything that gets these done. I also read about Phing, but not sure what is, and were shall I use it.
I understand there are lots of questions in this topic, but if you know the answer for a few of them please post as answer
warning tl;tr
Ok - you want it all. Lot's of questions - long story.
Jenkis is "just" a continous integration server.
Continous integration, basically means that you don't have to run a compilation and unit-testing step on the developer machine, but pull this over to a central server, right?
Because compilation and linking is now on a central server, the developer has more time to develop, instead of waiting for compilation to finish. That's how this CI thing started.
Now, when looking at PHP projects, there isn't any compilation or linking process involved.
The job of an Continous Integration working on PHP projects boils down to just doing the unit-testing and maybe some report generation.
You can clearly see that, when looking at helper projects like Jenkins-PHP, which provdies a template setup for PHP projects on Jenkins - http://jenkins-php.org/example.html
The starting point is "Jenkins does something, after you commited source".
You already have a configuration for your Git repository. It is monitored and whenever a new commit arrives, a new "build process" is triggered.
What is this "build process"?
The build process can partly be configured in the Jenkis GUI.
Partly means, the focus is on the configuration of "triggers" and "notifications" and also on "report generation". Report generation means, that, when certain build tools have finished their jobs and their log files are processed and turned into a better format.
E.g. when phpunit finished it's job, it's possible to use the code-coverage log, to turn it into a nice HTML page and move it to the /www folder for public viewing.)
But, most of the real work of this build process is described in a build configuration file. Here is, where build tools like "Phing", "ant" (the big brother of phing) and "nant" (win) come into play.
The build tool provides the foundation for scripting tasks.
This is where your automation happens!
You will have to script the automation steps youself.
Jenkins is just a GUI on top of that, providing some buttons for displaying the build.log and reports and re-starting a build, right.
Or in other words: you can't simply stick Jenkins and your PHP project together, hoping that you can click your build and deployment process together on the GUI.
We are not there, yet! The tools are getting better, but it's a long way.
Let's forgt about Jenkis for a while. Let's focus on the build steps.
How would you build and deploy your project, when you are on the CLI only?
Do it! You might want to write down, all commands and steps involved to simple text file.
Now, turn these steps into automation steps.
Create a "build.xml" in the root folder of your project.
<?xml version="1.0" encoding="UTF-8"?>
<project name="name-of-project">
... build steps ..
</project>
Now we need some build steps.
Build tools refer to them as "target"s. A build target groups tasks.
You can execute each target on it's own and you can also chain them.
<?xml version="1.0" encoding="UTF-8"?>
<project name="name-of-project" default="build">
<target name="build">
<!-- tasks -->
</target>
<target name="deploy">
<!-- tasks -->
</target>
</project>
Rule: keep targets small - maximum 5-7 cli commands in one target.
Now let's introduce target chaining with dependencies.
Lets assume, that your task "build" should run "phpunit" before.
On the CLI you would just run phpunit, then your build commands.
Inside a build config you have to wrap calls into the exec tasks.
So, you create a "phunit" target and add this as a dependency to the target "build".
Dependencies are executed before the target specifying them.
<?xml version="1.0" encoding="UTF-8"?>
<project name="name-of-project" default="build">
<target name="phpunit" description="Run unit tests with PHPUnit">
<exec executable="phpunit" failonerror="true"/>
</target>
<target name="build" depends="phpunit">
<!-- tasks -->
</target>
<target name="deploy">
<!-- tasks -->
</target>
</project>
A build tool like Phing provides a lot of core tasks, like chown, mkdir, delete, copy, move, exec (...) and additional tasks (for git, svn, notify). Please see the documentation of Phing http://www.phing.info/docs/master/hlhtml/index.html or Ant http://ant.apache.org/manual/
A good thing with Phing is the possibility to write AdhocTasks in PHP inside the build configuration file and run them. This is also possible with ant, just build a exec tasks executing PHP and the script.
Ok - lets fast forward: you re-created the complete build and deployment procedures inside this build configuration. You are now in the position to use the target commands standalone. Now we switch back to Jenkins CI or any other CI server and configure it, to run the build tool with the target tasks. Normally you would have a default target, called mainor build which chains all your targets (steps) together.
Now, when a new commit arrives, Jenkins starts the build process by executing the build script.
Given these pieces of information, about how Jenkins interacts with a build tool,
some of your questions are self-explaining. You just have to create the steps, in order to get done, what you want...
Let's start the Q&A round:
Q1: Jenkins workspace folder
Workspace is where the projects live. New commits arrive there.
Under "Advanced" you select a working directory for the projects without changing the Jenkins home directory. Check the "Use custom workspace" box and set the directory that Jenkins will pull the code to and build in.
It's also possible to configure the build folders there and the amount of builds to keep.
Q2: Composer
Composer keeps a local cache - it lives in $COMPOSER_HOME/cache. The local cache will be used, when the same dependencies are used. This avoids re-downloading them. If a new dependency is introduced or the version changed, then that things gets fetched and will be re-used, on composer install and composer update.
Composer installs/updates are always fresh from the net or the cache.
There is no keep alive of the vendor folder. The dependency is deleted and re-installed.
If it takes long, it takes long. End of the story.
If it takes to long, use Composer one-time, then add new build targets "zip-copy-vendor-folder" and "copy-unzip-vendor-folder". I guess, you can imagine what these things do.
Now you have to introduce a if check for the zipped vendor file. If the vendor zip file exists, you skip the composer install target and proceed with "copy-unzip.." .. ok, you got it. This is a tweak.. do this only if your dependencies are pretty stable and far from changing often.
In general, you will need a build target "get-dependencies-with-composer", which executes composer install. Cache will be used automatically by Composer.
Q3: get the latest build and move to a fresh destination
The latest build is in the build folder - or, if you defined a step to move the file, it's already in your desired folder.
Q4: how to get media files in
Just add a build target for copying the media folders into the project.
Q5: add a build targets for asset handling
You already know the position: it's "after build". That means it's a deployment steps, right? Add a new target to upload your folder maybe via FTP to your CDN.
Q6: when shall I clear the caches (memcache, redis)
I suggest to go with a simple: "deploy - flush cache - rewarm caches" strategy.
Hotswapping of PHP applications is complicated. You have to have a PHP class supporting the change of underlying components, while the system starts running two versions. The old versions fades out of cache, the new versions fades in.
Please ask this question standalone!
This is not as easy as one would think.
It's complicated and also one of the beloved topics of Rasmus Lerdorf.
Q7.1: How can I rollback to previous versions?
By running the deploy target/tasks in the folder of the previous version.
Q7.2: And how can I setup to keep last 5 successful releases.
Jenkins has a setting for "how many builds to keep" in the build folder.
Set it to 5.
Q8: How can I get email of failed build and failed deploy email alerts?
Automatically. Email notifications are default. If i'm wrong, look into notifiers "email".
**Q9: How can operations get a list of recents commit messages, after a successful deploy by email. **
Add a build target "send-git-log-via-email-to-operations".
i feel, like i wrote a book today...
I don't have answers to all but few that i have are:
1) My first question relates to how/where are the files cloned. I understand there is a Workspace, and every time gets cloned there, or only new things are pulled.
You're correct in your understanding that files are cloned in workspace. However, if you want, you can set your own custom workspace by setting it up in Advanced Project Options ( enable Use custom workspace) which is just above 'Source Code Management' section.
2) composer update seams to load again and again the same stuff, and looks like it's not being cached with multiple builds. I'd love to hear the opinion here, but I was expecting on the next build it will check for changes, and get the diff only. Doing a full composer update takes several minutes.
I have no idea about Composer but if this thing is also getting checked-out from Git, then Shallow clone might be the thing you're looking for. This is present under: Source Code Management section > Git > Additional Behaviours > Advanced clone behaviours
In the Deploy job, I would love to setup a process that takes the most recent stable build, and moves the files to a dedicated folder like releases2. Then runs some provision scripting and in the end, it updates the /htdocs folder symlink to the new releases2 folder, so the webserver starts to serve from this folder the website.
I'm not sure why you need a separate job for deployment. All that you've stated above can be accomplished in the same job i guess. In the Build section (just above Post-build Actions section) itself, you can specify your script (Win batch/bash/perl/...) that will perform all actions required on the stable build that just got created.
3) How can I get the latest build (in the build folder I saw only a couple of log and xml files, couldn't locate the files from git) and move to a fresh destination.
From your description, i'm almost sure that you're not having a master-slave set up for Jenkins. In that case, the easiest way to find out the location of files fetched from Git would be to check the 'Console Output' of the latest build (or any build for that matter). In the first two-three lines of the console output, you will see the path to your build workspace. For ex., in my case, it's something like:
Started by timer
Building remotely on Slave1_new in workspace /home/ec2-user/slave/workspace/AutoBranchMerge
4) How shall I setup the destination, so that I can keep Media Files between different deploys.
I'm not really sure what you're looking for. Looks like something that has to handled by your script. Please elaborate.
5) When shall I deal with the Assets (like publishing to a CDN) after successful build, and before deploy is finished. Shall this be a pre/post hook, or a different job.
If you mean artifacts, then you should be checking 'Post-build Actions'. It has several options such as 'Archive the artifacts', '[ArtifactDeployer] - Deploy artifacts from workspace to remote repositories' etc... The number of such options you see in this section depends on the number of plugins you've installed.
One useful artifact-related plugin is https://wiki.jenkins-ci.org/display/JENKINS/ArtifactDeployer+Plugin
6) When shall I clear the caches (memcache, redis).
Sorry, no idea about this.
7) How can I rollback to previous versions? And how can I setup to keep last 5 successful releases.
Previous version of what? Build is always overwritten in the workspace; only the logs of past builds will be there to view. You will have to explicitly put a mechanism (script) in place to take backup of builds. Also check this section Discard Old Builds at the top of project configuration page. There are few plugins available too which you can install. These will help you configure builds to keep, to delete etc.
8) How can I get email of failed build and failed deploy email alerts?
This option is available in Post-build Actions. There is 'E-mail Notification' which provides basic functionality. If you need better features, i suggest you install 'Email-ext' plugin.
https://wiki.jenkins-ci.org/display/JENKINS/Email-ext+plugin
You can search from hundreds of plugins by going to Jenkins > Manage Jenkins > Manage Plugins > Available tab
9) How can operations get a list of recents commit messages, after a successful deploy by email.
This functionality might not be directly available through plugin. Some scripting effort will be required i guess. This might be of some help for you: How to include git changelog in Jenkins emails?
I want to upload my testsuite to Subversion repository. I was wondering where in the Subversion repo the test suite should be placed. At the moment we have a root folder (containing all the source code) and a documentation folder. Should we create a testing folder within the root? We also have an automatic build system that makes a new build every minute. How can we get the tests to be run automatically in parallel?
And also, if the test fails or pass how will I get to know the result of it? When it's uploaded in Subversion?
First, the decision how to structure your code has to be discussed in your team. But I would always suggest to remove source from root and move it into separate directory. As well for docs, externals, etc.
However why do you wish to upload results into a repository?
When jusing a CI like Jenkins, TFS etc the results should be displayed on the server to be reviewed by all, not stored in svn as the results are like generated binary data usually not checked in for your own project.
Additionally why does a build system make a build every minute? Why is it not polling the source repo?
The explicit answer for your question would be checking in the result output of a test run to repository which indicates whether this test has worked or not. But I would not recommend that.
We're considering using a CI server soon.
From my reading, I've found that Sismo and Hudson were available for PHP project.
Considering that we're actually using GIT and PHPUnit, what are the big difference between Hudson and Sismo that we should know in order to make the best choice for our situation ?
Thanks
The language match is not key in your hunt for the best CI server; it's all the features around:
source control
concurrent build
trigger build
notification
Even for simple project, Jenkins (the new name for Hudson) is easy to use and quick to install. Then it is really easy to scale Jenkins up by adding more nodes (satellite machine that can execute build) when you need to. Also Jenkins has hundreds of plugin for numerous task.
Have a look at Bamboo, Jenkins, TeamCity, and CruiseControl Features to compare some of the features of the big names (you might actually want to consider Bamboo, TeamCity or Cruise Control over Jenkins)
I would lean towards Sismo since it matches the language of the project you are developing (PHP) and can be ran from just a single PHP and config file. Then you don't have to deal with having a java environment just for Hudson.
There is a really good php-integration for Jenkins by the phpunit inventor Sebastian Bergmann. You should really have a look at it.
As far as I see the biggest downside of Sismo is, that is not a "real" CI server, but more a build-and-report-environment, because you need to trigger the builds yourself (or let something trigger it).
I'll preface this by saying that I haven't used sismo.
We use Hudson with applications being built & tested in both Java and PHP. It has a nice plugin system, and getting it up and running on a centOS box took about 15 minutes yesterday. (We had to move it from one box to another).
For PHP Hudson integrates with both PHPUnit and Selenium so we run both unit tests and functional tests against the same codebase. Hudson has a great 'one-click' plugin system that really lets you customize your installation.
One thing we had to get a plugin for was sending an email on every build whether successful or not. Hudson by default will only email when your build goes from good (tests pass) to bad, from bad to good, or repeatedly bad. This means it will not send an email for every build if 2 builds in a row were successful. The email plugin solves this but it was confusing to uncover that.
So this is my dilemma - I am using the excellent codeigniter-simpletest library by Eric Barnes (http://github.com/ericbarnes/codeigniter-simpletest). Its perfect for my purposes, as it adds an endpoint onto the test deployment of my CodeIgniter application, from which I have a dashboard to run all my unit tests and view the results. Everything fine so far.
But now I come to integrating it into my Phing build script (so that a phing release call on my test build will trigger the unit tests and fail the build if any of the tests fail), and I'm stuck.
Due to the integration package to make codeigniter play nice with simpletest, the command line runner of simpletest is not an option (I don't think).
Is there a way to invoke a URL from phing, and grab the resulting HTML? I could insert some hidden HTML into the built unit tests results page and check for it from the Phing task.
A new task HttpGetTask was recently added to Phing that should help you, it will be released in the next release (2.4.3).
I am looking for some sort of dynamic testing tool. Let me explain what I mean. I have a modular application and part of my business strategy is to license each module independently. I would like to test things such as the fact that one module's tests all still pass even when the other module is not present on the file system.
Another thing I would like to be able to verify during my test suite, is that un-related settings should not change behavior they are not supposed to be coupled to.
To write an ad-hoc script for each test case and inject the defect would be trivial, but sounds like a violation of DRY. Is there any tools designed for this kind of defect injection, specifically in PHP?
So in an ideal world I would like to be able to write a set of "environment modifying" scripts, each of which will alter the environment in some way. For each "environment modification" phpunit should run all my tests. I should be able to view in some kind of report which tests failed under which "environment extremes"
So with N "environment modifications" and M tests, The sum of passes & failures should equal N times M, similar to how GUI test tools such as selenium run the suite on each different environment (browser)
This might work using phpUnit and some magic inside the fixtures. But I'd look into some build tool (like Ant or Phing) which creates a checkout from your source repository, set's up the required modules and then runs the phpUnit tests.
Phing or Ant can be called from an continous integration system like phpUnderControl or Hudson on a regular basis so you get regular automated feedback.