Uploading testsuite to Subversion repository - php

I want to upload my testsuite to Subversion repository. I was wondering where in the Subversion repo the test suite should be placed. At the moment we have a root folder (containing all the source code) and a documentation folder. Should we create a testing folder within the root? We also have an automatic build system that makes a new build every minute. How can we get the tests to be run automatically in parallel?
And also, if the test fails or pass how will I get to know the result of it? When it's uploaded in Subversion?

First, the decision how to structure your code has to be discussed in your team. But I would always suggest to remove source from root and move it into separate directory. As well for docs, externals, etc.
However why do you wish to upload results into a repository?
When jusing a CI like Jenkins, TFS etc the results should be displayed on the server to be reviewed by all, not stored in svn as the results are like generated binary data usually not checked in for your own project.
Additionally why does a build system make a build every minute? Why is it not polling the source repo?
The explicit answer for your question would be checking in the result output of a test run to repository which indicates whether this test has worked or not. But I would not recommend that.

Related

Jenkins guide needed for build, deploy, provision and rollback, keeping 5 releases

I am pretty new to Jenkins, and have some sort of understanding but need guidance further.
I have a PHP application on a Git repo, that uses Composer, has Assets, has user uploaded Media files, uses Memcache/Redis, has some Agents/Workers, and has Migration files.
So far I understood I need to create two jobs in Jenkins.
Job 1 = Build
Job 2 = Deploy
In the Build job, I setup the Git repo as source, and I setup a post shell script that has one single line composer update.
1) My first question relates to how/where are the files cloned. I understand there is a Workspace, and every time gets cloned there, or only new things are pulled.
2) composer update seams to load again and again the same stuff, and looks like it's not being cached with multiple builds. I'd love to hear the opinion here, but I was expecting on the next build it will check for changes, and get the diff only. Doing a full composer update takes several minutes.
In the Deploy job, I would love to setup a process that takes the most recent stable build, and moves the files to a dedicated folder like releases2. Then runs some provision scripting and in the end, it updates the /htdocs folder symlink to the new releases2 folder, so the webserver starts to serve from this folder the website.
3) How can I get the latest build (in the build folder I saw only a couple of log and xml files, couldn't locate the files from git) and move to a fresh destination.
4) How shall I setup the destination, so that I can keep Media Files between different deploys.
5) When shall I deal with the Assets (like publishing to a CDN) after successful build, and before deploy is finished. Shall this be a pre/post hook, or a different job.
6) When shall I clear the caches (memcache, redis).
7) How can I rollback to previous versions? And how can I setup to keep last 5 successful releases.
8) How can I get email of failed build and failed deploy email alerts?
9) How can operations get a list of recents commit messages, after a successful deploy by email.
I noticed Jenkins has a lot of plugins. Not sure if these are handled by those plugins, but feel free to recommend anything that gets these done. I also read about Phing, but not sure what is, and were shall I use it.
I understand there are lots of questions in this topic, but if you know the answer for a few of them please post as answer
warning tl;tr
Ok - you want it all. Lot's of questions - long story.
Jenkis is "just" a continous integration server.
Continous integration, basically means that you don't have to run a compilation and unit-testing step on the developer machine, but pull this over to a central server, right?
Because compilation and linking is now on a central server, the developer has more time to develop, instead of waiting for compilation to finish. That's how this CI thing started.
Now, when looking at PHP projects, there isn't any compilation or linking process involved.
The job of an Continous Integration working on PHP projects boils down to just doing the unit-testing and maybe some report generation.
You can clearly see that, when looking at helper projects like Jenkins-PHP, which provdies a template setup for PHP projects on Jenkins - http://jenkins-php.org/example.html
The starting point is "Jenkins does something, after you commited source".
You already have a configuration for your Git repository. It is monitored and whenever a new commit arrives, a new "build process" is triggered.
What is this "build process"?
The build process can partly be configured in the Jenkis GUI.
Partly means, the focus is on the configuration of "triggers" and "notifications" and also on "report generation". Report generation means, that, when certain build tools have finished their jobs and their log files are processed and turned into a better format.
E.g. when phpunit finished it's job, it's possible to use the code-coverage log, to turn it into a nice HTML page and move it to the /www folder for public viewing.)
But, most of the real work of this build process is described in a build configuration file. Here is, where build tools like "Phing", "ant" (the big brother of phing) and "nant" (win) come into play.
The build tool provides the foundation for scripting tasks.
This is where your automation happens!
You will have to script the automation steps youself.
Jenkins is just a GUI on top of that, providing some buttons for displaying the build.log and reports and re-starting a build, right.
Or in other words: you can't simply stick Jenkins and your PHP project together, hoping that you can click your build and deployment process together on the GUI.
We are not there, yet! The tools are getting better, but it's a long way.
Let's forgt about Jenkis for a while. Let's focus on the build steps.
How would you build and deploy your project, when you are on the CLI only?
Do it! You might want to write down, all commands and steps involved to simple text file.
Now, turn these steps into automation steps.
Create a "build.xml" in the root folder of your project.
<?xml version="1.0" encoding="UTF-8"?>
<project name="name-of-project">
... build steps ..
</project>
Now we need some build steps.
Build tools refer to them as "target"s. A build target groups tasks.
You can execute each target on it's own and you can also chain them.
<?xml version="1.0" encoding="UTF-8"?>
<project name="name-of-project" default="build">
<target name="build">
<!-- tasks -->
</target>
<target name="deploy">
<!-- tasks -->
</target>
</project>
Rule: keep targets small - maximum 5-7 cli commands in one target.
Now let's introduce target chaining with dependencies.
Lets assume, that your task "build" should run "phpunit" before.
On the CLI you would just run phpunit, then your build commands.
Inside a build config you have to wrap calls into the exec tasks.
So, you create a "phunit" target and add this as a dependency to the target "build".
Dependencies are executed before the target specifying them.
<?xml version="1.0" encoding="UTF-8"?>
<project name="name-of-project" default="build">
<target name="phpunit" description="Run unit tests with PHPUnit">
<exec executable="phpunit" failonerror="true"/>
</target>
<target name="build" depends="phpunit">
<!-- tasks -->
</target>
<target name="deploy">
<!-- tasks -->
</target>
</project>
A build tool like Phing provides a lot of core tasks, like chown, mkdir, delete, copy, move, exec (...) and additional tasks (for git, svn, notify). Please see the documentation of Phing http://www.phing.info/docs/master/hlhtml/index.html or Ant http://ant.apache.org/manual/
A good thing with Phing is the possibility to write AdhocTasks in PHP inside the build configuration file and run them. This is also possible with ant, just build a exec tasks executing PHP and the script.
Ok - lets fast forward: you re-created the complete build and deployment procedures inside this build configuration. You are now in the position to use the target commands standalone. Now we switch back to Jenkins CI or any other CI server and configure it, to run the build tool with the target tasks. Normally you would have a default target, called mainor build which chains all your targets (steps) together.
Now, when a new commit arrives, Jenkins starts the build process by executing the build script.
Given these pieces of information, about how Jenkins interacts with a build tool,
some of your questions are self-explaining. You just have to create the steps, in order to get done, what you want...
Let's start the Q&A round:
Q1: Jenkins workspace folder
Workspace is where the projects live. New commits arrive there.
Under "Advanced" you select a working directory for the projects without changing the Jenkins home directory. Check the "Use custom workspace" box and set the directory that Jenkins will pull the code to and build in.
It's also possible to configure the build folders there and the amount of builds to keep.
Q2: Composer
Composer keeps a local cache - it lives in $COMPOSER_HOME/cache. The local cache will be used, when the same dependencies are used. This avoids re-downloading them. If a new dependency is introduced or the version changed, then that things gets fetched and will be re-used, on composer install and composer update.
Composer installs/updates are always fresh from the net or the cache.
There is no keep alive of the vendor folder. The dependency is deleted and re-installed.
If it takes long, it takes long. End of the story.
If it takes to long, use Composer one-time, then add new build targets "zip-copy-vendor-folder" and "copy-unzip-vendor-folder". I guess, you can imagine what these things do.
Now you have to introduce a if check for the zipped vendor file. If the vendor zip file exists, you skip the composer install target and proceed with "copy-unzip.." .. ok, you got it. This is a tweak.. do this only if your dependencies are pretty stable and far from changing often.
In general, you will need a build target "get-dependencies-with-composer", which executes composer install. Cache will be used automatically by Composer.
Q3: get the latest build and move to a fresh destination
The latest build is in the build folder - or, if you defined a step to move the file, it's already in your desired folder.
Q4: how to get media files in
Just add a build target for copying the media folders into the project.
Q5: add a build targets for asset handling
You already know the position: it's "after build". That means it's a deployment steps, right? Add a new target to upload your folder maybe via FTP to your CDN.
Q6: when shall I clear the caches (memcache, redis)
I suggest to go with a simple: "deploy - flush cache - rewarm caches" strategy.
Hotswapping of PHP applications is complicated. You have to have a PHP class supporting the change of underlying components, while the system starts running two versions. The old versions fades out of cache, the new versions fades in.
Please ask this question standalone!
This is not as easy as one would think.
It's complicated and also one of the beloved topics of Rasmus Lerdorf.
Q7.1: How can I rollback to previous versions?
By running the deploy target/tasks in the folder of the previous version.
Q7.2: And how can I setup to keep last 5 successful releases.
Jenkins has a setting for "how many builds to keep" in the build folder.
Set it to 5.
Q8: How can I get email of failed build and failed deploy email alerts?
Automatically. Email notifications are default. If i'm wrong, look into notifiers "email".
**Q9: How can operations get a list of recents commit messages, after a successful deploy by email. **
Add a build target "send-git-log-via-email-to-operations".
i feel, like i wrote a book today...
I don't have answers to all but few that i have are:
1) My first question relates to how/where are the files cloned. I understand there is a Workspace, and every time gets cloned there, or only new things are pulled.
You're correct in your understanding that files are cloned in workspace. However, if you want, you can set your own custom workspace by setting it up in Advanced Project Options ( enable Use custom workspace) which is just above 'Source Code Management' section.
2) composer update seams to load again and again the same stuff, and looks like it's not being cached with multiple builds. I'd love to hear the opinion here, but I was expecting on the next build it will check for changes, and get the diff only. Doing a full composer update takes several minutes.
I have no idea about Composer but if this thing is also getting checked-out from Git, then Shallow clone might be the thing you're looking for. This is present under: Source Code Management section > Git > Additional Behaviours > Advanced clone behaviours
In the Deploy job, I would love to setup a process that takes the most recent stable build, and moves the files to a dedicated folder like releases2. Then runs some provision scripting and in the end, it updates the /htdocs folder symlink to the new releases2 folder, so the webserver starts to serve from this folder the website.
I'm not sure why you need a separate job for deployment. All that you've stated above can be accomplished in the same job i guess. In the Build section (just above Post-build Actions section) itself, you can specify your script (Win batch/bash/perl/...) that will perform all actions required on the stable build that just got created.
3) How can I get the latest build (in the build folder I saw only a couple of log and xml files, couldn't locate the files from git) and move to a fresh destination.
From your description, i'm almost sure that you're not having a master-slave set up for Jenkins. In that case, the easiest way to find out the location of files fetched from Git would be to check the 'Console Output' of the latest build (or any build for that matter). In the first two-three lines of the console output, you will see the path to your build workspace. For ex., in my case, it's something like:
Started by timer
Building remotely on Slave1_new in workspace /home/ec2-user/slave/workspace/AutoBranchMerge
4) How shall I setup the destination, so that I can keep Media Files between different deploys.
I'm not really sure what you're looking for. Looks like something that has to handled by your script. Please elaborate.
5) When shall I deal with the Assets (like publishing to a CDN) after successful build, and before deploy is finished. Shall this be a pre/post hook, or a different job.
If you mean artifacts, then you should be checking 'Post-build Actions'. It has several options such as 'Archive the artifacts', '[ArtifactDeployer] - Deploy artifacts from workspace to remote repositories' etc... The number of such options you see in this section depends on the number of plugins you've installed.
One useful artifact-related plugin is https://wiki.jenkins-ci.org/display/JENKINS/ArtifactDeployer+Plugin
6) When shall I clear the caches (memcache, redis).
Sorry, no idea about this.
7) How can I rollback to previous versions? And how can I setup to keep last 5 successful releases.
Previous version of what? Build is always overwritten in the workspace; only the logs of past builds will be there to view. You will have to explicitly put a mechanism (script) in place to take backup of builds. Also check this section Discard Old Builds at the top of project configuration page. There are few plugins available too which you can install. These will help you configure builds to keep, to delete etc.
8) How can I get email of failed build and failed deploy email alerts?
This option is available in Post-build Actions. There is 'E-mail Notification' which provides basic functionality. If you need better features, i suggest you install 'Email-ext' plugin.
https://wiki.jenkins-ci.org/display/JENKINS/Email-ext+plugin
You can search from hundreds of plugins by going to Jenkins > Manage Jenkins > Manage Plugins > Available tab
9) How can operations get a list of recents commit messages, after a successful deploy by email.
This functionality might not be directly available through plugin. Some scripting effort will be required i guess. This might be of some help for you: How to include git changelog in Jenkins emails?

How to deploy only modified/new files GIT+Jenkins+PHP?

I am trying to use Jenkins CI server for my PHP application. As we are using our Git repository so i am using jenkins's git plugin to take files from central repo.
Currently when my jenkins job runs it takes files from git repo & make a build but that build contains all the files.
As per my current scenario i only want modified+new files in that build.So that i can deploy only them not the whole bunch of files.
Is this possible somehow..or it is fundamentally wrong in build environments..?
You need two things: The commit of the previous build and then the changed files between the previous build commit and current HEAD.
For the first: There might be ways to find the commit from Jenkins via the REST API (as it does display it in the build page. But I think it will be easier if you put the git commit into a file and archive it as a build artifact. Then you can use Copy Build artifacts plugin to get the file from the previous build.
For the second: Maybe you can use git diff --name-status HEAD
To tie all of this together:
Set up the build to Copy artifacts from the same job, last successful build.
Assuming the file where you store the commit id is called "commit_id", set a build step to run something like:
git diff --name-status `cat commit_id` HEAD |
while read status file; do
case $status in
D) echo "$file was deleted" ;; # deploy (or ignore) file deletion here
A|M) echo "$file was added or modified" ;; # deploy file modification here
esac
done
# record this commit to be archived
git describe > commit_id
In the post build actions, configure the job to archive the file commit_id.
There is nothing "fundamentally wrong" in that, however at least your release builds should be "clean full" builds (not incremental).
As for "how" to do that... you have to do that yourself. Haven't seen any plugins like this. Why? Because in majority of compiled "builds", there is no 1-to-1 relationship between a source file and corresponding compiled file. How would the system know which new files produced which new artifacts? (In PHP, it's clear, in other languages, not)
You've got to write your own build script that would:
Parse the console log for SCM changes, or query the SCM directly.
Build
Archive/Package/Zip only the files that were changed, based on your parsing in step 1.
Deploy that subset of file.

The point of Yii2 environments folder

I am trying to work what the point of the environments folder is.
Originally I had the idea that you could point the webserver to the different dev and prod folders in the environment folder but after reading up a bit I realise this is not the case.
In Yii 1 you would solve this by just having multiple index.php's i.e.:
index.php
index-local.php
So the question is what benefit does this new environment structure actually give me over the old way?
I've found environments very useful in allowing me to keep a common code base for multiple client projects (based on Yii App Advanced) and setting up a different environment for each specific client, keeping their custom code private and separate.
To do this I store the environments folder in a separate git repo from the rest of the code and pull down the relevant folder on a client/project basis.
This lets me use a base common code for all projects and add/override any file for a specific client or project whilst still allowing separate dev/prod config settings. If the client uses other developers too, they are also catered for. In this way, only common code I choose will be shared amongst clients and custom code will be kept private.
I've also moved the composer.json file into the environments folder so I can pull in different extensions per client/project keeping those private too.
That init command can be a very powerful tool and you don't have to limit yourself to the template provided by the core developers.
If you don't need environments, then don't use them, but I assure you some people will find it very useful.
Yii2 documentation in WIP, but you should read this :
https://github.com/yiisoft/yii2/blob/master/docs/guide/apps-advanced.md#configuration-and-environments
You need to use yii init command to switch between these environments.
EDIT :
This new environment feature is more than just use different config file. You can use different folder structure, different entry script...etc
Personnaly I won't use this feature, I don't need it (I will use a different entry script as with Yii 1), but I think this is not useless.
I think you didn't get the real purpose of environments introduced in Yii2.
I'll try to explain what was the main purpose of adding environments to yii from the developers point of view on an example and hope you will really appreciate its usefulness.
Let's suppose for a moment that you are a team of developers (e.g. 5-7 person) working on mid-to-large project implemented in Yii. To effectively work on that project your team decides to use some CVS or SVN (e.g. GIT) and keep all the files of the project in repository in cloud for the whole team. That's de facto standard while working on mid-to-large projects in teams and nobody will resist that it's the only comfortable and easy way.
Ok, now let's suppose you use Yii 1.x or Yii2 with the approach of different entry scripts to differentiate between local (development) and production environments to connect to db or set some other environment specific configs. Everything is ok and working. But suppose your team members implemented something new on the project and you check out repository to work on updated version and you suddenly find out that your local config file (in this case entry script with config) is overwritten with other team member's file who pulled the changes to repository (because each of you is using your local machine db with other database name or OS, or config, or simply because your team uses one local development server db, but you are on vacation and can't use anything except your local machine).
So generally Yii2 environment adds more flexibility for using different environments each with it's own specific configurations while using also general (common) configs when working in teams on mid-to-large projects hence why the example in guide is given on advanced app project.
Surely you can overcome everything stated above with some solutions or .gitignore which is used by default to overcome the problem stated in Yii2 with environments. But:
Why bother if everything is already done?
and
It was just one little example of usefulness of Yii2 environments. More depends on the project and your imagination.
Overall Yii2 is great product. Not only it adds many new features to already great framework, but it also is more robust and flexible than Yii 1.x (despite the fact that Yii 1.x was already very robust).
As for Laravel or any other PHP framework, it really depends... Everyone will find his/her own favorite.
For those who are tired of copying files around, I created a useful script that you can run in background to keep the files in sync on your dev environment:
File sync-env-files.sh
#!/bin/bash
ENVIRONMENT_DIR="/var/www/example.com/environments/dev/"
DIR="/var/www/example.com/"
while [ true ]; do
for envFile in `find $ENVIRONMENT_DIR -type f`
do
file=${envFile/$ENVIRONMENT_DIR/$DIR}
if [ `stat -c "%Y" $file` -gt `stat -c "%Y" $envFile` ]; then
#echo "copying modified file $file to $envFile"
/bin/cp -f $file $envFile
fi
done
sleep 2
done
Then run the script in background or add to cron with flock
nohup server/sync-env-files.sh >/dev/null 2>&1 &
I would like to mention in addition to #AngelCoding, since this question still gets seen, that I use the environments folder lots now and definitely see the point of it.
The very first things I do in any open source project is create one project for the code base on GitHub and then another, private, one on Bitbucket for the configuration, in other words the environments folder.
Having this folder has made it a lot easier for me to separate my configuration into a private repository.
So the environments folder has a lot of uses and really helps to separate configuration for easier usage even if it does not seem like it initially.

How do you manage the unit test files in projects? do you add them in git?

How do you manage your PHPUnit files in your projects?
Do you add it to your git repository or do you ignore them?
Do you use #assert tag in your PHPdocs codes?
Setup
I'm not using php currently, but I'm working with python unit testing and sphinx documentation in git. We add our tests to git and even have certain requirements on test passing for pushing to the remote devel and master branches (master harder than devel). This assures a bit of code quality (test coverage should also be evaluated, but thats not implemented yet :)).
We have the test files in a separate directory next to the top-level source directory in the directories where they belong to, prefixed with test_, so that the unit testing framework finds them automagically.
For documentation its similar, we just put the sphinx docs files into their own subdirectory (docs), which is in our case an independent git submodule, which might be changed in the future.
Rationale
We want to be able to track changes in the tests, as they should be rare. Frequent changes indicate immature code.
Other team members need access to the tests, otherwise they're useless. If they change code in some places, they must be able to verify it doesn't break anything.
Documentation belongs to the code. In case of python, the code directly contains the documentation. So we have to keep it both together, as the docs are generated from the code.
Having the tests and the docs in the repository allows for automated testing and doc building on the remote server, which gives us instantaneous updated documentation and testing feedback. Also the implementation of “code quality” restrictions based on test results works that way (its actually more a reminder for people to run tests, as code quality cannot be checked with tests without looking at test coverage too). Refs are rejected by the git server if tests do not pass.
We for example require that on master, all tests have to pass or be skipped (sadly, we need skipped, as some tests require OpenGL, which is not available on headless), while on devel its okay if tests just “behave like expected” (i.e. pass, skip or expected failure, no unexpected success, error or failure).
Yes, to keeping them in git. Other conventions I picked up by looking at projects, including phpunit itself. (A look at the doctrine2 example shows it seems to follow the same convention.)
I keep tests in a top-level tests directory. Under that I have meaningfully named subdirectories, usually following the main project directory structure. I have a functional subdirectory for tests that test multiple components together (where applicable).
I create phpunit.xml.dist telling it where to find the tests (and also immediately telling anyone looking at the source code that we use phpunit, and by looking at the xml file they can understand the convention too).
I don't use #assert or the skeleton generator. It feels like a toy feature; you do some typing in one place (your source file) to save some typing in another place (your unit test file). But then you'll expand on the tests in the unit test files (see my next paragraph), maybe even deleting some of the original asserts, and now the #assert entries in the original source file are out of date and misleading to anyone looking at just that code.
You have also lost a lot of power that you end up needing for real-world testing of real-world classes (simplistic BankAccount example, I'm looking at you). No setUp()/tearDown(). No instance variables. No support for all the other built-in assert functions, let alone custom ones. No #depends and #dataProvider.
One more reason against #assert, and for maintaining a separate tests directory tree: I like different people to write the tests and the actual code, where possible. When tests fail it sometimes points to a misunderstanding in the original project specs, by either your coder or your tester. When code and tests live close together it is tempting to change them at the same time. Especially late on a Friday afternoon when you have a date.
We store our tests right with the code files, so developers see the tests to execute, and ensure they change the tests as required. We simply add an extension of .test to the file. This way, we can simply include the original file automatically in each test file, which may then be created with a template. When we release the code, the build process deletes the .test files from all directories.
/application/src/
Foo.php
Foo.php.test
/application/src/CLASS/
FOO_BAR.class
FOO_BAR.class.test
require_once(substr(__FILE__, 0, -5)); // strip '.test' extension

Syncing changes in an SVN repository with a folder on my server

I work for a consumer software company. Our core software runs on the desktop (as opposed to in a browser) and uses a lot of XML and audio files. We're frequently committing new versions of the XML files (and sometimes audio files) to our SVN repository.
However, we have an additional online-only application that uses these same XML and audio files. Assume these files are stored in http://example.com/data/xml and http://example.com/data/audio. Is there a way to automatically check on a daily basis for new commits to the SVN repository, and then update the /data/xml and /data/audio folders with the new files found in SVN?
I'm using PHP on Apache.
You might look here. That describes post-commit hooks. That means you might write a script which is executed on your server after every commit to the repository e.g a script which updates these two directories.
If you just want to update them daily you might look for daily cronjobs(see e.g. here: http://en.wikipedia.org/wiki/Cron).
Keep it simple. Assuming the two folders you want to update are working copies and not locally edited where you want them updated, just run a daily cron that issues a svn update on each folder.

Categories