Continuous Deployment with Jenkins & PHP - php

I'm sure there are answers for this all over stackoverflow but I was unable to find anything specific.
I have a PHP project which I am revisiting. Its running on a RHEL5 box. I have SVN on the same box.
Out of curiosity I recently added Jenkins to the machine and have the jenkins php template at...
http://jenkins-php.org/
There was a bit of playing around with the setup but I more or less have this all running and doing Continuous Inspection builds when something is committed to SVN.
What I want to do now is have Jenkins copy my updated files across to the server when the build completes.
I am running a simple LAMP setup and would ideally only like to copy across the files that have actually changed.
Should I just use ANT & sync? Currently the files reside on the same box as the server but this may change whereby I will need to sync these files across to multiple remote boxes.
Thanks

Check these - Copy Artifact Plugin and the job's env variables.
Now set 2 jobs - 1 on source machine and 1 on destination server (make it a slave). Use the plugin to copy required artifacts by using environment variables.

Do you have your project (not the jenkins but that with LAMP setup) under the SVN? If yes I'd recommend to create standalone job in Jenkins that will just do an svn up and you could tie it to jenkins job the way like - you running your main job, and if build is ok jenkins automatically runs job to update your project.

For copying to other servers take a look at Publish Over plugins
It's very easy to setup server and rules. The bad thing is that you can't setup copying only the new files for current build which means that the entire project is uploaded every build.
If your project is too big, another solution is to use rsync as post build action.

Related

Creating docker images from puppet for jenkins

I'm currently trying to set up some continuous integration testing for the company I work for. System is LAMP based web application and I'm trying to get jenkins set up to do some continuous integration testing.
Current stumbling block is finding a docker image to run jenkins within PHP, most of the images I've found are out of date (haven't been updated in a year) and then don't work when you try to run through the jenkins setup.
We currently have a puppet configuration for our production servers and my current line of thinking is to use puppet image build to create a docker image based on our production set up; with the added bonus of like for like testing.
The problem here though is that we don't want to install jenkins on all our production servers. Is there a way to add jenkins to a created docker image? (preferably via CLI). And am I tackling this the right way or am I putting the cart before the horse i.e. should I be looking at trying to put PHP on a working jenkins image?
Where I work, we have a... few lines of puppet that we maintain, and we've been discussing internally leveraging the existing puppet in docker, but how? I know nothing of the project you've referenced here, but it's a but of a bad smell that the project has sat dormant for nearly a year without a commit. We're taking a hard look at HashiCorp's Packer—for your purposes, the free and open source should do just fine!
But, to the second half of your question, the cart is probably before the horse right now. I'm confused as to what you're trying to accomplish with installing Jenkins inside the docker image to conduct your testing. I think what you're looking for is a longer-running jenkins server (with or without slaves) to run through setting up, running unit tests, integration tests, and so on. If you have the resources, gitlab-ce with gitlab-ci is a pretty simple way to get started in developing an on-premise ci/cd workflow.

Deploy thousands of files to FTP during web app deployment process

Recently I investigated the continuous integration development process, so I installed a TeamCity server, set up the build and tried to make it automatically deploy the build artifacts (my web app) to the web hosting server through FTP. However, I failed on the last step (deployment) because of thousands php files deploying for a very long time. So, I'm wondering if there is a way to do it more quickly, perhaps using zip-archives or something else.
So, my question is, if there is a common way to solve such problem?
Use git for deploy. It get only changes
Example using hooks: gist.github.com/noelboss/3fe13927025b89757f8fb12e9066f2fa – Łukasz Gawrylu

PHP + MySQL Deployment

I'm just trying to find an easier way to deploy a site I'm working on. I'm working alone with a test a production server and right now deployment means copying a subset of the files and database data onto my computer and uploading it to the prod site. I'm sure there's a simple synchronization tool out there but so far I've had no luck in finding anything.
What I'd really like is an application I can run locally (on windows) or something I could install on my server for let me have a one-click deployment. Any suggestions?
Thanks!
godwin
Edit
I have decided for now to go with GoodSync and Toad. Thanks for the suggestions.
man scp
SCP(1) BSD General Commands Manual SCP(1)
NAME
scp - secure copy (remote file copy program)
SYNOPSIS
scp [-1246BCpqrv] [-c cipher] [-F ssh_config] [-i identity_file] [-l limit] [-o ssh_option] [-P port] [-S program] [[user#]host1:]file1
[...] [[user#]host2:]file2
DESCRIPTION
scp copies files between hosts on a network. It uses ssh(1) for data transfer, and uses the same authentication and provides the same
security as ssh(1). Unlike rcp(1), scp will ask for passwords or passphrases if they are needed for authentication.
Any file name may contain a host and user specification to indicate that the file is to be copied to/from that host. Copies between two
remote hosts are permitted.
When copying a source file to a target file which already exists, scp will replace the contents of the target file (keeping the inode).
If the target file does not yet exist, an empty file with the target file name is created, then filled with the source file contents. No
attempt is made at "near-atomic" transfer using temporary files.
The options are as follows:
-1 Forces scp to use protocol 1.
-2 Forces scp to use protocol 2.
...
I use GoodSync http://www.goodsync.com/ for this sort of thing. It's really good. Runs on windows, can sync between any combination of local files (S)FTP, windows, linux network shares etc.
Then create a scheduled task/cronjob to run an export of the database into the syncronised folder and have one do an import at the other end. Obviously this process is one way.
http://www.phing.info/docs/guide/stable/
PHing is an automated build system made for PHP. Works with GIT, SVN, PHPUnit, etc...
You basically set up XML files that give PHing instructions on what to do. Allows you to run test suites along with build creation, build multiple varied versions at a time, copy files as well as db, and a bunch of other cool features.
Also, it's open source and platform independent.
What are you using for source control? Some tools like Git and SVN have ready-made methods for this sort of thing. See here for a quick Git solution.
I would second the advice about Git/SVN, but would put in a strong plug for Git via GitHub. Use GitHub as your "central" Git repository. Your local Git repository will push to GitHub, and your production server will pull from GitHub.
There is some overhead to learning Git/GitHub, but really, in the situation you've described (a single engineer and two servers), Git isn't any more complicated then SVN (or CVS or anything else).
We use an FTP Synchronizer, which seems to work pretty well. I don't know offhand of any good free ones.
Example: http://www.ftpsynchronizer.com/
Depends on what type of server you are running, but you could run SVN (Subversion). There is a plugin for Eclipse, Aptana, and Zend Studio if you use that to develop.
Essentially you could have a development repository that sits on the server. You would pull your code down to your local environment and commit it back after changes. Then you can setup another repository that is your live data or production thats linked back to your Development repository.
When you want to update the live data, you just update it so if any trouble happens you can roll back that code without having to roll back your development code. Once you get good at all that you can start branching and tagging your projects.
I personally use both SVN and Git, but I prefer Git because it works so much better. Though if you are using Windows, the command line tools just aren't the same as linux.

2 cloud servers, one dev, one prod; what's a good deployment process?

Currently using LAMP stack for my web app. My dev and prod are in the same cloud instance. Now I am getting a new instance and would like to move the dev/test environment to the new instance, separating it from the prod environment.
It used to be a simple Phing script that would do a SVN export into the prod directory (pointed to by my vhost.conf). How do I make a good build process now with the environments separated?
Thinking of transferring the SVN repository to the dev server and then doing a ssh+svn push (is this possible with Phing?)
What's the best/common practice for this type of setup?
More Info:
I'm currently using CodeIgniter for MVC framework, Phing for automated builds for localhost deployment. The web app is also supported by a few CRON scripts written in Java.
Update:
Ended up using Phing + Jenkins. Working well so far!
We use Phing for doing deployments similar to what you have described. We also use Symfony framework for our projects (which is not so much important for this but Symfony supports the concept of different environments so it's a plus).
However we still need to produce different configuration files for database, front controllers etc.
So we ended up having a folder with build.properties that define configuration for different environments (and in our case also for different clients we ship our product to). This folder is linked to the file structure using svn externals (again not necessary).
The Phing build.xml file then accept a property file as a parameter on the command line, takes the values from it and produces all necessary configuration files, controllers and other environment specific files.
We store the configuration in template files and then use copy/filter feature in Phing to replace the placeholders in the templates with the specific values.
The whole task of configuring the given environment can then be as simple as something like this:
phing configure-environment -DpropertyFile=./build_properties/build.properties.prod
In your build file you check if the propertyFile property that specifies the properties file is defined and load the file using <property file="./build_properties/build.properties.prod" override="true" />. Then you just do any magic with the values as you need.
You can still use your svn checkout/update and put all the resulting configuration files into svn ignore (you will have them generated by phing). We actually use additional steps in Phing. Those steps in the end produce a Linux shell installation self-deploy package. This is produced automatically in Jenkins. We then send the package to our clients or the support team can grab the package from Jenkins and they can do the whole deployment just by executing it (we still prefer manual deployments to production servers) or Jenkins can deploy it automatically (for example to test servers).
I'll be happy to write more info if needed.
I recommend using Capistrano (looks like they haven't updated the docs since they moved the site) and railsless-deploy for doing deployment. Eventually, you are probably going to need to add more app boxes and run other tasks as part of your deployment so choosing a framework that will support this can save you a lot of time in the future. I have used capistrano for two PHP deployments (one small and one large) and although its not perfect, it works well. It also handles all of the code checkout / update, moving symlinks into place, and rolling back if something goes wrong.
Once you have capistrano configured, all you have to do is something like:
cap dev deploy
cap prod deploy
Another option that I have explored for doing this is fabric. Although I haven't used it, if I had to deploy a complex app again, I would consider it. The interface is simple and straightforward.
A third option you might take a look at thought its still in the early stages of development is gantry (pardon the self promoting). This is something I have been working on out of frustration with using capistrano to deploy a PHP application in an environment with a lot of moving pieces. Capistrano is great and works well for non PHP application deployments, but you still have to some poking around in the code to understand what is happening and tweak it to suit your needs. This is also why I suggest giving fabric a good look.
I use a similar config now. Lamp + SVN + codeigniter + prd and dev servers.
I run the svn repos on dev. I checkout the repos into the root folder of the dev domain. Then use a post-commit hook to update the root folder everytime any developer commits.
When we are happy and have fully tested the code I ssh into the prd server and rsync the dev root to the prd root.
Heres my solution for the different configs. Outside the root folder I have a config.ini file. I parse the file in my codeigniter constants.php script. This means that the prd and dev server can have separate settings without them ever being in the repos.
If you want help with post-commit, rsync and ini code let me know.

How do you manage your build [using Phing] process?

I'm trying to use Phing to automate :
running tests
running DB migrations on each Developer machine [using dbdeply]
deployment to production when needed
I think it does make sense to add a build folder in my project and put all my build configuration files and db deltas in that folder. and commit all that into the SVN repository. so every developer will get the updated build files when he check-out from svn. and be able to run the build to update his DB with the new changes.
on the production server:
I was planning to add another build file there to get the latest Tagged version in svn and perform CSS & JS compression.
I was planning to implement continues integration using PHPUnderControl too, so I can keep track of the result of each build and get notified whenever the build fails.
so, do you think this all does make sense, or you do have other better suggestions?
What you say makes sense : it's pretty close to what I often use (sometimes with ant, sometimes with phing, and sometimes with some shell-scripts).
In the build directory, I would have something like this :
build/
testing/
development/
staging/
production/
common/
With one build.xml file in each sub-directory -- all including another build.xml file, located in the common directory, the idea being to put as much "common" code as possible in the "common" build.xml file, and to have per-environment specific files, that contain as few xml code as possible.
This can be done with the import task that exists in the last version of phing (not sure it's in the stable version -- I'm using an SVN checkout of phing, to have this one, for the project I'm currently working on).
One thing, though : you say you want to deploy to production from the production server ; I would rather, instead :
on a "development" server :
export from SVN
compress JS/CSS and all that
create a tar.gz archive
upload that archive to the production server
on the "production" server :
un-compress that uploaded archive
change a couple of symlink to use the new version of the sources (see the answer I gave here, for more informations)
update what has to be done in the DB
The idea being to :
do as few things as possible on the production server
just in case something goes wrong
and in case, one day, your production server doesn't have access to the SVN server
to have a physical archive that can be deployed on several production servers
Oh, and, as a sidenote : you have to write some kind of documentation of "how to deploy to production", step by step !
This will be really useful the day you are in holiday and someone else has to deploy to production because of an urgent bug-fix ;-)

Categories