We're a web development shop using PHP or .Net. Here's the proposed path for development:
Developer -> SVN Server -> Jenkins Server -> Dev Server
A developer commits or creates a tag to the SVN Server
Either a commit or tag triggers a job on Jenkins asynchronously
The job on Jenkins will run a svn update (or clear destination space, then svn export) on Dev Server bringing the code in from the SVN Server
For .Net, after having copied the code, I need to compile it using aspnet_compile
My question is two fold, is this a proper use of Jenkins? And, if I were to compile .Net using aspnet_compile.exe, would installing a slave on the Microsoft OS be the way to go (to enable this)?
To futher the answer:
Check the Restrict where this project can be run, select slave.
This sounds like a fairly standard build / deploy workflow and therefore a good use of Jenkins, but I'd suggest doing step 4 as part of your Jenkins build and then deploying the compiled artifacts to your dev server also using Jenkins. This way you'll be able to have Jenkins report compilation problems as build failures and have the ability to run tests against the compiled code as part of the Jenkins job, per #thescientist's comment.
If your Jenkins master is running on an operating system other than Windows, you can use a Windows slave to do the compilation. You could even have the slave process running on your dev server to make deployment easier; I wouldn't recommend this for a production setup, but it's OK for dev.
Here are some links which may help you:
Running Jenkins on Windows
Building ASP.NET code with Jenkins
Jenkins and PHP
Related
I run a custom PHP site on a Managed VPS (LAMP stack) and am the solo developer, but want to start using GIT so freelancers can contribute. Currently I don't use GIT.
GIT
For GIT, I wanted to use Visual Studio Online (VSO) since I've used it before but am open to suggestions if it's better for the suggested deployment process.
Deployment
I kept a "Dev" folder and a "Live" folder on the web server and simply did all of my dev in the Dev folder, tested there, then ran rsync to push it to the Live folder. I couldn't easily run it locally since it has things like Linux symlinks, and I work on a Windows computer.
The Goal
I want to start adding GIT to this process, integrate that into a decent build process, and still use a Windows IDE for development. Though maybe I should install a Linux VM on my Windows machine so I can start running the site directly from pulling the latest version from GIT?
I need a setup that would be easy for other developers to join on as I find freelancers to help out.
Suggestions?
here's what I do for
Git deployment
For deployment, I follow a very-simple git-flow technique (http://nvie.com/posts/a-successful-git-branching-model/#the-main-branches) with 2 branches:
master for prod
dev for new features
You always develop new features on dev branch, so dev is always ahead of master in term of functionnalities.
When you want to deploy, you SSH on your server, you git pull dev branch, and if everything works fine, you git merge with master.
You can always switch on the prod server between prod and dev version with git branch master and git branch dev. No need of 2 folders, git handles this !
If you have a bug on prod environment, you can switch branch to master on your local computer and fix the bug. Then you upload it via ssh.
Multi collaborator development (developers with windows/linux/mac but 1 production server)
I personally use docker, it creates a single VM which you can customize to match your prod environment (linux version, apache version, php version and my sql version). You create 1 docker image, and then every collaborator can download this image and run it on his computer.
I am in charge of launching web projects and it takes a little too long currently from client sign off to final launch. It is on a server which I have root access to, but it runs Plesk so that the boss can setup VirtualHosts, which means there are many sites running on it.
Each project has its own git repository so currently I have the following setup.
On my staging server there is a clone of the repo and I have two bare repositories. One is on the forge (powered by Indefero) and the other is on the live server.
Each release of a project is tagged with todays date eg. git tag -a deployed-2011-04-20.
So on the staging server I execute something similar to git push --tags live master, which targets the bare repo on the live server.
Then over SSH on the live server I execute a short bash script which basically clones the repository from the live bare repo to the folder Apache will serve.
So if that all makes sense would you be able to recommend a tool or anything to make my life easier that follows that work flow or can be adapted?
It looks something like this:
Forge (authoritative source)
^
|
v
Staging/development server
|
v
Live server bare repo
|
v
Releases folder (symlinked to htdocs)
One solution that comes to mind is to add some post-receive hook on the live server bare repo in order to detect any deployed-2011-xx-yy tag coming from the staging repo, and to trigger the ssh script from there.
The other solution is to have a scheduler (like Hudson mention in pderaaij's answer, in order to:
monitor the stating repo and, on the right tag, trigger the push on the live server
monitor the live bare repo, and trigger the ssh script.
The second solution has the advantage to keep a trace of all release instances in an Hudson job report, each time said job detect the right tags and execute the release process.
Take a look at Capistrano, which happily does the symlink dance you describe here.
If you use Hudson as a continious integration server, you can make use of the build pipeline plugin.
You have your normal build process, but add an extra job which contains the commands to deploy your application. The plugin gives you a nice button to execute that build.
The hudson job could execute all the needed commands or you can take a peek at Maven for PHP and use the available plugins to invoke the remote scripts
Perhaps it is a bit out of range considering the path you've chosen already, but it's worth the research.
We got a couple of drupal sites we develop for, we are a team of 4 developers and about 20+ non-technical content managers.
Each developer has his own dev environment, we all got a beta environment where we test code integration and performance, a staging environment where the content managers test features before we push to the live environment, a training environment we use to train people and an environment specific for usability testing.
All that is setup with only 1 bare repo on a central server where each environment is a branch. We do use the post-receive hook with password-less ssh certs doing the auto pulling on the appropriate repo based on a case statement like the following:
BRANCH=`echo $line | sed 's/.*\///g'`
LOG="`date` - updating $BRANCH instance"
case $BRANCH in
"beta" )
ssh www-data#beta "cd /var/www/beta.example.com; git pull"
;;
I have worked within a web development company where we had our local machines, a staging server and a a number of production servers. We worked on macs in perl and used svn to commit to stage, and perl scripts to load to production servers. Now I am working on my own project and would like to find good practices for web development when using shared web hosting and not working from a unix based environment (with all the magic I could do with perl / bash scripting / cron jobs etc)
So my question is given my conditions, which are:
I am using a single standard shared web hosting from an external provider (with ssh access)
I am working with at least one other person and intended to use SVN for source control
I am developing php/mysql under Windows (but using linux is a possibility)
What setup do you suggest for testing, deployment, migration of code/data? I have a xampp server installed on my local machine, but was unsure which methods use to migrate data etc under windows.
I have some PHP personnal-projects on shared-hosting ; here are a couple of thoughts, from what I'm doing on one of those (the one that is the most active, and needs some at least semi-automated synchronization way) :
A few words about my setup :
Some time ago, I had everything on SVN ; now, I'm using bazaar ; but the idea is exactly the same (except, with bazaar, I have local history and all that)
I have an ssh access to the production server, like you do
I work on Linux exclusivly (so, what I do might not be as easy with windows)
Now, How I work :
Everything that has te be on the production server (source-code, images, ...) is commited to SVN/bazarr/whatever
I work locally, with Apache/PHP/MySQL (I use a dump of the production DB that I import locally once in a while)
I am the only one working on that project ; it would probably be OK for a small team of 2/3 developpers, but not more.
What I did before :
I had some PHP script that checked the SVN server for modification between "last revision pushed to production" and HEAD
I'm guessing this homemade PHP script looks like the Perl script you are currently usng ^^
That script built a list of directories/files to upload to production
And uploaded those via FTP access
This was not very satisfying (there were bugs in my script, I suppose ; I never took time to correct those) ; and forced me to remember the revision number of the time I last pushed to production (well, it was automatically stored in a file by the script, so not that hard ^^ )
What I do now :
When switching to bazaar, I didn't want to rewrite that script, which didn't work very well anyway
I have dropped the script totally
As I have ssh access to the production server, I use rsync to synchronise from my development machine to the production server, when what I have locally is considered stable/production-ready.
A couple of notes about that way of doing things :
I don't have a staging server : my local setup is close enough to the production's one
Not having a staging server is OK for a simple project with one or two developpers
If I had a staging server, I'd probably go with :
do an "svn update" on it when you want to stage
when it is OK, launch the rsync command from the staging server (which will ba at the latest "stable" revision, so OK to be pushed to production)
With a bigger project, with more developpers, I would probably not go with that kind of setup ; but I find it quite OK for a (not too big) personnal project.
The only thing "special" here, which might be "linux-oriented" is using rsync ; a quick search seems to indicate there is a rsync executable that can be installed on windows : http://www.itefix.no/i2/node/10650
I've never tried it, though.
As a sidenote, here's what my rsync command looks like :
rsync --checksum \
--ignore-times \
--human-readable \
--progress \
--itemize-changes \
--archive \
--recursive \
--update \
--verbose \
--executability \
--delay-updates \
--compress --skip-compress=gz/zip/z/rpm/deb/iso/bz2/t[gb]z/7z/mp[34]/mov/avi/ogg/jpg/jpeg/png/gif \
--exclude-from=/SOME_LOCAL_PATH/ignore-rsync.txt \
/LOCAL_PATH/ \
USER#HOST:/REMOTE_PATH/
I'm using private/public keys mecanism, so rsync doesn't ask for a password, btw.
And, of course, I generally use the same command in "dry-run" mode first, to see what is going to be synchorised, with the option "--dry-run"
And the ignore-rsync.txt contains a list of files that I don't want to be pushed to production :
.svn
cache/cbfeed/*
cache/cbtpl/*
cache/dcstaticcache/*
cache/delicious.cache.html
cache/versions/*
Here, I just prevent cache directories to be pushed to production -- seems logical to not send those, as production data is not the same as development data.
(I'm just noticing there's still the ".svn" in this file... I could remove it, as I don't use SVN anymore for that project ^^ )
Hope this helps a bit...
Regarding SVN, I would suggest you go with a dedicated SVN host like beanstalk or use the same server machine to run an SVN server so both developers can work off it.
In the latter case, your deployment script would simply move the bits to a staging web folder (accessible via beta.mysite.com) and then another deployment script could move that to the live web directory. Deploying directly to the live site is obviously not a good idea.
If you decide to go with a dedicated host or want to deploy from your machine to the server, use rsync. This is also my current setup. RSync does differential syncs (over SSH) so it's fast and it was built for just this sort of stuff.
As you grow you can start using build tools with unit tests and whatnot. This leaves only the data sync issue.
I only sync data from remote -> local and use a DOS batch file that does this over SSH using mysqldump. Cygwin is useful for Windows machines but you can skip it. The SQL import script also runs a one line query to update some cells such as hostname and web root for local deployment.
Once you have this setup, you can focus on just writing code and remote deployment or local sync and deployement becomes a one click process.
One option is to use a dedicated framework for the task. Capistrano fits very well with scripting languages such as php. It's based on Ruby, but if you do a search, you should be able to find instructions on how to use it for deploying php applications.
I've worked on several PHP projects and always I have problems with organizing my work. Where do you develop your application - on localhost, remote server or maybe production one(!) ?
When I work on my localhost after making some major path I send new files by ftp - but sometimes it happens to forget about one file and it is just tiring to browse several directiores to copy changed files.
What best practises do you propose?
This is how we manage our commercial site:
Develop in a local sand box
Check-in to SVN
Automated build/test from SVN onto internal
dev server
Scripted deploy using rsync to staging
server for QA/UAT
Scripted deploy onto production
servers.
Staging and production servers are both hosted by the ISP and are hardware and version matched and run RHEL, internal Dev server is version matched CentOS.
This way, when code hits the production servers there shouldn't be any nasty surprises as even the deploy scripts get checked in stage 4.
Google App Engine has an apposite tool that automatically uploads to the production environment files that are modified; don't know if there's something similar for PHP.
So, doing a dev2prod script (a script that does this automatically) should be a good solution.
To do local source file management, everyone may suggest you to use a source code control system.
I'm developing on a development machine which has an identical enviroment like the production one - that prevents some different behaviour because of different versions or configs. after finishing i just move all the files on to the production server.
Winmerge is a nice and free windows tool to diff your files between development and production machine.
Develop in your local machine with the same exact configuration as your development enviroment (that is apache mods, php extensions and so on), using a version control system (I prefer SVN) to keep track of modified files and what not.
Then you can write a script in your production or testing enviroment to copy/update files off the repository to the web server path.
Probably get told off for redirecting an old post but here's how I do it using free tools:
I use Netbeans, Git, bitbucket, source tree, gitflow and FTPloy.
Bitbucket.com : sign up for free account.
SourceTree: free from bitbucket. Great tool for managing git repositories. All commits, merges and pulls can be done in here. Issues in bitbucket can be tracked.
In sourcetree, take the master branch and click "git flow" - google gitflow - it's fantastic work flow of feature, hotfix, development, and release branches and sourcetree helps automate the process.
FTPloy.com automates the deployment of your master branch. Free is one repo/server. But worth the upgrade if you want to push development branch to a server sandbox for testing.
Hope this helps someone searching the web!
This is my own PHP Development lifecycle.
Create a GIT Private Repository on Github or Bitbucket.
Connect the GIT with cPanel. (It's very easy to commit the changes to the repo as well as a production server).
Develop on Localhost(probably on Xampp on Windows) with Visual Studio Code or Sublime Text
Initialise GIT and connected with the private repository.
Push the changes to the GIT repository (master/branches) - master changes will upload to the production server automatically.
I have to deploy my php/html/css/etc code to multiple servers and i am looking at my options for software that allows easy and secure deployment to multiple servers.
Also helps if it could be tied into my SVN.
Any suggestions?
Capistrano is pretty handy for that. There's a few people using it (1, 2, 3) for deploying PHP code as evidenced by doing a quick search.
Setting up password-less publickey authentication with ssh would allow you to scp your files to any of your servers very quickly (or be automated by a shell script).
Here's a simple tutorial: http://rcsg-gsir.imsb-dsgi.nrc-cnrc.gc.ca/documents/internet/node31.html
If you're running on Redhat or Debian, consider packaging up your code into RPM's or Debs. Then build a yum or dpkg repository and put your packages there. You can then use your system's package management to do upgrades/rollbacks, etc. You can even use puppet to automate the process.
If you want to tie it into subversion, you can create a branch for each new version. Use the commit scripts to build the RPM's when a new branch shows up in a directory.
I'll second Capistrano. It's incredibly powerful and flexible. Our current project uses Capistrano for deploying to different servers as well as multiple servers. We pass two arguments to the cap command:
1) the name of the set of machine specific config options to run and
2) the name of the action to run
ends up looking like this:
cap -f deploy.rb live deploy
or
cap -f deploy.rb dev deploy
Of course the default use case - deploy to lots of machines at once - is a doddle with Capistrano AND you don't need to have Capistrano on the machines you are deploying to. All in all, tasty technology.
I've used Automated Build Studio before for a similar task. It gives you a lot of flexibility in what you can do.
I concur -- set your svn tree up, and use rsync over ssh to copy the tree out to the remote locations. rsync will make it fast and efficient, only copying changes rather than full files.
You want to export your svn tree to some directory, then rsync from there to the remote host's directory tree.
I also forgot to mention that if you use rsync, you can set up rsync to use ssh, so you will only transfer the files that have changed, saving on time and bandwidth.
You can also use kwateeSDCM which is free and allows remote installation via ssh. It also enables you to manage server-specific configuration from a central location and make upgrades seemless.
I had marked a post on how to deploy your websites using Subversion : http://blog.lavablast.com/post/2008/02/I2c-for-one2c-welcome-our-new-revision-control-overlords!.aspx
I found capistrano to be very easy to use once it's setup. The configuration file can be a bit confusing at first for more complicated environments but it soon becomes worthwhile. I deploy to 14 servers on production. I also use multiple environments for deployment to a staging server. One quirk, there's a bug in Ruby that breaks parallel deployment but serially isn't too bad with svn exports.
Capistrano setup is just too complicated. We found that KwateeSDCM was very straightforward to use with a simple web interface and no scripting. We've got our deployment configuration done in no time for Dev and QA configuration on windows and linux servers.