It seems the standard way to deploy with Rocketeer is to do a pull deploy, that is, it will do a git clone from the server you are deploying to. What I want to do is push a set of files after having done a build on a CI server to the server being deployed to.
The reason I want to do this is that usually my projects have lots of extra stuff not required for production. I usually like to construct a build folder and run a build script that packages a final product. I want to use Rocketeer to push the results to staging/production servers. It was suggested in this article it can be done: Deploying PHP Applications with Rocketeer and Docker
However, after reading the rocketeer documentation there is nothing that speaks to that strategy and its seems a bit against the grain to try. I'm open to ideas given my problem.
As the author of the article, I owe you a clarification. I mentioned those two types of deployment paradigms in a general sense just to introduce the different concepts. As I am aware of, Rocketeer supports only "pull" deployments. Sorry for the confusion!
For deploying the generated files to your server from a CI, I think the most straightforward way is to usea tool like scp, rsync or just download it from S3 if you're storing your built package in a bucket.
Related
Apparently there are two strategies used for deployment of web application. Please correct me if I am wrong.
Pull Deployment
I have my own build, deploy scripts. I use git as vcs. Deploy script will pull the code from git repository and build script will configure the app.
Pros
Easy installation.
Better scalability (as my ssh key resides in server, it can contact our vcs server). So even our application server grows we don't have to bother.
cons
Security issue as ssh key in every application server.
Push Deployment
I had used this method with my old project, where I used rsync to push the code. I push a copy from the local machine, but still we had used vcs.
Pros
Full control, flexibility as I don't have to push the code to vcs.
cons
Not better scalability.
I have checked some tools, which offering both strategies. (http://capifony.org/)
Questions
How do you guys handle this for a large scale project? (built with php).
Is there any better strategy?
Which is better in between these two?
What if there are many application servers under load balancer? will push make sense here?
Thanks in advance.
Full control, flexibility as I don't have to push the code to vcs
This to me is not a good thing. You will have more control using a VCS than without. I generally create a production branch alongside a development and feature branches, that way the production server only ever pulls down code that I've deliberately put into the production branch.
Furthermore, if you ever run into a problem where your production code suddenly breaks, if you're using a VCS you should be able to roll back to a working version while you figure out what's wrong with your code. This, to me, is one of the most beneficial aspects of using a Pull Deployment.
If you use a continuous integration tool like Jenkins, you can periodically check for changes on a specific branch in your VCS, and have Jenkins automatically pull and build your project for you, without anyone ever needing to log in to the production server themselves. This makes deployment as easy as updating your code in the repository.
security issue as ssh key in every application server
Depending on where your code is hosted, you might be able to set up deployment-only keys. This is how Bitbucket is set up, so our production servers can only pull code, not push. Furthermore, if one of these servers is compromised, we can easily revoke the access on our repository to that specific key.
I use Git to track local changes in my PHP web applications, and I was wondering if it would be a good idea to use Git on the server as well, so that I could just use git push to deploy my changes. Would there be any pitfalls with this approach?
This seems like a nice way to do things. If you're tagging and branching properly it will enable you to quickly switch back to working versions of your site too in the event that something breaks.
I think this is a fine way to do it. I handle things in a similar manner, where live sites are just a checkout from the repository, and i update them as necessary.
Git is fine but you can do a lot better then just using git pull. Take a look at railess deploy for capistrano.
Capistrano basically does a combination of rsync and git pull to deploy copies of your website. It supports roleback, staging and distributed deployments.
And online hotfixes can be pushed back to development.
Being able to do a git status on a live system can be a live saver.
Go for it!
Caveats
Make sure the the ".git" folder isn't accessible from the web.
With PHP the source code is usually present on the webserver, so that doesn't add additional risk in case the server is hacked.
I would be in favor of using a technique like this if only because you can be sure anything on your deployed site is also being tracked in git. That is, it encourages a best practice and discourages ad hoc changes that aren't under source control.
For another alternative, check out this article about how Twitter uses BitTorrent to manage deployment: http://torrentfreak.com/twitter-uses-bittorrent-for-server-deployment-100210/
It's probably most useful when you need to deploy quickly across a large collection of servers.
I think its a great solution. I have been using it to deploy my website for a long time... Its nice because you can almost instantly push your changes into production just by updating the folder. I have encountered no security issues or anything with it.
Enjoy!
I'm deveoping a big project, I have the dev folder (connected to a specific subdomain) then the "real" folder, the live one. When I'm ready to push patches or whole new versions I'm currently copying the files individually, is there a program that can help me do this task?
Keep in mind that some files (the config one and the htacess) and folders (the dev ones) do not need to be copied in the live version.
Thank you
Yes: subversion (or any other version control system) will allow you to push changes painlessly.
A simplicistic solution would be to have one checkout where you develop and you commit to, and another checkout which is the deployment. When you are ready, you go to the deployment directory, and do a svn up, to sync it. It won't overwrite modified or excluded files.
There are build packages like Capistrano and Phing which can help with more complicated deployments. Capistrano is Ruby-based, so it is a more natual choice for RoR applications, and Phing (being PHP-based) can be a little more convenient for PHP-based projects. In my experience, Phing seems less mature than Capistrano, but is a little more flexible because it doesn't assume you are working with a Ruby project like Capistrano seems to do. That's entirely opinion of course.
Both tend to take more thought and work to configure up front, but once you've designed the deploy script, you can run a single command and have everything happen for you while you watch. Both tools can integrate with source control like SVN, and bring copies of your project out of the repository for you. You can also break your deployment out into sub-parts, like a traditional Makefile, which helps with testing and reuse. If you want the process you go through for your releases to be bulletproof and consistent, you need to use a tool that will manage all the steps involved for you so you remove the human-error component.
What would be a good tool-for-the-job to do automated deployments of LAMP-based applications(MySQL, PHP, Zend Framework) to integration and staging environments?
I am looking specifically for tools that handle deployes to remote hosts. I assume building tools such as phing and ant I assume could be used for that, but I was wondering if there is something better for this case.
For integration, especially for continuous integration, I like phpUnderControl (which is a tool for PHP projects, but is itself based on CruiseControl, which is quite know in the JAVA World) : it deals with :
fetching the last revision from SVN
launching the automated tests (PHPUnit)
php_CodeSniffer
Generation of the PHP Documentation (phpDocumentor)
and provides a nice interface for users to see the results of each build.
And, to begin, here's an article that explains how to set phpUnderControl up : Getting started with phpUnderControl
(Each time I, or some colleagues, have installed phpuc, we did almost as explained in that article, from what I remember)
For staging, I generally go with a couple of phing tasks to build a tar.gz archive, that I deploy to the staging server once in a while, using another phing task to un-tar the archive, and create the required symlinks (or stuff like that).
The idea being that Continuous Integration happens all the time, and has to be fully automatic, while deploying to staging is done only one in a while (once per week, for instance), and can be done semi-automatically.
Configure a build server, something like CruiseControl is excellent for this and roll your own custom Nant scripts if needed or use Exec tasks to take care of the deployment.
For these things like specific deployments each with their configuration issues and intricacies, there is hardly ever something out-of-the-box.
Look at it this way, rolling your own scripts and batch files definitely means you know all about the steps and can configure and modify it anyway you like, rather than some magic fairy dust going on, and when things break - having no idea where to fix it.
this is a question on PHP mainly. I was wondering: How do you make sure that all necessary libaries are packaged with your application when you do a deployment to (production) servers?
A more concrete example: I have an app running on Zend Framework and each time I roll the application to a server the deployment process creates a fresh "installation" on that system. Therefore, I need to bundle Zend Framework together with my application and then copy the files to the right places together (it is done automatically). Currently, I am using a svn:externals definition to get the files out of Zend's SVN system during deployment, however, I don't want to rely on that SVN and I also don't want to put traffic on external SVNs with each deployment.
In the Java world, I am used to Maven which handles such stuff using central artifact repositries. I know that there is a Maven4PHP version, however, I am more looking for a PHP-based solution. Additionally, I don't believe that PEAR is a good way to go as it doesn't really fulfill my requirement of bundling the applicaiton (incl. libs) into a single deployable.
Is there some tool available already that I am not aware? Or do you have any great technique that I should know?
Thanks much for your help!
Michael
There's a build system called Phing which is written in PHP and based on Apache Ant.
I personally can very well live with externals.
I think the vendor branching would solve the problem from your example quite straightforward, but if you also don't like large repositories I'd recommended to keep watching on the modern toys like composer and what it solve(and maybe phark, I never heard before :) )
It isn't production ready yet but you might want to keep an eye on the Phark project. It is a port of Bundler to PHP.
While looking through the Simplify your external dependency management slides I came across a tool called pantr which can be used as a PEAR installer. pantr as PEAR installer which allows you to specify your dependencies in a project specific file.
The article Version Control != Dependency Management has some information about using the new PEAR installer called Pyrus