Apparently there are two strategies used for deployment of web application. Please correct me if I am wrong.
Pull Deployment
I have my own build, deploy scripts. I use git as vcs. Deploy script will pull the code from git repository and build script will configure the app.
Pros
Easy installation.
Better scalability (as my ssh key resides in server, it can contact our vcs server). So even our application server grows we don't have to bother.
cons
Security issue as ssh key in every application server.
Push Deployment
I had used this method with my old project, where I used rsync to push the code. I push a copy from the local machine, but still we had used vcs.
Pros
Full control, flexibility as I don't have to push the code to vcs.
cons
Not better scalability.
I have checked some tools, which offering both strategies. (http://capifony.org/)
Questions
How do you guys handle this for a large scale project? (built with php).
Is there any better strategy?
Which is better in between these two?
What if there are many application servers under load balancer? will push make sense here?
Thanks in advance.
Full control, flexibility as I don't have to push the code to vcs
This to me is not a good thing. You will have more control using a VCS than without. I generally create a production branch alongside a development and feature branches, that way the production server only ever pulls down code that I've deliberately put into the production branch.
Furthermore, if you ever run into a problem where your production code suddenly breaks, if you're using a VCS you should be able to roll back to a working version while you figure out what's wrong with your code. This, to me, is one of the most beneficial aspects of using a Pull Deployment.
If you use a continuous integration tool like Jenkins, you can periodically check for changes on a specific branch in your VCS, and have Jenkins automatically pull and build your project for you, without anyone ever needing to log in to the production server themselves. This makes deployment as easy as updating your code in the repository.
security issue as ssh key in every application server
Depending on where your code is hosted, you might be able to set up deployment-only keys. This is how Bitbucket is set up, so our production servers can only pull code, not push. Furthermore, if one of these servers is compromised, we can easily revoke the access on our repository to that specific key.
Related
It seems the standard way to deploy with Rocketeer is to do a pull deploy, that is, it will do a git clone from the server you are deploying to. What I want to do is push a set of files after having done a build on a CI server to the server being deployed to.
The reason I want to do this is that usually my projects have lots of extra stuff not required for production. I usually like to construct a build folder and run a build script that packages a final product. I want to use Rocketeer to push the results to staging/production servers. It was suggested in this article it can be done: Deploying PHP Applications with Rocketeer and Docker
However, after reading the rocketeer documentation there is nothing that speaks to that strategy and its seems a bit against the grain to try. I'm open to ideas given my problem.
As the author of the article, I owe you a clarification. I mentioned those two types of deployment paradigms in a general sense just to introduce the different concepts. As I am aware of, Rocketeer supports only "pull" deployments. Sorry for the confusion!
For deploying the generated files to your server from a CI, I think the most straightforward way is to usea tool like scp, rsync or just download it from S3 if you're storing your built package in a bucket.
I'm trying to figure out a way to deploy my company intranet PHP apps automatically, using GitLab, but so far, I'm not sure if the options that I've found on the internet will do the job.
Here's the scenario:
VM1: Remote Git server, that uses GitLab to administrate projects repos.
VM2: Development server. Each project has it's own development server.
VM3: Production server. Each project has it's own, as well.
Developers: Each developer uses a vagrant box, based on the project's development server.
What I want to do:
Whenever a developer push it's commits to the development branch, a hook in the remote server must update the development server tree with the last commit on that branch.
The same must occur when the master branch is updated, but it must update the production server tree.
However, since we use Laravel, we must run some extra console commands using Artisan, depending on each case.
We are following Vincent Driessen's Git Branching Model.
What I know so far:
I know that GitLab uses Web Hooks, but I'm not sure if that's going to work in this case, since I need to access a custom URL, doesn't sound very safe, but if it's the only solution, I can write a script to handle just that.
The other possible solution is to use Jenkins but I'm not an expert and I don't know yet if Jenkins is too much for my case, since I'm not using unit tests yet.
Do you guys have implemented a solution that could help in my case? Can anyone suggest an easy and elegant way to do that?
Thanks guys! Have a nice one
We do it the following way:
Developers checkout whatever Git branches, and as many branches as they want/need locally (Debian in VM Ware on Mac)
All branches get pulled to dev server whenever a change is pushed to Git. They are available in feature-x.dev.domain.com, feature-y.dev.domain.com etc., running against dev database.
Release branches are manually checked out on live system for testing, made available on release-x.test.domain.com etc. against the live database (if possible, depending on migrations).
We've semi-automated this using own scripts.
Database changes are made manually, due the sensitivity of their nature. However, we don't fint that a hassle, after getting used to migrations - and just remembering to note the alterations. We find good support by cloning databases locally for each branch that needs changes. An automated schema comparison quickly helps then, if changes to a migration file have been forgotten.
(The second point is the most productive one, making instant test on the dev platform available to everyone as soon as the first commit of a new branch is pushed)
I would suggest to keep things simple and work with git, hooks and remote repositores.
Pulling out heavy guns, like Jenkins or Gitlab for this task could be a bit too much.
I see your request as the following: "git after push and/or git after merge: push to remote repo".
You could setup "bare" remote repositories - one for "dev-stage", one for "production-stage".
Their only purpose is to receive pushes.
Each developer works on his feature-branch, based on the development branch.
When the feature-branch is ready, it is merge back to the main development branch.
Both trigger a "post merge" or "post-receive" hook, which execute a script.
The executed script can do whatever you want.
(Same approach for production: When the dev-branch has enough new features, it is merged to prod branch - triggers merge event - scripts...)
Here you want two things:
You want to push a specific branch to a specific remote repo.
In order to do this, you have to find out the specific branch in your hook script.
That's tricky, but solveable, see: https://stackoverflow.com/a/13057643/1163786 (writing a "git post-receive hook" to deal with a specific branch)
You want to execute additional steps for configuration/setup, like artisan, etc.
You might add these steps directly or as triggers to the hook script.
I think this request is related to internal and external deployment via git.
You might also search for tutorials, like "deployment with git", which might be helpful.
For example: http://ryanflorence.com/deploying-websites-with-a-tiny-git-hook/
http://git-scm.com/book/en/Git-Basics-Working-with-Remotes
http://githooks.com/ & https://www.kernel.org/pub/software/scm/git/docs/githooks.html
If you prefer to keep things straightforward and don't mind using paid third-party options, check out one of these:
http://deploybot.com/
https://www.deployhq.com/
https://envoyer.io/
Alternatively, if you fancy shifting to an integrated solution, I've not used better than Beanstalk.
So - let's say I develop a PHP app which I develop in a vagrant box identical to production envrionment. So - as an end result I would have a *.tar.zip file with a code...
How would one organize a deployment into production environment where there are a lot of application servers? I mean - I'm confused how to push code into production synchronously all at once?
More information:
on server code is stored like this:
project
+current_revision ->link to revisions/v[n]
+revisions
+v1
+v2
+v3
...
+data
So when I have to deploy changes I usually run a deploy script that uploads updated tar onto server with ssh, untars into specific dir under revisions, symlinks it into current_revision and restart php-fpm.... This way I can rollback anytime just by symlinking to an older revision.
with multipe servers what bothers me is that not all boxes will be updated at once, ie. technically some glitches might be possible.
If you're looking for a "ready-to-go" answer, you'll need to provide some more info about your setup. For example, if you plan to use git for VCS, you could write a simple shell script that pulls the latest commit and rsyncs with the server(s). Or if you're building on top of Symfony, capifony is a great tool. If your using AWS, there's a provider plugin written by the author of Vagrant that's super easy to use, and you can specify a regex for which machines to bring up or provision.
If instead you're looking for more of a "roadmap", then the considerations that you'll want to take are:
Make building of identical boxes in the remote and local environments as easy as possible, and try to make sure that your provisioning emphasizes idempotence.
Consider your versioning/release structure; what resources will rarely or never change? Include those in a setup function instead of a deploy function, and don't include them in your sync run.
Separate your development and system administration concerns; i.e. do not just package a vagrant box with a *.tar.gz and tie it through config.vm.box_url. The reason for this is that you'd have to repackage every production server with a new box every time you deploy, instead of just changing files on the server, or adding/removing some packages from the server.
Check out some config management tools like Chef and Puppet; even if you don't end up using them, they'll give you an idea of how sysadmin professionals approach this problem.
Lots of ways. If starting from barebones (no cloud infrastructure), I'm a fan of the SVN branch hook. Have a SVN repo for your code. Set up a post-commit hook on it, which checks if anything in /branch/production/ has been changed.
If it has, let the post-commit hook fire all your automated roll-out procedure - and in this case, an easy way to do so is to let all your servers known* to svn export the branch. As simple as that!
(* that's the hard step)
Good day to you all,
I am currently developing a project on Laravel. So far I have always developed online, directly editing my files on the webserver throuh FTP (using PSPad or similar simple editing tools).
What I want to do now (and what i believe most people actually do) is setup a (W)LAMP stack on my local machine and program locally. However it is a little bit unclear to me how to keep my local code (including databases) in sync with the live website. How do you folks do that? I know there's probably lots of ways and tools to do that, but what would be your advice for a best practice? Any advice would be very welcome :)
What many companies do is build offline, then push their edits up to a server using git.
Im no expert on the software so ill describe what you do in a basic form:
My advice would be to create an online repo (repository) to store your project while you edit/update.
There are several git project management systems such as github or bitbucket. I personally use bitbucket
What git does, is when you have built or added what you need offline on local (w)lamp, you then git push them up to your repo or server. The changed files then get merged with the existing on the repo or the server. If you'd like the most recent version of your project you'd simply just git pull them down.
Read the full documentation here to see the wide range of options available when using git
We have a settings array within our platform available as $res::Config.
At runtime, a variable is changed from 'dev' to 'live' after checking the HTTP Host, obviously depending on the IP address.
Within our framework bootstrapping, depending on the value of $res::Config->$env, or the environment set previously as either dev or live, the settings for the database connection are set. You store these settings in the Config array as db_live or db_dev.
However you do it, use an environmental variable to figure out whether you want live or dev, and set up and array of settings accordingly.
We also have sandbox and staging for intermittent development stages.
As for version control, use git or subversion.
Edit: It's also possible that within our vhost file, we setup an environmental variable as either live or dev, and our application reads from this accordingly. I'd suggest this approach :)
There are a number of ways of doing this. But this is a deceptively HUGE question you've asked.
Here is some good practice advice - go and research these items, then have a look at my approach.
Typically you use a precess called version control which allows you to create "versions" or snapshots of your system.
The commonly used "SVN" software is good, but the new (not really any more) kid on the block is GIT, and I personally recommend that.
You can use this system to push the codebase live in a controlled fashion. While the files/upload feature is essentially similar to FTP, it allows you to dump a specific version of your site live.
In environments where there are multiple developers, this is ideal - you can compare/test and work around each other, and version control tends to stop errors between devs.
So - advice part 1: Look up and understand version control, then use it to release CODE to the live environment.
Part 2: I use database dumps and farm them back to my machine to work with.
If the live database needs updating, I can work locally and simply export, then re-import on the live system.
For example: on a recent Moodle project I worked on, to refresh the whole database took seconds... I could push a patch and database update in a few minutes.
However: you should think about maintenance and scheduling... if the site is live and has ongoing data changes then you need to be careful with this. Consider adding a maintenance page.
Advice 2: go research SQL dump/export and importing.
I personally use phpmyadmin to dump and re-import, as it's very convenient.
Advice 3: Working locally then pushing live is MUCH BETTER PRACTICE. You're starting down a much safer and better road than you're on!
Hope that helps... but bear in mind - this is a big subject, so you'll need to research a fair bit.
I use Git to track local changes in my PHP web applications, and I was wondering if it would be a good idea to use Git on the server as well, so that I could just use git push to deploy my changes. Would there be any pitfalls with this approach?
This seems like a nice way to do things. If you're tagging and branching properly it will enable you to quickly switch back to working versions of your site too in the event that something breaks.
I think this is a fine way to do it. I handle things in a similar manner, where live sites are just a checkout from the repository, and i update them as necessary.
Git is fine but you can do a lot better then just using git pull. Take a look at railess deploy for capistrano.
Capistrano basically does a combination of rsync and git pull to deploy copies of your website. It supports roleback, staging and distributed deployments.
And online hotfixes can be pushed back to development.
Being able to do a git status on a live system can be a live saver.
Go for it!
Caveats
Make sure the the ".git" folder isn't accessible from the web.
With PHP the source code is usually present on the webserver, so that doesn't add additional risk in case the server is hacked.
I would be in favor of using a technique like this if only because you can be sure anything on your deployed site is also being tracked in git. That is, it encourages a best practice and discourages ad hoc changes that aren't under source control.
For another alternative, check out this article about how Twitter uses BitTorrent to manage deployment: http://torrentfreak.com/twitter-uses-bittorrent-for-server-deployment-100210/
It's probably most useful when you need to deploy quickly across a large collection of servers.
I think its a great solution. I have been using it to deploy my website for a long time... Its nice because you can almost instantly push your changes into production just by updating the folder. I have encountered no security issues or anything with it.
Enjoy!