Using GIT and GitLab to deploy PHP (Laravel) - php

I'm new to the topic of CI & CD, therefore I have some basic questions.
But first about my current environment:
On my server I have Debian running with GitLab and one webpage. I run the update commands manually at my website like
php artisan down &&
git checkout . &&
git checkout master &&
git pull &&
composer install &&
php artisan migrate &&
php artisan up
In future I would like to improve my system landscape to have multiple stages. DEV, QA (demo for future releases) and PROD. On long term perspective at least PROD will get an own server. But I'm not sure about the right setup.
Now my questions;
When I have a gitlab-ci.yml where does this script run, I believe on the server where gitlabs runs, right? How can I run the command on client side e.g. my external PROD server?
Sofar I like the concept of webhooks, but I can't have multiple webhooks by different stages, or? Does it make sense to split the different branches and validate within the webhook script on webhost side?

When I have a gitlab-ci.yml where does this script run, I believe on the server where gitlabs runs, right?
No, it runs on a seperate server on which the Gitlab CI Runneris installed. https://docs.gitlab.com/runner/
How can I run the command on client side e.g. my external PROD server?
Using SSH in the gitlab-ci.yml file, for the authentification you can use secret variables which will be usable in the build script. Don't forget to ensure you are connectet to the right host
Sofar I like the concept of webhooks, but I can't have multiple webhooks by different stages, or? Does it make sense to split the different branches and validate within the webhook script on webhost side?
I wouldn't use webhooks. See the tutorials above.
In future I would like to improve my system landscape to have multiple stages. DEV, QA (demo for future releases) and PROD. On long term perspective at least PROD will get an own server. But I'm not sure about the right setup.
You should have a own server or VM for each server. Is it possible to do this all on a single server. Sure, but it is far from recommended. In that case you can use Gitlab eviroments
Welcome to Stackoverflow!

Related

Updating Laravel application from git on multiple servers

I am working on development of a web app (for learning) in Laravel and I`m using Bitbucket for source control. It will be deployed on couple servers (20 or so, perhaps more over time), and I would like to be able to update all of them as the app changes over time.
The problem is that I will not have SSH access to most of those servers so I wont be able to use a simple "git pull" (a server we test on does not even have git installed so shell_exec is not an option also).
My plan was to make a script that will download latest zip from Bitbucket server, unpack it overwriting the old code, and then running a Laravel script to run migrate (for eventual database changes).
Is there maybe a more sensible way of doing this?
What are you looking for is CI/CD, i.e. Continues Integration/ Continues Delivery. There are so many ways to automatically deploy or pull a code over server. You can use following methods
Automating Deployment to EC2 Instance With Git
Using Bitbucket for Automated Deployments
CI\CD workflow with BitBucket Cloud, Bamboo, AWS CodeDeploy
Bitbucket - Manage Web Hooks
Apart from this you can find so many articles on this, but if you wants to automate the process at laravel level then use Laravel Envoy

Continuous Deployment with Jenkins & PHP

I'm sure there are answers for this all over stackoverflow but I was unable to find anything specific.
I have a PHP project which I am revisiting. Its running on a RHEL5 box. I have SVN on the same box.
Out of curiosity I recently added Jenkins to the machine and have the jenkins php template at...
http://jenkins-php.org/
There was a bit of playing around with the setup but I more or less have this all running and doing Continuous Inspection builds when something is committed to SVN.
What I want to do now is have Jenkins copy my updated files across to the server when the build completes.
I am running a simple LAMP setup and would ideally only like to copy across the files that have actually changed.
Should I just use ANT & sync? Currently the files reside on the same box as the server but this may change whereby I will need to sync these files across to multiple remote boxes.
Thanks
Check these - Copy Artifact Plugin and the job's env variables.
Now set 2 jobs - 1 on source machine and 1 on destination server (make it a slave). Use the plugin to copy required artifacts by using environment variables.
Do you have your project (not the jenkins but that with LAMP setup) under the SVN? If yes I'd recommend to create standalone job in Jenkins that will just do an svn up and you could tie it to jenkins job the way like - you running your main job, and if build is ok jenkins automatically runs job to update your project.
For copying to other servers take a look at Publish Over plugins
It's very easy to setup server and rules. The bad thing is that you can't setup copying only the new files for current build which means that the entire project is uploaded every build.
If your project is too big, another solution is to use rsync as post build action.

How do you do an automatic git pull on remote server?

Before I begin, I know there are a lot of questions similar to this one, but I am really having difficulty finding a concise, secure, best practice since the feedback on them has been so widely varied.
What I want to do:
Complete work on my local machine on a development branch.
Push changes to git. Git posts to a webhook URL and automatically has my remote server pull the changes on a development site.
Once QA'd and confirmed to be proper on the development site, push the master branch to the production site (on the same server as the development site).
Where I am at:
I have git installed on my local machine and the remote server. I can push mods to the development branch to git. On the remote server, I can pull the updates and it works like a charm. The problem is that I cannot get the remote server to automatically update when changes are pushed from my local machine.
My questions are:
For the remote server development site directory, should I git init or git init --bare? I don't plan on having updates made on the server itself. I would like my dev team to work locally and push mods to the server. I believe I need to use git init as the working tree is needed to set-up a remote alias to the git repository, but I wanted to confirm.
I am pretty sure the webhook post from git issue is due to user privileges. How can I safely get around this? I have read many tutorials that suggest updating git hook files, but I feel as though that is more drastic of a measure than I need to take. I would love to be able to have the webhook hit a URL that safely pulls the files without adding a boatload of code (if it is possible).
I am a web developer by nature, so git and sysadmin tasks are generally the bane of my existence. Again, I know this question is similar to others, but I have yet to find a comprehensive, concise, secure, and most logical approach to resolving the issue. I am about 16 hours in and have officially hit the "going in circles with no progress" point.
You can do this quite easily with GitHub service hooks.
You ll need to create one more file that will handle the process of performing the git pull. Add a new file, called github.php (or anything you wish ), and add:
<?php `git pull`;
Save that file, and upload it to the repository directory on your server. Then, go to Services Hooks -> Post-Receive URL and copy the URL to that file, and paste it into the “Post-Receive URL” E.g. http://demo.test.com/myfolder/github.php
So, when you push, GitHub will automatically visit this URL, thus causing your server to perform a git pull.
To see this in more details to go to this tutorial
I had the same exact issue, strong in code and development skills, weak in sysadmin skills. When I was finally ready to push code I had to ask a GitHub rep what their suggested method was, and they responded with Capistrano. It's a Ruby application that runs commands (such as git pull) on remote servers, along with pretty much any other command you can imagine.
Here a few articles you can read to get more info:
GitHub - Deploy with Capistrano
How to compile the Capistrano stack on your *nix system
Another example of how to deploy code with Capistrano
Not going to lie, the learning curve was pretty steep, but once you start working with Capistrano, you will see that it works well for pushing code. I develop my applications in Symfony and I have my Capistrano set-up to pull code, clear cache and clear log files, all in one command from my local machine.

PHP deployment using Git. How can I make it more automated?

I am in charge of launching web projects and it takes a little too long currently from client sign off to final launch. It is on a server which I have root access to, but it runs Plesk so that the boss can setup VirtualHosts, which means there are many sites running on it.
Each project has its own git repository so currently I have the following setup.
On my staging server there is a clone of the repo and I have two bare repositories. One is on the forge (powered by Indefero) and the other is on the live server.
Each release of a project is tagged with todays date eg. git tag -a deployed-2011-04-20.
So on the staging server I execute something similar to git push --tags live master, which targets the bare repo on the live server.
Then over SSH on the live server I execute a short bash script which basically clones the repository from the live bare repo to the folder Apache will serve.
So if that all makes sense would you be able to recommend a tool or anything to make my life easier that follows that work flow or can be adapted?
It looks something like this:
Forge (authoritative source)
^
|
v
Staging/development server
|
v
Live server bare repo
|
v
Releases folder (symlinked to htdocs)
One solution that comes to mind is to add some post-receive hook on the live server bare repo in order to detect any deployed-2011-xx-yy tag coming from the staging repo, and to trigger the ssh script from there.
The other solution is to have a scheduler (like Hudson mention in pderaaij's answer, in order to:
monitor the stating repo and, on the right tag, trigger the push on the live server
monitor the live bare repo, and trigger the ssh script.
The second solution has the advantage to keep a trace of all release instances in an Hudson job report, each time said job detect the right tags and execute the release process.
Take a look at Capistrano, which happily does the symlink dance you describe here.
If you use Hudson as a continious integration server, you can make use of the build pipeline plugin.
You have your normal build process, but add an extra job which contains the commands to deploy your application. The plugin gives you a nice button to execute that build.
The hudson job could execute all the needed commands or you can take a peek at Maven for PHP and use the available plugins to invoke the remote scripts
Perhaps it is a bit out of range considering the path you've chosen already, but it's worth the research.
We got a couple of drupal sites we develop for, we are a team of 4 developers and about 20+ non-technical content managers.
Each developer has his own dev environment, we all got a beta environment where we test code integration and performance, a staging environment where the content managers test features before we push to the live environment, a training environment we use to train people and an environment specific for usability testing.
All that is setup with only 1 bare repo on a central server where each environment is a branch. We do use the post-receive hook with password-less ssh certs doing the auto pulling on the appropriate repo based on a case statement like the following:
BRANCH=`echo $line | sed 's/.*\///g'`
LOG="`date` - updating $BRANCH instance"
case $BRANCH in
"beta" )
ssh www-data#beta "cd /var/www/beta.example.com; git pull"
;;

Best practices for (php/mysql) deployment to shared hosting?

I have worked within a web development company where we had our local machines, a staging server and a a number of production servers. We worked on macs in perl and used svn to commit to stage, and perl scripts to load to production servers. Now I am working on my own project and would like to find good practices for web development when using shared web hosting and not working from a unix based environment (with all the magic I could do with perl / bash scripting / cron jobs etc)
So my question is given my conditions, which are:
I am using a single standard shared web hosting from an external provider (with ssh access)
I am working with at least one other person and intended to use SVN for source control
I am developing php/mysql under Windows (but using linux is a possibility)
What setup do you suggest for testing, deployment, migration of code/data? I have a xampp server installed on my local machine, but was unsure which methods use to migrate data etc under windows.
I have some PHP personnal-projects on shared-hosting ; here are a couple of thoughts, from what I'm doing on one of those (the one that is the most active, and needs some at least semi-automated synchronization way) :
A few words about my setup :
Some time ago, I had everything on SVN ; now, I'm using bazaar ; but the idea is exactly the same (except, with bazaar, I have local history and all that)
I have an ssh access to the production server, like you do
I work on Linux exclusivly (so, what I do might not be as easy with windows)
Now, How I work :
Everything that has te be on the production server (source-code, images, ...) is commited to SVN/bazarr/whatever
I work locally, with Apache/PHP/MySQL (I use a dump of the production DB that I import locally once in a while)
I am the only one working on that project ; it would probably be OK for a small team of 2/3 developpers, but not more.
What I did before :
I had some PHP script that checked the SVN server for modification between "last revision pushed to production" and HEAD
I'm guessing this homemade PHP script looks like the Perl script you are currently usng ^^
That script built a list of directories/files to upload to production
And uploaded those via FTP access
This was not very satisfying (there were bugs in my script, I suppose ; I never took time to correct those) ; and forced me to remember the revision number of the time I last pushed to production (well, it was automatically stored in a file by the script, so not that hard ^^ )
What I do now :
When switching to bazaar, I didn't want to rewrite that script, which didn't work very well anyway
I have dropped the script totally
As I have ssh access to the production server, I use rsync to synchronise from my development machine to the production server, when what I have locally is considered stable/production-ready.
A couple of notes about that way of doing things :
I don't have a staging server : my local setup is close enough to the production's one
Not having a staging server is OK for a simple project with one or two developpers
If I had a staging server, I'd probably go with :
do an "svn update" on it when you want to stage
when it is OK, launch the rsync command from the staging server (which will ba at the latest "stable" revision, so OK to be pushed to production)
With a bigger project, with more developpers, I would probably not go with that kind of setup ; but I find it quite OK for a (not too big) personnal project.
The only thing "special" here, which might be "linux-oriented" is using rsync ; a quick search seems to indicate there is a rsync executable that can be installed on windows : http://www.itefix.no/i2/node/10650
I've never tried it, though.
As a sidenote, here's what my rsync command looks like :
rsync --checksum \
--ignore-times \
--human-readable \
--progress \
--itemize-changes \
--archive \
--recursive \
--update \
--verbose \
--executability \
--delay-updates \
--compress --skip-compress=gz/zip/z/rpm/deb/iso/bz2/t[gb]z/7z/mp[34]/mov/avi/ogg/jpg/jpeg/png/gif \
--exclude-from=/SOME_LOCAL_PATH/ignore-rsync.txt \
/LOCAL_PATH/ \
USER#HOST:/REMOTE_PATH/
I'm using private/public keys mecanism, so rsync doesn't ask for a password, btw.
And, of course, I generally use the same command in "dry-run" mode first, to see what is going to be synchorised, with the option "--dry-run"
And the ignore-rsync.txt contains a list of files that I don't want to be pushed to production :
.svn
cache/cbfeed/*
cache/cbtpl/*
cache/dcstaticcache/*
cache/delicious.cache.html
cache/versions/*
Here, I just prevent cache directories to be pushed to production -- seems logical to not send those, as production data is not the same as development data.
(I'm just noticing there's still the ".svn" in this file... I could remove it, as I don't use SVN anymore for that project ^^ )
Hope this helps a bit...
Regarding SVN, I would suggest you go with a dedicated SVN host like beanstalk or use the same server machine to run an SVN server so both developers can work off it.
In the latter case, your deployment script would simply move the bits to a staging web folder (accessible via beta.mysite.com) and then another deployment script could move that to the live web directory. Deploying directly to the live site is obviously not a good idea.
If you decide to go with a dedicated host or want to deploy from your machine to the server, use rsync. This is also my current setup. RSync does differential syncs (over SSH) so it's fast and it was built for just this sort of stuff.
As you grow you can start using build tools with unit tests and whatnot. This leaves only the data sync issue.
I only sync data from remote -> local and use a DOS batch file that does this over SSH using mysqldump. Cygwin is useful for Windows machines but you can skip it. The SQL import script also runs a one line query to update some cells such as hostname and web root for local deployment.
Once you have this setup, you can focus on just writing code and remote deployment or local sync and deployement becomes a one click process.
One option is to use a dedicated framework for the task. Capistrano fits very well with scripting languages such as php. It's based on Ruby, but if you do a search, you should be able to find instructions on how to use it for deploying php applications.

Categories