Jenkins build strategy with a PHP project from Git - php

I have Jenkins set up on a remote server which hosts my PHP project. Jenkins creates a build from a git repository and I would like then to move the files to the servers document root.
However the problem is all the files have the Jenkins user as their owner. So if I moved the workspace to my document root I wouldn't have permission to access them.
I'm wondering what people do to automate this? Maybe execute shell commands after the build is completed? I'm very new to Jenkins so I may be on the completely wrong track.

Short answer would be yes, you need a post-build step that runs chown (change ownership) and/or chmod (change permissions) linux commands. This will allow you to switch owning user (if jenkins user has enough rights for it) and set read/write permissions for files up to 777 (free-for-all).
However, it's more than that. You've stumbled upon deployment phase of project, and it's much more complex than that. Usually that's handled by external tools called via bash (i'm not very familiar with those, but i certainly know capistrano is an industry standard in ruby), and those tools usually (every project may require special scenario) update project files, update dependencies, deploy migrations to database, manage permissions, clean cache, etc., etc.

Related

Allow Laravel app to create/modify project files?

I have plans on making a generator GUI, accessible on a route while in development mode. When taking some action in the GUI it will create or update corresponding files in the project folder. This means the web server user need permissions to handle the files.
What is a good way to accomplish that? I have thought of a few approaches but not sure.
After you pull in my package, let the user chmod everything themselves. This might be an issue as these changes will be committed to source control
Add some kind of installer script that you can run with sudo to do 1) for you.
Add a temporary workfolder for the generator where it has full access. Then you set up some symlink thing to the actual project folder so it stays in sync. That way you won't have the Git issues.
I am excited to hear what you think about this.

Handling File ownership issues in a PHP apache application

Env: Linux
PHP apps runs as "www-data"
PHP files in /var/www/html/app owned by "ubuntu". Source files are pulled from git repository. /var/www/html/app is the local git repository (origin: bitbucket)
Issue: Our Developers and Devops would like to pull the latest sources (frequently), and would like to initiate this over the web (rather than putty -> and running the git pull command).
However, since the PHP files run as "www-data" it cannot run a git pull (as the files are owned by "ubuntu").
I am not comfortable with both alternatives:
Running Apache server as "ubuntu", due to obvious security issue.
The git repository files to be "www-data", as it makes it very inconvenient for developers logging into the server and editing the files directly.
What is the best practice for handling this situation? I am sure this must be a common issue for many setups.
Right now, we have a mechanism where the Devops triggers the git pull request from the web (where a PHP job - running as "www-data" creates a temp file). And a Cron job, running as "ubuntu", reads the temp file trigger and then issues the "git pull" command. There is a time lag, between the trigger and the actual git pull, which is a minor irritant now. I am in the process of setting up docker containers, and have the requirement to update the repo, running on multiple containers within the same host. I wanted to use this opportunity to solve this problem, in a better way, and looking for advise regarding this.
We use Rocketeer and groups to deploy. Rocketeer deploys with the user set to the deployment user (ubuntu in your case) and read/write permission for it, and the www-data group with read/execute permission. Then, as a last step, it modifies the permissions on the web-writable folders so that php can write to them.
Rocketeer executes over ssh, so can be triggered from anywhere, as long as it can connect to the server (public keys help). You might be able to setup your continuous integration/automated deployment to trigger a deploy automatically when a branch is updated/tests pass.
In any case, something where the files are owned by one user that can modify them and the web group can read the files should solve the main issue.
If you are planning on using docker, the simplest way would be to generate a new docker image for each build that you can distribute to your hosts. The docker build process would simply pull the latest changes on creation and never update itself. If a new version needs to be deployed, a new immutable image with the latest code is created and distributed.

Set up my DEV environment for my PHP site with SVN

I am creating my first website that I am taking live soon. It is a big application that I hope goes somewhere eventually but I will be working on it while it is actually up on the web. I want to do it the right way and set up everything efficiently. It is very difficult to actually gain knowledge on how to go about doing this from a book or even the web.
Firstly, I've been doing a lot of research about SVN and although I am currently the only one working on the site I would like the ability to add more people to the project eventually. I know that you have to create a repository for the SVN where your project will actually reside in, and that there are some places that offer it for free like Codesion. I have a good understanding of how it works.
I would like a dev. subdomain for the current website so that I can test and work on everything that I am developing on and that SVN would commit to. How is it that your SVN repository communicates and uploads files to that subdomain? Is this even how you would go about doing this? How would I go about managing all the different databases for each domain?
When I am done updating several changes to my new rollout do I just manually upload all the files to the live site? I am very noob on the process for all this stuff and it is difficult to find information on it. What is the best practice for rolling out new updates? Are there any other different mechanisms or applications that I would need to know about for maintaining my website as it grows?
Please, any help in the right direction would be greatly appreciated.
It depends on how much control you have over your Web server.
Most (IMO) people push their files up to the server through something like FTP/SFTP. Many SVN hosting services offer this as a 'deployment' service. You should absolutely use this option if you don't have shell access or anything of that nature.
Personally, I prefer to have finer levels of control, and let the Web server do the pulling. Using a system with a full SSH access, you can just have the server do a checkout of the code:
svn co http://server/project/trunk --username foo --password bar
If you want ease of use (mixed with potential control levels), you can also create a PHP administration class to do this for you.
<?php
echo `svn co http://server/project/trunk --username foo --password bar --non-interactive`;
And often enough, I build a bridged solution. You can create post-commit hooks and set things up such that the SVN server reach out to a specified URL with the comment. The client server then checks the comment and performs actions based on the comment (it might ignore anything without the tag 'AUTODEPLOY' for instance).
There is also GIT (which you should be able to do similar things with), but I haven't had blow up my current processes (and hundreds of SVN repos) in order to go there yet.
That is a lot of concepts to cover all in one question. First - I host my repo directly on my webserver - but I can cover what happens if its on a remote server as well.
When you setup an SVN repo you should have a post-commit(.tmpl) file - rename that to post-commit - this is what will run with every commit.
Inside of that:
#!/bin/sh
# POST-COMMIT HOOK
# [1] REPOS-PATH (the path to this repository)
# [2] REV (the number of the revision just committed)
REPO="$1"
REV="$2"
if ( svnlook log -r $REV $REPO | grep "*DEPLOY-DEV*" )
then
rm -rf ~/tmp/path/to/repo
svn export -r $REV "file://$REPO" ~/tmp/path/to/repo
rsync -az ~/tmp/path/to/repo/trunk/ ~/path/to/webroot
fi
if ( svnlook log -r $REV $REPO | grep "*DEPLOY-LIVE*" )
then
rm -rf ~/tmp/path/to/repo
svn export -r $REV "file://$REPO" ~/tmp/path/to/repo
rsync -avz -e "ssh -i /path/to/.ssh/rsync" ~/tmp/path/to/repo/trunk/ root#website.com:/path/to/webroot
ssh -i /path/to/.ssh/rsync root#website.com chown -R apache:apache /path/to/webroot/
fi
Now to go through the important parts - include *DEPLOY-DEV* or *DEPLOY-LIVE* in your commit to deploy your site to the path you specify.
The DEPLOY-DEV part of the example works nicely when its on the same server - you just need to apply the correct paths.
If the server lives somewhere else it may require some ssh keys and other trickery that is a little deeper than I'll get into - but that should give you the direction you need to set it up if you need.
If you have ssh access to your webserver checkout two working copies on the server (in dev and production document root).
That way you can easily deploy any changes.
You can use a stable branch you merge your changes into for production.
Database connection you define in a config ini (or something similar) file that is not under version control. That way you can have different values for dev and prod.
If you don't have ssh access and you want to do it "efficiently" look for a different server with ssh access.

Symfony 2 without SSH access

I have a developed a small web-app in Symfony 2 and Doctrine 2.
Can i deploy it to a web-host that doesn't give SSH access?
I ask this because i see there are a lot of task that must be done from the terminal, like updating the database schema, creating symlinks for the assets, clearing cache, etc...
Should not be a problem:
Create a copy of the system somewhere, ideally with identical DB connection params like the production system.
Run all the necessary tasks with the --env=prod parameter, if your DB settings allow it.
Clone the created production database to the production system (with phpMyAdmin). You can clone the schema from the production database, run app/console doctrine:schema:update --dump-sql locally and then run the generated SQL on the production server.
Copy all the files, excluding the dirs in app/cache and app/log
I have done this many times with SF 1.4, and it should be just as easy with SF 2.
Some low end hosts have restrictions that will cause issues for symfony, so its important to run the symfony compatibility checker script (you can upload it and then enter its URL in your browser to get the output). Once thats done, follow these simple steps:
copy over all the files for the project. I usually zip/tar the project folder, upload it, and unpack.
Export the database from your development environment and upload it to your new server.
Edit the config and update your database settings. If you have hardcoded paths somewhere in your code, now is the time to fix those as well.
Make sure that the user for apache (or whatever server software your host uses) has full access to the cache and log directories. This can be tricky on some hosts, I have had to contact support in the past to have someone log in and change permissions.
In your web hosts configuration tool, set the webroot for your site to the web folder in your project.
Maybe there is a way (with sftp for example), but it would be like trying to ride a bike with square wheels ;)

How to get started deploying PHP applications from a subversion repository?

I've heard the phrase "deploying applications" which sounds much better/easier/more reliable than uploading individual changed files to a server, but I don't know where to begin.
I have a Zend Framework application that is under version control (in a Subversion repository). How do I go about "deploying" my application? What should I do if I have an "uploads" directory that I don't want to overwrite?
I host my application through a third party, so I don't know much other than FTP. If any of this involves logging into my server, please explain the process.
Automatic deploy + run of tests to a staging server is known as continuous integration. The idea is that if you check in something that breaks the tests, you would get notified right away. For PHP, you might want to look into Xinc or phpUnderControl
You'd generally not want to automatically deploy to production though. The normal thing to do is to write some scripts that automates the task, but that you still need to manually initiate. You can use frameworks such as Phing or other build-tools for this (A popular choice is Capistrano), but you can also just whisk a few shell-scripts together. Personally I prefer the latter.
The scripts themselves could do different things, depending on your application and setup, but a typical process would be:
ssh to production server. The rest of the commands are run at the production server, through ssh.
run svn export svn://path/to/repository/tags/RELEASE_VERSION /usr/local/application/releases/TIMESTAMP
stop services (Apache, daemons)
run unlink /usr/local/application/current && ln -s /usr/local/application/releases/TIMESTAMP /usr/local/application/current
run ln -s /usr/local/application/var /usr/local/application/releases/TIMESTAMP/var
run /usr/local/application/current/scripts/migrate.php
start services
(Assuming you have your application in /usr/local/application/current)
I wouldn't recommend automatic updating. Just because your unit tests pass doesn't mean your application is 100% working. What if someone checks in a random new feature without any new unit tests, and the feature doesn't work? Your existing unit tests might pass, but the feature could be broken anyway. Your users might see something that's half-done. With automatic deployment from a check-in, you might not notice for a few hours if something made it live that shouldn't have.
Anyhow, it wouldn't be that difficult to get an automatic deployment going if you really wanted. You'd need a post-check-in hook, and really the steps would be:
1) Do an export from the latest check-in
2) Upload export to production server
3) Unpack/config the newly uploaded export
I've always performed the last steps manually. Generally it's as simple as SVN export, zip, upload, unzip, configure, and the last two steps I just alias a couple of bash commands together to perform. Then I swap out the root app directory with the new one, ensuring I keep the old one around as a backup, and it's good to go.
If you're confident in your ability to catch errors before they'd automatically go live, then you could look at automating that procedure. It gives me the jibbly-jibblies though.
At my webdev company we recently started using Webistrano, which is a Web GUI to the popular Capistrano tool.
We wanted an easy to use, fast deployment tool with a centralized interface, accountability (who deployed which version), rollback to previous versions and preferably free. Capistrano is well-known as a deployment tool for Ruby on Rails applications, but not centralized and targeted mainly to Rails apps. Webistrano enhances it with a GUI, accountability, and adds basic support for PHP deployment (use the 'pure file' project type).
Webistrano is itself a Ruby on Rails app, that you install on a development or staging server. You add a project for each of your websites. To each project you add stages, such as Prod and Dev.
Each stage can have different servers to deploy to, and different settings. Write (or modify) a 'recipe', which is a ruby script that tells capistrano what to do. In our case I just used the supplied recipe and added a command to create a symlink to a shared uploads dir, just like you mentioned.
When you click Deploy, Webistrano SSHs into your remote server(s), does an svn checkout of the code, and any other tasks that you require such as database migrations, symlinking or cleanup of previous versions. All this can be tweaked of course, after all, it's simply scripted.
We're very happy with it, but it took me a few days to learn and set up, especially since I wasn't familiar with Ruby and Rails. Still, I can highly recommend it for production use in small and medium companies, since it's proven very reliable, flexible and has saved us many times the initial investment. Not only by speeding up deployments, but also by reducing mistakes/accidents.
This sort of thing is what you would call "Continous Integration". Atlassian Bamboo (cost), Sun Hudson (free) and Cruise Control (free) are all popular options (in order of my preference) and have support to handle PHPUnit output (because PHPUnit support JUnit output).
The deployment stuff can be done with a post build trigger. Like some other people on this thread, I would exercise great caution before doing automated deployments on checkin (and test passing).
check fredistrano, it's a capistrano clone
works great (litle bit confusing installing but after all runs great)
http://code.google.com/p/fredistrano/
To handle uploads, the classic solution is to move the actual directory out of the main webspace, leaving it only for a fresh version to be checked out (as I do in the script below) and then using Apache to 'Alias' it back into place as part of the website.
Alias /uploads /home/user/uploads/
There are less choices to you if you don't have as much control of the server however.
I've got a script I use to deploy a given script to the dev/live sites (they both run on the same server).
#!/bin/sh
REV=2410
REVDIR=$REV.20090602-1027
REPOSITORY=svn+ssh://topbit#svn.example.com/var/svn/website.com/trunk
IMAGES=$REVDIR/php/i
STATIC1=$REVDIR/anothersite.co.uk
svn export --revision $REV $REPOSITORY $REVDIR
mkdir -p $REVDIR/tmp/templates_c
chown -R username: $REVDIR
chmod -R 777 $REVDIR/tmp $REVDIR/php/cache/
chown -R nobody: $REVDIR/tmp $REVDIR/php/cache/ $IMAGES
dos2unix $REVDIR/bin/*sh $REVDIR/bin/*php
chmod 755 $REVDIR/bin/*sh $REVDIR/bin/*php
# chmod -x all the non-directories in images
find $IMAGES -type f -perm -a+x | xargs -r chmod --quiet -x
find $STATIC1 -type f -perm -a+x | xargs -r chmod --quiet -x
ls -l $IMAGES/* | grep -- "-x"
rm dev && ln -s $REVDIR dev
I put the revison number, and date/time which is used for the checked-out directory name. The chmod's in the middle also make sre the permissions on the images are OK as they are also symlinked to our dedicated image server.
The last thing that happens is an old symlink .../website/dev/ is relinked to the newly checked out directory. The Apache config then has a doc-root of .../website/dev/htdocs/
There's also a matching .../website/live/htdocs/ docroot, and again, 'live' is another symlink. This is my other script that will remove the live symlink, and replace it with whatever dev points to.
#!/bin/sh
# remove live, and copy the dir pointed to by dev, to be the live symlink
rm live && cp -d dev live
I'm only pushing a new version of the site every few dats, so you might not want to be using this several times a day (my APC cache wouldn't like more than a few versions of the site around), but for me, I find this to be very much problem-free for my own deployment.
After 3 years, I've learned a bit about deployment best practices. I currently use a tool called Capistrano because it's easy to set up and use, and it nicely handles a lot of defaults.
The basics of an automated deployment process goes like this:
Your code is ready for production, so it is tagged with the version of the release: v1.0.0
Assuming you've already configured your deployment script, you run your script, specifying the tag you just created.
The script SSH's over to your production server which has the following directory structure:
/your-application
/shared/
/logs
/uploads
/releases/
/20120917120000
/20120918120000 <-- latest release of your app
/app
/config
/public
...etc
/current --> symlink to latest release
Your Apache document root should be set to /your-application/current/public
The script creates a new directory in the releases directory with the current datetime. Inside that directory, your code is updated to the tag you specified.
Then the original symlink is removed and a new symlink is created, pointing to the latest release.
Things that need to be kept between releases goes in the shared directory, and symlinks are created to those shared directories.
It depends on your application and how solid the tests are.
Where I work everything gets checked into the repository for review and is then released.
Auto updating out of a repository wouldn't be smart for us, as sometimes we just check in so that other developers can pull a later version and merge there changes in.
To do what you are talking about would need some sort of secondary check in and out to allow for collaboration between developers in the primary check in area. Although I don't know anything about that or if its even possible.
There is also issues with branching and other like features that would need to be handled.

Categories