I'm going to copy my web-application code to production server with git pull. First I make a SSHFS mount to remote server and then run git pull in the right directory.
If I need to skip production configs etc. I can use .gitigonore.
Very clean and effective so far (compared to manually dragging all changed files from folder to folder)!
But what if I have different directories on remote server? E.g. devel localhost has ~/app/ and ~/app/webroot/ but production server has ~/app/ and ~/public_html/
How to solve these kind of problems?
The generic answer is : git is not a deployment tool.
Write a deployment script, and use that script.
You can version the script with your project in your repo, of course.
Related
Env: Linux
PHP apps runs as "www-data"
PHP files in /var/www/html/app owned by "ubuntu". Source files are pulled from git repository. /var/www/html/app is the local git repository (origin: bitbucket)
Issue: Our Developers and Devops would like to pull the latest sources (frequently), and would like to initiate this over the web (rather than putty -> and running the git pull command).
However, since the PHP files run as "www-data" it cannot run a git pull (as the files are owned by "ubuntu").
I am not comfortable with both alternatives:
Running Apache server as "ubuntu", due to obvious security issue.
The git repository files to be "www-data", as it makes it very inconvenient for developers logging into the server and editing the files directly.
What is the best practice for handling this situation? I am sure this must be a common issue for many setups.
Right now, we have a mechanism where the Devops triggers the git pull request from the web (where a PHP job - running as "www-data" creates a temp file). And a Cron job, running as "ubuntu", reads the temp file trigger and then issues the "git pull" command. There is a time lag, between the trigger and the actual git pull, which is a minor irritant now. I am in the process of setting up docker containers, and have the requirement to update the repo, running on multiple containers within the same host. I wanted to use this opportunity to solve this problem, in a better way, and looking for advise regarding this.
We use Rocketeer and groups to deploy. Rocketeer deploys with the user set to the deployment user (ubuntu in your case) and read/write permission for it, and the www-data group with read/execute permission. Then, as a last step, it modifies the permissions on the web-writable folders so that php can write to them.
Rocketeer executes over ssh, so can be triggered from anywhere, as long as it can connect to the server (public keys help). You might be able to setup your continuous integration/automated deployment to trigger a deploy automatically when a branch is updated/tests pass.
In any case, something where the files are owned by one user that can modify them and the web group can read the files should solve the main issue.
If you are planning on using docker, the simplest way would be to generate a new docker image for each build that you can distribute to your hosts. The docker build process would simply pull the latest changes on creation and never update itself. If a new version needs to be deployed, a new immutable image with the latest code is created and distributed.
I am looking to set up a remote repository on a server that I can push changes on a website to. However, I also want to server to have a working directory, or at least be able have the development version of the website running with the latest changes to the repository.
Do I need to create a local branch on the server to checkout the repository files every time I push changes? Or is there a better way?
I highly recommend you use Capistrano. It is a tool that makes it very easy to deploy a website. It would handle creating a local git repository on your server and checking out the latest version when you run the deploy script. It makes it very easy to deploy to multiple environments (development, staging, production) and it also makes it easy to rollback to previous versions if you discover a bug. All of this is done from your local computer, you would not have to log in to the server after having everything setup.
I am currently developing a medium sized web application using PHP and need to use some kind of version control. I also need to have the master copy of the code running in the apache document root so I can test the app while it is in development. Does anyone have any sugestions?
Thanks,
RayQuang
You can't go wrong with Git; everything you need to know is here: http://progit.org/book/
Yeah you should definitely use version control (Git or Subversion).
Here a short explanation how I'm using it in my web projects (I am using SVN):
I have a SVN project which I have checkouted on my local machine and on the webserver
Always when you change something you can commit your current running version
Log into the server (Could also be multiple servers) and do a svn update, so the newest code gets automatically deployed on the machine. The only thing you have to do is restart of the webserver
Note:
Take care what you commit. You've maybe another database configuration file on your local machine than on your server. You can put this into the svn ignore file list (guess git has something similar)
It is also easy possible that multiple persons work on the same project..
Don't commit logfiles
Links:
Git: http://git-scm.com/
Subversion: http://subversion.tigris.org/
I'd recommend Mercurial for its ease of use and that it keeps the working copy uncluttered, all versioning information is kept in just one .hg folder. I'd do it like this:
Set up a Mercurial repository at the server (hg init)
Do a hg clone of that repository to where you want your working copy
Work away!
When you want to test on the server, do a hg commit and hg push to move the changed files to the server
Run hg update on the server, or add
[hooks]
changegroup = hg update >&2
to the .hg/hgrc file (create it if it doesn't exist) on the server to have it automatically update.
For more info, you can also check out: http://hginit.com/
I have worked within a web development company where we had our local machines, a staging server and a a number of production servers. We worked on macs in perl and used svn to commit to stage, and perl scripts to load to production servers. Now I am working on my own project and would like to find good practices for web development when using shared web hosting and not working from a unix based environment (with all the magic I could do with perl / bash scripting / cron jobs etc)
So my question is given my conditions, which are:
I am using a single standard shared web hosting from an external provider (with ssh access)
I am working with at least one other person and intended to use SVN for source control
I am developing php/mysql under Windows (but using linux is a possibility)
What setup do you suggest for testing, deployment, migration of code/data? I have a xampp server installed on my local machine, but was unsure which methods use to migrate data etc under windows.
I have some PHP personnal-projects on shared-hosting ; here are a couple of thoughts, from what I'm doing on one of those (the one that is the most active, and needs some at least semi-automated synchronization way) :
A few words about my setup :
Some time ago, I had everything on SVN ; now, I'm using bazaar ; but the idea is exactly the same (except, with bazaar, I have local history and all that)
I have an ssh access to the production server, like you do
I work on Linux exclusivly (so, what I do might not be as easy with windows)
Now, How I work :
Everything that has te be on the production server (source-code, images, ...) is commited to SVN/bazarr/whatever
I work locally, with Apache/PHP/MySQL (I use a dump of the production DB that I import locally once in a while)
I am the only one working on that project ; it would probably be OK for a small team of 2/3 developpers, but not more.
What I did before :
I had some PHP script that checked the SVN server for modification between "last revision pushed to production" and HEAD
I'm guessing this homemade PHP script looks like the Perl script you are currently usng ^^
That script built a list of directories/files to upload to production
And uploaded those via FTP access
This was not very satisfying (there were bugs in my script, I suppose ; I never took time to correct those) ; and forced me to remember the revision number of the time I last pushed to production (well, it was automatically stored in a file by the script, so not that hard ^^ )
What I do now :
When switching to bazaar, I didn't want to rewrite that script, which didn't work very well anyway
I have dropped the script totally
As I have ssh access to the production server, I use rsync to synchronise from my development machine to the production server, when what I have locally is considered stable/production-ready.
A couple of notes about that way of doing things :
I don't have a staging server : my local setup is close enough to the production's one
Not having a staging server is OK for a simple project with one or two developpers
If I had a staging server, I'd probably go with :
do an "svn update" on it when you want to stage
when it is OK, launch the rsync command from the staging server (which will ba at the latest "stable" revision, so OK to be pushed to production)
With a bigger project, with more developpers, I would probably not go with that kind of setup ; but I find it quite OK for a (not too big) personnal project.
The only thing "special" here, which might be "linux-oriented" is using rsync ; a quick search seems to indicate there is a rsync executable that can be installed on windows : http://www.itefix.no/i2/node/10650
I've never tried it, though.
As a sidenote, here's what my rsync command looks like :
rsync --checksum \
--ignore-times \
--human-readable \
--progress \
--itemize-changes \
--archive \
--recursive \
--update \
--verbose \
--executability \
--delay-updates \
--compress --skip-compress=gz/zip/z/rpm/deb/iso/bz2/t[gb]z/7z/mp[34]/mov/avi/ogg/jpg/jpeg/png/gif \
--exclude-from=/SOME_LOCAL_PATH/ignore-rsync.txt \
/LOCAL_PATH/ \
USER#HOST:/REMOTE_PATH/
I'm using private/public keys mecanism, so rsync doesn't ask for a password, btw.
And, of course, I generally use the same command in "dry-run" mode first, to see what is going to be synchorised, with the option "--dry-run"
And the ignore-rsync.txt contains a list of files that I don't want to be pushed to production :
.svn
cache/cbfeed/*
cache/cbtpl/*
cache/dcstaticcache/*
cache/delicious.cache.html
cache/versions/*
Here, I just prevent cache directories to be pushed to production -- seems logical to not send those, as production data is not the same as development data.
(I'm just noticing there's still the ".svn" in this file... I could remove it, as I don't use SVN anymore for that project ^^ )
Hope this helps a bit...
Regarding SVN, I would suggest you go with a dedicated SVN host like beanstalk or use the same server machine to run an SVN server so both developers can work off it.
In the latter case, your deployment script would simply move the bits to a staging web folder (accessible via beta.mysite.com) and then another deployment script could move that to the live web directory. Deploying directly to the live site is obviously not a good idea.
If you decide to go with a dedicated host or want to deploy from your machine to the server, use rsync. This is also my current setup. RSync does differential syncs (over SSH) so it's fast and it was built for just this sort of stuff.
As you grow you can start using build tools with unit tests and whatnot. This leaves only the data sync issue.
I only sync data from remote -> local and use a DOS batch file that does this over SSH using mysqldump. Cygwin is useful for Windows machines but you can skip it. The SQL import script also runs a one line query to update some cells such as hostname and web root for local deployment.
Once you have this setup, you can focus on just writing code and remote deployment or local sync and deployement becomes a one click process.
One option is to use a dedicated framework for the task. Capistrano fits very well with scripting languages such as php. It's based on Ruby, but if you do a search, you should be able to find instructions on how to use it for deploying php applications.
I've worked on several PHP projects and always I have problems with organizing my work. Where do you develop your application - on localhost, remote server or maybe production one(!) ?
When I work on my localhost after making some major path I send new files by ftp - but sometimes it happens to forget about one file and it is just tiring to browse several directiores to copy changed files.
What best practises do you propose?
This is how we manage our commercial site:
Develop in a local sand box
Check-in to SVN
Automated build/test from SVN onto internal
dev server
Scripted deploy using rsync to staging
server for QA/UAT
Scripted deploy onto production
servers.
Staging and production servers are both hosted by the ISP and are hardware and version matched and run RHEL, internal Dev server is version matched CentOS.
This way, when code hits the production servers there shouldn't be any nasty surprises as even the deploy scripts get checked in stage 4.
Google App Engine has an apposite tool that automatically uploads to the production environment files that are modified; don't know if there's something similar for PHP.
So, doing a dev2prod script (a script that does this automatically) should be a good solution.
To do local source file management, everyone may suggest you to use a source code control system.
I'm developing on a development machine which has an identical enviroment like the production one - that prevents some different behaviour because of different versions or configs. after finishing i just move all the files on to the production server.
Winmerge is a nice and free windows tool to diff your files between development and production machine.
Develop in your local machine with the same exact configuration as your development enviroment (that is apache mods, php extensions and so on), using a version control system (I prefer SVN) to keep track of modified files and what not.
Then you can write a script in your production or testing enviroment to copy/update files off the repository to the web server path.
Probably get told off for redirecting an old post but here's how I do it using free tools:
I use Netbeans, Git, bitbucket, source tree, gitflow and FTPloy.
Bitbucket.com : sign up for free account.
SourceTree: free from bitbucket. Great tool for managing git repositories. All commits, merges and pulls can be done in here. Issues in bitbucket can be tracked.
In sourcetree, take the master branch and click "git flow" - google gitflow - it's fantastic work flow of feature, hotfix, development, and release branches and sourcetree helps automate the process.
FTPloy.com automates the deployment of your master branch. Free is one repo/server. But worth the upgrade if you want to push development branch to a server sandbox for testing.
Hope this helps someone searching the web!
This is my own PHP Development lifecycle.
Create a GIT Private Repository on Github or Bitbucket.
Connect the GIT with cPanel. (It's very easy to commit the changes to the repo as well as a production server).
Develop on Localhost(probably on Xampp on Windows) with Visual Studio Code or Sublime Text
Initialise GIT and connected with the private repository.
Push the changes to the GIT repository (master/branches) - master changes will upload to the production server automatically.