Before I begin, I know there are a lot of questions similar to this one, but I am really having difficulty finding a concise, secure, best practice since the feedback on them has been so widely varied.
What I want to do:
Complete work on my local machine on a development branch.
Push changes to git. Git posts to a webhook URL and automatically has my remote server pull the changes on a development site.
Once QA'd and confirmed to be proper on the development site, push the master branch to the production site (on the same server as the development site).
Where I am at:
I have git installed on my local machine and the remote server. I can push mods to the development branch to git. On the remote server, I can pull the updates and it works like a charm. The problem is that I cannot get the remote server to automatically update when changes are pushed from my local machine.
My questions are:
For the remote server development site directory, should I git init or git init --bare? I don't plan on having updates made on the server itself. I would like my dev team to work locally and push mods to the server. I believe I need to use git init as the working tree is needed to set-up a remote alias to the git repository, but I wanted to confirm.
I am pretty sure the webhook post from git issue is due to user privileges. How can I safely get around this? I have read many tutorials that suggest updating git hook files, but I feel as though that is more drastic of a measure than I need to take. I would love to be able to have the webhook hit a URL that safely pulls the files without adding a boatload of code (if it is possible).
I am a web developer by nature, so git and sysadmin tasks are generally the bane of my existence. Again, I know this question is similar to others, but I have yet to find a comprehensive, concise, secure, and most logical approach to resolving the issue. I am about 16 hours in and have officially hit the "going in circles with no progress" point.
You can do this quite easily with GitHub service hooks.
You ll need to create one more file that will handle the process of performing the git pull. Add a new file, called github.php (or anything you wish ), and add:
<?php `git pull`;
Save that file, and upload it to the repository directory on your server. Then, go to Services Hooks -> Post-Receive URL and copy the URL to that file, and paste it into the “Post-Receive URL” E.g. http://demo.test.com/myfolder/github.php
So, when you push, GitHub will automatically visit this URL, thus causing your server to perform a git pull.
To see this in more details to go to this tutorial
I had the same exact issue, strong in code and development skills, weak in sysadmin skills. When I was finally ready to push code I had to ask a GitHub rep what their suggested method was, and they responded with Capistrano. It's a Ruby application that runs commands (such as git pull) on remote servers, along with pretty much any other command you can imagine.
Here a few articles you can read to get more info:
GitHub - Deploy with Capistrano
How to compile the Capistrano stack on your *nix system
Another example of how to deploy code with Capistrano
Not going to lie, the learning curve was pretty steep, but once you start working with Capistrano, you will see that it works well for pushing code. I develop my applications in Symfony and I have my Capistrano set-up to pull code, clear cache and clear log files, all in one command from my local machine.
Related
At the moment I am using poor method to work at home and at work to do web development.
I use Wamp for testing/development and then I upload to a production web server (Linux) via FTP.
If I continue with the project at home, I have to download the files from FTP.
What is good method to work on same projects at multiple locations?
Someone suggest me to learn Git and get Github private account. Also suggested to get Vagrant installed at work and home. Do I need to install Git in Vagrant VM or local machine?
Learn git: http://try.github.io
Create a Vagrant/VrtualBox VM by following the directions at https://puphpet.com
One of the tricks here is to put the Vagrant stuff you get from Puphpet directly in your project and then commit all of it to git. You'll then be able to check out the project in a new environment and, as long as Vagrant and VirtualBox are installed, you can run vagrant up and be working in about 5 mins.
Here's an example of how I'm doing just that to allow people to easily try out a library I've written: https://github.com/jeremykendall/query-auth-impl.
Enjoy! Your life as a developer is about to get a lot easier and a whole lot better.
Github or Bitbucket. Git or Mercurial, and also Svn if it's just for yourself and you want an easier learning curve.
Any source control system would be ideal for this.
You don't want your production server to be the source of truth for the actual code. Those two concerns should definitely be separated. The production application is the output of the code, not the code itself. For a language like PHP the two may be identical, but the concerns themselves should still be separated. Indeed, for small systems the two services may even be hosted on the same server, but should still be logically separated.
The source control system maintains the changes made to the code over time, the production server is a snapshot of the current release version of the code.
I have decided that it's time for me to start using Git on a PHP project that I have been developing casually for over a decade. (Please, no lectures from the version control police!) Due to the complex setup required on my VPS to do everything the project needs (esp. single-codebase-multiple-client structure and a Japanese-capable installation of TeX to create specialty PDFs), it is not possible to set up a development environment on my local Windows box. But I do have a testbed area on the server that I can play in, so it's my development area. Currently I use Filezilla to access the server and open files directly into Notepad++, and when I'm ready to see my edit in action, I just save and let Filezilla upload. When everything looks good on the testbed, I copy the files to the production codebase area. Yeah, that gives me no history of my changes other than my own comments, and I have to be careful not to mix bug fixes with half-finished new features. I can see the value of Git's branches for different upgrades in progress.
Yesterday I got my toes wet. First I created a Github account, and then (at the recommendation of a tutorial) installed Git For Windows (with its own Bash and tiny-looking GUI) and Kdiff3, and followed some instructions for configuring Git Bash. After all that, though, I ended up having to install something else in order to interface with my Github account (appropriately named Github for Windows), which seem to do all the stuff the other two programs were supposed to do for me. Anyway, then I did a simple task as my first foray into the Github world - I had added functionality to someone else's jQuery plugin and wanted to share it with the developer, so I forked his repo, cloned it to my machine, overwrote the file I had previously edited and tested, synced to my Github account, and sent a pull request. All the terminology in that last sentence was brand new to me, so I was pretty proud of myself that I got that far. ;) But I guess I only needed the Github software, not the Git software - it's hard to know what tutorials to believe.
Anyway, now I want to figure out a workflow for my own stuff, which is my actual question for you guys. From what I can tell, having the master repo anywhere but the public Github costs money, and I don't care if others see my code (I don't expect anyone else to work on my oddball project made of spaghetti code, but if they want to, that's great). Okay, but then what? Perhaps one of these scenarios, or something else:
Clone branches of the repo to my PC, do edits on the local files, and upload them in Filezilla for testing (a couple more clicks than my current workflow because Filezilla doesn't automatically see the relationship between the local file and the remote file, but not a big deal). Then when I'm happy with the code, commit locally, sync to Github, and copy the files (from somewhere - not sure on this point) to the production area.
Install the Linux flavor of Git on my VPS so that the "local" Git file location is the testbed, and use Git through PuTTY to do the local commits. Simpler for file structure (no need for a copy on my PC at all) but more cumbersome to use Git:
I'm not on PuTTY very frequently, and for some reason the connection often dies on me and I have to restart.
Even though the Linux command line is Git's native habitat, I am probably more comfortable with a GUI (because I forget command syntax quickly - old brain, I guess).
Also, since I never ended up using the Git program I installed here, I'm not sure whether it would be Git or Github I would be using on the server.
Some other scenario, since neither #1 or #2 uses Git/Github to manage the production file area at all, which would probably be a good idea so that I don't forget to copy everything I need.
I tried to research the possibility of a PHP-based GUI to go with idea #2 (so I don't have to use PuTTY for day-to-day operations), but it seems that the discussions of such tools all assume either that you are trying to create your own Github service, or that the "local" cloned repo is physically on your local PC (with xAMP running on whatever OS it is). But maybe the Github software I used is enough to do all that - it's hard to tell. I don't yet understand the interplay between a master public repo on Github, branches somewhere (on Github also?), at least two sets of files on my web server (the testbed and the production area), Github software, Git software, and the keyboard/screen of the computer I'm sitting at.
So pardon my newbie ramblings, but if someone out there has a similar development situation, What's your workflow? Or what would you suggest for me?
Here's one way to aproach the issue:
You will need three repositories:
a local repo to edit code. [1]
a bare remote repository on your server. This will be in a location that in not publicly viewable, but you can ssh in to. [2]
The production environment. [3]
Here's the implementation:
workstation$ cd localWorkingDirectory/
workstation$ git init
workstation$ git add .
workstation$ git commit -m 'initial commit'
workstation$ ssh login#myserver
myserver$ mkdir myrepo.git
myserver$ cd myrepo.git
myserver$ git init --bare
myserver$ exit
workstation$ cd localWorkingDirectory/
workstation$ git remote add origin login#myserver:myrepo.git
workstation$ git push origin master
every time you make a commit on any branch, back it up with:
workstation$ git push origin BRANCH
When you are ready to move branch version2 into production: do this
workstation$ git push origin version2
workstation$ ssh login#myserver
myserver$ git clone path/to/myrepo.git productionDirectory
myserver$ cd productionDirectory
myserver$ git checkout version2
Oh no! It dsoesn't work! better switch back to version1!
workstation$ ssh login#myserver
myserver$ cd productionDirectory
myserver$ git checkout version1
You don't need github (or any other central store) to start using git. Especially since you're a lone developer. Git runs directly on your own machine, without any server component (unlike for example subversion). Just git init and start committing away.
I agree with the other commenters here, that you should aim to get a local development environment up and running. Even if it takes some effort, it's certainly worth it. One of the side effects of doing so may be that you are forced to decouple some of your current hard dependencies and thereby getting a better overall application architecture out of it. The things that can't easily be replicated in your development environment could instead be replaced with mock services.
Once that is in place, look into a scripted deployment process. E.g. write a shell script that syncs your development machine's codebase with the production server. There are many ways to do this, but I suggest you start really simple, then revise your options (Capistrano is one option).
I'd certainly look at something like capistrano for your current development setup.
I can understand why you might have a reticence to use the terminal but it would probably help your understanding of git in context. Doesn't take long to get to grips with the commands and when tied into a system such as capistrano you'll be rocking development code up to your environment in no time:
git commit -a
git push origin develop
cap deploy:dev
When i'm working on windows i generally try to replicate the deployment environment i have with virtual machines locally using something like sun's virtualbox. That way you can minimise potential environment issues while still developing locally. Then you can just use putty to ssh to your local vm. Setup sharing between the vm and your host OS and all your standard IDEs/editors will work too. I find this preferable to having to setup a vps remotely but whatever works.
I had just implemented GIT Version Control for a Team of three people. We are managing to pull/push/fetch etc to our bitBucket repository. My only concern is how I should push live after we make an update to the repository.
On another server, I wrote a shell script to clone the repo into a directory so I would be able to test, but on the live server I cannot do the same since on some occasions I just have to push one file only.
Please note that our live server is inside a network and not accessible over the internet (thus I cannot use BitBucket's push service).
The only advise I can give is to make sure that the local repository (which concentrates changes from the developers) is up to date and in shape before pushing.
Locally you can define your workflow. Perhaps the master repo, into which only official commits are taken, branches for each developer's work once it passes local tests, and each one does as they please on their machines. Or something fancier. There are suggestions on workflows, ranging from almost centralized to completely distributed, check them out. If you adopt one way of working, git won't stand in your way if you decide later to change it.
I am an avid webdev hobbyist and freelance, up until now I simply edit the website live (put a maintenance message up while its being made), now all my projects up until now have also been very small.
eg I make a site, show em, take money and go, I've never had to work on a site after it's gone live.
Now my new project is pretty big and I know I will have to edit it after its gone live and maybe have a small team of devs (atm just me)
So how do people professionally handle this? I know I will need a prefix-amp app cos i run an apache server, I've also heared that people use github for versioning, but I'm not really sure because apparently its not svn?
Thanks
ps. I have a windows 7 pc, so no mac apps please
up until now i simply edit the website live
Terrible in my book ;)
so how do people professionally handle this?
First you need to setup a development server (it would be best to keep it as close as possible to the expected live environments). On this server you would install all the software you need.
You may also want to setup a staging server.
i know i will need a prefix-amp app
I hope you are not talking about those one click installers. If you would do it professionally you should install everything yourself that way you can set it up the way you need it.
ive also heared that people use github for versioning, but im not really sure because apparently its not svn?
GitHub is just a website. What you are looking for is git or svn for versioning. You could also setup a git or svn server locally instead of using services like GitHub. Basically what versioning is is that when somebody makes a change to the code he/she would need commit the changes. This way it is easy to keep track of changes in the codebase (like what was changed, when was it changed and by whom).
Local XAMP-stack (LAMP, or WAMP) for development
intranet-system for test and maybe staging
Of course the live system
Versioncontrol, I prefer git. Of course you can use SVN too, but... lets say: It's SVN.
Make changes local, test this changes local
everythings fine: Push it into the "master" vcs-repository
New version ready (or it's "sunday-night-release-time")? Push all that stuff on test/stage
Everythings fine there too: Push it into the live system
Thats very shortened of course, but it should give you an idea.
The tool where you manage your software version is not that important. Use Git, or SVN or whatever, the one you like most. But use _one_.
Equally important is that you run the "page" on two sites, a test and a live system, strictly apart. Both systems have to be very close in their layout, all changes must first be done in the test system, be verified and then done in the same manner in the live system. Do not allow changes only to be made to the live system ('cause it's just a small change'). No exceptions.
Then think about deployment: how will you transfer changed files to the target system ? You need routines for this, that run once started and don't forget a step in between.
Firstly you need some kind of versioning system: either SVN or Git. GitHub is simply an online service that provides managed Git repositories. Secondly you need a development server.
If it were just you doing development, you could host both of these on your local desktop PC, but since other developers are going to be joining, you need a remote server. If you don't want to be running a server out of your home, the best option is a VPS (virtual private server) on which you can install Git, Apache, etc. and anything else you need.
As for development software, take your pick- there are loads of options. A common choice is the NetBeans IDE and TortoiseGit combo. You use NetBeans to develop your code on, automatically uploading to your development server, then you TortoiseGit to commit and sync changes.
Only when you're ready to go live do you copy the code from the dev server to the production server.
I've read a number of topics in the same sort of ballpark as this one, but in all honesty I'm still not exactly sure on the best approach (as a starting point). I am a solo developer in a small office and I have around 30 websites which are hosted on a linux VPS. I want to start using using version control (probably SVN) and also set up a staging server. At the moment, I do development either locally on my machine before using FTP to upload to the live server, or ocassionally for small changes I edit the remote files directly, which is not an ideal approach.
I'm looking for some guidance on how to improve my development environment. I imagine I should be installing SVN on the web server, which would then allow me to check out versions to my local machine (which would also require SVN i think). Also, if I want to set up a staging server, should I just set up subdomains for each of the live websites, then use these subdomains for showing clients changes to the site before making them live?
Hope this makes sense!
This is what we do at work:
We have a staging server running Apache and a Subversion server. We have a post commit hook that updates a working copy in the htdocs directory, that way, when a developer commits something it automatically gets updated on the staging server, so everyone can see the latest code.
On the client's production servers (the ones we can control) we have the Subversion client installed and the website is a working copy. When we need to update the live site we login to a shell and run svn up. If you do something like this, make sure to limit access to the .svn directories, either with .htaccess files or from the main Apache config.
We have a custom app that manages the projects, but that is only because we're lazy and don't want to setup each project by hand, the app creates the necessary directories and working copies. You could write a quick script to do this.
We never, ever, edit files via FTP on the live site. All in all we have been using this setup for almost 2 years and aside from the occasional conflict on the staging server, we never have had any problems.
You can actually install the SVN server on your local machine, which I would recommend in lieu of installing it on the web server (assuming you make backups). The easiest thing to do, since it’s only you using it, would be to use the file:// protocol, but using svnserve is a little more robust, and the preferred method if you want to take the time to do it.
#Michael, I disagree - I would say it's better to install on the linux vps, especially if you are already paying for the hosting service. I find it very helpful to be able to browse and download stuff from my svn repo wherever I am, from whatever computer I'm on.
#nicky, I started with svn (and version control) several years ago and I took baby steps which made it easier to tackle.
If I had to do it over again, I'd read the svn book to start with. The book is very well laid out and didn't take more than 1-2 days to plow thru.
While you're reading, install svn on your linux vps with an apache front end.
Once you have that up, pick one of your websites and import it into svn. This is how I structure my svn repo. For example, say my repo is hosted at http://mysvn.mydomain.com/svn/:
mywebsite1
- trunk
- tags
- branches
mywebsite2
- trunk
- tags
- branches
Don't worry about creating the perfect structure. It's pretty easy to re-organize especially when you're starting out. After you import a few projects into svn, you'll start to get a feel for which projects should have their own "trunk/tags/branches" dir structure and which can be combined.
For creating test environments, I do exactly what you describe. I use build scripts to checkout from svn and download files into dirs that are mapped to subdomains like "test.clientsite.com" (I work primarily in java and use ant and maven, but I think you can use whatever scripting language you're familiar with).
Once you get used to version control, you'll never go back, good luck!