I am currently developing a medium sized web application using PHP and need to use some kind of version control. I also need to have the master copy of the code running in the apache document root so I can test the app while it is in development. Does anyone have any sugestions?
Thanks,
RayQuang
You can't go wrong with Git; everything you need to know is here: http://progit.org/book/
Yeah you should definitely use version control (Git or Subversion).
Here a short explanation how I'm using it in my web projects (I am using SVN):
I have a SVN project which I have checkouted on my local machine and on the webserver
Always when you change something you can commit your current running version
Log into the server (Could also be multiple servers) and do a svn update, so the newest code gets automatically deployed on the machine. The only thing you have to do is restart of the webserver
Note:
Take care what you commit. You've maybe another database configuration file on your local machine than on your server. You can put this into the svn ignore file list (guess git has something similar)
It is also easy possible that multiple persons work on the same project..
Don't commit logfiles
Links:
Git: http://git-scm.com/
Subversion: http://subversion.tigris.org/
I'd recommend Mercurial for its ease of use and that it keeps the working copy uncluttered, all versioning information is kept in just one .hg folder. I'd do it like this:
Set up a Mercurial repository at the server (hg init)
Do a hg clone of that repository to where you want your working copy
Work away!
When you want to test on the server, do a hg commit and hg push to move the changed files to the server
Run hg update on the server, or add
[hooks]
changegroup = hg update >&2
to the .hg/hgrc file (create it if it doesn't exist) on the server to have it automatically update.
For more info, you can also check out: http://hginit.com/
Related
I'm going to copy my web-application code to production server with git pull. First I make a SSHFS mount to remote server and then run git pull in the right directory.
If I need to skip production configs etc. I can use .gitigonore.
Very clean and effective so far (compared to manually dragging all changed files from folder to folder)!
But what if I have different directories on remote server? E.g. devel localhost has ~/app/ and ~/app/webroot/ but production server has ~/app/ and ~/public_html/
How to solve these kind of problems?
The generic answer is : git is not a deployment tool.
Write a deployment script, and use that script.
You can version the script with your project in your repo, of course.
Env: Linux
PHP apps runs as "www-data"
PHP files in /var/www/html/app owned by "ubuntu". Source files are pulled from git repository. /var/www/html/app is the local git repository (origin: bitbucket)
Issue: Our Developers and Devops would like to pull the latest sources (frequently), and would like to initiate this over the web (rather than putty -> and running the git pull command).
However, since the PHP files run as "www-data" it cannot run a git pull (as the files are owned by "ubuntu").
I am not comfortable with both alternatives:
Running Apache server as "ubuntu", due to obvious security issue.
The git repository files to be "www-data", as it makes it very inconvenient for developers logging into the server and editing the files directly.
What is the best practice for handling this situation? I am sure this must be a common issue for many setups.
Right now, we have a mechanism where the Devops triggers the git pull request from the web (where a PHP job - running as "www-data" creates a temp file). And a Cron job, running as "ubuntu", reads the temp file trigger and then issues the "git pull" command. There is a time lag, between the trigger and the actual git pull, which is a minor irritant now. I am in the process of setting up docker containers, and have the requirement to update the repo, running on multiple containers within the same host. I wanted to use this opportunity to solve this problem, in a better way, and looking for advise regarding this.
We use Rocketeer and groups to deploy. Rocketeer deploys with the user set to the deployment user (ubuntu in your case) and read/write permission for it, and the www-data group with read/execute permission. Then, as a last step, it modifies the permissions on the web-writable folders so that php can write to them.
Rocketeer executes over ssh, so can be triggered from anywhere, as long as it can connect to the server (public keys help). You might be able to setup your continuous integration/automated deployment to trigger a deploy automatically when a branch is updated/tests pass.
In any case, something where the files are owned by one user that can modify them and the web group can read the files should solve the main issue.
If you are planning on using docker, the simplest way would be to generate a new docker image for each build that you can distribute to your hosts. The docker build process would simply pull the latest changes on creation and never update itself. If a new version needs to be deployed, a new immutable image with the latest code is created and distributed.
I know this has been asked before, but I couldn't get the answer I needed.
Currently I'm developing an website using PHP and was using Notepad++ before, and it all worked well because I'm developing with a co-worker so we both keep on changing different files on the FTP.
Switched to NetBeans. All went ok, pulled the entire website via FTP to my local computer and everytime I edited a file and saved it uploaded to the FTP. But, there is a problem. If my colleague updates a file, it doesn't update on my local folder. So, I thought: "Let's try versioning".
Created a team on bitbucket, created a repository. All went ok.
But now, I'm in a struggle to get everything up and running on both NetBeans (mine and colleague's) so that my colleague is editing a file on his NetBeans and constantly saving so that it gets saved on FTP and only when he stops working on that file push it to BitBucket so that I can pull after.
Suggestions?
About setting up your work environment :
In order to set up your bitbucket repository and local clone, go read this link (official doc).
You will need to repeat the cloning part once for each PC (e.g : once on yours, once on your colleague's).
Read the account management part to see how you can tag your actions with your account, and your colleague's action with his own account.
Start using your git workflow ; when you are tired of always typing your password to upload modifications to your bitbucket account, take the time to read the ssh keys setup part - read carefully, you will need to execute the procedure once for you and once for your colleague.
Using your local git repository with Netbeans is pretty straightforward :
From netbeans, run the File > New Project ... command (default: Ctrl+Shift+N),
Select PHP application with Existing Sources and click Next >,
For the Sources Folder: line, select your local git directory,
Fill the remaining fields, and if you want the last Run configuration screen, then click Finish.
After the project is created in netbeans, you can modify the Run configuration part by right clicking on the project's icon, selecting the Properties menu entry, and going to the Run configuration item.
About solving your workflow "problem" :
Your current FTP workflow can lead you to blindly squash your colleague's modifications (when uploading), or have your colleague's modification blindly squash your own local modifications (when downloading). This is bad, and you will generally notice it only after the bad stuff happened - too late.
Correctly using version control allows you to be warned when this could potentially happen, and to keep an almost infinite undo stack on the modifications of the project's files. The cost, however, is that both of you will have to add several actions in your day to day workflow - some choices can not be made automatically.
You may find it cumbersome in the beginning, but it really pays off, and quite quickly - we're talking big bucks here. So use it and learn.
On top of using Ctrl+S to save your modifications on disk, you and your colleague will need to integrate 3 extra commands in your daily work :
Save your work to your local repository (git add / git commit)
Download the latest modifications shared by your colleague (git pull)
Upload your work to the central repository (git push)
You can access these commands :
from a terminal,
from a GUI frontend : you can try TortoiseGit for windows, or gitk for linux,
from Netbeans :
in the contextual menu of the files/folders in the project tree (right click on the item, there is a "Git" entry),
using the Team > Git > ... menu
Since you provided a git tag, I'll describe what's to do for Git.
set up a remote bare repo on a server that you both could access (BitBucket in your case):
http://git-scm.com/book/en/Git-on-the-Server-Getting-Git-on-a-Server
you both clone that remote repo to your local machines:
http://git-scm.com/book/en/Git-Basics-Getting-a-Git-Repository#Cloning-an-Existing-Repository
each of you works in her part of the application. When one is done, publish the work to the server:
http://git-scm.com/book/en/Git-Basics-Working-with-Remotes#Pushing-to-Your-Remotes
By now, the remote server holds the version that was just pushed. What's missing is the deployment of the website. This has been discussed here:
Using GIT to deploy website
Doing so, you will decouple your work from that of your colleague since you're not changing files over FTP all the time. You work in your part, your partner works on her part. The work is getting merged and then a new version of the website gets published.
You can create git or Mercurial repositories in Atlassian Bitbucket (http://bitbucket.org). If your team is new to version control, i advise you no forks in your first project.
The easy solution ins to use Atlassian SourceTree (http://www.sourcetreeapp.com/) to control your code since there is a bug in netbeans. See NetBeans + Git on BitBucket
You need to create a new repository in bitbucket. I assume you already configure the ssh2 keys. Using Git you need:
git clone --bare --shared php_project php_project.git
git commit
Using Mercurial you need:
hg init
hg commit
Good luck / boa sorte
Pedro
I have decided that it's time for me to start using Git on a PHP project that I have been developing casually for over a decade. (Please, no lectures from the version control police!) Due to the complex setup required on my VPS to do everything the project needs (esp. single-codebase-multiple-client structure and a Japanese-capable installation of TeX to create specialty PDFs), it is not possible to set up a development environment on my local Windows box. But I do have a testbed area on the server that I can play in, so it's my development area. Currently I use Filezilla to access the server and open files directly into Notepad++, and when I'm ready to see my edit in action, I just save and let Filezilla upload. When everything looks good on the testbed, I copy the files to the production codebase area. Yeah, that gives me no history of my changes other than my own comments, and I have to be careful not to mix bug fixes with half-finished new features. I can see the value of Git's branches for different upgrades in progress.
Yesterday I got my toes wet. First I created a Github account, and then (at the recommendation of a tutorial) installed Git For Windows (with its own Bash and tiny-looking GUI) and Kdiff3, and followed some instructions for configuring Git Bash. After all that, though, I ended up having to install something else in order to interface with my Github account (appropriately named Github for Windows), which seem to do all the stuff the other two programs were supposed to do for me. Anyway, then I did a simple task as my first foray into the Github world - I had added functionality to someone else's jQuery plugin and wanted to share it with the developer, so I forked his repo, cloned it to my machine, overwrote the file I had previously edited and tested, synced to my Github account, and sent a pull request. All the terminology in that last sentence was brand new to me, so I was pretty proud of myself that I got that far. ;) But I guess I only needed the Github software, not the Git software - it's hard to know what tutorials to believe.
Anyway, now I want to figure out a workflow for my own stuff, which is my actual question for you guys. From what I can tell, having the master repo anywhere but the public Github costs money, and I don't care if others see my code (I don't expect anyone else to work on my oddball project made of spaghetti code, but if they want to, that's great). Okay, but then what? Perhaps one of these scenarios, or something else:
Clone branches of the repo to my PC, do edits on the local files, and upload them in Filezilla for testing (a couple more clicks than my current workflow because Filezilla doesn't automatically see the relationship between the local file and the remote file, but not a big deal). Then when I'm happy with the code, commit locally, sync to Github, and copy the files (from somewhere - not sure on this point) to the production area.
Install the Linux flavor of Git on my VPS so that the "local" Git file location is the testbed, and use Git through PuTTY to do the local commits. Simpler for file structure (no need for a copy on my PC at all) but more cumbersome to use Git:
I'm not on PuTTY very frequently, and for some reason the connection often dies on me and I have to restart.
Even though the Linux command line is Git's native habitat, I am probably more comfortable with a GUI (because I forget command syntax quickly - old brain, I guess).
Also, since I never ended up using the Git program I installed here, I'm not sure whether it would be Git or Github I would be using on the server.
Some other scenario, since neither #1 or #2 uses Git/Github to manage the production file area at all, which would probably be a good idea so that I don't forget to copy everything I need.
I tried to research the possibility of a PHP-based GUI to go with idea #2 (so I don't have to use PuTTY for day-to-day operations), but it seems that the discussions of such tools all assume either that you are trying to create your own Github service, or that the "local" cloned repo is physically on your local PC (with xAMP running on whatever OS it is). But maybe the Github software I used is enough to do all that - it's hard to tell. I don't yet understand the interplay between a master public repo on Github, branches somewhere (on Github also?), at least two sets of files on my web server (the testbed and the production area), Github software, Git software, and the keyboard/screen of the computer I'm sitting at.
So pardon my newbie ramblings, but if someone out there has a similar development situation, What's your workflow? Or what would you suggest for me?
Here's one way to aproach the issue:
You will need three repositories:
a local repo to edit code. [1]
a bare remote repository on your server. This will be in a location that in not publicly viewable, but you can ssh in to. [2]
The production environment. [3]
Here's the implementation:
workstation$ cd localWorkingDirectory/
workstation$ git init
workstation$ git add .
workstation$ git commit -m 'initial commit'
workstation$ ssh login#myserver
myserver$ mkdir myrepo.git
myserver$ cd myrepo.git
myserver$ git init --bare
myserver$ exit
workstation$ cd localWorkingDirectory/
workstation$ git remote add origin login#myserver:myrepo.git
workstation$ git push origin master
every time you make a commit on any branch, back it up with:
workstation$ git push origin BRANCH
When you are ready to move branch version2 into production: do this
workstation$ git push origin version2
workstation$ ssh login#myserver
myserver$ git clone path/to/myrepo.git productionDirectory
myserver$ cd productionDirectory
myserver$ git checkout version2
Oh no! It dsoesn't work! better switch back to version1!
workstation$ ssh login#myserver
myserver$ cd productionDirectory
myserver$ git checkout version1
You don't need github (or any other central store) to start using git. Especially since you're a lone developer. Git runs directly on your own machine, without any server component (unlike for example subversion). Just git init and start committing away.
I agree with the other commenters here, that you should aim to get a local development environment up and running. Even if it takes some effort, it's certainly worth it. One of the side effects of doing so may be that you are forced to decouple some of your current hard dependencies and thereby getting a better overall application architecture out of it. The things that can't easily be replicated in your development environment could instead be replaced with mock services.
Once that is in place, look into a scripted deployment process. E.g. write a shell script that syncs your development machine's codebase with the production server. There are many ways to do this, but I suggest you start really simple, then revise your options (Capistrano is one option).
I'd certainly look at something like capistrano for your current development setup.
I can understand why you might have a reticence to use the terminal but it would probably help your understanding of git in context. Doesn't take long to get to grips with the commands and when tied into a system such as capistrano you'll be rocking development code up to your environment in no time:
git commit -a
git push origin develop
cap deploy:dev
When i'm working on windows i generally try to replicate the deployment environment i have with virtual machines locally using something like sun's virtualbox. That way you can minimise potential environment issues while still developing locally. Then you can just use putty to ssh to your local vm. Setup sharing between the vm and your host OS and all your standard IDEs/editors will work too. I find this preferable to having to setup a vps remotely but whatever works.
I'm just trying to find an easier way to deploy a site I'm working on. I'm working alone with a test a production server and right now deployment means copying a subset of the files and database data onto my computer and uploading it to the prod site. I'm sure there's a simple synchronization tool out there but so far I've had no luck in finding anything.
What I'd really like is an application I can run locally (on windows) or something I could install on my server for let me have a one-click deployment. Any suggestions?
Thanks!
godwin
Edit
I have decided for now to go with GoodSync and Toad. Thanks for the suggestions.
man scp
SCP(1) BSD General Commands Manual SCP(1)
NAME
scp - secure copy (remote file copy program)
SYNOPSIS
scp [-1246BCpqrv] [-c cipher] [-F ssh_config] [-i identity_file] [-l limit] [-o ssh_option] [-P port] [-S program] [[user#]host1:]file1
[...] [[user#]host2:]file2
DESCRIPTION
scp copies files between hosts on a network. It uses ssh(1) for data transfer, and uses the same authentication and provides the same
security as ssh(1). Unlike rcp(1), scp will ask for passwords or passphrases if they are needed for authentication.
Any file name may contain a host and user specification to indicate that the file is to be copied to/from that host. Copies between two
remote hosts are permitted.
When copying a source file to a target file which already exists, scp will replace the contents of the target file (keeping the inode).
If the target file does not yet exist, an empty file with the target file name is created, then filled with the source file contents. No
attempt is made at "near-atomic" transfer using temporary files.
The options are as follows:
-1 Forces scp to use protocol 1.
-2 Forces scp to use protocol 2.
...
I use GoodSync http://www.goodsync.com/ for this sort of thing. It's really good. Runs on windows, can sync between any combination of local files (S)FTP, windows, linux network shares etc.
Then create a scheduled task/cronjob to run an export of the database into the syncronised folder and have one do an import at the other end. Obviously this process is one way.
http://www.phing.info/docs/guide/stable/
PHing is an automated build system made for PHP. Works with GIT, SVN, PHPUnit, etc...
You basically set up XML files that give PHing instructions on what to do. Allows you to run test suites along with build creation, build multiple varied versions at a time, copy files as well as db, and a bunch of other cool features.
Also, it's open source and platform independent.
What are you using for source control? Some tools like Git and SVN have ready-made methods for this sort of thing. See here for a quick Git solution.
I would second the advice about Git/SVN, but would put in a strong plug for Git via GitHub. Use GitHub as your "central" Git repository. Your local Git repository will push to GitHub, and your production server will pull from GitHub.
There is some overhead to learning Git/GitHub, but really, in the situation you've described (a single engineer and two servers), Git isn't any more complicated then SVN (or CVS or anything else).
We use an FTP Synchronizer, which seems to work pretty well. I don't know offhand of any good free ones.
Example: http://www.ftpsynchronizer.com/
Depends on what type of server you are running, but you could run SVN (Subversion). There is a plugin for Eclipse, Aptana, and Zend Studio if you use that to develop.
Essentially you could have a development repository that sits on the server. You would pull your code down to your local environment and commit it back after changes. Then you can setup another repository that is your live data or production thats linked back to your Development repository.
When you want to update the live data, you just update it so if any trouble happens you can roll back that code without having to roll back your development code. Once you get good at all that you can start branching and tagging your projects.
I personally use both SVN and Git, but I prefer Git because it works so much better. Though if you are using Windows, the command line tools just aren't the same as linux.