I have a website built on php and MySQL and trying to finally become more organised using git for making a distinction between code that is live and code that is still in test. I am the only developer working on the site
I cannot afford having a production and test server. So on the same server I made separate folders, one which is live /www/live/ and the other that is in development /www/dev/.
The development part of the site /www/dev/ is linked to a sub-domain dev.mydomain.com
Problem 1: I wish to setup git in such a way that my commits go to /www/dev/ and my pushes go live to /www/live. Is it reasonable to do it this way?
Problem 2: There are two places in the /www/dev code where links to the MySQL development database and folders are hard-coded. Commits would work fine, but then pushes would change the live code to now point to development databases and folders. Is there a neat way of managing this?
No - that's what git branches are for. Create a new branch called develop for all of your dev stuff and push commits to that which you would pull down from /www/dev. When ready to go live, merge the develop branch into master and pull master down on /www/live
Either git ignore the config files or better yet define the environment based on URL, directory path or an environment variable.
Related
I have a number of currently live Magento websites which I'm looking to take copies of and integrate Git with Bitbucket into. I want to do this to make future changes traceable with the ability to be able to roll back should anything go wrong!
Here's how I understand it should work:
Create bitbucket repository
Install .git on both local machines and live server
Download copies of live site files and database
Clone repo to local machine
Import all files into cloned repo
Push back up to repo
Setup php file on live site to git pull from repo
Link php file as a Service Hook in bitbucket
What I need to know is if I'm missing anything from the above steps and how you would migrate a currently live Magento site to your local machine and get it all working again.
First of all here is my suggestion for your Git architecture:
Create two branch in your local and two branch in Bitbucket (Let's name them Dev and Master). It is very important to have a dedicated branch (Master) in Bitbucket that your server securely pulls from it; in this way you try to prevent messing things up.
In your local always push from your Dev branch and merge your Master branch after it. And in your server pull only from Master branch in Bitbucket.
For making a copy of your Magento:
First dump your DB and import it in your local DB. Then copy all the files from public_html folder in your server to your local. You might need to change a few things in your DB (in core_config_data table) and your .htaccess file (Here).
Take the DB user credentials from: app/etc/local.xml file and create this user in your local DB. If you are lucky enough it should do the job if not search for it, you'll find hundreds of dedicated questions/answers all over the Internet.
Apart from these, for automatic updates you should look for CI (continuous integration) tools like Jenkins. Have a look at these links (1, 2).
Recently we have started exploring GIT with the target of enabling our developers to work from any place and secondly to automate the overall deployment process.
We have a central test server where we host all apps/sites for testing and/or demo purpose and once the development and testing is finalized we move the application to their respective live servers.
Whatever i have set up with GIT, is as follows
1. Create a bare repo on test server
2. Get a local clone for each involved developer, Developers will push to remote(test server) dev branch
3. Someone will merger all changes from dev branch to master branch and push it to remote
4. The test server (bare repo) has a post-receive hook, which checks out the master branch to public_html folder (using GIT_WORKING_DIR and checkout -f).
As of now, everything works good and i am able to see merge on master branch on hosted pages (on test server, of course). Now my questions are ...
1. Am I doing this right?
2. I guess the post-receive hook I have set, executes on push to dev branch as well. How to avoid this?
3. How I can ship these contents to my live server? As I have some projects with large code base, checking out everything on test server and then ship it to live doesn't looks good enough.
I've heard of CI servers, but as much as I know they check out locally and upload everything to live using 'rsync' (don't know if it just syncs changes or uploads everything) or such tools. I just want to avoid that everything part and keep an option open to rollback, if anything goes wrong. I am good with setting up git on live servers.
Yes. You can see other considerations at "reset hard on git push".
You can test the name of the branch when receiving the commits.
branch=$(git rev-parse --symbolic --abbrev-ref $refname)
See also "Writing a git post-receive hook to deal with a specific branch"
a rsync is usually recommended for live server (where git isn't necessary): it will update only what has changed.
If you have git on live server, then various approaches are described in "Git for Websites / post-receive / Separation of Test and Production Sites"
Regarding deployment, as seen in "Deploy with rsync(or svn, git, cvs) and ignore inconsistent state during deployment?", deploying (even everything) in a separate directory and symlink the prod instance to that directory is a nice way of avoiding inconsistencies during deployment, and to facilitate rollback (symlink back to the previous live directory) in case of trouble.
I'm part of a team of 3 (2 developers and 1 designer) who sometimes work in the office and sometimes remotely and I'm looking at a way of using GIT to develop our websites seamlessly. I've got a managed account with Rackspace and have 3 servers setup on the account - development, staging and production.
I'm looking at the best way for our team to develop daily on our websites without having to FTP the files up to the server each time we make any changes. I've used SVN in the past but i'm looking to use Git for version control. The workflow I had in mind for an example website called 'test' was the following:
Development Server would have a directory (called trunk but not sure if it should be called something else?) for each user as well as a central directory. E.g /var/www/test/jbloggs/, /var/www/test/asmith/, /var/www/test/rjohnson/ and /var/www/test/central/trunk/.
The central repository would be installed within /var/www/test/central/trunk/ and then /asmith/, /rjohnson/ and /jbloggs/ would clone the trunk which would mean they would become /var/www/test/asmith/trunk, /var/www/test/rjohnson/trunk/ and /var/www/test/jbloggs/trunk/.
Each user would then have a copy of /trunk/ which will contain all the website files, will all have a subdomain configured i.e jbloggs.test.development, rjohnson.test.development etc and will configure their IDE to automatically SFTP to the server so that they are working directly within their directory the development server. The central directory domain will be test.development. When they come to committing any changes to the central repository they will SSH into the server and commit their changes and when we want to update the central repository we will pull these changes to get the latest version which can then be viewed at test.development.
Is this the right method of doing things or should we all have a local LAMP stack installed (apart from the designer who uses Windows) and have our repositories locally? If so, should the central repo still be on the rackspace server? The developers will be using phpstorm and the designer dreamweaver.
Hope the above makes sense.
Thanks
I strongly advise you to work local and then commit on the shared server. This is what git is made for. Development will be more reactive and easier for everybody. Make sure all dev master git so they can do their internal soup as they want. If one dev destroy the database, the others can keep on working. But you'll also need a convenient way to synch databases so developers work with an up to date local database.
The rest of your chain is ok, you can still have two test step like dev server for dev team and test server for testers. This will make testers working on a more stable version and it will also make you test the upgrade process when you copy changes from dev server to test server. Lot of errors arise because of untested upgrade procedures.
You can updates changes on test and production server either by installing GIT on them or just using a simple script that will ftp changed files. I don't like having git on a production server but this is a personal opinion.
I'm struggling with the best approach to test php development code which is dependent on certain framework files to be present. I think there are three possible scenario's with git:
Create a copy of the live production directory and clone this 'dev' directory to the local workstation. The next step would be to edit code on the local workstation and commit/push every change. You can check your work via the 'dev' url on the production server. If everything is alright you can push the changes to the 'live' directory. This approach may result in a lot of commits as you are editing/fixing your code (syntax errors or other obvious mistakes) and it adds an extra step (commit/push) to see your result.
Create a 'dev' server which mirrors the production server. This server will contain all the framework files and you'll be able to edit a copy of the 'live' directory directly and immediately see your changes. If you prefer you can mount the remote 'dev' directory to your local workstation. This requires an extra server which needs to be maintained and you would need the resources to set it up.
Create a local 'dev' workstation environment and clone the repository on the 'live' or 'dev' server. This way you'll be able to test all the code on your local machine and only push out the commits which have been tested and approved. This reduces the number of commits as opposed to method one. To recreate the 'dev' environment locally you might have to install a lot framework/dependent files to your local workstation and even then it might not be 100% reliable when the code is ported to the actual live server.
Basically I want to find the best method for the 'write-test-revise-test-revise-test-commit'
cycle if you are dependent on framework files (whatever framework that may be). Would you create a 'dev' server or would you recreate the exact production environment on your local workstation? Ideally you would only commit the code when you have done some initial testing (obvious syntax errors etc.). A 'dev' server with local git repo would require that you commit every little change to test your work which may be tedious....
I hope I have made myself clear. I'm looking for the best way to integrate git and the 'write-test-commit' cycle. Normally you would test on the local machine but with web development you may need a webserver + framework to be able to test your code. Editing directly on the 'live' server is what I want to avoid.
Thanks for your input!
There are definitely many ways to do this, but here is my 2 cents from how I have been working lately.
First off, I would probably avoid a dev server per se, because if you have more than 1 developer, each developer may try to update the dev server with conflicting code, or if they are working on similar areas, overwrite some of your test code since you both probably are working from the same branch but have both modified the code and not yet pushed the changes.
That said, you may want a dev server that closely resembles your live server so that after you and some of your other developers have made a number of changes, you can test them on the dev server before updating the code on the live server.
In my environment, I develop on Linux and have Apache/PHP running the same versions and configs as the live server. This way, I clone my git repo, have my environment set up so that my document root is the "public" directory of my git repo (e.g. htdocs). In this case, we have a dev MySQL server which is usually shared, and not on the local machine, but you can do whatever is easiest there. Our system depends on constantly updated data from the field so this is why the shared database, we have a system which adds a lot of this necessary "test" data automatically to the database.
This way, I can pull the latest code from git, work on it all I want, work-break-fix-work-work-work etc etc and when I have completed my task, I can push the changes back to git for other developers.
When you are ready for a release, you can do all of your testing and stuff on the dev server, verify it is good to go and then push to the live.
In my case for updating the "live" server, one person is responsible for that, and I use rsync to sync my local working directory to the live server. So when I am absolutely sure we are ready to deploy, I pull the most recent code from git, and run my script which rsyncs my git directory to the server.
I'd avoid your methods 1 & 2, and go with something like 3. That will probably be the sanest thing for you to do and easiest to manage. Depending on what your team is like, you could create a dev VM that is pre-built with all the dependencies, correct software, and development tools you are all using, or leave it up to the developer to set themselves up.
So far this method has worked pretty well for me and the others on my team.
Call me opinionated, but every developer should have a local development AMP stack which they can develop against. If you don't know how to set up an exact mirror of your production server, solve that problem first.
Once you're there, it should be trivial to have each developer set up a virtual machine with a clean OS install, configure web/php/db servers and libraries/framewroks to match the production environment, check out your project, and get to work.
Developers commit against personal branches in their own local repos as they go, and after local testing, ship their code (via either a push, or a pull request, or whatever).
The exact rules about how to merge changes into master depend on your team an preferences. But developers should almost always have a complete local dev environment. If it seems like it's hard to set one up, that's a big problem. Figure out how to make it easy and then document it.
We have a PHP project that we would like to version control. Right now there are three of us working on a development version of the project which resides in an external folder to which all of our Eclipse IDEs are linked, and thus no version control.
What is the right way and the best way to version control this?
We have an SVN set up, but we just need to find a good way to check in and out that allows us to test on the development server. Any ideas?
We were in a similar situation, and here's what we ended up doing:
Set up two branches -- the release and development branch.
For the development branch, include a post-commit hook that deploys the repository to the dev server, so you can test.
Once you're ready, you merge your changes into the release branch. I'd also suggest putting in a post-commit hook for deployment there.
You can also set up individual development servers for each of the team members, on their workstations. I find that it speeds things up a bit, although you do have some more setup time.
We had to use a single development server, because we were using a proprietary CMS and ran into licensing issues. So our post-commit hook was a simple FTP bot.
Here is what we do:
Each dev has a VM that is configured like our integration server
The integration server has space for Trunk, each user, and a few slots for branches
The production server
Hooks are in Subversion to e-mail when commits are made
At the beginning of a project, the user makes a branch and checks it out on their personal VM as well as grabs a clean copy of the database. They do their work, committing as they go.
Once they have finished everything in their own personal space they log into the integration server and check out their branch, run their tests, etc. When all that passes their branch is merged into Trunk.
Trunk is rebuilt, the full suite of tests are run, and if all is good it gets the big ol' stamp of approval, tagged in SVN, and promoted to Production at the end of the night.
If at any point a commit by someone else is made, we get an e-mail and can merge those changes into our individual branches.
Beanstalk has built-in post-commit hooks for deploying to development, staging, and production servers.
One way to use subversion for PHP development is too setup a repository for one or all three developers, and use this repository, more as a syncing tool, than true version control.
You could,
Make a repo
Add your entire PHP document structure of your project
Checkout a copy of this repo into the correct spot on your dev server
Use an svn hook, that activates on commit
This hook, will automatically update the contents of the dev sever, whenever anybody on the team checks in any code.
Hook resides in:
svn_dir/repo_name/hooks/post-commit
And could look like:
/usr/bin/svn up /path_to/webroot --username svn_user --password svn_pass
That will update your working copy on the dev server to the latest check in.
What about something distributed? You can start for example with Mercurial, try different workflows, and see which one fits you the best.
Each of you could run it locally, or on your own dev server (or even the same one with a different port...).
One possible way (there are probably better ways):
Each of you should have your own checked out version of the project.
Have a local copy of the server on your computer and test it there throughout the day. Then at the end of each day (or whenever), you merge together whatever you are ready to test, and you check it out onto the dev server and test it.
Another tool you can use for the builds is TeamCity which is free for 20 build configurations (enough for most small companies/projects.) This way you can run your tests as well as schedule builds.