I'm developing web app using CodeIgniter PHP framework. The server I'm working with does not support any type of source control (i.e. Subversion) unless you go to a higher price tier.
I would still like to put the code under some sort of source control. Does it make sense to do the following:
Install git or SVN on my local machine and develop there
Copy changes from my local machine to development directory on the server (using FileZilla, WinSCP, etc.) and test
Copy changes from development directory to production directory on the server
Does that sound reasonable? Are there better alternatives? Thanks!
If you are using svn locally it's a bit dangerous because then you will need to protect also your computer - I think the best way is to work with the commercial sites offer fully supported svn/git - like http://www.beanstalkapp.com/ or http://www.github.com
You could use source control on your local machine (SVN, Git, etc.) and use an open source tool like Capistrano to deploy the code from your local source control repo to your server via SSH. Or if you're limited to FTP, this blog post has a potential solution.
An advantage of using a tool like Capistrano instead of directly mirroring the files on your local machine to the server via FileZilla or WinSCP is that Capistrano will version your deployed files so that, if you end up breaking something and need to roll back quickly to the previously-deployed version, it can be as easy as changing a symlink to the previous deployment directory.
Does that sound reasonable?
Partially, in p.1. But even in this case I'll suggest to have your repository also at some Repository-Hosting (BitBucket, GitHub, Assembla)
For pp. 2-3: your deploys must to be automatic and non-interactive, thus - you'll have to select another tools (for using in post-commit hook of SCM-of-choice)
Somehow better alternative to 2-3 may be:
Use 2 different branches (DEVEL and PROD) as sources of DEVEL and PROD dir
Post-commit hook, which upload only changed in committed revision files to corresponding dir (NCFTP for FTP, SCP with scenario for ssh)
Main development happens in DEVEL branch
PROD have only mergesets from DEVEL
Workflow is SCM-agnostic and scalable to any reasonable amount of branches and developers
Related
I was enjoining about getting a Network Attached Storage (NAS) so that I can work on dev sites from both my desktop and my laptop without duplicating files and always having the most current file (just in case I forget to save). My question is if I put sites on there that uses php, would I be able to run the sites off of the NAS as I would with MAMP / WAMP? Or would I still need something else to make that work?
The point of a NAS is to share files over a network. This is usually done via Windows File & Print Sharing (aka Samba aka SMB) which is supported on most platforms.
Some NAS devices might allow you to run a web server (particularly if you can install custom firmware), but it is a poor choice of platform to run anything remotely complex in terms of web server stacks.
You can certainly store your development files on a NAS, and then access them from webservers running in both your development environments.
… but that said, I'd look at using version control software (Git would be my preference), keeping your repository on the NAS and getting into the habit of saving, committing and pushing. It makes things more manageable in the long run. (You could also use a service like Bitbucket or Github and dispense with the local NAS entirely).
You could also go a step further and run a server with CI software on it that monitors your repository and automatically pulls updates from it, runs your automated tests, and then updates a local test server.
I am assuming you use windows(it's easier to do it in mac i think) with wamp what you can do is mount a network drive to w:\ let's say. Then create a virtual host that points to a folder in W:\ drive.
With mac all you need to do is mount the remote folder share to your mamp directory and all should work as you want.
Though personally I think this is a terrible idea and would much rather suggest you to use a VCS(version control system) to share code between multiple places. Lots of them are designed with this problem in mind. And it provides nice history about your code at the same time. If you want to do some research look at GIT(the most popular, currently) bitbucket has free private repositories. you can look into what a VCS can do here https://en.wikipedia.org/wiki/Version_control
I'm part of a team of 3 (2 developers and 1 designer) who sometimes work in the office and sometimes remotely and I'm looking at a way of using GIT to develop our websites seamlessly. I've got a managed account with Rackspace and have 3 servers setup on the account - development, staging and production.
I'm looking at the best way for our team to develop daily on our websites without having to FTP the files up to the server each time we make any changes. I've used SVN in the past but i'm looking to use Git for version control. The workflow I had in mind for an example website called 'test' was the following:
Development Server would have a directory (called trunk but not sure if it should be called something else?) for each user as well as a central directory. E.g /var/www/test/jbloggs/, /var/www/test/asmith/, /var/www/test/rjohnson/ and /var/www/test/central/trunk/.
The central repository would be installed within /var/www/test/central/trunk/ and then /asmith/, /rjohnson/ and /jbloggs/ would clone the trunk which would mean they would become /var/www/test/asmith/trunk, /var/www/test/rjohnson/trunk/ and /var/www/test/jbloggs/trunk/.
Each user would then have a copy of /trunk/ which will contain all the website files, will all have a subdomain configured i.e jbloggs.test.development, rjohnson.test.development etc and will configure their IDE to automatically SFTP to the server so that they are working directly within their directory the development server. The central directory domain will be test.development. When they come to committing any changes to the central repository they will SSH into the server and commit their changes and when we want to update the central repository we will pull these changes to get the latest version which can then be viewed at test.development.
Is this the right method of doing things or should we all have a local LAMP stack installed (apart from the designer who uses Windows) and have our repositories locally? If so, should the central repo still be on the rackspace server? The developers will be using phpstorm and the designer dreamweaver.
Hope the above makes sense.
Thanks
I strongly advise you to work local and then commit on the shared server. This is what git is made for. Development will be more reactive and easier for everybody. Make sure all dev master git so they can do their internal soup as they want. If one dev destroy the database, the others can keep on working. But you'll also need a convenient way to synch databases so developers work with an up to date local database.
The rest of your chain is ok, you can still have two test step like dev server for dev team and test server for testers. This will make testers working on a more stable version and it will also make you test the upgrade process when you copy changes from dev server to test server. Lot of errors arise because of untested upgrade procedures.
You can updates changes on test and production server either by installing GIT on them or just using a simple script that will ftp changed files. I don't like having git on a production server but this is a personal opinion.
I'm just trying to find an easier way to deploy a site I'm working on. I'm working alone with a test a production server and right now deployment means copying a subset of the files and database data onto my computer and uploading it to the prod site. I'm sure there's a simple synchronization tool out there but so far I've had no luck in finding anything.
What I'd really like is an application I can run locally (on windows) or something I could install on my server for let me have a one-click deployment. Any suggestions?
Thanks!
godwin
Edit
I have decided for now to go with GoodSync and Toad. Thanks for the suggestions.
man scp
SCP(1) BSD General Commands Manual SCP(1)
NAME
scp - secure copy (remote file copy program)
SYNOPSIS
scp [-1246BCpqrv] [-c cipher] [-F ssh_config] [-i identity_file] [-l limit] [-o ssh_option] [-P port] [-S program] [[user#]host1:]file1
[...] [[user#]host2:]file2
DESCRIPTION
scp copies files between hosts on a network. It uses ssh(1) for data transfer, and uses the same authentication and provides the same
security as ssh(1). Unlike rcp(1), scp will ask for passwords or passphrases if they are needed for authentication.
Any file name may contain a host and user specification to indicate that the file is to be copied to/from that host. Copies between two
remote hosts are permitted.
When copying a source file to a target file which already exists, scp will replace the contents of the target file (keeping the inode).
If the target file does not yet exist, an empty file with the target file name is created, then filled with the source file contents. No
attempt is made at "near-atomic" transfer using temporary files.
The options are as follows:
-1 Forces scp to use protocol 1.
-2 Forces scp to use protocol 2.
...
I use GoodSync http://www.goodsync.com/ for this sort of thing. It's really good. Runs on windows, can sync between any combination of local files (S)FTP, windows, linux network shares etc.
Then create a scheduled task/cronjob to run an export of the database into the syncronised folder and have one do an import at the other end. Obviously this process is one way.
http://www.phing.info/docs/guide/stable/
PHing is an automated build system made for PHP. Works with GIT, SVN, PHPUnit, etc...
You basically set up XML files that give PHing instructions on what to do. Allows you to run test suites along with build creation, build multiple varied versions at a time, copy files as well as db, and a bunch of other cool features.
Also, it's open source and platform independent.
What are you using for source control? Some tools like Git and SVN have ready-made methods for this sort of thing. See here for a quick Git solution.
I would second the advice about Git/SVN, but would put in a strong plug for Git via GitHub. Use GitHub as your "central" Git repository. Your local Git repository will push to GitHub, and your production server will pull from GitHub.
There is some overhead to learning Git/GitHub, but really, in the situation you've described (a single engineer and two servers), Git isn't any more complicated then SVN (or CVS or anything else).
We use an FTP Synchronizer, which seems to work pretty well. I don't know offhand of any good free ones.
Example: http://www.ftpsynchronizer.com/
Depends on what type of server you are running, but you could run SVN (Subversion). There is a plugin for Eclipse, Aptana, and Zend Studio if you use that to develop.
Essentially you could have a development repository that sits on the server. You would pull your code down to your local environment and commit it back after changes. Then you can setup another repository that is your live data or production thats linked back to your Development repository.
When you want to update the live data, you just update it so if any trouble happens you can roll back that code without having to roll back your development code. Once you get good at all that you can start branching and tagging your projects.
I personally use both SVN and Git, but I prefer Git because it works so much better. Though if you are using Windows, the command line tools just aren't the same as linux.
So we are pushing to create good processes in our office. I work in a web shop that has been doing web sites for over a decade. And we don't use version control. I know! It's bad, not my fault. I'm the guy with a SoftE background pushing for this at a minimum.
The tech lead has been looking into it. We all use Mac workstations and mostly use Coda for editing since it is a great IDE. It has SVN support built in but expects it to work on local files. We're trying to explore mounting the web directory as a local network drive with an SFTP tool.
We are a LAMP shop, BTW.
I am wondering what the model is here. I think we have typically would checkout the whole site to our local machine where we have apache running and then test it there? This isn't how we work yet, we do everything on the server. We've looked at checking things in and out, but some files are owned by apache and the ownerships change when I check them in, because I'm not apache.
I just want to know a way to do this that works given my circumstances. Would be nice to not have to run apache locally.
You might want to checkout the Coda mailing list and ask there. Lots of Coda enthusiasts there with specific experience.
If you don't want to have to run locally could make Apache on your server run a copy of the site for every developer, on a different port per person, and then mount those web-roots to the local macs and make that the working directory. If you're a small shop that's not hard to manage. I find that pretty easy to set up and saves a lot of resources on the local machines. The one-site-per-person helps to avoid conflicts with multiple people working on files at the same time.
What I'd additionally recommend is to have a script that gets the latest changes from SVN and deploys the entire site to the production server when you're ready. You could have that script change permissions on appropriate files/folders as needed to be owned by Apache. The idea once you're using source control is to never manually edit the production files -- you should have something that deploys it from SVN for you.
A few notes:
Take a look at MacFuse / MacFusion (the latter is the application, the former is the library behind it) to mount remote directories via SSH / FTP as local ones.
Allow your developers to check out into their local environment (with their own LAMP stack if they're savvy), or look into a shared dev environment with individual jails. This way your developers can run their own LAMP stack (which you could deploy for them on the machine) without interfering with others.
The idea being, let them use a workflow that works best for them, to minimize the pain in adapting to this change (if change management might be an issue!)
Just as an example, we have a shared dev server where jails are created with a single command for new developers. They have a full LAMP stack ready to go, and we can upgrade and re-deploy jails easily to keep software up to date. Developers have individual control to add custom settings / extensions if they need it for work, while the sys admins have the ability to reset everything when someone accidently breaks their environment :)
Those who prefer not to use jails, and are able to, manage their own local environments (typically through Macports or MAMP).
I've read a number of topics in the same sort of ballpark as this one, but in all honesty I'm still not exactly sure on the best approach (as a starting point). I am a solo developer in a small office and I have around 30 websites which are hosted on a linux VPS. I want to start using using version control (probably SVN) and also set up a staging server. At the moment, I do development either locally on my machine before using FTP to upload to the live server, or ocassionally for small changes I edit the remote files directly, which is not an ideal approach.
I'm looking for some guidance on how to improve my development environment. I imagine I should be installing SVN on the web server, which would then allow me to check out versions to my local machine (which would also require SVN i think). Also, if I want to set up a staging server, should I just set up subdomains for each of the live websites, then use these subdomains for showing clients changes to the site before making them live?
Hope this makes sense!
This is what we do at work:
We have a staging server running Apache and a Subversion server. We have a post commit hook that updates a working copy in the htdocs directory, that way, when a developer commits something it automatically gets updated on the staging server, so everyone can see the latest code.
On the client's production servers (the ones we can control) we have the Subversion client installed and the website is a working copy. When we need to update the live site we login to a shell and run svn up. If you do something like this, make sure to limit access to the .svn directories, either with .htaccess files or from the main Apache config.
We have a custom app that manages the projects, but that is only because we're lazy and don't want to setup each project by hand, the app creates the necessary directories and working copies. You could write a quick script to do this.
We never, ever, edit files via FTP on the live site. All in all we have been using this setup for almost 2 years and aside from the occasional conflict on the staging server, we never have had any problems.
You can actually install the SVN server on your local machine, which I would recommend in lieu of installing it on the web server (assuming you make backups). The easiest thing to do, since it’s only you using it, would be to use the file:// protocol, but using svnserve is a little more robust, and the preferred method if you want to take the time to do it.
#Michael, I disagree - I would say it's better to install on the linux vps, especially if you are already paying for the hosting service. I find it very helpful to be able to browse and download stuff from my svn repo wherever I am, from whatever computer I'm on.
#nicky, I started with svn (and version control) several years ago and I took baby steps which made it easier to tackle.
If I had to do it over again, I'd read the svn book to start with. The book is very well laid out and didn't take more than 1-2 days to plow thru.
While you're reading, install svn on your linux vps with an apache front end.
Once you have that up, pick one of your websites and import it into svn. This is how I structure my svn repo. For example, say my repo is hosted at http://mysvn.mydomain.com/svn/:
mywebsite1
- trunk
- tags
- branches
mywebsite2
- trunk
- tags
- branches
Don't worry about creating the perfect structure. It's pretty easy to re-organize especially when you're starting out. After you import a few projects into svn, you'll start to get a feel for which projects should have their own "trunk/tags/branches" dir structure and which can be combined.
For creating test environments, I do exactly what you describe. I use build scripts to checkout from svn and download files into dirs that are mapped to subdomains like "test.clientsite.com" (I work primarily in java and use ant and maven, but I think you can use whatever scripting language you're familiar with).
Once you get used to version control, you'll never go back, good luck!