Me and my friend are in different countries have been developing a LAMP web app for several weeks. All these times we have been sharing source code over ftp. In this way php files become messy. I have heard about CVS, and have been reading about it. But I still cannot figure out how it works exactly.
How does the CVS could help me in this matter ?
I would be much appreciated for someone who point me in the right direction.
Ok here comes a very simple explanation of VCS. After using it for a while you'll laugh at the explanation but for now I guess this should help you.
What are the problems of your current ftp file sharing?
If 2 people upload the same file one of the files will get overwritten
After uploading it you'll only see who changed the file (the last time) but not where it got changed
You can't provide information about the changes (despite putting comments in the files itself)
You can't go back in time, once uploaded old files are lost
With version control you can solve these problems:
Files get either merged into one new file, or get overwritten but the old file will still be stored to roll back if needed
You can see who made which changes when
You can provide comments when you "upload" your files about what got changed (without storing these comments inside files)
You can always go back in time and restore old "uploads"/changes
You can also create small side projects by branching. This basically let's you split your project in smaller pieces and work on them separately.
So at the beginning of your work you usually get your local sources up-to-date by getting all the changes that got made. Then you do your work and afterwards you update the online version with your changes so that other developers can pull these changes and continue to work on them or integrate these changes into their current changes.
How to implement this sorcery?
You could google for "how to implement git" or "how to implement svn" but I would recommend you to use an online service as a beginner. Here is a list of services: https://git.wiki.kernel.org/index.php/GitHosting
My personal preference for closed source projects with a low number of developers is https://bitbucket.org/. You get a small wiki page and bug tracking tool provided with some of the services. If you want to use bitbucket, here is the very easy to understand documentation: https://confluence.atlassian.com/display/BITBUCKET/Bitbucket+101
Important to know:
Soon you'll learn that you don't upload files as I've written multiple times but rather change lines of code. You also don't upload them you "commit" them.
While cvs could help, not many developers will recommend using it for new projects. It has largely been replaced with Subversion (svn), but even that is falling out of favour. Many projects these days use distributed version control with git or Mercurial (hg).
A good introduction to git can be found in the free online book Pro Git.
In any case, these things are all version control systems. They help to synchronize the code between developers, and also let you track
who changed code,
when it was changed,
why it was changed, and
how it was changed.
This is very important on projects with multiple developers, but there is value in using such a system even when working on your own.
Related
Until now, I always used the FTP transfer provided with Netbeans. Everytime I write and save, it uploads the file to the server.
Now I have a project that me and my co-worker have to work on. The method above isn't the best anymore, because if I do some changes and save, and so my co-worker, we overwrite each other modifications.
What's the best way to cooperate on the same PHP/CSS/HTML project? Could you provide some guide too?
Thanks.
PS: OS Win7.
I personally use Mercurial (Hg) when working on projects with a team.
There is a Tutorial for using Mercurial with NetBeans.
When your team is small you could use bitbucket.org to host your project as it is free and private (but in the free version limited to 5 people per project I think). Otherwise you could easily set up your own hg-server. There are plenty of guides on how to use Mercurial on their website
In case you are not familiar with distributed version controll systems. In a nutshell it works like this:
When a file is created it gets pushed to the server
As people make changes to their local copy of that file (or the files) they create changesets which contain only the changes they made.
Everyone pushes theis changesets to the server from time to time and pulls the changes the others have made. The changesets are then merged to contain everybodys changes.
The files on your computer are updated according to the changesets you have pulled so that everybodys files are up to date.
This works pretty well as long you don't edit the same lines of code (because then the automatic merging fails because it doesn't know what to keep and what to discard and you have to merge manually) and your files are not in a binary dataformat.
But in your case HTML, PHP and CSS are all text based so you are good to go.
I hope this helped. If you have any questions feel free to ask.
Sorry I could not post more links because my reputation is too low.
I am developing an application in the Kohana PHP framework that assesses performance. The end result of the process is a webpage listing the overall scoring and a color coded list of divs and results.
The original idea was to have the option to save this as a non-editable PDF file and email the user. After further research I have found this to be non as straight forward as I hoped.
The best solution seemed to be installing the unix application wkhtmltopdf but as the destination is shared hosting I am unable to install this on the server.
My question is, what's the best option to save a non editable review of the assessment to the user?
Thank you for help with this.
I guess the only way to generate a snapshot, or review how you call it, is by storing it on the server side and only grant access via a read only protocol. So basically by offering it as a 'web page'.
Still everyone can save and modify the markup. But that is the case for every file you generate, regardless of the type of file. Ok, maybe except DRM infected files. But you don't want to do that, trust me.
Oh, and you could also print the files. Printouts are pretty hard to be edited. Though even that is not impossible...
I found a PHP version that is pre-built as a Kohana Module - github.com/ryross/pdfview
I have seen a few that only have the checkout functions and options to view the diff and revisions but none with the option to commit changes, so my question:
Any PHP svn GUI with commit functionally?
Update:
to simplify even more:
i need a browser-based interface to a SVN repo that supports write operations, do you know of any?
I suspect the reason one doesn't exist is because it'd be clumsy to use.
You wouldn't have direct file system access (I don't think even Flash or Silverlight provide that, but I could be wrong), so you'd need to upload the files you want to commit or diff.
If you're not using any browser plugins you'd be further limited to uploading a single file at a time, which isn't the way SVN is meant to work. You commit a whole changeset, not one file at a time.
The only feasible way to do this I guess is to have your working copy also on the server and do your editing in the browser as well. At which point though you're not looking for just a web SVN GUI, you're looking for a web-based IDE that has SVN support.
--Edit--
The closest I've been able to find is this http://kodingen.com/ which is still in beta, and looks like it plans to have SVN support but it's not currently available in the beta version (at the time of this answer)
I have a problem with maintenance of my php based website. My website is built on the Zend Framework. When I wish to upload a new copy or version online - during the time of upload especially when crucial files like models and controllers are being uploaded and rewritten - the site won't run understandably.
Is there a way to upload a website without having to go through this issue?
My updates are really quite regular. Let's say like once or twice a week in this case.
You can make use of the fact that renaming directories is quick and easy even through FTP. What I usually do is:
Have two directories, website_live and website_upload
website_live contains the live website (obviously)
Upload contents to website_upload
Rename website_live to website_old (or whatever)
Rename website_upload to website_live
done! Downtime less than two seconds if you rename quickly.
It gets a bit more complex if you have uploaded content in the old version (e.g. from a CMS) that you need to transfer to the new one. It's cumbersome to do manually every time, but basically, it's just simple rename operations too (renaming works effortlessly in FTP as well).
This is a task that can be automated quite nicely using a simple deployment script. If you're on Linux, setting up a shell script for this is easy. On Windows, a very nice tool I've worked with to do automated FTP synchronizing, renaming and error handling - even with non-technical people starting the process - is ScriptFTP. It comes with a good scripting language, and good documentation. It's not free, though.
If you're looking to get into hard-core automated PHP deployment, I've been doing some research in that field recently. Maybe the answers to my recent bounty question can give you inspiration.
At my company we have a group of 8 web developers for our business web site (entirely written in PHP, but that shouldn't matter). Everyone in the group is working on different projects at the same time and whenever they're done with their task, they immediately deploy it (cause business is moving fast these days).
Currently the development happens on one shared server with all developers working on the same code base (using RCS to "lock" files away from others). When deployment is due, the changed files are copied over to a "staging" server and then a sync script uploads the files to our main webserver from where it is distributed over to the other 9 servers.
Quite happily, the web dev team asked us for help in order to improve the process (after us complaining for a while) and now our idea for setting up their dev environment is as follows:
A dev server with virtual directories, so that everybody has their own codebase,
SVN (or any other VCS) to keep track of changes
a central server for testing holding the latest checked in code
The question is now: How do we manage to deploy the changed files on to the server without accidentaly uploading bugs from other projects? My first idea was to simply export the latest revision from the repository, but that would not give full control over the files.
How do you manage such a situation? What kind of deployment scripts do you have in action?
(As a special challenge: the website has organically grown over the last 10 years, so the projects are not split up in small chunks, but files for one specific feature are spread all over the directory tree.)
Cassy - you obviously have a long way to go before you'll get your source code management entirely in order, but it sounds like you are on your way!
Having individual sandboxes will definitely help on things. Next then make sure that the website is ALWAYS just a clean checkout of a particular revision, tag or branch from subversion.
We use git, but we have a similar setup. We tag a particular version with a version number (in git we also get to add a description to the tag; good for release notes!) and then we have a script that anyone with access to "do a release" can run that takes two parameters -- which system is going to be updated (the datacenter and if we're updating the test or the production server) and then the version number (the tag).
The script uses sudo to then run the release script in a shared account. It does a checkout of the relevant version, minimizes javascript and CSS1, pushes the code to the relevant servers for the environment and then restarts what needs to be restarted. The last line of the release script connects to one of the webservers and tails the error log.
On our websites we include an html comment at the bottom of each page with the current server name and the version -- makes it easy to see "What's running right now?"
1 and a bunch of other housekeeping tasks like that...
You should consider using branching and merging for individual projects (on the same codebase), if they make huge changes to the shared codebase.
we usually have a local dev enviroment for testing (meaning, webserver locally) for testing the uncommited code (you don't want to commit non functioning code at all), but that dev enviroment could even be on a separeate server using shared folders.
however, committed code, should be deployed to a staging server for testing before putting it in production.
You can probably use Capistrano even though is more for ruby there are some articles that describe how to use it for PHP
I think Phing can be use with CVS but not with SVN (at least that what I last read)
There are also some project around that mimic Capistrano but written in PHP.
Otherwise there is also a custom made solution :
tag files you want to deploy.
checkout files using the tag in a
specific directory
symlink the directory to the current
document root (easy to rollback to
the previous version)
Naturally check out SVN for the repository, Trac to track things, and Apache Ant to deploy.
The basic process is managing in Subversion, tracking the repositroy and developers in Trac and using Ant deployment scripts to push your site out with the settings needed. Ant allows you to easily deploy a project to a specific location. (Dev/test/prod) etc.
You need to look at:
Continuous Integration
Running unit tests on check-in of code to check it is bug free
Potentially rejecting code if it contains a bug
Having nightly builds
Releasing only the last build that was bug free
You may not get to a perfect solution, especially not at first, but the more you use your chosen solution, the more comfortable everyone will get and be able to make suggestions on improving it.
We check for the stability with ant, every night. And use ant script to deploy. It is very easy to configure and use.
I gave a similar answer yesterday to another question. Basically you can work in branches and integrate before going live.
The biggest thing you will have to get your head round is that you are dealing with changes to files, rather than individual files. Once you have branches there isn't really a current version there are just versions with different changes in.