Recently I've read this article: http://www.smashingmagazine.com/2009/09/25/svn-strikes-back-a-serious-vulnerability-found/
Developers of many popular sites like apache.org, php.net (http://ru2.php.net/.svn/entries), classmates.com and russian Yandex use SVN, but do not follow the recommendations given by SVN (to use command export).
So, what are the reasons for not using svn export instead of updating the public copy like all they do?
Some people, not including myself, think that to deploy onto production you should just issue an svn up. If you do an export it loses the meta data about the versioning so you can t do that, you have to use another mechanism for tracking which version is where. It is an easy solution, but I think it can make for lazy packaging and also for "fixing in production" as if you do this you can also check back in from production...
From my perspective what I do is lock off/block access to any .svn files on the server (either Apache2 or IIS) this way the hidden folders are not accessible externally, and it allows for version tracking for sites that we use which do not require compiling before rollout
Languages like:
PHP
ASP (not .NET)
PLAIN HTML
COLDFUSION
PDF / IMAGE versioning (if needed, in my case we needed it for updated PDF docs for customers).
So certainly you can use SVN for web development, but you do need to be cautions as you expose your .svn folders to the world if you are not cautious. Otherwise it is a tool you could use to make your job easier and more efficient.
With that said, we simply run an SVN UPDATE on our production to update changed files, and with limited developers working on one piece of code at a time (like I said in my case) we don't get mixups with wrong things getting deployed. PLUS to be safe, always do a SVN CHECK FOR MODIFICATIONS to see what is going to be updated, and hey, if you do make a mistake, roll it back.
With svn export files can never get deleted, only added and modified. This could be issue sometimes.
When the entire website is open-source and available for downloading over a public resource (like PHP's). Protecting the .svn directories so other's can't get the source code is probably not worth the effort over simply doing a svn up.
Related
I've built a CMS (using the Codeigniter PHP framework) that we use for all our clients. I'm constantly tweaking it, and it gets hard to keep track of which clients have which version. We really want everyone to always have the latest version.
I've written it in a way so that updates and upgrades generally only involve uploading the new version via FTP, and deleting the old one - I just don't touch the /uploads or /themes directories (everything specific to the site is either there or in the database). Everything is a module, and each module has it's own version number (as well as the core CMS), as well as an install and uninstall script for each version, but I have to manually FTP the files first, then run the module's install script from the control panel. I wrote and will continue to write everything personally, so I have complete control over the code.
What I'd like is to be able to upgrade the core CMS and individual modules from the control panel of the CMS itself. This is a "CMS for Dummies", so asking people to FTP or do anything remotely technical is out of the question. I'm envisioning something like a message popping up on login, or in the list of installed modules, like "New version available".
I'm confident that I can sort out most of the technical details once I get this going, but I'm not sure which direction to take. I can think of ways to attempt this with cURL (to authenticate and pull source files from somewhere on our server) and PHP's native filesystem functions like unlink(), file_put_contents(), etc. to preform the actual updates to files or stuff the "old" CMS in a backup directory and set up the new one, but even as I'm writing this post - it sounds like a recipe for disaster.
I don't use git/github or anything, but I have the feeling something like that could help? How should (or shouldn't) I approach this?
Theres a bunch of ways to do this but the least complicated is just to have Git installedo n your client servers and set up a cron job that runs a git pull origin master every now and then. If your application uses Migrations it should be easy as hell to do.
You can do this as it sounds like you are in full control of your clients. For something like PyroCMS or PancakeApp that doesn't work because anyone can have it on any server and we have to be a little smarter. We just download a ZIP which contains all changed files and a list of deleted files, which means the file system is updated nicely.
We have a list of installations which we can ping with a HTTP request so the system knows to run the download, or the click can hit "Upgrade" when they log in.
You can use Git from your CMS: Glip. The cron would be a url on your own system, without installing Git.
#Obsidian Wouldn't a DNS poisoning attack also compromise most methods being mentioned in this thread?
Additionally SSH could be compromised by a man in the middle attack as well.
While total paranoia is a good thing when dealing with security, Wordpress being a GPL codebase would make it easy to detect an unauthorized code change in your code if such an attack did occur, so resolution would be easy.
SSH and Git does sound like a good solution, but what is the intended audience?
Have you taken a look at how WordPress does it?
That would seem to do what you want.
Check this page for a description of how it works.
http://tech.ipstenu.org/2011/how-the-wordpress-upgrade-works/
I have seen a few that only have the checkout functions and options to view the diff and revisions but none with the option to commit changes, so my question:
Any PHP svn GUI with commit functionally?
Update:
to simplify even more:
i need a browser-based interface to a SVN repo that supports write operations, do you know of any?
I suspect the reason one doesn't exist is because it'd be clumsy to use.
You wouldn't have direct file system access (I don't think even Flash or Silverlight provide that, but I could be wrong), so you'd need to upload the files you want to commit or diff.
If you're not using any browser plugins you'd be further limited to uploading a single file at a time, which isn't the way SVN is meant to work. You commit a whole changeset, not one file at a time.
The only feasible way to do this I guess is to have your working copy also on the server and do your editing in the browser as well. At which point though you're not looking for just a web SVN GUI, you're looking for a web-based IDE that has SVN support.
--Edit--
The closest I've been able to find is this http://kodingen.com/ which is still in beta, and looks like it plans to have SVN support but it's not currently available in the beta version (at the time of this answer)
I am developing (solo web developer) a rather large web based system which needs to run at various different locations. Unfortunately, due to some clients having dialup, we have had to do this and not have a central server for them all. Each client is part of our VPN, and those on dialup/ISDN get dialed on demand from our Cisco router. All clients are accessable within a matter of seconds.
I was wondering what the best way to release an update to all these clients at once would be. Automation would be great as their are 23+ locations to deploy the system to, each of which is used on a very regular basis. Because of this, when deploying, I need to display a 'updating' page so that the clients don't try access the system while the update is partially complete.
Any thoughts on what would be the best solution
EDIT: Found FileSyncTask which allows me to rsync with Phing. Going to use that.
There's also a case here for maintaining a "master" code repository (in SVN, CVS or maybe GIT). This isn't your standard "keep editions of your code in the repo and allow roll backs"... this repo holds your current production code (only). Once an update is ready you check the working updated code into the master repo. All of your servers check the repo on a scheduled bases to see if it's changed, downloading new code if a change is found. That check process could even include turning on the maintenance.php file (that symcbean suggested) before starting the repo download and removing the file once the download is complete.
At the company I work for, we work with huge web-based systems which are both Java and PHP. For all systems we have our development environments and production environments.
This company has over 200 developers, so I guess you can imagine the size of the products we develop.
What we have done is use ANT and RPM build archives for creating deployment packages. This is done quite easily. I haven't done this myself, but might be worth for you to look into.
Because we use Linux systems we can easily deploy RPM packages, the setup scripts within a RPM package can make sure everything gets to the correct place. Also you get a more proper version handling and release process.
Hope this helped you.
Br,
Paul
There's 2 parts to this, lets deal with the simple one first:
I need to display a 'updating' page
If you need to disable the entire site while maintaining transactional integrity, and publishing a message to the users from the server being updated, then the only practical way to do this is via an auto-prepend - this needs to be configured in advance (note - I believe this can be done using a .htaccess file without having to restart the webserver for a new PHP config):
<?php
if (file_exists($_SERVER['DOCUMENT_ROOT'] . '/maintenance.php')) {
include_once($_SERVER['DOCUMENT_ROOT'] . '/maintenance.php');
exit;
}
Then just drop maintenance.php into your webroot and that file will be displayed instead of the expected file. Note that it should probably include a session_start() and auto-refresh to ensure the session is not expired. You might want to extend the above to allow a grace period where POSTs will still be processed e.g. by adding a second php file.
In terms of deploying to remote sites, I'd recommend using rsync over ssh for copying content files - which should be invoked via a controlling script which:
Applies the lock file(s) as shown above
runs rsync to replicate files
runs any database deployment script
removes the lock file(s)
If each site has a different set up then I'd recommend either managing the site specific stuff via a hierarchy of include paths, or even maintaining a comlpete image of each site locally.
C.
I currently ftp all my files to my website when i do an update (over a slowish adsl connection)
And I want to make things easier, so I just recently started using a hosted svn service, and i thought i could speed things up a bit by doing an svn export of my website directly onto my webserver
i have tried that a few times and it seems to work ok, however it does fetch the entire site everytime which is a bit slow for a 1 file update
so my questions are
is it possible to do an export and only get the changes since the last export (how will this handle deleted files ?)
OR will it be easier to do an svn checkout and svn update it all the time instead of svn export and just hide the .svn folders using apache htaccess
is this a good idea, or is there a better way to publish my website
i am trying to achieve the 1 click deploy type ideal
maybe there are some gotcha's i haven't thought of that someone else has run into
debian/apache/php
I would do an svn checkout, and have done so successfully on a live site for a number of years. You should add mod_rewrite rules to 404 the .svn directories (and files) though.
This is what I'm doing on my host:
For every project I have a structure that looks more less like this:
~/projects/myproj
~/public_html/myproj
First dir is a checkout from SVN, while second one is just svn export.
I have a small bash script
#!/bin/bash
SOURCE="$HOME/projects/"
TARGET="$HOME/public_html/"
for x in `ls $SOURCE`
do
if [ -d $SOURCE$x ]; then
svn update $SOURCE$x
svn export --force $SOURCE$x $TARGET$x
fi
done
Export is done from working copy so it's very fast.
It might not be exactly the answer you are looking for, but, if you have an SSH access to your webserver (it depends on your hosting service ; some "low cost" don't give such kind of access), you can use rsync to "synchronise" the remote site with what you have on your disk.
In the past, I was using something like the idea you are describing (fetching the svn log between last revision pushed to production and HEAD, analysing every lines, and in the end calculating what to send to the server) ; but it was not really a great process ; I now use rsync, and like it way better.
(Here too, you will have to exclude .svn directories, btw)
You can just accept having .svn directories in your website (generally not a problem esp. if you configure it not to permit access to these) - this is the easy option. Alternatively, do what RaYell does, and have two copies of your website on the webserver. One normal checkout outside of the web-directory, and one in your web-directory. When you update, simply export the svn (just a copy with .svn dirs deleted) into the web directory (and you should make sure to first delete old files if you wish to avoid files that have been removed from SVN from remaining on your website).
I do something like this, using robocopy set to mirror the svn checkout while excluding .svn directories, and get both the export and the old-file deletion in one step, thus minimizing downtime if the copy takes long. I'm sure this is easy on unix too, if that's your hosting environment. For example, you can use a local rsync: http://blog.gilluminate.com/2006/12/12/yes-you-can-rsync-between-two-local-directories/
Old topic but since this is what came up in Google during my research I thought I would add to this. I recommend doing an export to the site instead of checking out to it. I like to keep the repositories and the sites separate. I also don't recommend exporting the entire repo to the site each time, specially if only a few file change at a time. What you can do instead is do a diff on the repo to see what's changed from a release to another and only export those files. More info at: http://www.joeyrivera.com/2011/automate-svn-export-to-site-w-bash-script/
Like many projects, we deploy to many environments, QA, UA, Developer trunks, etc..
What is the best way to store sensitive configuration parameters in SVN? Or, should you not and just maintain a smaller unversioned file with credentials in it on the server?
Mainly, we do not want to expose production credentials to every developer.
I'd rather provide configuration examples than real config files. In my project there is setup.default.php file in root directory that every user need to copy as setup.php and amend to match local environment. Additionally, to prevent checking in back customised setup files there is a rule for it in .svnignore.
$ echo 'setup.php' > .svnignore
$ svn propset svn:ignore -F .svnignore .
This is a problem I have run into as well. I think the answer is to check in a Template (such as you have with setup.php.default) and then use an automated tool such as Phing to make the push to development. If you use recognizable tokens in the setup.php file then Phing will be able to replace these tokens with individual server values. Also, an easy one step push live will be a helpful process to have.
I would not store configuration information in the repository at all. That way you don't have to worry about SVN trying to update the config when you update your source.
I would agree with adam. If its not something that benefits everybody who works on the project, it shouldn't be under version control. If somebody checks out a copy of your code, will your personal project files help them? Probably not. It would most likely just clutter things up.