IDE, SVN and pushing to sites! - php

Im thinking of updating my practices, and looking for a little help and advice!
I do a lot of work on sites that run joomla, oscommerce, drupal etc and so I have created a lot of custom components/plugins and hacks etc. Currently each site has its own folder on my xampp setup. What I would like to do is have a default setup of (for example) a Joomla setup and when I make changes updates, I can do something which updates all the other folders that contain joomla, almost like an auto update?
Im also looking at using Aptana IDE more and SVN service such as unfuddle to share my work with others, but I have not used SVN before and not sure if its possible to do the above using SVN?
It would be great to be able to work on a main/core item and send the updates to both local updates and to actual servers, without having to maintain lots of different individual sites.
Suggestions?

Yes, SVN would be a great tool for this purpose. Store your code (eg: a custom Joomla component) in source control. Wherever you want to use that component, just do a checkout or export of that particular folder into your live site. Here's one way you could structure your repository:
unfuddle.com/myRepo/trunk/com_myComponent
unfuddle.com/myRepo/trunk/com_anotherComponent
Log in to your live server via SSH and run this command:
> cd path/to/joomla/components
> svn co http://unfuddle.com/myRepo/trunk/com_myComponent
Any time you change your code, commit the changes and then log back into the server and run:
> cd path/to/joomla/components
> svn up com_myComponent
A real benefit of this is that should you do an update and break something, you can always roll it back to the last known "good" version.
As for automating this process, you might be out of luck if it's on different servers. For multiple deployments on the same server, you could quite easily write a shell script to run the above commands for each site/component. If you needed this to be fully automated, you could even set up a cron job to run this script every day at 2am or something - personally I'd stick with the manual approach, but it's still an option.
For working locally with your SVN repositories, I'd recommend looking at TortoiseSVN (if you're on Windows): it's the simplest and easiest way to work with SVN.

For automating things, you could use SVN hooks for this. There is a post-commit hook, so every time you do a commit, your hook script could tell the other machines to do an SVN update to get the latest code.
For more info, see Version Control with Subversion - Implementing Repository Hooks.

I don't have a good answer for your situation, but I don't think Subversion by itself is the answer.
This Question addresses some of the concerns about Subversion's mechanisms for sharing across 'projects'.
Subversion can certainly handle the source code management part of this puzzle. The automated distribution, well I'd use another tool.

Look into Capistrano. I've used it a couple of times and once you figure it out, it's pretty good. Aimed at rails but should work for anything where you need to get code from a repository and deploy it on different servers.

Related

How to organize a web app deployment onto multiple servers at once?

So - let's say I develop a PHP app which I develop in a vagrant box identical to production envrionment. So - as an end result I would have a *.tar.zip file with a code...
How would one organize a deployment into production environment where there are a lot of application servers? I mean - I'm confused how to push code into production synchronously all at once?
More information:
on server code is stored like this:
project
+current_revision ->link to revisions/v[n]
+revisions
+v1
+v2
+v3
...
+data
So when I have to deploy changes I usually run a deploy script that uploads updated tar onto server with ssh, untars into specific dir under revisions, symlinks it into current_revision and restart php-fpm.... This way I can rollback anytime just by symlinking to an older revision.
with multipe servers what bothers me is that not all boxes will be updated at once, ie. technically some glitches might be possible.
If you're looking for a "ready-to-go" answer, you'll need to provide some more info about your setup. For example, if you plan to use git for VCS, you could write a simple shell script that pulls the latest commit and rsyncs with the server(s). Or if you're building on top of Symfony, capifony is a great tool. If your using AWS, there's a provider plugin written by the author of Vagrant that's super easy to use, and you can specify a regex for which machines to bring up or provision.
If instead you're looking for more of a "roadmap", then the considerations that you'll want to take are:
Make building of identical boxes in the remote and local environments as easy as possible, and try to make sure that your provisioning emphasizes idempotence.
Consider your versioning/release structure; what resources will rarely or never change? Include those in a setup function instead of a deploy function, and don't include them in your sync run.
Separate your development and system administration concerns; i.e. do not just package a vagrant box with a *.tar.gz and tie it through config.vm.box_url. The reason for this is that you'd have to repackage every production server with a new box every time you deploy, instead of just changing files on the server, or adding/removing some packages from the server.
Check out some config management tools like Chef and Puppet; even if you don't end up using them, they'll give you an idea of how sysadmin professionals approach this problem.
Lots of ways. If starting from barebones (no cloud infrastructure), I'm a fan of the SVN branch hook. Have a SVN repo for your code. Set up a post-commit hook on it, which checks if anything in /branch/production/ has been changed.
If it has, let the post-commit hook fire all your automated roll-out procedure - and in this case, an easy way to do so is to let all your servers known* to svn export the branch. As simple as that!
(* that's the hard step)

How do you take your project from development to production?

Good day to you all,
I am currently developing a project on Laravel. So far I have always developed online, directly editing my files on the webserver throuh FTP (using PSPad or similar simple editing tools).
What I want to do now (and what i believe most people actually do) is setup a (W)LAMP stack on my local machine and program locally. However it is a little bit unclear to me how to keep my local code (including databases) in sync with the live website. How do you folks do that? I know there's probably lots of ways and tools to do that, but what would be your advice for a best practice? Any advice would be very welcome :)
What many companies do is build offline, then push their edits up to a server using git.
Im no expert on the software so ill describe what you do in a basic form:
My advice would be to create an online repo (repository) to store your project while you edit/update.
There are several git project management systems such as github or bitbucket. I personally use bitbucket
What git does, is when you have built or added what you need offline on local (w)lamp, you then git push them up to your repo or server. The changed files then get merged with the existing on the repo or the server. If you'd like the most recent version of your project you'd simply just git pull them down.
Read the full documentation here to see the wide range of options available when using git
We have a settings array within our platform available as $res::Config.
At runtime, a variable is changed from 'dev' to 'live' after checking the HTTP Host, obviously depending on the IP address.
Within our framework bootstrapping, depending on the value of $res::Config->$env, or the environment set previously as either dev or live, the settings for the database connection are set. You store these settings in the Config array as db_live or db_dev.
However you do it, use an environmental variable to figure out whether you want live or dev, and set up and array of settings accordingly.
We also have sandbox and staging for intermittent development stages.
As for version control, use git or subversion.
Edit: It's also possible that within our vhost file, we setup an environmental variable as either live or dev, and our application reads from this accordingly. I'd suggest this approach :)
There are a number of ways of doing this. But this is a deceptively HUGE question you've asked.
Here is some good practice advice - go and research these items, then have a look at my approach.
Typically you use a precess called version control which allows you to create "versions" or snapshots of your system.
The commonly used "SVN" software is good, but the new (not really any more) kid on the block is GIT, and I personally recommend that.
You can use this system to push the codebase live in a controlled fashion. While the files/upload feature is essentially similar to FTP, it allows you to dump a specific version of your site live.
In environments where there are multiple developers, this is ideal - you can compare/test and work around each other, and version control tends to stop errors between devs.
So - advice part 1: Look up and understand version control, then use it to release CODE to the live environment.
Part 2: I use database dumps and farm them back to my machine to work with.
If the live database needs updating, I can work locally and simply export, then re-import on the live system.
For example: on a recent Moodle project I worked on, to refresh the whole database took seconds... I could push a patch and database update in a few minutes.
However: you should think about maintenance and scheduling... if the site is live and has ongoing data changes then you need to be careful with this. Consider adding a maintenance page.
Advice 2: go research SQL dump/export and importing.
I personally use phpmyadmin to dump and re-import, as it's very convenient.
Advice 3: Working locally then pushing live is MUCH BETTER PRACTICE. You're starting down a much safer and better road than you're on!
Hope that helps... but bear in mind - this is a big subject, so you'll need to research a fair bit.

SVN in web development as a base code repository

I have been told that SVN is a good method of being able to manage multiple sites running from the same code base. I am struggling however to find out enough information about SVN and how it would be used in this way. Basically, I would look to put my ecommerce system into an SVN and then run it on multiple sites, each having specific configuration settings in order to specify allowed modules.
Could anyone advise me if
(a) SVN is the correct method to do this
(b) There is any material/tutorials out there on how best to do this.
SVN is a versioning tool, not a multi-site management tool. Although I can see how it can be used in that way, that is not what you get out of the box. You need an understanding of SVN first.
As an example, each website could have it's own svn branch of the code. Any updates that you want to propagate to the rest of the branches, you would merge with trunk, then update the rest of the branches.
That said, branching can very quickly get very complicated, especially with multiple developers. Git handles branching much better than SVN (my opinion). Git is also a versioning tool, just like SVN. But again, it's designed for versioning, bit multi-site management.
I suppose that you could treat each installation as a working copy of the SVN repository, log in to each system, and do a checkout from the latest code on the SVN server.
I do not recommend this approach. It will be slow and will pollute your application folders with a bunch of svn data that you do not need on your web server. You do not want to risk by putting it insider your web root.
I recommend using rsync. With rsync, pushing out updates is slightly more complicated than copying a local file. If you are running linux, the odds are that rsync is already on your system. On windows you can run it under cygwin or find various alternative implementations, such as DeltaCopy.
If for some reason you can't use rsync won't work, I would recommend SFTP.

Should I use Git for deployment of web apps?

I use Git to track local changes in my PHP web applications, and I was wondering if it would be a good idea to use Git on the server as well, so that I could just use git push to deploy my changes. Would there be any pitfalls with this approach?
This seems like a nice way to do things. If you're tagging and branching properly it will enable you to quickly switch back to working versions of your site too in the event that something breaks.
I think this is a fine way to do it. I handle things in a similar manner, where live sites are just a checkout from the repository, and i update them as necessary.
Git is fine but you can do a lot better then just using git pull. Take a look at railess deploy for capistrano.
Capistrano basically does a combination of rsync and git pull to deploy copies of your website. It supports roleback, staging and distributed deployments.
And online hotfixes can be pushed back to development.
Being able to do a git status on a live system can be a live saver.
Go for it!
Caveats
Make sure the the ".git" folder isn't accessible from the web.
With PHP the source code is usually present on the webserver, so that doesn't add additional risk in case the server is hacked.
I would be in favor of using a technique like this if only because you can be sure anything on your deployed site is also being tracked in git. That is, it encourages a best practice and discourages ad hoc changes that aren't under source control.
For another alternative, check out this article about how Twitter uses BitTorrent to manage deployment: http://torrentfreak.com/twitter-uses-bittorrent-for-server-deployment-100210/
It's probably most useful when you need to deploy quickly across a large collection of servers.
I think its a great solution. I have been using it to deploy my website for a long time... Its nice because you can almost instantly push your changes into production just by updating the folder. I have encountered no security issues or anything with it.
Enjoy!

How to efficiently manage multiple installations of a web application?

From my experience, one of the bigger problems we come across during our webdevelopment process is keeping different setups updated and secure across different servers.
My company has it's own CMS which is currently installed across 100+ servers. At the moment, we use a hack-ish FTP-based approach, combined with upgrade scripts at specific locations to upgrade all of our CMS setups. Efficiently managing these setups becomes increasingly difficult and risky when there are several custom modules involved.
What is the best way to keep multiple setups of a web application secure and up-to-date?
How do you do it?
Are there any specific tips regarding modularity in applications, in order to maintain flexibility towards our clients, but still being able to efficiently manage multiple "branches" of an application?
Some contextual information: we mainly develop on the LAMP-stack. One of the main factors that helps us sell our CMS is that we can plugin pretty much anything our client wants. This can very from 10 to to 10.000 lines of custom code.
A lot of custom work consists of very small pieces of code; managing all these small pieces of code in Subversion seems quite tedious and inefficient to me (since we deliver around 2 websites every week, this would result in a lot of branches).
If there is something I am overlooking, I'd love to hear it from you.
Thanks in advance.
Roundup: first of all, thanks for all of your answers. All of these are really helpful.
I will most likely use a SVN-based approach, which makes benlumley's solution closest to what I will use. Since the answer to this question might differ in other usecases, I will accept the answer with the most votes at the end of the run.
Please examine the answers and vote for the ones that you think have the most added value.
I think using a version control system and "branching" the part of the codes that you have to modify could turn out to be the best approach in terms of robustness and efficiency.
A distributed version system could be best suited to your needs, since it would allow you to update your "core" features seamlessly on different "branches" while keeping some changes local if need be.
Edit: I'm pretty sure that keeping all that up to date with a distributed version system would be far less tedious than what you seem to expect : you can keep the changes you are sure you're never going to need elsewhere local, and the distributed aspect means each of your deployed application is actually independent from the others and only the fix you mean to propagate will propagate.
If customizing your application involves changing many little pieces of code, this may be a sign that your application's design is flawed. Your application should have a set of stable core code, extensibility points for custom libraries to plug into, the ability to change appearance using templates, and the ability to change behavior and install plugins using configuration files. In this way, you don't need a separate SVN branch for every client. Rather, keep the core code and extension plugin libraries in source control as normal. In another repository, create a folder for each client and keep all their templates and configuration files there.
For now, creating SVN branches may be the only solution that helps you keep your sanity. In your current state, it's almost inevitable that you'll make a mistake and mess up a client's site. At least with branches you are guaranteed to have a stable code base for each client. The only gotcha with SVN branches is if you move or rename a file in a branch, it's impossible to merge that change back down to the trunk (you'd have to do it manually).
Good luck!
EDIT: For an example of a well-designed application using all the principles I outlined above, see Magento E-Commerce. Magento is the most powerful, extensible and easy to customize web application I've worked with so far.
I may be wrong, but it seems to me what Aron is after is not version control. Versioning is great, and I'm sure they're using it already, but for managing updates on hundreds of customized installations, you need something else.
I'm thinking something along the lines of a purpose-built package system. You'll want every version of a module to keep track of its individual dependencies and 'guaranteed compatibilities', and use this information to automatically update only the 'safe' modules.
E.g. let's say you've built a new version 3 of your 'Wiki' module. You want to propagate the new version to all the servers running your application, but you've made changes to one of the interfaces within the Wiki module since version 2. Now, for all default installations, that is no problem, but it would break installations with custom extensions on top of the old interface. A well-planned package system would take care of this.
To address the security question, you should look into using digital signatures on your patches. There are lots of good libraries available for public-key-based signatures, so just go with whatever seems to be the standard for your chosen platform.
Not sure whether someone's said this, there are a lot of long responses here, and I've not read them all.
I think a better approach to your version control would be to have your CMS sat on its own in its own repository and each project in its own. (or, all of these could be subfolders within one repo i guess)
You can then use its trunk (or a specific branch/tag if you prefer) as an svn:external in each project that requires it. This way, any updates you make to the CMS can be committed back to its repository, and will be pulled into other projects as and when they are svn updated (or the external is svn:switch 'ed).
As part of making this easier, you will need to make sure the CMS and the custom functionality sit in different folders, so that svn externals works properly.
IE:
project
project/cms <-- cms here, via svn external
project/lib <-- custom bits here
project/www <-- folder to point apache/iis at
(you could have cms and lib under the www folder if needed)
This will let you branch/tag each project as you wish. You can also switch the svn:external location on a per branch/tag basis.
In terms of getting changes live, I'd suggest that you immediately get rid of ftp and use rsync or svn checkout/exports. Both work well, the choice is up to you.
I've got most experience with the rsync route, rsyncing an svn export to the server. If you go down this route, write some shell scripts, and you can create a test shell script to show you the files it will upload without uploading them as well, using the -n flag. I generally use a pair of scripts for each environment - one a test, and one to actually do it.
Shared key authentication so you don't need a password to send uploads up may also be useful, depending on how secure the server to be given the access is.
You could also maintain another shell script for doing bulk upgrades, which simply calls the relevant shell script for each project you want to upgrade.
Have you looked at Drupal? No, not to deploy and replace what you have, but to see how they handle customizations and site-specific modules?
Basically, there's a "sites" folder which has a directory for every site you're hosting. Within each folder is a separate settings.php which allows you to specify a different database. Finally, you can (optionally) have "themes" and "modules" folders within sites.
This allows you to do site-specific customizations of particular modules and limit certain modules to those sites. As a result, you end up with a site that the vast majority of everything is perfectly identical and only the differences get duplicated. Combine that with the way it handles upgrades and updates and you might have a viable model.
Build into the code a self-updating process.
It will check for updates and run them when/where/how you have configured it for the client.
You will have to create some sort of a list of modules (custom or not) that need to be tested with the new build prior to roll-out. When deploying an update you will have to ensure these are tested and integrated correctly. Hopefully your design can handle this.
Updates are ideally a few key steps.
a) Backup so you can back out. You should be able to back out
the entire update at any time. So,
that means creating a local archive
of the application and database
first.
b) Update Monitoring Process - Have the CMS system phone home to look for a new build.
c) Schedule Update on availability - Chances are you don't want the update to run the second it is available. This means you will have to create a cron/agent of some kind to do the system update automatically in the middle of the night. You can also consider client requirements to update on weekends, or on specific days. You can also stagger rolling out your updates so you don't update 1000 clients in 1 day and get tech support hell. Staggered roll-out of some kind might be beneficial for you.
d) Add maintenance mode to update the site -- Kick the site into maintenance mode.
e) SVN checkout or downloadable packages -- ideally you can deploy via svn checkout, and if not, setup your server to deliver svn generated packages into an archive that can be deployed on client sites.
f) Deploy DB Scripts - Backup the databases, update them, populate them
g) Update site code - All this work for one step.
h) Run some tests on it. If your code has self-tests built in, it would be ideal.
Here's what I do...
Client-specific include path
Shared, common code is in shared/current_version/lib/
Site specific code is in clients/foo.com/lib
The include path is set to include from the clients/foo.com/lib, and then share/lib
The whole thing is in a version control system
This ensures that the code uses shared files wherever possible, but if I need to override a particular class or file for some reason, I can write a client specific version in their folder.
Alias common files
My virtual host configuration will contain a line like
Alias /common <path>/shared/current_version/public_html/common
Which allows common UI elements, icons, etc to be shared across projects
Tag the common code with each site release
After each site release, I tag the common code by creating a branch to effectively freeze that point in time. This allows me to deploy /shared/version_xyz/ to the live server. Then I can have a virtual host use a particular version of the common files, or leave it pointing at the current_version if I want it to pick up the latest updates.
Have you looked at tools such as Puppet (for system administration incl. app deployment) or Capistrano (deployment of apps in RubyOnRails but not limited to these)?
One option would be to set up a read-only version control system (Subversion). You could integrate access to the repository into your CMS and invoke the updates through a menu, or automatically if you do not want the user to have a choice about an update (could be critical). Using a version control system would also allow you to keep different branches easily
As people have already mentioned that using version control (I prefer Subversion due to functionality) and branching would be the best option. Another open source software available on sourceforge called cruisecontrol. Its amazing, you configure cruisecontrol with subversion in sach a way that any code modification or new code added in serversion, Cruise control will know automatically and will do build for you. It will save your hell of time.
I have done the same way in my company. we have four projects and have to deploy that project on different servers. I have setup cruiseconrol in such a way that any modification in code base triggers automatic build. and another script will deploy that build on the server. your are good to go.
If you use a LAMP stack I would definitely turn the solutions files into a package of your distribution and use it for propagate changes. I recommend for that matter Redhat/Fedora because of RPM and it's what I have experience on. Anyway you can use any Debian based distribution too.
Sometime ago I made a LAMP solution for managing an ISP hosting servers. They had multiple servers to take care of web hosting and I needed a way to deploy the changes of my manager, because every machine was self-contained and had a online manager. I made a RPM package containing the solution files (php mostly) and some deploying scripts that runned with the RPM.
For automated updating we had our own RPM repository set on every server in yum.conf. I set an crontab job to update the servers daily with the latest RPMs from that trusted repository.
Trustiness can be achieve too because you can use trust settings in the RPM packages, like signing them with your public key file and accepting only signed packages.
Hm could it be an idea to add configuration files? You wrote that a lot of small script are doing something. Now if you'd build them into the sources and steered them with configuration files shouldn't that "ease" that?
On the other hand having branches for every customer looks like an exponential growth to me. And how would you "know" which areas you've done something and do not forget to "make" changes in all other branches also. That looks quite ugly to me.
It seems a combination of revision controls, configuration options and/or deployment receipts seems to be a "good" idea.....
With that many variations on your core software, I think you really need a version control system to stay on top of pushing updates from the trunk to the individual client sites.
So if you think Subversion would be tedious, you've got a good sense for what the pain points will be... Personally, I wouldn't recommend Subversion for this, since it's not really that good at managing & tracking branches. Although benlumley's suggestion to use externals for your core software is a good one, this breaks down if you need to tweak the core code for your client sites.
Look into Git for version control, it's built for branching, and it's fast.
Check out Capistrano for managing your deployments. It's a ruby script, often used with Rails, but it can be used for all sorts of file management on remote servers, even non-ruby sites. It can get the content to the remote end through various stragegies including ftp, scp, rsync, as well as automatically checking out the latest version from your repository. The nice features it provides include callback hooks for every step of the deploy process (e.g. so you can copy your site-specific configuration files which might not be in version control), and a release log system--done through symlinks--so you can quickly roll back to a previous release in case of trouble.
I'd recommend a config file with the list of branches and their hosted location, then run through that with a script that checks out each branch in turn and uploads the latest changes. This could be cron'd to do nightly updates automatically.

Categories