How should I organize my website in order to utilize shared packages? Lets use Stackexchange as an example. There are many subdomains:
webapps.stackexchange.com namespace StackExchange\webapps
programmers.stackexchange.com namespace StackExchange\programmers
etc.
Each of the subdomains reuse libraries. Maybe there is a Question Class that is reused by all of the sites to help layout the ask a question page.
Would the question package be packaged in every one of the subdomain packages? Or do all of the packages reference the Question package. In other words: which scenario is the case:
StackExchange\webapps\question
StackExchange\programmers\question
OR:
StackExchange\webapps
StackExchange\programmers
StackExchange\question
WHERE StackExchange\Webapps and StackExchange\programmers use StackExchange\question
I ask because I am setting up my website with several independent tools on it. I plan to put each tool in its own subdomain on the same server. I only want to have to update the server in one place whenever I update a package that is used by many subdomains, so I would like it to be stored in its own namespace.
Should I be using a dependency manager like composer or something to take care of this for me? What is the best practice way of accomplishing this?
I had to deal with a similar scenario where I had two websites that shared a lot of functionality. Because I had to rewrite one of them, I tried to establish a shared codebase.
Namespaces
How you structure your namespaces is mostly a matter of taste.
Personally, I use Vendor\Package for shared stuff and Vendor\Site\Package for site-specific code.
What you do not want to do is to put a shared package in each site-specific package (as your first example implied) because that way you would have to copy and modify your code files (because you'd have to change the namespace).
Where to put shared packages
You have two options where to actually put your shared packages: local to each application (subdomain) or globally on the server.
Global packages
Here you would put shared code somewhere on your server where every application can access it. rm-vanda's answer pretty much described how this could be done in practice. You don't need Composer for this.
In the pro side, you would only have one instance of the shared package instead of several copies. You can't forget to update one of your applications. If you use a PHP accelerator like APC or OpCache this approach will also save some memory because shared libraries only hit the cache once.
Local packages
Another approach is to have a copy for each application. That's the approach Composer pursues by default. Usually, you'll have a "vendor" directory in your application's main directory where you put this shared stuff (and also third-party code if you use any). That way, each application can use a different version of a shared library.
Personally, I like the local approach most. I can live with the additional deploying work and I think it's the most future-safe way to do it.
Have to split up my applications on several servers? No problem, they're pretty much independent already.
Number of applications is growing and I can't bother to always check all of them if I update something? Just don't update all of them.
I can even have a production environment and a test environment on the same server featuring different versions of a shared library.
Also, using Composer feels the most natural with this approach. And Composer is a really awesome thing, especially when using third-party code.
Since everything is going to be on the same server, you can simply have a shared directory in your php.ini 's include path ---
i.e.
/usr/share/php/
Which each project will then be able to include the same files from -
(A well-written auto-loader function could do this as well, without the need for an include path)
and when setting the include path for project-specific purposes -
ini_set('include_path', get_include_path().".:/var/www/path/for/this/project");
Or even just include what you need on a per-need basis:
require_once '/usr/share/php/apis/Google_Client.php';
Related
I'm working on a symfony 2 web site, it is a product our company sells (every client gets his own installation).
Each of our client gets a slightly different version of the site: sometimes the differences are small (different strings, slightly different templates), and sometimes they are bigger (different bundles activated, different database schema, different security.yml file etc.)
Currently each version sits on its own git branch. This works for now as we only started selling and don't have many clients(=branches) yet.
But I want to move to a better solution.
Any ideas?
Thanks,
A
I disagree. If it's a single application, then branches per client or release, ie configurations that need separate histories (change tracking), are precisely the correct practice. Clients use/share the same app, "core", whatever, but possibly you want to control when changes in one client's features/bugs are merged into the other configurations, ie branches. Each client will have to be "upgraded" by merging from the development branch to client's branch, separately, manually. You want to pay this overhead to gain stability and isolation.
As for the specific differences you mentioned:
Different strings: client specific configs, constants, deployment stuff, should probably be in a per-client/deployment module (include file, whatever), that you can simply ignore when merging, so that the core app code will have no client/deployment specific differences, ever.
Slightly different templates: meh, this smells bad; if you can use symbolic constants or partials or whatever, and move the differences to a config.php, then see the above solution. Otherwise, don't template diffs suggest it's not the same app anymore?
Different bundles activated: if you can put the include statements in the config.php, no problem.
Different database schema: nasty business. Can this app be made schema-independent? Can you do polymorphism somehow? Nu, I don't know, haven't touched PHP in twenty years, I only do CoffeeScript and NoSQL now. ;o)
Different security.yml: same as with config.php, not a problem, just don't merge that file, or even stop tracking it (remove from repo) altogether.
HTH.
;o)
I don't think that a branch is a good solution as its not intended for this kind of use case. As i see it, branches should be used for a staging or features/fixes.
As i see it, you have few options (in case you are not familiar with those options, you should read more about them in the links i attached):
Git submodule. "A submodule allows you to keep another Git repository in a subdirectory of your repository. The other repository has its own history, which does not interfere with the history of the current repository. This can be used to have external dependencies such as third party libraries for example." http://git-scm.com/docs/git-submodule
Composer. This is what i think would best suite you. Use a regular dependencies manager (composer for php, npm for nodejs etc..). I will give you full control over the version been used + easy deployment options. https://getcomposer.org/
Another option would to do it "old school".. In case they are all in the same server, just keep the "Core" stuff in the root folder and include it for each project. It's no a recommended way to go but it might be enough and won't require much changes from you
I have 3 PHP projects using the CodeIgniter framework which share some exact same files such as models libraries and controllers. What's the best way I could share these files across without having to keep in sync and update the same files across?
In linux I thought of using dynamic links and extract these files to a central place but that kind of breaks our version control and would create portability issues.
Another way perhaps to use unison on these files across projects
I'm assuming that's a common problem, what are common approaches?
Separate them into a module, and use something like composer.
http://getcomposer.org/
Or just put them in a separate SCM.
One thing you can do:
Put all the shared code in libraries, helpers and models and place this in a separate folder. Then use:
$this->load->add_package_path('shared location');
Also take a look here: http://codeigniter.com/user_guide/libraries/loader.html , under application packages.
This works for most of the stuff, except controllers.
Use version control! In svn you can use externals, git has submodules or subtrees.
You don't want to use hardlinks, you'll run into weird issues like updating one project influences another project ("that I haven't touched in weeks").
The code can be in two physical places but shared under version control. There will always be only one authorative copy, namely the one in your version system. All physical copies are derivatives. It's important to see that you have control over when you update the code of a specific project, so a change at one point doesn't immediately break another project in case you made a mistake.
If you do want to catch these kinds of errors, set up a proper regression testing environment.
Sharing a development environment with another developer is also a big no. You don't want to have to wait till your colleague fixes a parse error that breaks the entire program. Each developer should have their own copy (checkout!) of a project and similarly each project should have their own copy (externaled) of shared code.
Seperate them into folders outside your project, then configure or include them in your projects.
Usually we will rewrite "autoloader" method for the project to find files in our new folders.
I run multiple websites all running off of a single installation of CodeIgniter on my server (separate application directories and a single system directory). This has been working fabulously and I don't see any reason to change it at this point.
I find myself writing library classes to extend/override CI all of the time and many times if I find a bug or improve effeciency I have to go back to several websites to make the same adjustments at risk of a typo that breaks one of the websites. Because of this it requires that I change each file and then test that site for bugs.
I have been pondering a solution of using a single libraries directory in a central location and symlinking all of my websites to that central directory. Then when I make a file change it will immediately propagate to all of the downstream websites. It will still require that I test each one for errors, but I won't have to make the changes multiple times. Anything that is specific to a single website will either be a non-shared file (still in the linked directory just not used elsewhere) or can be put in a local helper.
Also, I keep separate 'system' directories by CI version so I can migrate my websites independently if necessary--this central libraries file would be attached to a specific version to reduce possible breaks.
Does anyone see potential issues or pitfalls from taking this approach? Has anyone accomplished this in another direction that I should consider?
Thanks in advance!
I think this actually makes sense :] Go for it. Even on official CodeIgniter page, they mention it's possible.
Also, I don't see one reason why there should be any problem.
Edit: they touch the problem of multiple sites here: http://codeigniter.com/user_guide/general/managing_apps.html
also:
http://codeigniter.com/wiki/Multiple_Applications/
http://www.exclusivetutorials.com/setting-multiple-websites-in-codeigniter-installation/
How to Handle Multiple Projects in CodeIgniter?
http://codeigniter.com/forums/viewthread/56436/
I have a single system directory and separate application directories for my CI apps. In order to share libraries and some view templates between my apps, I have created a "Common" directory, in the same folder as the CI system and with the same structure as a regular app folder and used symlinks, but you can modify the Loader class so that it looks in the Common folder too. My setup looks something like this:
/var/CodeIgniter/
/var/Common/
/var/Common/config/
/var/Common/controllers/
...
/var/Common/libraries/
...
/var/www/someapp/
/var/www/someotherapp/
...
I'm not sure how you handle publishing your sites (assuming you actually do any of that), but I'd look into version control. For example, in SVN you can make external to another svn directory (or file) and then just update the current svn directory which grabs the external file. This approach gains one benefit from the others, which is when you modify the common library, the others aren't immediately affected. This prevents unwanted breaks before you have time to go test all the sites using the common library. You can then just update each site's folder whenever you are ready to test the changes. This is "more work", but it prevents code duplication AND unwanted breaks.
I wrote a MY_Loader to do exactly that.
http://ellislab.com/forums/viewthread/136321/
I have a folder of PHP scripts, they are mostly utility scripts. How to share those scripts among different PHP applications so that reuse and deployment are easy?
I would have to package my app into an installer, and let the user install it.
I could put the lib and hardcode the include path, but that means I haven to change the PHP code every time i deploy the web application to a new customer. This is not desirable.
Another route I consider is to copy the lib to other apps, but still, since the lib is constantly updating, that means that I need to constantly do the copying, and this will introduce a lot of problems. I want an automated way to do this.
Edit: Some of the applications are Symfony, some are not.
You could create a PEAR package.
See Easy PEAR Package Creation for more information on how to do this.
This assumes that when you say anyone, you mean outside your immediate organisation.
Updated: You do not need to upload to a website to install the PEAR package. Just extract your archive into the pear folder to use in a PHP application.
Added: Why not create a new SVN repository for your library? Lets say you create a library called FOO. Inside the repostory you could use the folder heirachy of trunk\lib\foo. Your modules could then go into trunk\lib\foo\modules and have a file called trunk\lib\foo\libfoo.php. Now libfoo.php can include once or require once all the modules as required.
PHP now supports Phar archives. There's full documentation on php.net.
There's a complete tutorial on IBM website as well.
One neat thing you can do with Phar archives is package an entire application and distribute it that way.
http://php.net/phar
http://www.ibm.com/developerworks/opensource/library/os-php-5.3new4/index.html
Ahh, libraries...
There are two conflicting purposes here:
Sanity when updating scripts (ie. not breaking 10 other apps).
Keeping things in one organized logical place for developer efficiency.
I suggest you take a close look at git and git submodules
We use git submodules extensively for this very purpose. It allows the best of both worlds because shared scripts can be upgraded at will in any project, and then that change can be moved to the other projects (deliberately) when you have time to do so and test correctly.
Of course, you need to be using git to take advantage of submodules, but if you are not using git, and you start, you'll eventually wonder how you ever lived without it.
Edit: Since the original poster is using svn, consider using SVN Externals.
UPDATED:
you just have to put the lib in some place reachable by your apps (in a place where you can reach it via http or ftp or https or something else) and include it.
If you have to update it often you can package your library in a single phar file and you can then provide your client a function to pull the library from some remote path and update a parameter in their local configuration accordingly, like:
function updateLocalLibary(){
//read the remote library in a variable
$file= file_get_content($remoteLibraryRepository.$libraryPharFile);
//give it a unique name
$newLibraryName=$libraryPharFile."_".date('Ymdhsi');
//store the library it on a local file
file_put_content($localLibraryPath.$newLibraryName,$file);
//update the configuration, letting your app point to the new library
updateLatestLibraryPathInConfig($newLibraryName);
//possibly delete the old lib
}
In your include path then you don't have necesasrily to hardcode a path, you can include a parameter based on your config, like:
include( getLatestLibraryPathFromConfig() )
(you are responsible to secure the retrieval in order to let only your clients see the library)
Your conf can be in a db, so that when you call updateLibraryPathInConfig() you can perform an atomical operation and you are sure not to have client read dirty data.
The clients can then update their library as needed. They may even schedule regular updates.
There are a lot of options:
tar + ftp/scp
PEAR (see above #Wayne)
SVN
rsync
NFS
I recommend to use a continuous integration software (Atlassian Bamboo, CruiseControl); check out your repository, build a package, and then use rsync. Automatically.
You should also look into using namespace in order to avoid conflicts with other libraries you might use. pear is probably a good idea for the delivery method, however, you can just place it in the standard path /usr/share/php/, or any other place that is set as the include path in your php settings file.
Good question, and probably one that doesn't have a definite answer. You can basically pick between two different strategies for distributing your code: Either you put commonly used code in one place and let individual applications load from the same shared place, or you use a source-control-system to synchronise between local copies. They aren't mutually exclusive, so you'll often see both patterns in use at the same time.
Using the file system to share code
You can layer the include_path to create varying scopes of inclusion. The most obvious application of this pattern is a globally maintained PEAR repository and a local application. If your it-system consists of multiple applications that share a common set of libraries, you can add a layer in between these (a framework layer). If you structure the include_path such that the local paths come before the global paths, you can use this to make local overrides of files. This is a rather crude way to extend code, since it works per-file, but it can be useful in some cases.
Use source-control
Another strategy is to make a lot of local checkouts of a single shared repository. Some benefits over the layered-include-pattern is that you can make more fine grained local changes. It can be a bit of a challenge to manage the separation between application layers (infrastructure, framework, application). svn:externals can work, but has some limitations. It's also slightly more complicated to propagate global changes to all applications. An automated deployment process can help with that.
From my experience, one of the bigger problems we come across during our webdevelopment process is keeping different setups updated and secure across different servers.
My company has it's own CMS which is currently installed across 100+ servers. At the moment, we use a hack-ish FTP-based approach, combined with upgrade scripts at specific locations to upgrade all of our CMS setups. Efficiently managing these setups becomes increasingly difficult and risky when there are several custom modules involved.
What is the best way to keep multiple setups of a web application secure and up-to-date?
How do you do it?
Are there any specific tips regarding modularity in applications, in order to maintain flexibility towards our clients, but still being able to efficiently manage multiple "branches" of an application?
Some contextual information: we mainly develop on the LAMP-stack. One of the main factors that helps us sell our CMS is that we can plugin pretty much anything our client wants. This can very from 10 to to 10.000 lines of custom code.
A lot of custom work consists of very small pieces of code; managing all these small pieces of code in Subversion seems quite tedious and inefficient to me (since we deliver around 2 websites every week, this would result in a lot of branches).
If there is something I am overlooking, I'd love to hear it from you.
Thanks in advance.
Roundup: first of all, thanks for all of your answers. All of these are really helpful.
I will most likely use a SVN-based approach, which makes benlumley's solution closest to what I will use. Since the answer to this question might differ in other usecases, I will accept the answer with the most votes at the end of the run.
Please examine the answers and vote for the ones that you think have the most added value.
I think using a version control system and "branching" the part of the codes that you have to modify could turn out to be the best approach in terms of robustness and efficiency.
A distributed version system could be best suited to your needs, since it would allow you to update your "core" features seamlessly on different "branches" while keeping some changes local if need be.
Edit: I'm pretty sure that keeping all that up to date with a distributed version system would be far less tedious than what you seem to expect : you can keep the changes you are sure you're never going to need elsewhere local, and the distributed aspect means each of your deployed application is actually independent from the others and only the fix you mean to propagate will propagate.
If customizing your application involves changing many little pieces of code, this may be a sign that your application's design is flawed. Your application should have a set of stable core code, extensibility points for custom libraries to plug into, the ability to change appearance using templates, and the ability to change behavior and install plugins using configuration files. In this way, you don't need a separate SVN branch for every client. Rather, keep the core code and extension plugin libraries in source control as normal. In another repository, create a folder for each client and keep all their templates and configuration files there.
For now, creating SVN branches may be the only solution that helps you keep your sanity. In your current state, it's almost inevitable that you'll make a mistake and mess up a client's site. At least with branches you are guaranteed to have a stable code base for each client. The only gotcha with SVN branches is if you move or rename a file in a branch, it's impossible to merge that change back down to the trunk (you'd have to do it manually).
Good luck!
EDIT: For an example of a well-designed application using all the principles I outlined above, see Magento E-Commerce. Magento is the most powerful, extensible and easy to customize web application I've worked with so far.
I may be wrong, but it seems to me what Aron is after is not version control. Versioning is great, and I'm sure they're using it already, but for managing updates on hundreds of customized installations, you need something else.
I'm thinking something along the lines of a purpose-built package system. You'll want every version of a module to keep track of its individual dependencies and 'guaranteed compatibilities', and use this information to automatically update only the 'safe' modules.
E.g. let's say you've built a new version 3 of your 'Wiki' module. You want to propagate the new version to all the servers running your application, but you've made changes to one of the interfaces within the Wiki module since version 2. Now, for all default installations, that is no problem, but it would break installations with custom extensions on top of the old interface. A well-planned package system would take care of this.
To address the security question, you should look into using digital signatures on your patches. There are lots of good libraries available for public-key-based signatures, so just go with whatever seems to be the standard for your chosen platform.
Not sure whether someone's said this, there are a lot of long responses here, and I've not read them all.
I think a better approach to your version control would be to have your CMS sat on its own in its own repository and each project in its own. (or, all of these could be subfolders within one repo i guess)
You can then use its trunk (or a specific branch/tag if you prefer) as an svn:external in each project that requires it. This way, any updates you make to the CMS can be committed back to its repository, and will be pulled into other projects as and when they are svn updated (or the external is svn:switch 'ed).
As part of making this easier, you will need to make sure the CMS and the custom functionality sit in different folders, so that svn externals works properly.
IE:
project
project/cms <-- cms here, via svn external
project/lib <-- custom bits here
project/www <-- folder to point apache/iis at
(you could have cms and lib under the www folder if needed)
This will let you branch/tag each project as you wish. You can also switch the svn:external location on a per branch/tag basis.
In terms of getting changes live, I'd suggest that you immediately get rid of ftp and use rsync or svn checkout/exports. Both work well, the choice is up to you.
I've got most experience with the rsync route, rsyncing an svn export to the server. If you go down this route, write some shell scripts, and you can create a test shell script to show you the files it will upload without uploading them as well, using the -n flag. I generally use a pair of scripts for each environment - one a test, and one to actually do it.
Shared key authentication so you don't need a password to send uploads up may also be useful, depending on how secure the server to be given the access is.
You could also maintain another shell script for doing bulk upgrades, which simply calls the relevant shell script for each project you want to upgrade.
Have you looked at Drupal? No, not to deploy and replace what you have, but to see how they handle customizations and site-specific modules?
Basically, there's a "sites" folder which has a directory for every site you're hosting. Within each folder is a separate settings.php which allows you to specify a different database. Finally, you can (optionally) have "themes" and "modules" folders within sites.
This allows you to do site-specific customizations of particular modules and limit certain modules to those sites. As a result, you end up with a site that the vast majority of everything is perfectly identical and only the differences get duplicated. Combine that with the way it handles upgrades and updates and you might have a viable model.
Build into the code a self-updating process.
It will check for updates and run them when/where/how you have configured it for the client.
You will have to create some sort of a list of modules (custom or not) that need to be tested with the new build prior to roll-out. When deploying an update you will have to ensure these are tested and integrated correctly. Hopefully your design can handle this.
Updates are ideally a few key steps.
a) Backup so you can back out. You should be able to back out
the entire update at any time. So,
that means creating a local archive
of the application and database
first.
b) Update Monitoring Process - Have the CMS system phone home to look for a new build.
c) Schedule Update on availability - Chances are you don't want the update to run the second it is available. This means you will have to create a cron/agent of some kind to do the system update automatically in the middle of the night. You can also consider client requirements to update on weekends, or on specific days. You can also stagger rolling out your updates so you don't update 1000 clients in 1 day and get tech support hell. Staggered roll-out of some kind might be beneficial for you.
d) Add maintenance mode to update the site -- Kick the site into maintenance mode.
e) SVN checkout or downloadable packages -- ideally you can deploy via svn checkout, and if not, setup your server to deliver svn generated packages into an archive that can be deployed on client sites.
f) Deploy DB Scripts - Backup the databases, update them, populate them
g) Update site code - All this work for one step.
h) Run some tests on it. If your code has self-tests built in, it would be ideal.
Here's what I do...
Client-specific include path
Shared, common code is in shared/current_version/lib/
Site specific code is in clients/foo.com/lib
The include path is set to include from the clients/foo.com/lib, and then share/lib
The whole thing is in a version control system
This ensures that the code uses shared files wherever possible, but if I need to override a particular class or file for some reason, I can write a client specific version in their folder.
Alias common files
My virtual host configuration will contain a line like
Alias /common <path>/shared/current_version/public_html/common
Which allows common UI elements, icons, etc to be shared across projects
Tag the common code with each site release
After each site release, I tag the common code by creating a branch to effectively freeze that point in time. This allows me to deploy /shared/version_xyz/ to the live server. Then I can have a virtual host use a particular version of the common files, or leave it pointing at the current_version if I want it to pick up the latest updates.
Have you looked at tools such as Puppet (for system administration incl. app deployment) or Capistrano (deployment of apps in RubyOnRails but not limited to these)?
One option would be to set up a read-only version control system (Subversion). You could integrate access to the repository into your CMS and invoke the updates through a menu, or automatically if you do not want the user to have a choice about an update (could be critical). Using a version control system would also allow you to keep different branches easily
As people have already mentioned that using version control (I prefer Subversion due to functionality) and branching would be the best option. Another open source software available on sourceforge called cruisecontrol. Its amazing, you configure cruisecontrol with subversion in sach a way that any code modification or new code added in serversion, Cruise control will know automatically and will do build for you. It will save your hell of time.
I have done the same way in my company. we have four projects and have to deploy that project on different servers. I have setup cruiseconrol in such a way that any modification in code base triggers automatic build. and another script will deploy that build on the server. your are good to go.
If you use a LAMP stack I would definitely turn the solutions files into a package of your distribution and use it for propagate changes. I recommend for that matter Redhat/Fedora because of RPM and it's what I have experience on. Anyway you can use any Debian based distribution too.
Sometime ago I made a LAMP solution for managing an ISP hosting servers. They had multiple servers to take care of web hosting and I needed a way to deploy the changes of my manager, because every machine was self-contained and had a online manager. I made a RPM package containing the solution files (php mostly) and some deploying scripts that runned with the RPM.
For automated updating we had our own RPM repository set on every server in yum.conf. I set an crontab job to update the servers daily with the latest RPMs from that trusted repository.
Trustiness can be achieve too because you can use trust settings in the RPM packages, like signing them with your public key file and accepting only signed packages.
Hm could it be an idea to add configuration files? You wrote that a lot of small script are doing something. Now if you'd build them into the sources and steered them with configuration files shouldn't that "ease" that?
On the other hand having branches for every customer looks like an exponential growth to me. And how would you "know" which areas you've done something and do not forget to "make" changes in all other branches also. That looks quite ugly to me.
It seems a combination of revision controls, configuration options and/or deployment receipts seems to be a "good" idea.....
With that many variations on your core software, I think you really need a version control system to stay on top of pushing updates from the trunk to the individual client sites.
So if you think Subversion would be tedious, you've got a good sense for what the pain points will be... Personally, I wouldn't recommend Subversion for this, since it's not really that good at managing & tracking branches. Although benlumley's suggestion to use externals for your core software is a good one, this breaks down if you need to tweak the core code for your client sites.
Look into Git for version control, it's built for branching, and it's fast.
Check out Capistrano for managing your deployments. It's a ruby script, often used with Rails, but it can be used for all sorts of file management on remote servers, even non-ruby sites. It can get the content to the remote end through various stragegies including ftp, scp, rsync, as well as automatically checking out the latest version from your repository. The nice features it provides include callback hooks for every step of the deploy process (e.g. so you can copy your site-specific configuration files which might not be in version control), and a release log system--done through symlinks--so you can quickly roll back to a previous release in case of trouble.
I'd recommend a config file with the list of branches and their hosted location, then run through that with a script that checks out each branch in turn and uploads the latest changes. This could be cron'd to do nightly updates automatically.