We have a custom PHP/MySQL CMS running on Linux/Apache thats rolled out to multiple sites (20+) on the same server. Each site uses exactly the same CMS files with a few files for each site being customised.
The customised files for each site are:
/library/mysql_connect.php
/public_html/css/*
/public_html/ftparea/*
/public_html/images/*
There's also a couple of other random files inside /public_html/includes/ that are unique to each site. Other than this each site on the server uses the exact same files. Each site sitting within /home/username/. There is obviously a massive amount of replication here as each time we want to deploy a system update we need to update to each user account. Given the common site files are all stored in SVN it would make far more sense if we were able to simply commit to SVN and deploy to a single location direct from there. Unfortunately, making a major architecture change at this stage could be problematic. In my mind the ideal scenario would mean creating an account like /home/commonfiles/ and each site using these common files unless an account specific file exists, for example a request is made to /home/user/public_html/index.php but as this file doesnt exist the request is then redirected to /home/commonfiles/public_html/index.php. I know that generally this approach is possible, similar to how Zend Framework (and probably others) redirect all requests that dont match a specific file to index.php. I'm just not sure about how exactly to go about implementing it and whether its actually advisable. Would really welcome any input/ideas people have got.
EDIT AllenJB's comment reminded me that we have previously explored AliasMatch as a potential solution to this, we ended up with an general.conf file for a user that looked something like this:
php_admin_value open_basedir "/home/commonfi:/home/usertes:/usr/lib/php:/usr/local/lib/php:/tmp"
php_admin_value include_path "/home/commonfi"
AliasMatch (.*).php /home/commonfi/public_html/$1.php
AliasMatch (.*).html /home/commonfi/public_html/$1.html
You can set this up via the Apache configuration - you probably want Alias, but there are several options:
http://httpd.apache.org/docs/2.2/urlmapping.html
You certainly can build a "cascading" system as you describe (load local file, if that doesn't exist, load global file). The complexity is that the files are loaded in different ways (using include() in PHP, through the web, ... maybe even more ways?)
Filesystem includes
If the includes/ consist of files containing one PHP class each, you could use an autoloader like Zend Framework does. The autoloader would look first for a custom version of the include file, and if it doesn't find one, include the global version instead. I happen to have such an autoloader handy if you need code to start with.
If the includes don't match the one-class-per-file structure, you would have to build a custom include() function that fetches the local version of the file or, failing that, the global one.
Pseudo-code:
function fetch_path($name)
{
if (file_exists(LOCAL_DIRECTORY."/$name")) return LOCAL_DIRECTORY."/$name";
if (file_exists(GLOBAL_DIRECTORY."/$name")) return GLOBAL_DIRECTORY."/$name";
return false;
}
Web resources
The second part is going to be the web part (i.e. Web URLs with local or global files). I think this should be pretty easily solvable using the -f switch in a .htaccess file. You would build a rule that rewrites failed requests (!-f) to the local web resources directory (example.com/css/main_stylesheet.css) to the global one /home/commonfiles/public_html/main_stylesheet.css). You would need to fiddle around with Apache's server config to be able to rewrite local requests to the commonfiles directory, but it should be possible.
That is maybe worth a separate question.
Related
Some people have reported issues with accessing, setting, or getting the right value from baseUrl() in a view script. But I'm wondering why it is necessary to use it at all, at least in a situation like mine where the ZF application is on a virtual private host (Amazon EC2) where I have full control of the directory structure and apache rewrite rules, as well as routes.
I know, for example, that in the filesystem foo.jpg lives in public/images/foo.jpg, and that the application's mod_rewrite will direct all requests to public - so in my view scripts it's a lot simpler/clearer and more efficient to write something like
<img src="/images/foo.jpg" />
instead of
<img src="<?php echo $this->baseUrl();?>/images/foo.jpg" />
What sort of future-proofing robustness or other benefit does the use of baseUrl() really provide? So far I haven't used it at all, and had no problem. But I've inherited some code that uses it, and my inclination is to strip out those uses whenever I'm editing a view script that contains them. Would I regret that later?
Used this way, it's not really useful, but on the other hand, using it this way
echo $this->baseUrl('/images/foo.jpg')
might prove to be useful in the future since you can add logic before printing the URL. Imagine that in a few years your website grows way more than you expected and you have to move all your static content to a Content delivery network (CDN) you will have to manually (or with search and replace) correct all your images/css/js instances URLs. With the baseUrl() (or as name it assetUrl()) you would just have to add your CDN's url and it will be fixed everywhere in your application.
EDIT
I found a use for the baseUrl() in the code you inherited :
It would allow you to add a common URL part to all of your links and references, in the case that your site is not at the root of the domain
i.e. : www.mysite.com/zf-app/
In your config file you would just have to add
resources.frontController.baseUrl = "/zf-app/"
for it to work, and all of your links would be prepended with that part
Perfect example. I have a couple of basic Zend-y utilities I built on separate systems. On my test platform, I just virtual host them all each with their own document root. Generally I access these tools over a remote web browser but that requires I VPN to the system running it as I didn't create these tools to be on anything more than a subnet and don't really want to expose them to an internet facing site.
So along comes android phones and things like bitweb server that allows you to run a lighttpd, php and mysql in minimalized forms on a pocket cell phone. Only problem is, it's not really set up to be powerful enough to virtual host on android operating systems.
No problem, it will allow for basic aliasing, so I just move all the tools each into their own sub-directory on my sdcard and use lighttpd mod_alias definitions to point to each and then create re-write rules for each subdirectory. But that led me to this post and others like it to fix all the static urls that pointed to href="/some/path/to/static/content"
I even had to update some urls to zend tools that were absolute paths to utilize {view}->url() calls instead.
By adding the baseUrl calls to the front of the static content, and using the view url() method for calling controller actions, I can now move the entire Zend MVC for any one of the independent tools into any directory I want and have them run from as deep in the web-tree as I desire. Zend does the rest and all it takes is 2-3 properly formatted entries in the lighttpd conf file.
Our main website uses symfony 1, and by the time I started working on the code it seems impossible to upgrade (too much custom code from previous developer). Now we are adding a large addition to what the company offers. Instead of using a really old framework I wanted to use CodeIgniter, also since I'm very familiar with it. My real question:
What is a proper way of setting up a website to use multiple frameworks. The new features will be separate from the original website, but it will still need a few tables of the database.
I was going to have apache handle where the root directory was depending on the url and just do everything normally. The main website is example.com and the new feature will be abc.example.com
I'm really looking for people who have done this and some tips and warning they had.
PHP will run the framework based on which directory is loaded on the server. For instance, on most apache servers the root directory for example.com would be /www. Which means all of the code for Symphony would be in /www/*.
When you setup the path of your subdomain, just put it outside of the /www folder. Then, when you go to abc.example.com apache won't try to load the original site along with the Symphony framework.
I think it will be fine for both frameworks to share the same database tables. I'm not entirely sure how you plan for these two applications to work, but as long as you don't change the column names and types you should be okay.
If you don't want the applications to share the data in the original table, then look into using mysqldump or something of the like to copy the data over to a new table.
I run multiple websites all running off of a single installation of CodeIgniter on my server (separate application directories and a single system directory). This has been working fabulously and I don't see any reason to change it at this point.
I find myself writing library classes to extend/override CI all of the time and many times if I find a bug or improve effeciency I have to go back to several websites to make the same adjustments at risk of a typo that breaks one of the websites. Because of this it requires that I change each file and then test that site for bugs.
I have been pondering a solution of using a single libraries directory in a central location and symlinking all of my websites to that central directory. Then when I make a file change it will immediately propagate to all of the downstream websites. It will still require that I test each one for errors, but I won't have to make the changes multiple times. Anything that is specific to a single website will either be a non-shared file (still in the linked directory just not used elsewhere) or can be put in a local helper.
Also, I keep separate 'system' directories by CI version so I can migrate my websites independently if necessary--this central libraries file would be attached to a specific version to reduce possible breaks.
Does anyone see potential issues or pitfalls from taking this approach? Has anyone accomplished this in another direction that I should consider?
Thanks in advance!
I think this actually makes sense :] Go for it. Even on official CodeIgniter page, they mention it's possible.
Also, I don't see one reason why there should be any problem.
Edit: they touch the problem of multiple sites here: http://codeigniter.com/user_guide/general/managing_apps.html
also:
http://codeigniter.com/wiki/Multiple_Applications/
http://www.exclusivetutorials.com/setting-multiple-websites-in-codeigniter-installation/
How to Handle Multiple Projects in CodeIgniter?
http://codeigniter.com/forums/viewthread/56436/
I have a single system directory and separate application directories for my CI apps. In order to share libraries and some view templates between my apps, I have created a "Common" directory, in the same folder as the CI system and with the same structure as a regular app folder and used symlinks, but you can modify the Loader class so that it looks in the Common folder too. My setup looks something like this:
/var/CodeIgniter/
/var/Common/
/var/Common/config/
/var/Common/controllers/
...
/var/Common/libraries/
...
/var/www/someapp/
/var/www/someotherapp/
...
I'm not sure how you handle publishing your sites (assuming you actually do any of that), but I'd look into version control. For example, in SVN you can make external to another svn directory (or file) and then just update the current svn directory which grabs the external file. This approach gains one benefit from the others, which is when you modify the common library, the others aren't immediately affected. This prevents unwanted breaks before you have time to go test all the sites using the common library. You can then just update each site's folder whenever you are ready to test the changes. This is "more work", but it prevents code duplication AND unwanted breaks.
I wrote a MY_Loader to do exactly that.
http://ellislab.com/forums/viewthread/136321/
I'm running the same php script on many domains on an apache2 server. Only some of the files change between domains, and the rest are always the same. Right now, every time I set up a new domain, I copy all the files, but I'd really like to have these common files in one place so any changes would affect all domains.
I've thought of using a bunch of symlinks to point at common files. Is this an ok approach, or are there some simple edits I can make to the php scripts or apache configuration files to make this more efficient?
Thanks!
The way I do this kind of thing is to create a "common" directory, where I place all the file that can be shared between each site. Then I simply include them wherever they are needed.
This is pretty good because allows to add features across multiple sites.
I'd suggest abstracting the common code into a set of 'library' scripts. Placing these in a common directory, and making that available by modifying PHP's include_path variable. This means you most likely won't have to modify your current scripts, while still removing the need to have more than one copy.
This path could (and probably should) be outside of your public directories. This enhances the security of your websites by not making them directly available to outside users.
This can be a bit tricky, as the application almost needs to know you're doing this. IME, it works best when you can divide the app into common code and instance code in two separate directory trees. The common code also needs to not do anything silly like include a file that has to be in the instance tree.
A single point of entry to load the common code is also a big bonus because then you can chain a few very small files: the instance code includes one in it's own directory; that file includes a file outside the instance code; that file then either loads the entry point file for the common code, or loads another that does. Now this is only one way to do it, but it means you have just one file that needs to know where the common code is (so you can move it if you have to with minimal effort), and if you do it right, all the various instance code trees load it, albeit indirectly.
You could have a library directory that sits above all of your sites, and a config file that states which library files your sites should include by default. You can then have another config file within each site that overrides the global config. These config files can be used to generate include('../../lib/*.php') statements to build the basic function toolkit needed for each site.
some_high_level_directory/
-> lib/
->*.php (library files)
-> config.php (global library includes)
-> site_1/
-> config.php (library includes that only relate to site_1)
-> www/
-> site_2/
-> config.php (library includes that only relate to site_2)
-> www/
-> etc, etc
Hopefully that makes sense... :)
Not quite sure how to phrase this, but will do my best.
I have an application (written in PHP) that I want to install somewhere like this:
/app/build/1.0/
Now, I want to be able to set up subdomains, something like this:
http://sub1.mydomain.com
http://sub2.mydomain.com
etc ..
First, I want to put the least # files as possible in the subdomains (I am thinking just a config file), and point to the build folder for all php files.
However, in the case where the subdomain has some sort of customization, I would like to be able to actually place the customized file in the subdomain, and then the app would use that (in other words, app first looks for local file, if it exists use it, otherwise, use the default build folder files).
Last, if I release 1.1, I should simply be able to re-point the subdomains to the 1.1 folder.
I have a basic understanding of this and how I might achieve it, but what I am looking for is alternative ideas, or gotchas I may face (or anything else I may not have considered like scalability issues I may not see yet, or other things I may not be able to do if I go this route).
Bottom line question: Is this a good or bad idea, and why?
I'm assuming that the config file in each subdomain means there will be config differences for every site, so that you can't combine all of the uncustomized sites into one folder and simply have the subdomains point to it (in DNS).
I have set up sites such that the subdomains would have a single index.php file. The index.php file would define a bunch of config options and then call something equivalent to startApp(); Each site would have its include path set to include the application files. That can be done in the apache config or in the index.php file.
If you want to customize a site, then you would change the include path to point to the customized code, which you could keep in that sites folder if you want.
Honestly, I think the harder problem will be keeping all the customizations documented and updated. That's a totally different problem though.