I'm trying to include a premade messageboard, Phorum, into one of my Zend projects. Phorum is a relatively large and intricate web of PHP includes. My website already has a signup, so I'm trying to combine the two login systems into one.
What I've done is to make a controller that includes the Phorum index.php. This will let me use the authentication system I've set up. My problem is that, since I'm bootstrapping, all the relative filepaths within Phorum's index.php seem to try to begin at my Zend project's index.php, meaning they all seem to break.
Is there a way around this? Should I include? Render? Something better?
All help is appreciated.
Adding the appropriate chdir (back to Phorums include path root) in your Bootstrap file should do it. Then you have to of course take care that you Zend App uses application specific and not relative paths.
You can use the include_path setting, which can be set at runtime using set_include_path(). This doesn't require you to change the working directory, but makes PHP search for includes in all the directories specified in this setting.
Related
Goal: to require files from an old monolithic code base that is still an active website, and use those files as logic for an API running on the same server in its own instance of PHP. Hoping to leverage the old codes deeply buried business logic. As the old code base must run as we transition it away.
Problem: The old code base uses $_SERVER[‘DOCUMENT_ROOT’] in its require paths. So the new API can’t include those files because it thinks that the document root is the root of its own web root. The old code files contain MANY requires, and those required pages include many requires, so going through and replacing the document root var with relative paths would be a mammoth job. For every new route I write I need to go through all of the old code requires and changes the paths to relative.
Using curl from the new API works fine, but I’m trying to avoid the overhead.
I thought of using an $_ENV var but again it’s the server environment and it has the same problem.
Is there a way to circumvent this problem?
The easiest by far would be to write a script that automatically updates all the require paths. This is probably far better than any workaround you can come up with.
This question seems really elementary, but I have never had a use for the scenario as of yet, so figure I will get some advice. I am building a complex application with an accompanying API in a LAMP environment.
mydomain.com will be the location of the main service and system.
api.mydomain.com will be the location of all endpoints for my API.
All of the class files, DB config files, etc, will be located at main domain/root folder.
Is there a recommended way to handle this? Or is it as simple as including the required files on the API side/folder?
It all depends on whether api.mydomain.com is on the same server as mydomain.com. If they are, you could allow access to the shared files from both areas by using direct paths. So for example, say you have a structure like this:
/webdata/shared
/webdata/api
/webdata/www
You could simply just do an include like:
include_once('/webdata/shared/config.php');
As it is on the same server, it should be no problem. You should use absolute paths for including, so for example /web/www/htdocs/config.inc.php instead of ../htdocs/config.inc.php, as it would get really confusing.
It would be the best to have a simple pathConf.inc.php in api directory and root directory, containing this absolute path - so you will be able to easily change it afterwards. Then, use this path configured in pathConf.inc.php and include your classes, db configs etc.
We have a custom PHP/MySQL CMS running on Linux/Apache thats rolled out to multiple sites (20+) on the same server. Each site uses exactly the same CMS files with a few files for each site being customised.
The customised files for each site are:
/library/mysql_connect.php
/public_html/css/*
/public_html/ftparea/*
/public_html/images/*
There's also a couple of other random files inside /public_html/includes/ that are unique to each site. Other than this each site on the server uses the exact same files. Each site sitting within /home/username/. There is obviously a massive amount of replication here as each time we want to deploy a system update we need to update to each user account. Given the common site files are all stored in SVN it would make far more sense if we were able to simply commit to SVN and deploy to a single location direct from there. Unfortunately, making a major architecture change at this stage could be problematic. In my mind the ideal scenario would mean creating an account like /home/commonfiles/ and each site using these common files unless an account specific file exists, for example a request is made to /home/user/public_html/index.php but as this file doesnt exist the request is then redirected to /home/commonfiles/public_html/index.php. I know that generally this approach is possible, similar to how Zend Framework (and probably others) redirect all requests that dont match a specific file to index.php. I'm just not sure about how exactly to go about implementing it and whether its actually advisable. Would really welcome any input/ideas people have got.
EDIT AllenJB's comment reminded me that we have previously explored AliasMatch as a potential solution to this, we ended up with an general.conf file for a user that looked something like this:
php_admin_value open_basedir "/home/commonfi:/home/usertes:/usr/lib/php:/usr/local/lib/php:/tmp"
php_admin_value include_path "/home/commonfi"
AliasMatch (.*).php /home/commonfi/public_html/$1.php
AliasMatch (.*).html /home/commonfi/public_html/$1.html
You can set this up via the Apache configuration - you probably want Alias, but there are several options:
http://httpd.apache.org/docs/2.2/urlmapping.html
You certainly can build a "cascading" system as you describe (load local file, if that doesn't exist, load global file). The complexity is that the files are loaded in different ways (using include() in PHP, through the web, ... maybe even more ways?)
Filesystem includes
If the includes/ consist of files containing one PHP class each, you could use an autoloader like Zend Framework does. The autoloader would look first for a custom version of the include file, and if it doesn't find one, include the global version instead. I happen to have such an autoloader handy if you need code to start with.
If the includes don't match the one-class-per-file structure, you would have to build a custom include() function that fetches the local version of the file or, failing that, the global one.
Pseudo-code:
function fetch_path($name)
{
if (file_exists(LOCAL_DIRECTORY."/$name")) return LOCAL_DIRECTORY."/$name";
if (file_exists(GLOBAL_DIRECTORY."/$name")) return GLOBAL_DIRECTORY."/$name";
return false;
}
Web resources
The second part is going to be the web part (i.e. Web URLs with local or global files). I think this should be pretty easily solvable using the -f switch in a .htaccess file. You would build a rule that rewrites failed requests (!-f) to the local web resources directory (example.com/css/main_stylesheet.css) to the global one /home/commonfiles/public_html/main_stylesheet.css). You would need to fiddle around with Apache's server config to be able to rewrite local requests to the commonfiles directory, but it should be possible.
That is maybe worth a separate question.
I'm running the same php script on many domains on an apache2 server. Only some of the files change between domains, and the rest are always the same. Right now, every time I set up a new domain, I copy all the files, but I'd really like to have these common files in one place so any changes would affect all domains.
I've thought of using a bunch of symlinks to point at common files. Is this an ok approach, or are there some simple edits I can make to the php scripts or apache configuration files to make this more efficient?
Thanks!
The way I do this kind of thing is to create a "common" directory, where I place all the file that can be shared between each site. Then I simply include them wherever they are needed.
This is pretty good because allows to add features across multiple sites.
I'd suggest abstracting the common code into a set of 'library' scripts. Placing these in a common directory, and making that available by modifying PHP's include_path variable. This means you most likely won't have to modify your current scripts, while still removing the need to have more than one copy.
This path could (and probably should) be outside of your public directories. This enhances the security of your websites by not making them directly available to outside users.
This can be a bit tricky, as the application almost needs to know you're doing this. IME, it works best when you can divide the app into common code and instance code in two separate directory trees. The common code also needs to not do anything silly like include a file that has to be in the instance tree.
A single point of entry to load the common code is also a big bonus because then you can chain a few very small files: the instance code includes one in it's own directory; that file includes a file outside the instance code; that file then either loads the entry point file for the common code, or loads another that does. Now this is only one way to do it, but it means you have just one file that needs to know where the common code is (so you can move it if you have to with minimal effort), and if you do it right, all the various instance code trees load it, albeit indirectly.
You could have a library directory that sits above all of your sites, and a config file that states which library files your sites should include by default. You can then have another config file within each site that overrides the global config. These config files can be used to generate include('../../lib/*.php') statements to build the basic function toolkit needed for each site.
some_high_level_directory/
-> lib/
->*.php (library files)
-> config.php (global library includes)
-> site_1/
-> config.php (library includes that only relate to site_1)
-> www/
-> site_2/
-> config.php (library includes that only relate to site_2)
-> www/
-> etc, etc
Hopefully that makes sense... :)
Not quite sure how to phrase this, but will do my best.
I have an application (written in PHP) that I want to install somewhere like this:
/app/build/1.0/
Now, I want to be able to set up subdomains, something like this:
http://sub1.mydomain.com
http://sub2.mydomain.com
etc ..
First, I want to put the least # files as possible in the subdomains (I am thinking just a config file), and point to the build folder for all php files.
However, in the case where the subdomain has some sort of customization, I would like to be able to actually place the customized file in the subdomain, and then the app would use that (in other words, app first looks for local file, if it exists use it, otherwise, use the default build folder files).
Last, if I release 1.1, I should simply be able to re-point the subdomains to the 1.1 folder.
I have a basic understanding of this and how I might achieve it, but what I am looking for is alternative ideas, or gotchas I may face (or anything else I may not have considered like scalability issues I may not see yet, or other things I may not be able to do if I go this route).
Bottom line question: Is this a good or bad idea, and why?
I'm assuming that the config file in each subdomain means there will be config differences for every site, so that you can't combine all of the uncustomized sites into one folder and simply have the subdomains point to it (in DNS).
I have set up sites such that the subdomains would have a single index.php file. The index.php file would define a bunch of config options and then call something equivalent to startApp(); Each site would have its include path set to include the application files. That can be done in the apache config or in the index.php file.
If you want to customize a site, then you would change the include path to point to the customized code, which you could keep in that sites folder if you want.
Honestly, I think the harder problem will be keeping all the customizations documented and updated. That's a totally different problem though.