API trying to use code on the same server - require path issue - php

Goal: to require files from an old monolithic code base that is still an active website, and use those files as logic for an API running on the same server in its own instance of PHP. Hoping to leverage the old codes deeply buried business logic. As the old code base must run as we transition it away.
Problem: The old code base uses $_SERVER[‘DOCUMENT_ROOT’] in its require paths. So the new API can’t include those files because it thinks that the document root is the root of its own web root. The old code files contain MANY requires, and those required pages include many requires, so going through and replacing the document root var with relative paths would be a mammoth job. For every new route I write I need to go through all of the old code requires and changes the paths to relative.
Using curl from the new API works fine, but I’m trying to avoid the overhead.
I thought of using an $_ENV var but again it’s the server environment and it has the same problem.
Is there a way to circumvent this problem?

The easiest by far would be to write a script that automatically updates all the require paths. This is probably far better than any workaround you can come up with.

Related

Location of PHP class

A PHP newbie question. I am trying to make my first web page using a php class for navigation and some php for including html files - trying to be efficient. It all works fine at the top of the directory structure but I don't know how to make it work in all the sub-folders.
The (linux) server is a fileshare and I have access to a root folder with a www folder for serving web pages.
I have trawled this site and the web but I am completely confused about how to specify the path to the php and class files to be included in each page.
I know this is really basic php but I am really old (well, almost 60) and learning is getting harder.
The most efficient way to do it is as suggested by Jeremy Miller in the comment, by use of a variable indicating your root folder (absolute path), and referring to it each time. Otherwise, you should know that "../path" notation indicates the parent folder in PHP, and you can use it multiple times to traverse up the tree: "../../path" and so on.

Sub Domain utilizing same application files as root (using PHP)

This question seems really elementary, but I have never had a use for the scenario as of yet, so figure I will get some advice. I am building a complex application with an accompanying API in a LAMP environment.
mydomain.com will be the location of the main service and system.
api.mydomain.com will be the location of all endpoints for my API.
All of the class files, DB config files, etc, will be located at main domain/root folder.
Is there a recommended way to handle this? Or is it as simple as including the required files on the API side/folder?
It all depends on whether api.mydomain.com is on the same server as mydomain.com. If they are, you could allow access to the shared files from both areas by using direct paths. So for example, say you have a structure like this:
/webdata/shared
/webdata/api
/webdata/www
You could simply just do an include like:
include_once('/webdata/shared/config.php');
As it is on the same server, it should be no problem. You should use absolute paths for including, so for example /web/www/htdocs/config.inc.php instead of ../htdocs/config.inc.php, as it would get really confusing.
It would be the best to have a simple pathConf.inc.php in api directory and root directory, containing this absolute path - so you will be able to easily change it afterwards. Then, use this path configured in pathConf.inc.php and include your classes, db configs etc.

Consolidate multiple site files into single location

We have a custom PHP/MySQL CMS running on Linux/Apache thats rolled out to multiple sites (20+) on the same server. Each site uses exactly the same CMS files with a few files for each site being customised.
The customised files for each site are:
/library/mysql_connect.php
/public_html/css/*
/public_html/ftparea/*
/public_html/images/*
There's also a couple of other random files inside /public_html/includes/ that are unique to each site. Other than this each site on the server uses the exact same files. Each site sitting within /home/username/. There is obviously a massive amount of replication here as each time we want to deploy a system update we need to update to each user account. Given the common site files are all stored in SVN it would make far more sense if we were able to simply commit to SVN and deploy to a single location direct from there. Unfortunately, making a major architecture change at this stage could be problematic. In my mind the ideal scenario would mean creating an account like /home/commonfiles/ and each site using these common files unless an account specific file exists, for example a request is made to /home/user/public_html/index.php but as this file doesnt exist the request is then redirected to /home/commonfiles/public_html/index.php. I know that generally this approach is possible, similar to how Zend Framework (and probably others) redirect all requests that dont match a specific file to index.php. I'm just not sure about how exactly to go about implementing it and whether its actually advisable. Would really welcome any input/ideas people have got.
EDIT AllenJB's comment reminded me that we have previously explored AliasMatch as a potential solution to this, we ended up with an general.conf file for a user that looked something like this:
php_admin_value open_basedir "/home/commonfi:/home/usertes:/usr/lib/php:/usr/local/lib/php:/tmp"
php_admin_value include_path "/home/commonfi"
AliasMatch (.*).php /home/commonfi/public_html/$1.php
AliasMatch (.*).html /home/commonfi/public_html/$1.html
You can set this up via the Apache configuration - you probably want Alias, but there are several options:
http://httpd.apache.org/docs/2.2/urlmapping.html
You certainly can build a "cascading" system as you describe (load local file, if that doesn't exist, load global file). The complexity is that the files are loaded in different ways (using include() in PHP, through the web, ... maybe even more ways?)
Filesystem includes
If the includes/ consist of files containing one PHP class each, you could use an autoloader like Zend Framework does. The autoloader would look first for a custom version of the include file, and if it doesn't find one, include the global version instead. I happen to have such an autoloader handy if you need code to start with.
If the includes don't match the one-class-per-file structure, you would have to build a custom include() function that fetches the local version of the file or, failing that, the global one.
Pseudo-code:
function fetch_path($name)
{
if (file_exists(LOCAL_DIRECTORY."/$name")) return LOCAL_DIRECTORY."/$name";
if (file_exists(GLOBAL_DIRECTORY."/$name")) return GLOBAL_DIRECTORY."/$name";
return false;
}
Web resources
The second part is going to be the web part (i.e. Web URLs with local or global files). I think this should be pretty easily solvable using the -f switch in a .htaccess file. You would build a rule that rewrites failed requests (!-f) to the local web resources directory (example.com/css/main_stylesheet.css) to the global one /home/commonfiles/public_html/main_stylesheet.css). You would need to fiddle around with Apache's server config to be able to rewrite local requests to the commonfiles directory, but it should be possible.
That is maybe worth a separate question.

Codeigniter Shared Resources - Opinions Wanted

I run multiple websites all running off of a single installation of CodeIgniter on my server (separate application directories and a single system directory). This has been working fabulously and I don't see any reason to change it at this point.
I find myself writing library classes to extend/override CI all of the time and many times if I find a bug or improve effeciency I have to go back to several websites to make the same adjustments at risk of a typo that breaks one of the websites. Because of this it requires that I change each file and then test that site for bugs.
I have been pondering a solution of using a single libraries directory in a central location and symlinking all of my websites to that central directory. Then when I make a file change it will immediately propagate to all of the downstream websites. It will still require that I test each one for errors, but I won't have to make the changes multiple times. Anything that is specific to a single website will either be a non-shared file (still in the linked directory just not used elsewhere) or can be put in a local helper.
Also, I keep separate 'system' directories by CI version so I can migrate my websites independently if necessary--this central libraries file would be attached to a specific version to reduce possible breaks.
Does anyone see potential issues or pitfalls from taking this approach? Has anyone accomplished this in another direction that I should consider?
Thanks in advance!
I think this actually makes sense :] Go for it. Even on official CodeIgniter page, they mention it's possible.
Also, I don't see one reason why there should be any problem.
Edit: they touch the problem of multiple sites here: http://codeigniter.com/user_guide/general/managing_apps.html
also:
http://codeigniter.com/wiki/Multiple_Applications/
http://www.exclusivetutorials.com/setting-multiple-websites-in-codeigniter-installation/
How to Handle Multiple Projects in CodeIgniter?
http://codeigniter.com/forums/viewthread/56436/
I have a single system directory and separate application directories for my CI apps. In order to share libraries and some view templates between my apps, I have created a "Common" directory, in the same folder as the CI system and with the same structure as a regular app folder and used symlinks, but you can modify the Loader class so that it looks in the Common folder too. My setup looks something like this:
/var/CodeIgniter/
/var/Common/
/var/Common/config/
/var/Common/controllers/
...
/var/Common/libraries/
...
/var/www/someapp/
/var/www/someotherapp/
...
I'm not sure how you handle publishing your sites (assuming you actually do any of that), but I'd look into version control. For example, in SVN you can make external to another svn directory (or file) and then just update the current svn directory which grabs the external file. This approach gains one benefit from the others, which is when you modify the common library, the others aren't immediately affected. This prevents unwanted breaks before you have time to go test all the sites using the common library. You can then just update each site's folder whenever you are ready to test the changes. This is "more work", but it prevents code duplication AND unwanted breaks.
I wrote a MY_Loader to do exactly that.
http://ellislab.com/forums/viewthread/136321/

How to keep relative filepaths intact, even though I'm bootstrapping

I'm trying to include a premade messageboard, Phorum, into one of my Zend projects. Phorum is a relatively large and intricate web of PHP includes. My website already has a signup, so I'm trying to combine the two login systems into one.
What I've done is to make a controller that includes the Phorum index.php. This will let me use the authentication system I've set up. My problem is that, since I'm bootstrapping, all the relative filepaths within Phorum's index.php seem to try to begin at my Zend project's index.php, meaning they all seem to break.
Is there a way around this? Should I include? Render? Something better?
All help is appreciated.
Adding the appropriate chdir (back to Phorums include path root) in your Bootstrap file should do it. Then you have to of course take care that you Zend App uses application specific and not relative paths.
You can use the include_path setting, which can be set at runtime using set_include_path(). This doesn't require you to change the working directory, but makes PHP search for includes in all the directories specified in this setting.

Categories