I'm in the middle of making my own custom forum system software. Much like phpbb, mybb, vbulletin, etc. except it's obviously quite less advanced. It's just a personal project for myself and I've run into some problems since I've never had to develop something that can be repackaged for others.
The file structure is as follows:
So, config.php is the end all be all of including files. It has the database connection information, it instantiates my database class as well, and none of the function files require/include any files since they'll always be accessed where config.php is required.
HERE'S THe QUESTION!
However I'm running into simple but very annoying problems, for example I call a function in config.php towards the top that checks the users cookies values and makes sure they all belong to the same user, and if not it deletes the cookies. However, it has to be after the database files require. And things like, a variable declared in config.php isn't always accessible, so sometimes I have to declare it in the header files.
Seems like it's not much of a question, but I guess it's just asking for how I can include/require in general without running into issues.
As a general note, most people don't mix config variables and code in one file. If you look at popular open source packages like Wordpress, Config.php just has config variables set. No code.
If you're using certain functions in anything more than a "one off" situation, you may want to consider putting them into your main class - that way they're available as needed.
#James is right, separate your config file. You can include it inside an "application.php" required file (so it's available globally).
I have run into a situation where I absolutely needed HTTP Header information prior to page build. Though it seemed a little backward, the solution was to call that file first, then include the application.php file. Checking for a cookie should be fine.
In another situation, #include('myStubbonPricing.php') was the answer. I'm not an advocate of error suppression, but in my case it only outputted a shipping rate (if the zip code was entered). To my defense !isset and the like would not fix the problem due to an XML request/response scenario.
Related
I'm looking to centralize a lot of my web applications code, so that multiple components have access to the same core functionality. This is how I have the website set up:
/var/www/website - domain.com
/var/www/subdomain1 - subdomain1.domain.com
/var/www/subdomain2 - subdomain2.domain.com
Naturally I've had a lot of trouble when it comes to the duplication of common functionality, as any changes made to one area would also need to be applied to other areas. My proposed solution is to create a new directory in /var/www which will contain all of the core scripts:
/var/www/code - core code
I would then set the PHP include directory to /var/www/code, so scripts can include these files without having to specify the absolute path.
Can you think of any more efficient ways of centralizing the code code?
Many thanks!
Your approach is good enough for this purpose.
Little suggestion:
store your front-end scripts in directory like /var/www/website/www instead of /var/www/website. There will be index file and ajax processors and scripts like that. But your project-based inclusions (as well as other miscellaneous stuff) would be stored in directory like /var/www/website/includes. It is simple yet efficient defense from hacker attacks on your inclusion files
so, your document roots will be in /var/www/website/www (domain) and /var/www/website/subdomain/www/ (subdomain)
It seems that you are thinking correctly :
Share Code between multiple PHP sites
It's only a suggestion, but you should put the public content in the /var/www/* which may end being publicly accessible—either because of your http server or because of some misconfiguration—and create some other directories for your shared code/libs like /usr/local/lib/php/*.
For more security you should frame it with open_basedir adding the private and public dirs—as well as upload and session dirs.
And don't forget to version your libs, e.g.:
/usr/local/lib/php/myLib-1.0
/usr/local/lib/php/myLib-1.2
etc.
Thus, you'll be able to make changes without breaking everything.
I have a series of web sites all hosted on the same server with different domains. I want to host some common PHP scrips and then be able to call these from the other domains.
Im am a bit fresh with my php so pls excuse code attempts - I have tried iterations of the following which may try and help you understand what I am aiming for!
from within php tags ...
include('http://www.mydomain/common_include.php?show_section=$section');
$show_section = $_GET['show_section'];
include('http://www.mydomain/common_include.php');//Then $show_section would be available to the included file/cod
Finally I have tried pulling in the include which contains a function then trying to run that include from the parent script.
I would much prefer to keep this PHP
orientated rather than getting
involved with the server (file
systems etc (but I can change
permissions etc)
I can but would prefer not to just upload the same library to each of the domains separately
I understand PHP is run on the server hence maybe problematic to include scripts across onto another server.
Thanks in advance.
#
EDIT
OK OK - I get that its bad practice so will not do it....THANKS VERY MUCH FOR THE QUICK ANSWERS.
However is there any other recommendations of how to esentially show this basic php app on all of the sites with out haveing to add the files to the root of each site? Just to prevent massive script duplication...(thinking out loud call the scripts in from a db or anyother soloutions)
Again thanks for your assistance
That would be a huge security risk if you could just include remote PHP files to your own projects. The PHP gets parsed before the server sends it to you so cross-domain includes would only contain the output the script generates. The only way to include PHP files so that they can be executed is via local filesystem.
If you look at PHP.net's documentation about include, you can find this:
If "URL fopen wrappers" are enabled in PHP (which they are in the default configuration), you can specify the file to be included using a URL (via HTTP or other supported wrapper - see List of Supported Protocols/Wrappers for a list of protocols) instead of a local pathname. If the target server interprets the target file as PHP code, variables may be passed to the included file using a URL request string as used with HTTP GET. This is not strictly speaking the same thing as including the file and having it inherit the parent file's variable scope; the script is actually being run on the remote server and the result is then being included into the local script.
Which pretty much explains the whole thing.
The root of the original question seemed to be the poster's concern about using a PHP script or plugin on multiple sites and then having an onerous task each time it needs to be updated. While trying to include PHP files across sites is a bad idea, it is a better plan to structure your script to be as self contained as possible. Keep the entire plugin contained in one directory.... and ensure your function calls to utilize it are as well formed as possible - clean, well named functions, uniform naming conventions and a well thought out plan for what parameters each function needs. Avoid using global variables.
Ideally you should then have quite an easy time each time you need to update the plugin/script in all locations. You can even set up an automated process that will upload the new directory containing the plugin to each site replacing the old one. And the function calls within your code should rarely if ever change.
If your script is big enough you might implement an automatic update process like the more recent versions of Wordpress use. Click a button and it updates itself. In the past, updating a dozen sites running Wordpress (as an example) was a massive pain.
That is very bad practice.
Actually you're including not PHP but just HTML code.
Include files, not urls. It is possible for the same server.
Just use absolute path to these files.
Apart from the fact that it's a bad practice you should first check if include allows URLs if you really want to do that.
If however all the sites that need to use the script, you could put the script somewhere in a directory accessible by the user that executes php and add that dir to the php.ini include_path property (can also be done at runtime)
(Or you could create a php extension and load it as extension)
If you have root rights on that server, you could just use absolute path from filesystem root, but most hostings won't let you do this.
Just something I wonder about when including files:
Say I want to include a file, or link to it. Should I just for example:
include("../localfile.php");
or should I instead use
include("http://sameserver.com/but/adirect/linkto/localfile.php");
Is one better than the other? Or more secure? Or is it just personal preference?
Clearly it would be a necessity if you had a file that you would include into files in multiple directories, and THAT file includes a different file, or is there some other way of doing that?
Reading a file is much faster than making an HTTP request and getting the response. Never include(a_uri) if you can help it.
Use $_SERVER['DOCUMENT_ROOT'] if you want to calculate a complete file path for your include.
As said before, definitely include a local file and not do an HTTP request (which takes more time, is not cached and the contents are technically viewable to all the world, if he knows where to look for it).
One more small detail, if you use full paths to your included files, it will even be faster then relative paths, especially if you use some kind Byte Code Cache.
Definitely include the local file, because the php script doesn't really know or care that you're including a script on your local server, so the url path causes an http request, and network latency from http requests is pretty much the bottleneck for rendering any html page in general, the fewer of them you have, the better off you're going to be.
Personally, I try to avoid using include and require in general, in favor of require_once, because using require_once means that you are writing your code reusably instead of writing code that executes immediately when you include it. Pull in class definitions, pull in function libraries, but try to avoid code that executes immediately when you include it, because that will make it harder to reuse.
If your question is about keeping it so you don't have to change a billion paths when you move from staging to production, go with this little tidbit I learned:
define('BASE_DIR', '/path/to/root/');
Then use BASE_DIR in all of your path references. When it's time to move your site, just change that definition to the new path (which should just be / at that point).
In addition to what other people say, these invocations will have different results, since the remove invocation will execute php output, not the file contents. Unless you stop php from processing the file, in which case you're exposing your code to the world which is also not necessarily what you actually want to.
Always include locally, cause if you include remote someone can create a different file and do nasty things. And the other problem you can't really test this with remote includes. As far as i know you should use require_once instead...
The question might prompt some people to say a definitive YES or NO almost immediately, but please read on...
I have a simple website where there are 30 php pages (each has some php server side code + HTML/CSS etc...). No complicated hierarchy, nothing. Just 30 pages.
I also have a set of purely back-end php files - the ones that have code for saving stuff to database, doing authentication, sending emails, processing orders and the like. These will be reused by those 30 content-pages.
I have a master php file to which I send a parameter. This specifies which one of those 30 files is needed and it includes the appropriate content-page. But each one of those may require a variable number of back-end files to be included. For example one content page may require nothing from back-end, while another might need the database code, while something else might need the emailer, database and the authentication code etc...
I guess whatever back-end page is required, can be included in the appropriate content page, but one small change in the path and I have to edit tens of files. It will be too cumbersome to check which content page is requested (switch-case type of thing) and include the appropriate back-end files, in the master php file. Again, I have to make many changes if a single path changes.
Being lazy, I included ALL back-end files inthe master file so that no content page can request something that is not included.
First question - is this a good practice? if it is done by anyone at all.
Second, will there be a performance problem or any kind of problem due to me including all the back-end files regardless of whether they are needed?
EDIT
The website gets anywhere between 3000 - 4000 visits a day.
You should benchmark. Time the execution of the same page with different includes. But I guess it won't make much difference with 30 files.
But you can save yourself the time and just enable APC in the php.ini (it is a PECL extension, so you need to install it). It will cache the parsed content of your files, which will speed things up significantly.
BTW: There is nothing wrong with laziness, it's even a virtue ;)
If your site is object-oriented I'd recommend using auto-loading (http://php.net/manual/en/language.oop5.autoload.php).
This uses a magic method (__autoload) to look for a class when needed (it's lazy, just like you!), so if a particular page doesn't need all the classes, it doesn't have to get them!
Again, though, this depends on if it is object-oriented or not...
It will slow down your site, though probably not by a noticable amount. It doesn't seem like a healthy way to organize your application, though; I'd rethink it. Try to separate the application logic (eg. most of the server-side code) from the presentation layer (eg. the HTML/CSS).
it's not a bad practice if the files are small and contains just definition and settings.
if they actually run code, or extremely large, it will cause a performance issue.
now - if your site has 3 visitors an hour - who cares, if you have 30000... that's another issue, and you need to work harder to minimize that.
You can migitate some of the disadvantages of PHP code-compiling by using XCache. This PHP module will cache the PHP-opcode which reduces compile time and performance.
Considering the size of your website; if you haven't noticed a slowdown, why try to fix it?
When it comes to larger sites, the first thing you should do is install APC. Even though your current method of including files might not benefit as much from APC as it could, APC will still do an amazing job speeding stuff up.
If response-speed is still problematic, you should consider including all your files. APC will keep a cached version of your sourcefiles in memory, but can only do this well if there are no conditional includes.
Only when your PHP application is at a size where memory exhaustion is a big risk (note that for most large-scale websites Memory is not the bottleneck) you might want to conditionally include parts of your application.
Rasmus Lerdorf (the man behind PHP) agrees: http://pooteeweet.org/blog/538
As others have said, it shouldn't slow things down much, but it's not 'ideal'.
If the main issue is that you're too lazy to go changing the paths for all the included files (if the path ever needs to be updated in the future). Then you can use a constant to define the path in your main file, and use the constant any time you need to include/require a file.
define('PATH_TO_FILES', '/var/www/html/mysite/includes/go/in/here/');
require_once PATH_TO_FILES.'database.php';
require_once PATH_TO_FILES.'sessions.php';
require_once PATH_TO_FILES.'otherstuff.php';
That way if the path changes, you only need to modify one line of code.
It will indeed slow down your website. Most because of the relative slow loading and processing of PHP. The more code you'd like to include, the slower the application will get.
I live by "include as little as possible, as much as necessary" so i usually just include my config and session handling for everything and then each page includes just what they need using an include path defined in the config include, so for path changes you still just need to change one file.
If you include everything the slowdown won't be noticeable until you get a lot of page hits (several hits per second) so in your case just including everything might be ok.
I was going to ask what the best way to do this is, but then decided I should ask whether or not it is even necessary. I have never seen it done in JSP development, but it appears to be common practice in PHP. What is the reasoning behind this, and if I do not protect against this, what else should I be taking into consideration?
The reason this is more common in PHP than other similar languages has to do with PHP's history. Early versions of PHP had the "register_globals" setting on as a default (in fact, it may not have even been a setting in really early versions). Register_globals tells PHP to define global variables according to the query string. So if you queried such a script thusly:
http://site.com/script.php?hello=world&foo=bar
... the script would automatically define a variable $hello with value "world" and $foo with value "bar."
For such a script, if you knew the names of key variables, it was possible to exploit the script by specifying those variables on the query string. The solution? Define some magic string in the core script and then make all the ancilliary scripts check for the magic string and bail out if it's not there.
Thankfully, almost nobody uses register_variables anymore, but many scripts are still very poorly written and make stupid assumptions that cause them to do damage if they are called out of context.
Personally, I avoid the whole thing by using the Symfony framework, which (at least in its default setup) keeps the controllers and templates out of the web root altogether. The only entry point is the front controller.
If you include everything from outside web root then it's not an issue as nothing can be loaded directly.
Well, This is to prevent sensitive includes from being sent to the web-server directly. It's certainly not an all-inclusive security measure, but it could help with your particular setup.
If however, your user was in a position to include the file from their own script, it won't help at all
I emit a 404 page, not as a serious security measure but only because I don't like leaking information about the internals of a site, even the names of internal files.
But if the file just contains functions then there's no real harm in omitting the check.
It also isn't just a security feature in php but more of how many MVC based PHP sites function. If for example in SugarCRM you were to call a module file directly the page load would fail because the controller, view and model were not previously loaded and you'd have no db config/connection information either, so to make sure all dependencies are loaded the users is forced through a known entry point - i.e. index.php
I just found an approach in the .Net MVC system that you could replicate for PHP using Apache Rewrites, .htaccess files or if you are using IIS, a web.config file.
As the MVC pattern doens't need the user to directly access aspx files these are not served and a 404 is sent instead. If you have a naming convention for included files "inc.php" for example you could redirect *.inc.php requests to a 404 for specific folders - in Apache Rewrite supply R=404 at the end of the rule will return that HTTP status to your client.
Some of these examples may help: Apache Rewrite Examples
As already mentioned in some of the other answers, you shouldn't need to do this. If a file isn't supposed to be served up by the web server, you shouldn't leave it within the web folder. Includes should be placed in a directory outside the web root.
Apart from that, the proper way to tell the user that a page doesn't exist, is by emitting a status 404, using:
header("HTTP/1.0 404 Not Found");
exit;
If you don't do this, it is hard for non-humans (Eg. search-engines) to distinguish between a regular page and a non-page.
This is very important because if you are editing your site running Google Toolbar, it will find your inner php files and then put them into search results. At best this will create an awkward experience for users but if you are a sloppy programmer, could reveal database connection information.