Quick...possibly stupid question.
When allow_url_include is set to off, does that prohibit other computers from remotely including the files on my site, or does it say that I'm not allowed to remotely include files from other sites?
Settings in your php.ini only affect your PHP installation
Setting allow_url_include to off in your configuration means that your PHP code on your server will not be able to include remote files.
But it doesn't change anything for other servers, which might request files from your server, like anyone can from the web -- note, though, that if some PHP file is requested, it'll be interpreted, and its output (not its content ! ) will be sent.
As said in the php manual:
http://www.php.net/manual/en/filesystem.configuration.php#ini.allow-url-include
is for including remote files, only if you let your directory to reach the source, others can include your php files.
Then is the second suggestion
It's always impossible to include PHP code from properly installed server, no matter of this configuration setting.
At the same time no PHP configuration setting can forbid HTTP clients from requesting resources from your HTTP server.
Related
I have several location specific websites and I have one file that lists all of the communities. I am calling it up on the other sites using
include (/otherdomain/file.php)
That has worked for years, but now I am having permmission issues.
By default its not possible call php file from remote server because of security issues, and it's not recomended to do that.
but if you want to do that you must enable allow_url_include in php.ini file (set it to on) (it's disabled by default).
I have a file called config file containing lots of function in my online server, I decide to include this file on a project I'm doing locally by simply coding
<?php include("http://example.com/config.php") ;?>
And I have a function called
<?php restrict(){
//content
} ;?>
This function is not working despite the fact that I've included the file
To allow inclusion of remote files, the directive allow_url_include must be set to On in php.ini
But that is a security problem, that's why is generally disabled (I've never seen it enabled, actually)
You certainly don't have included the source code of the PHP file but the ouput it does, it won't work. You have a webserver like apache or nginx + fpm that interprets the source code you have exposed.
You can serve this file as it without any interpretaion (by disabling mod_php on apache for instance) but you shouldn't. Exposing the source code is always a bad idea and must be avoided.
I am a PHP newbie and a have a php security question. Is it possible for somebody to get the source code of a php script file running on a server with default configuration? If so, what is the best way to be protected? I am asking this because I happened to download a php file when I requested a page from a site and what triggered my concerns. I think that maybe apache configuration was wrong and served that file to me like a simple file but I am not sure. Also what is the best place to keep "sensitive" data such as database or smtp configuration?
Thank you,
Alex
For the most sensitive information, I'd suggest putting it outside of your web root folder, and including it through "require" or "include". This way, even is some configuration gets botched on the server, the visitor will only get served the line "include('secret_stuff.php');" and not the actual script.
Exactly what David Dorward said but i would advise you take a look at the following patch(s) that would modify apache to not send source code's regards if there is a misconfiguration.
http://mirror.facebook.net/facebook/patches/ap_source_defense.patch
Patch like so:
cd apache-1.3.x
patch -p1 -i ap_source_defense.patch
More Patches from Facebook Development Team: http://mirror.facebook.net/facebook/patches/
The best way to protect your much needed source is to place them outside the public root directory, as if apache is running it will not be able to serve files directly from the folder up public_html
for example:
C:/server/apache/
C:/server/apache/htdocs/
C:/server/apache/htdocs/includes/
People can specifically view the files my going to
http://hostname.tld/includes/
but having the directory structure of:
C:/server/apache/
C:/server/apache/includes/
C:/server/apache/htdocs/
and then within
C:/server/apache/htdocs/index.php
you have
<?php
require_once('../includes/config.php');
?>
this should protect all major files bar the view file (index.php)
If the server is not configured to handle PHP files, then it will treat them like any other unknown file (and serve them as either text/plain or application/octet-stream.
PHP support is, as far as I know, always provided as an extension or external program (for CGI, FastCGI, etc) and never as a built in for an HTTP server.
If the server is properly configured to run PHP code, then people without direct access to the server cannot view the PHP source code. You don't have to do anything else.
It is only because that server was not configured to run PHP, and instead served it as text, that you could see the source.
If you have this line in your apache.httpd.conf file,
AddType application/x-httpd-php .php
Apache should deal with data, rather than showing them...
Also you need to start php services.
What you describe as "default configuration" is a webserver without php installed (or with php disabled). In these cases, it is certainly possible to download the php script.
Make sure php is installed (as it will be on ~100% of production php servers) and/or block access to your configuration file with an .htaccess file like this:
<FilesMatch "^config.php$">
Order allow,deny
Deny from all
</Files>
If you want to be extra-tricky (and work even on servers where .htaccess files are ignored), prefix the configuration file with .ht, like .ht.config.php. Most Apache(and some other webserver) configurations will refuse serving files beginning with .ht. However, in general, the only way you could be sure no webserver serves your file is to move it to a directory outside of the server's document directory. On most hosts you or your php script won't be able to access those though.
Your second problem are misconfigurations. There's not much you can do, albeit there might(?) be options to construct a rewriterule to prevent accidential accessibility.
The best prevention however is to keep all scripts outside of the DOCUMENT_ROOT. Just leave a single index.php there, and include all dependencies from there. This is also the best strategy to avoid leaking of configuration data (also don't use ini files for sensitive data, but always .php scripts).
Another worry are shared hosting servers however. All concurrent users on a server can read out your scripts (if not through PHP then via Perl/bash CGIs). Nothing you can do about that, unless you change to a professional hoster which supports running everthing through suexec and thus allowing individual permissions.
Well, "default configuration" is a vague term, but as long as the web server is configured to parse .php files through PHP, you should be fine from that angle. If your scripts themselves happen to parse other PHP files (for eg. a template system) then you need to make sure there aren't any loopholes in that too, since the PHP scripts have full access to your files on the server.
Assuming these are taken care of, you don't need to keep the "sensitive" data in any special place -- just put them in your .php files, but make sure all your scripts end in .php (for eg. config.inc.php) so that they are always parsed via PHP and not sent as plain text.
I'm having some issues to debug this in php. When I include this line:
require_once("http://" . $_SERVER["HTTP_HOST"] . "/dompdf/dompdf_config.inc.php");
what I get is just a blank page, I don't get any html code as response. Maybe the error messages are hidden ?
Quite often, when you get a WSOD (white screen of death), it's because there's a Fatal Error, and it's not displayed on the standard output -- i.e. the generated page.
To have it displayed, you need to :
set error_reporting to the right level
and enable display_errors
An easy way is to do that at the top of your PHP script, with a portion of code like this one :
error_reporting(E_ALL);
ini_set('display_errors', 'On');
In your specific case, you are trying to include/require something via HTTP ; which is often disabled.
See the allow_url_include directive, about that.
A possibility would be to enable that one in your PHP's configuration... But it's generally not considered as a good idea : it's disabled for security reasons.
And sending an HTTP request to include a file is slow -- and means your application will not work anymore if the remote server doesn't answer !
Also, here, you are trying to include a file from a remote server that is $_SERVER["HTTP_HOST"]...
... So, you are trying to include a file from a remote server that is, in fact, your own server ? i.e. not a remote one ?
If so, you should not try to include via HTTP ; instead, you should work with a local file ; this way (will need some tunning) :
require_once dirname(__FILE__) . "/dompdf/dompdf_config.inc.php";
This way :
No network un-needed request (you'll just read from the local disk) => faster and safer
And no need to enable allow_url_include
I should also add :
When including a local .php file, the content of the .php file is included in your page ; like if it's copy-pasted
When including a .php file via HTTP, chances are that the remote server will interpret the PHP code, and only send you the output back
Which means it's not the PHP code that will get included by your script
But only the output you'd get by executing that PHP code !
You should not require/include a remote file like this. Instead provide the local absolute or relative path.
Though insecure and not recommended, it is technically possible to do if certain configuration options are set. (allow_url_include)
See other answers below regarding display_errors for future debugging concerns. I often use the PHP command line interpreter to get the real error, without allowing error details to be presented to web visitors.
This a very unusual and insecure way to include files, but yet if you still want to use it, make sure that the file you're including isn't being executed on the remote server since you probably targeting the php source code on the require_once here not the final output of it.
The parameter to the require_once statement should be a file path, not a URL.
You are telling the web server to import a file from the file system, not the client to import the file from the web.
It is documented on the include statement page.
Try adding this as the first line of your script (after the <?php obviously):
error_reporting(E_ALL);
I am unable to include a remote PHP file in my PHP script. I suppose my hosting changed php settings.
The code I was using was:
include "http://domain.com/folder/file.php";
How do I allow enable the include function using php.ini/.htaccess ?
Is there any other workaround?
Thanks.
To allow inclusion of remote files, the directive allow_url_include must be set to On in php.ini
But it is bad, in a security-oriented point of view ; and, so, it is generally disabled (I've never seen it enabled, actually)
It is not the same as allow_url_fopen, which deals with opening (and not including) remote files -- and this one is generally enabled, because it makes fetching of data through HTTP much easier (easier than using curl)
To use remote includes, the allow_url_fopen and allow_url_include option must be set in php.ini
Be aware that if the remote server is php-enabled, you'll get the output of that remote script, not the script itself. If you do want to fetch the source, you could add a symlink on the remote server, e.g. ln -s file.php file.php.source and then make your include reference file.php.source instead.