How to disable direct PHP file execution in subdirectories - php

I have noticed one issue.
I have a server running a web application. Now I can access PHP files directly from url, like that
http://superapp.com/libs/mongo.php
How can I restrict of direct execution of all PHP files except of index.php in the app root directory.
I would be grateful for any help

Generally speaking, people usually put the application code above (../) the public webroot to accomplish this. Basically your wwwroot folder has index.php which includes something from ../MyApp/Bootstrap.php and maybe calls a function to kick things off.
You can disable php execution on a per folder basis, however, that is probably not what you want as it will deliver the SRC code to the requesting user. You could use htaccess or similar to restrict requests that are not to index.php.

Related

Apache restricting folder for script access only

I have a folder called data which has .ini files that my scripts access. I've put a .htaccess in the data folder with deny all so that the folder cannot be accessed directly. This way only my scripts on the server can access it.
Is this .htaccess method good enough? i've also tested chmod the folder to 700 which seems to also do exactly the same thing, it denies public access but allows scripts to access the files.
I am wondering if I should just use both methods? or is one method better than the other?
Ideally, I assume it would be better to place the files outside the www folder but I do not have access to that.
The .htaccess solution will stop requests from a client from reading the files via a web browser or similar:
deny from all
The advantage of the .htaccess file is that a future developer or admin would see the file and know you had deliberately chosen to protect that code in a well recognised way. This could help avoid issues down the line. For example if the site was copied to a new server.

how to send header:location to out of web root file

I'm making a web application which will only allow registered members to download zip folders from a folders directory.
I really need to know which would be the proper way to secure the folder as only members stored in my database will be able to access them so the problem is if somebody finds the directory and a file name there's nothing to stop them accessing it.
I've been doing some research and found some approaches but they all have major drawbacks.
1.) put the files outside of the webroot then use readfile to send them the data.
This is how I have it currently set up. the major draw back is that I'm on a shared server and max execution time for the script is 30 seconds (can't be changed) and if the file is big or user connection slow the timeout will be called before the download is complete.
2.) htaccess and htpasswd inside a webroot directory.
The problem with this is I don't want to have to ask the user to put a password again. unless there's a way to allow php to send the password then send a header to the actual zip file that needs to be downloaded.
3.) Keeping the files in webroot but obfuscating the file names so they are hard to guess.
this is just totally lame!
What I really would like to do is keep the files outside of web root then just send a header:location to that document to force a download, obviously as it's not in web root so the browser won't see it. is there a way around this. Is there a way to redirect to an out of web root file with header:location('/file') to force a download. thus allowing apache to serve the file and not php with readfile.
Is there some easier way to secure the folders and serve with apache that I am just not coming across? Has anybody experienced this problem before and is there an industry standard way to do this better?
I know this may resemble a repeat question but none of the answers in the other similar question gave any useful information for my needs.
What I really would like to do is keep the files outside of web root then just send a header:location to that document to force a download, obviously as it's not in web root so the browser won't see it.
More to the point, it is outside the web root so it doesn't have a URL that the server can send in the Location header.
is there a way around this. Is there a way to redirect to an out of web root file with header:location('/file') to force a download.
No. Preventing the server from simply handing over the file is the point of putting it outside the web root. If you could redirect to it, then you would just be back in "hard to guess file name" territory with the added security flaw of every file on the server being public over HTTP.
Is there some easier way to secure the folders and serve with apache that I am just not coming across.
Your options (some of which you've expressed already in the form of specific implementations) are:
Use hard to guess URLs
Put the file somewhere that Apache won't serve it by default and write code that will serve it for you
Use Apache's own password protection options
There aren't any other approaches.
Is there some easier way to secure the folders and serve with apache that I am just not coming across.
No, there isn't an easier way (but that said, all three implementations you've described are "very easy").
Another approach, which I consider really dirty but might get around your resource constraints:
Keep the files outside the web root
Configure Apache to follow symlinks
On demand: Create a symlink from under the web root to the file you want to serve
Redirect to the URI of that symlink
Have a cron job running every 5 minutes to delete old symlinks (put a timestamp in the symlink filename to help with this)
It's effectively a combination of the first two options in my previously bulleted list.

Is it possible to shield a directory/file on a server from the outside world, but make it accessible to PHP?

I've been wondering: is it possible to shield a directory/file on a server from the outside world, but make it accessible to PHP?
It's fairly simple. I'm caching webpages on my server with PHP in a certain directory, but I do not want web users to view these files or this directory directly. PHP, on the other hand, must be able to access these files (to serve them to the user). That may sound not logical, but what I'm trying to do is restrict users certain pages and still be able to cache them in a webserver-savvy format.
Preferably something with .htaccess or chmod.
Thanks!
Absolutely-- in fact, you don't need to use .htaccess. Simply put the protected directory above your document root (that is, store it next to the folder where your PHP scripts are store, typically "htdocs," "httpdocs" or sometimes just "www').
So your web files would be in /my/folders/httpdocs/, and your "protected" files would be in /my/folders/protected_folder/
The idea here is that PHP can access any folder on the server, but Apache won't let the user navigate "above" the root directory.
To access the directory, you can use:
$protected_path = $_SERVER['DOCUMENT_ROOT'].'/../protected_folder/';
(Incidentally, you mentioned you're doing this to cache pages-- you might want to look at Smarty, the PHP template engine, which pre-compiles your templates and also supports really smart caching. And in fact, one of the Smarty "best practices" is to configure your structure so the template and cache files are not in or below the document_root folder, so users coming in from Apache can never get to them, but the Smarty PHP code can easily grab whatever it needs from there.)
Sure, just place the files in a directory outside of your web root. For instance, if your web root is /usr/local/apache/htdocs/ you can create a /usr/local/apache/private_cache/ directory that PHP should have access to, but there is no way to get to it via a web request.
You can also put a .htaccess file consisting of the line deny from all in the directory you want to protect. That will prevent Apache (but not PHP) from serving up the files.

Storing script files outside web root

I've seen recommendations to store some or all php include files some place other than in the web document root directory (username/public_html in my case) for the specific reason of protecting php files with sensitive information (like database connection and login info) in the event that the web server hiccups and stops protecting php files and they become 'visible' to outsiders who know where to look.
It seems somewhat paranoid to me, but I'm guessing people have gotten burned badly on this before so I'm willing to go along. The suggestion usually takes the form of having the include files in something like '../include_files/' so its not directly in the document root and not directly accessible to outsiders through the web server.
My question is this: is there a significant difference in security between that way and just putting your 'include_files' directory under the document root and sticking an .htaccess file in there (with the appropriate entries)? Would putting an .htaccess file in '../include_files/' make any significant improvement there?
TIA,
Monte
Using .htaccess adds overhead since Apache has another item it needs to check for and process.
Keeping files out of web root isn't being paranoid, it's good practice. What happens if someone accesses one of the "include" files directly and it throws out revealing errors because all the pre-requisite files weren't loaded?
Each file needs to have it's own security checks to make sure it is running under the expected environment. Each executable file in a web accessible area is a potential security hole.
It really depends on what you have in your include_files. The most important thing is that you put any credentials you have outside of the document root ( database logins, etc ). Everything else really is secondary and doesn't matter that much.
If you don't want anyone stealing your source code then try to follow Zend conventions:
application
library
public
DocumentRoot points to public and that just contains media files, js/css files. HTML/views, db logic, conf/credentials are in application. Third party libraries are in library.
Theoretically, if you just stick a .htaccess file in the folder, you could still have the .php files called directly.
Taking them out of the server root; however, keeps them from be accessed ever by someone who is browsing your website.

How can I protect a directory using PHP?

Notice, this is a remote server, I don't have access to it, only to the FTP. I want to do this using purely PHP, not .htaccess. Is there a way similar to .net, where you put the web.config file and you set who can access it and their password?
I'd say the equivalent of that kind of functionnality from web.config on Apache is with .htaccess files : PHP is used to generate pages, but if you are trying to work at the directory level, the check has to come before PHP is even called.
In your PHP scripts, you can access the data of HTTP Authentication ; see $_SERVER, especially PHP_AUTH_USER and PHP_AUTH_PW ; but the protection will be at the file's level, and not directory -- and, obviously, it will be enforced only for PHP files (not images in a subdirectory, for instance).
For more informations, you can have a look at, for instance : HTTP Basic and Digest authentication with PHP
The right way to do this for an entire directory is definitly with .htpasswd / .htaccess files (or directly in the Apache's configuration file).
Why using PHP? .htaccess files were designed for this purpose. If you're trying to do something like store user logons in a database, look at the something like Mod_auth_mysql
What you can do is place the files outside of your webroot and write a php script to serve those files after passing your authentication logic.
As far as I'm aware there is no way to do this purely in PHP.
If you can use .htaccess but cannot upload it for whatever reason, then I would suggest writing the htaccess via PHP.
Directory protection is always the work of the webserver. Whatever you do in your PHP script, it's the webserver's responsibility to execute the script in the first place. If you try to access another file in the directory, the webserver will hand it to the user without even looking at your script. If the user requests a directory listing, it's the webserver that's handing it to the user.
You will have to configure the webserver correctly, most likely using .htaccess files, if you want to protect real, "physical" directories.

Categories