I have a few PHP scripts that generate word documents into a folder.
The PHP scripts are locked down with a login session, however if you know the right URL the documents are accessible.
How do I lock down the directory from being accessible from a url but still maintain read and write access for the PHP scripts? Currently the chmod for the directory is set to 777. Can this be set with the folder permissions?
You place the documents at the wrong location in your file system.
Do not place the documents inside the document root and you do not have to protect them. There is not need to place them exactly in there. There is no limit to still accessing such documents from php when they are stored elsewhere. So create yourself a folder outside the document root and that's it.
General rule of thumb:
never place objects inside the document root that are not meant to be accessed directly by web requests.
documents meant to be offered for download should not be offered directly but by a handler script instead.
Related
I have been told that for security reasons all PHP data handling files should be located outside of the website root directory. I have a website hosted in IIS 10 with the includes folder outside the root. Something like this:- website: C:\inetpub\wwwroot\index.php and PHP files C:\includes\submit.inc.php This is obviously not working since http:\localhost\includes\submit.inc.php doesn't exists. The submit.inc.php is a file that AJAX uses to send form data back to the server.
So, should I be worried about PHP file separation in the first place, and if so, how can I reference files outside website physical path in IIS?
Thanks.
You should have PHP files (or a file if you are using the Front Controller pattern) that are inside the root.
These provide endpoints (i.e. have URLs) that browsers and other clients can make requests to.
From there you can load the dependencies from outside the root using require_once.
The primary goal of keeping your data processing outside the root is to protect your business logic and security credentials from leaking if an HTTP server configuration error causes your PHP files to be served up raw instead of processing them with PHP.
This isn't served by keeping all your PHP outside the root.
I understand it's commonly suggested that documents such as user uploaded files be stored below the web root. However if the documents are stored in a folder in public_html and htaccess is setup to block the browsing of folders and the filename of the uploaded file is say a long randomly generated string, how would someone be able to access this file without permission anyway?
Yes, if the web server doesn't allow public access by configuration, then nobody can access the files. However, not allowing access by configuration is more brittle than the files simply not being there at all. It can always happen that a careless admin misconfigures the server and suddenly allows access to those files, which is simply a non-issue if you keep the files out of the webroot to begin with. Perhaps you also have a secondary vulnerability in which somebody may be able to alter your .htaccess files, thereby allowing themselves in through the backdoor. Again, not an issue if the files simply aren't there at all.
I have a folder called data which has .ini files that my scripts access. I've put a .htaccess in the data folder with deny all so that the folder cannot be accessed directly. This way only my scripts on the server can access it.
Is this .htaccess method good enough? i've also tested chmod the folder to 700 which seems to also do exactly the same thing, it denies public access but allows scripts to access the files.
I am wondering if I should just use both methods? or is one method better than the other?
Ideally, I assume it would be better to place the files outside the www folder but I do not have access to that.
The .htaccess solution will stop requests from a client from reading the files via a web browser or similar:
deny from all
The advantage of the .htaccess file is that a future developer or admin would see the file and know you had deliberately chosen to protect that code in a well recognised way. This could help avoid issues down the line. For example if the site was copied to a new server.
I've been wondering: is it possible to shield a directory/file on a server from the outside world, but make it accessible to PHP?
It's fairly simple. I'm caching webpages on my server with PHP in a certain directory, but I do not want web users to view these files or this directory directly. PHP, on the other hand, must be able to access these files (to serve them to the user). That may sound not logical, but what I'm trying to do is restrict users certain pages and still be able to cache them in a webserver-savvy format.
Preferably something with .htaccess or chmod.
Thanks!
Absolutely-- in fact, you don't need to use .htaccess. Simply put the protected directory above your document root (that is, store it next to the folder where your PHP scripts are store, typically "htdocs," "httpdocs" or sometimes just "www').
So your web files would be in /my/folders/httpdocs/, and your "protected" files would be in /my/folders/protected_folder/
The idea here is that PHP can access any folder on the server, but Apache won't let the user navigate "above" the root directory.
To access the directory, you can use:
$protected_path = $_SERVER['DOCUMENT_ROOT'].'/../protected_folder/';
(Incidentally, you mentioned you're doing this to cache pages-- you might want to look at Smarty, the PHP template engine, which pre-compiles your templates and also supports really smart caching. And in fact, one of the Smarty "best practices" is to configure your structure so the template and cache files are not in or below the document_root folder, so users coming in from Apache can never get to them, but the Smarty PHP code can easily grab whatever it needs from there.)
Sure, just place the files in a directory outside of your web root. For instance, if your web root is /usr/local/apache/htdocs/ you can create a /usr/local/apache/private_cache/ directory that PHP should have access to, but there is no way to get to it via a web request.
You can also put a .htaccess file consisting of the line deny from all in the directory you want to protect. That will prevent Apache (but not PHP) from serving up the files.
I have a PHP script that processes file uploads. The script tries to organise the files that are uploaded and may create new folders to move the files into if needed. These files will be below the www root directory (ie, a web browser will be able to access them).
My question is, what permissions should I set for the folders that get created and for the files that are moved into them (using mkdir() and move_uploaded_file())?
Your webserver needs read and write permission in those folders, execute permission should be revoked (assuming UNIX-like systems). If not, a user could upload a script and have it executed by sending a HTTP request for it.
But IMO the whole concept is a potential security hole. Better store the files in a folder outside the webserver root, so that no direct acceess is possible. In your web application, you can have a PHP download page that scans the upload directory and displays a list of download links. These download links lead to another script, that reads the fiels from you storage dir und sends them to the user.
Yes, this is more work. But the scenario is very common, so you should be able to find some source code with example implementations easily. And it it much less work that having your server hacked...
to answer it specifically 766 (no execute permissions) would be the loosest you would want to use. On the other end 700 would allow no one but the web user to mess with the file.
But really it all depends you were doing with the files that would determine the best result.