Apache restricting folder for script access only - php

I have a folder called data which has .ini files that my scripts access. I've put a .htaccess in the data folder with deny all so that the folder cannot be accessed directly. This way only my scripts on the server can access it.
Is this .htaccess method good enough? i've also tested chmod the folder to 700 which seems to also do exactly the same thing, it denies public access but allows scripts to access the files.
I am wondering if I should just use both methods? or is one method better than the other?
Ideally, I assume it would be better to place the files outside the www folder but I do not have access to that.

The .htaccess solution will stop requests from a client from reading the files via a web browser or similar:
deny from all
The advantage of the .htaccess file is that a future developer or admin would see the file and know you had deliberately chosen to protect that code in a well recognised way. This could help avoid issues down the line. For example if the site was copied to a new server.

Related

Is it safe to store documents under public_html if htaccess doesn't allow browsing folders?

I understand it's commonly suggested that documents such as user uploaded files be stored below the web root. However if the documents are stored in a folder in public_html and htaccess is setup to block the browsing of folders and the filename of the uploaded file is say a long randomly generated string, how would someone be able to access this file without permission anyway?
Yes, if the web server doesn't allow public access by configuration, then nobody can access the files. However, not allowing access by configuration is more brittle than the files simply not being there at all. It can always happen that a careless admin misconfigures the server and suddenly allows access to those files, which is simply a non-issue if you keep the files out of the webroot to begin with. Perhaps you also have a secondary vulnerability in which somebody may be able to alter your .htaccess files, thereby allowing themselves in through the backdoor. Again, not an issue if the files simply aren't there at all.

How to prevent access to a directory (apache and nginx) in php

I have a script with a subdirectory containing files that should not be accessible from outside (so, users cannot directly access it even if they know the remote URL).
In order to do so, I thought about programmatically creating a .htaccess file in the directory with the line
Deny from all
This would work, but I need this to work both on apache and nginx, and I am not very familiar with the latter. I wasn't able to find any good pieces of information on this, though, although as far as I understood I should NOT use .htaccess with nginx.
So my question is whether there is an alternative that would work on nginx. I did find out how to configure nginx to deny access to a specific folder, but I need the script to be able to do this by creating the local file. I can't assume to have access elsewhere to the server.
By the way, any other solution that does not involve .htaccess would be ok as well, as long as it prevents the direct access to the file through URL and it makes it accessible to php.
In case it matters, the script is part of a WordPress plugin.

How can you access files without a explicit link?

I have found in my server, that some files are stored inside the webroot, for example
example.com/files/longnamewithcode_1234293182391823212313231230812.pdf
They are stored for a web app and hace sensible info on them.
If you access example.com/files you get an empty index.html, so you can't directly scan the directory. Anyway, I'm concerned about this: I feel that it is not safe and I would like to know what kind of attacks could be made to access the files. I understand that some brute force attack would be possible, but with the long code names I guess it's a less big problem.
Finally, I would say that the correct way is storing the files outside the web folder and return them with PHP, but I'm not sure I'll be able to have access to the code to change this.
If you have to make the files accessible from webroot by the webserver you can't really make it more safe than use sufficient amount of entropy in the file names, but that still not account for simply sharing a the links by users that get a hold of them somehow.
If you want to implement the permission checking inside php take a look into the various X-Sendfile implementations on popular webservers like, mod_xsendfile (apache), XSendfile (nginx) or X-LIGHTTPD-send-file (lighttpd). This allows you to use the webserver to serve the file basically as efficiently as simply accessing it from the webroot after you validated the accessing user.
have you considered an .htaccess file to restrict who is allowed to access those sensible files? you tagged it, but i'm not sure why you are not using it. =)
If you wish to block everything in the folder you can use an .htaccess file to block all connections.
Try
deny from all
in the .htaccess file in that directory. This will deny all access to anything in that directory including the index file.
The question is
Are the files supposed to be accessed by users freely?
If yes, don't worry about those files too much (as long as they're not writeable).
If no, i.e. users have to be logged in to download them, move them out from the publicly accessible folder and deeper into the application. Write a php script that will manage the permissions for the file i.e. /download?file_id=1.

Is it possible to shield a directory/file on a server from the outside world, but make it accessible to PHP?

I've been wondering: is it possible to shield a directory/file on a server from the outside world, but make it accessible to PHP?
It's fairly simple. I'm caching webpages on my server with PHP in a certain directory, but I do not want web users to view these files or this directory directly. PHP, on the other hand, must be able to access these files (to serve them to the user). That may sound not logical, but what I'm trying to do is restrict users certain pages and still be able to cache them in a webserver-savvy format.
Preferably something with .htaccess or chmod.
Thanks!
Absolutely-- in fact, you don't need to use .htaccess. Simply put the protected directory above your document root (that is, store it next to the folder where your PHP scripts are store, typically "htdocs," "httpdocs" or sometimes just "www').
So your web files would be in /my/folders/httpdocs/, and your "protected" files would be in /my/folders/protected_folder/
The idea here is that PHP can access any folder on the server, but Apache won't let the user navigate "above" the root directory.
To access the directory, you can use:
$protected_path = $_SERVER['DOCUMENT_ROOT'].'/../protected_folder/';
(Incidentally, you mentioned you're doing this to cache pages-- you might want to look at Smarty, the PHP template engine, which pre-compiles your templates and also supports really smart caching. And in fact, one of the Smarty "best practices" is to configure your structure so the template and cache files are not in or below the document_root folder, so users coming in from Apache can never get to them, but the Smarty PHP code can easily grab whatever it needs from there.)
Sure, just place the files in a directory outside of your web root. For instance, if your web root is /usr/local/apache/htdocs/ you can create a /usr/local/apache/private_cache/ directory that PHP should have access to, but there is no way to get to it via a web request.
You can also put a .htaccess file consisting of the line deny from all in the directory you want to protect. That will prevent Apache (but not PHP) from serving up the files.

How can I protect a directory using PHP?

Notice, this is a remote server, I don't have access to it, only to the FTP. I want to do this using purely PHP, not .htaccess. Is there a way similar to .net, where you put the web.config file and you set who can access it and their password?
I'd say the equivalent of that kind of functionnality from web.config on Apache is with .htaccess files : PHP is used to generate pages, but if you are trying to work at the directory level, the check has to come before PHP is even called.
In your PHP scripts, you can access the data of HTTP Authentication ; see $_SERVER, especially PHP_AUTH_USER and PHP_AUTH_PW ; but the protection will be at the file's level, and not directory -- and, obviously, it will be enforced only for PHP files (not images in a subdirectory, for instance).
For more informations, you can have a look at, for instance : HTTP Basic and Digest authentication with PHP
The right way to do this for an entire directory is definitly with .htpasswd / .htaccess files (or directly in the Apache's configuration file).
Why using PHP? .htaccess files were designed for this purpose. If you're trying to do something like store user logons in a database, look at the something like Mod_auth_mysql
What you can do is place the files outside of your webroot and write a php script to serve those files after passing your authentication logic.
As far as I'm aware there is no way to do this purely in PHP.
If you can use .htaccess but cannot upload it for whatever reason, then I would suggest writing the htaccess via PHP.
Directory protection is always the work of the webserver. Whatever you do in your PHP script, it's the webserver's responsibility to execute the script in the first place. If you try to access another file in the directory, the webserver will hand it to the user without even looking at your script. If the user requests a directory listing, it's the webserver that's handing it to the user.
You will have to configure the webserver correctly, most likely using .htaccess files, if you want to protect real, "physical" directories.

Categories