How can I protect a directory using PHP? - php

Notice, this is a remote server, I don't have access to it, only to the FTP. I want to do this using purely PHP, not .htaccess. Is there a way similar to .net, where you put the web.config file and you set who can access it and their password?

I'd say the equivalent of that kind of functionnality from web.config on Apache is with .htaccess files : PHP is used to generate pages, but if you are trying to work at the directory level, the check has to come before PHP is even called.
In your PHP scripts, you can access the data of HTTP Authentication ; see $_SERVER, especially PHP_AUTH_USER and PHP_AUTH_PW ; but the protection will be at the file's level, and not directory -- and, obviously, it will be enforced only for PHP files (not images in a subdirectory, for instance).
For more informations, you can have a look at, for instance : HTTP Basic and Digest authentication with PHP
The right way to do this for an entire directory is definitly with .htpasswd / .htaccess files (or directly in the Apache's configuration file).

Why using PHP? .htaccess files were designed for this purpose. If you're trying to do something like store user logons in a database, look at the something like Mod_auth_mysql
What you can do is place the files outside of your webroot and write a php script to serve those files after passing your authentication logic.

As far as I'm aware there is no way to do this purely in PHP.
If you can use .htaccess but cannot upload it for whatever reason, then I would suggest writing the htaccess via PHP.

Directory protection is always the work of the webserver. Whatever you do in your PHP script, it's the webserver's responsibility to execute the script in the first place. If you try to access another file in the directory, the webserver will hand it to the user without even looking at your script. If the user requests a directory listing, it's the webserver that's handing it to the user.
You will have to configure the webserver correctly, most likely using .htaccess files, if you want to protect real, "physical" directories.

Related

How to prevent access to a directory (apache and nginx) in php

I have a script with a subdirectory containing files that should not be accessible from outside (so, users cannot directly access it even if they know the remote URL).
In order to do so, I thought about programmatically creating a .htaccess file in the directory with the line
Deny from all
This would work, but I need this to work both on apache and nginx, and I am not very familiar with the latter. I wasn't able to find any good pieces of information on this, though, although as far as I understood I should NOT use .htaccess with nginx.
So my question is whether there is an alternative that would work on nginx. I did find out how to configure nginx to deny access to a specific folder, but I need the script to be able to do this by creating the local file. I can't assume to have access elsewhere to the server.
By the way, any other solution that does not involve .htaccess would be ok as well, as long as it prevents the direct access to the file through URL and it makes it accessible to php.
In case it matters, the script is part of a WordPress plugin.

How can you access files without a explicit link?

I have found in my server, that some files are stored inside the webroot, for example
example.com/files/longnamewithcode_1234293182391823212313231230812.pdf
They are stored for a web app and hace sensible info on them.
If you access example.com/files you get an empty index.html, so you can't directly scan the directory. Anyway, I'm concerned about this: I feel that it is not safe and I would like to know what kind of attacks could be made to access the files. I understand that some brute force attack would be possible, but with the long code names I guess it's a less big problem.
Finally, I would say that the correct way is storing the files outside the web folder and return them with PHP, but I'm not sure I'll be able to have access to the code to change this.
If you have to make the files accessible from webroot by the webserver you can't really make it more safe than use sufficient amount of entropy in the file names, but that still not account for simply sharing a the links by users that get a hold of them somehow.
If you want to implement the permission checking inside php take a look into the various X-Sendfile implementations on popular webservers like, mod_xsendfile (apache), XSendfile (nginx) or X-LIGHTTPD-send-file (lighttpd). This allows you to use the webserver to serve the file basically as efficiently as simply accessing it from the webroot after you validated the accessing user.
have you considered an .htaccess file to restrict who is allowed to access those sensible files? you tagged it, but i'm not sure why you are not using it. =)
If you wish to block everything in the folder you can use an .htaccess file to block all connections.
Try
deny from all
in the .htaccess file in that directory. This will deny all access to anything in that directory including the index file.
The question is
Are the files supposed to be accessed by users freely?
If yes, don't worry about those files too much (as long as they're not writeable).
If no, i.e. users have to be logged in to download them, move them out from the publicly accessible folder and deeper into the application. Write a php script that will manage the permissions for the file i.e. /download?file_id=1.

How do I secure a hardcoded login/password in PHP?

I'm writing a simple PHP script to access the Foursquare API. The PHP will always access the same Foursquare account. For the time being, I have this login information hardcoded in my script. What is the best way to secure this information?
If I follow the advice from this thread, I should just place the login information in a config file outside the website's root directory:
How to secure database passwords in PHP?
Is this the best advice? Or is there a better way to secure the login information?
The best way, of course, would be to not store it at all.
If you can't do that, storing it inside a PHP file (as variables) should ensure it's not going to be sent to the client side. If you're really paranoid about your web server suddenly stopping to interpret PHP, you can put it in a separate file, outside the document root, or where access is denied (through a .htaccess directive, for instance).
(There are linux-specific details here, so please forgive them if that's not your platform...)
If you're running on apache and have access to the configuration files (which may not be the case with shared hosting), you can put a line like this in your VirtualHost config (or httpd.conf, or other included config file):
SetEnv FOURSQUARE_PW "your password"
Then your php scripts can access it at $_SERVER['FOURSQUARE_PW'].
The advantage here is that you can make that config file readable only by root, since apache will be started as root using init.d.
storing it in a .php file as a variable outside the root directory in a filename that is not something easily guessed is a reasonable secure way of keeping your credentials safe. But if you can avoid storing it on the server at all then that would be best. Provide a login page to enter that information into upon demand to be used for the session and then discarded once you no longer need it.

Is it possible to shield a directory/file on a server from the outside world, but make it accessible to PHP?

I've been wondering: is it possible to shield a directory/file on a server from the outside world, but make it accessible to PHP?
It's fairly simple. I'm caching webpages on my server with PHP in a certain directory, but I do not want web users to view these files or this directory directly. PHP, on the other hand, must be able to access these files (to serve them to the user). That may sound not logical, but what I'm trying to do is restrict users certain pages and still be able to cache them in a webserver-savvy format.
Preferably something with .htaccess or chmod.
Thanks!
Absolutely-- in fact, you don't need to use .htaccess. Simply put the protected directory above your document root (that is, store it next to the folder where your PHP scripts are store, typically "htdocs," "httpdocs" or sometimes just "www').
So your web files would be in /my/folders/httpdocs/, and your "protected" files would be in /my/folders/protected_folder/
The idea here is that PHP can access any folder on the server, but Apache won't let the user navigate "above" the root directory.
To access the directory, you can use:
$protected_path = $_SERVER['DOCUMENT_ROOT'].'/../protected_folder/';
(Incidentally, you mentioned you're doing this to cache pages-- you might want to look at Smarty, the PHP template engine, which pre-compiles your templates and also supports really smart caching. And in fact, one of the Smarty "best practices" is to configure your structure so the template and cache files are not in or below the document_root folder, so users coming in from Apache can never get to them, but the Smarty PHP code can easily grab whatever it needs from there.)
Sure, just place the files in a directory outside of your web root. For instance, if your web root is /usr/local/apache/htdocs/ you can create a /usr/local/apache/private_cache/ directory that PHP should have access to, but there is no way to get to it via a web request.
You can also put a .htaccess file consisting of the line deny from all in the directory you want to protect. That will prevent Apache (but not PHP) from serving up the files.

deny access to certain folder using php

Is it possible to "deny from all" apache htaccess style using php.
I can't use htaccess because im using different webserver, so i wan't to use php to workaround it.
So let say user are trying to access folder name 'david', all content and subdirectory are denied from viewing.
No
PHP cannot be used to protect folders.
Because it is not PHP who serves requests, but a web server
You can move this catalog above Document Root to prevent web access to it.
But premissions will help you nothing
Use chmod to change the permissions on that directory. Note that the user running PHP needs to own it in that case.
If you just want to prevent indexing the folder, you can create an index.php file that does a simple redirection. Note: Requests that have a valid filename will still be let through.
<?php
header("Location: /"); // redirect user to root directory
Without cooperation from the webserver the only way to protect your files is
to encrypt them, in an archive, maybe, of which your script would know the password and tell no one - that will end up wasting cpu as the server will be decrypting it all the time, or
to use an incredibly deranged file naming scheme, a file naming scheme you won't ever describe to anyone, and that only your php script can sort trough.
Still data could be downloaded, bandwidth go to waste and encrypted files decrypted.
It all depends on how much that data matters. And how much your time costs, as these convoluted layers of somewhat penetrable obfuscation will likely eat huge chunks of developer time.
Now, as I said... that would be without cooperation from the webserver... but what if the webserver is cooperating and doesn't know?
I've seen some apache webservers, (can anyone confirm it's in the standard distribution?) for instance, come preloaded with a rule denying access to files starting with .ht, not only .htaccess but everything similar: .htproxy, .htcache, .htwhatever_comes_to_mind, .htyourmama...
Chances are your server could be one of those.
If that's the case... rename your hidden files .hthidden-<filename1>,.hthidden-<filename2>... and you'll get access to them only through php file functions, like readfile()

Categories