Any help on this would be greatly appreciated:
I have a website running with php on IIS6 IIS7. I am protecting all the .php files by starting a session. The .php pages can only be accessed if the session is started by logging in through the login.php page
All my .php files are in the following directory (using as example):
home/dir
Is it possible to use php and .htaccess to protect all files in the following directory:
home/dir/files
The files in this directory are word files, pdf's and other files types.
Once the user has logged in through login.php I don't want them to have to retype their username and password when trying to access home/dir/files
I hope that I made sense. Thank you.
In general, a good way to do this is to have the static files outside your website directory structure but still somewhere that the web server has permissions to access them. Then, since you're using PHP anyway, when a user requests a document, they would really be requesting a PHP page that checks the user's permissions then, if the user has adequate permissions, serves the file.
.htaccess are generally associated with Apache, not IIS, but see Is there a file-based equivalent to .htaccess in IIS6?
That said, perhaps you could put your files directory out of harms way and put it somewhere outside the document root. Then you can control download of each file through a PHP script which checks the authentication details.
Related
I'd like to know how i can go about denying direct web access to configuration files of application whilst allowing php to access them.
I know most answers would suggest to put the includes outside the public_html directory. But I really don't think it's that efficient.
Thanks.
PHP just uses the file system to access files where web users usually go through apache and that verifies a .htaccess file. So just place that file that contains deny from all into that directory and voilla.
I am developing a website for myself and I just wonder how can I prevent direct access to include files like header.php and footer.php. Those files should only be incorporated in pages like index.php or other pages wherein they will be called using <?php include(''); ?>. Should I do it through PHP? How about editing the .htaccess file or are there any other methods?
place the files(s) in a directory out side the web root.
the web server will never serve theses files to users.
php et.al. can still access the files via include\require etc
This has been the gold standard approach for several decades.
I offered 3 suggestions and since you didn't provide much to go one, I will give you one elaboration.
As #Dragon eludes to, when you use include() your reading via the file system and not via a HTTP Request. You can check for an HTTP verb ($_REQUEST, $_GET, $_POST) and refuse to show content or fake a 401.
<?php
if(isset($_REQUEST) || isset($_GET) || isset($_POST)){
header("HTTP/1.0 404 Not Found");
die();
}
// Do the needed
?>
I will let you figure out the gotcha on your own here.
It would be perfect if your server is linux, because then what you can do is follow Dagon's suggestion of placing the files to include outside of the web root.
The web root of course is the base folder that contains files the outside world is meant to access. On many systems, this is the public_html folder.
On a system with WHM/cpanel installed, you might have a special user account where the root of that account (where anything can be stored) is located at /home/user on the entire system. This can be viewed by using the file manager utility included with cpanel when logged in. In that /home/user folder, you may find configuration files and folders starting with a period as well as public_ftp and public_html folders.
In the /home/user folder, you can place the PHP files you don't want the world to directly access. Then In public_html, (accessible within /home/user) you can place the index.php file that has the include statement to the "protected" files. That way in index.php you can use this statement:
include "../file-to-include.php";
Just make sure that the administrator has set the owner of the /home/user folder to the same username you login with or you may get access denied messages when trying to access the file. On a decent cpanel setup, the last step would have already been done for you.
I have a situation where I want to protect a file from public access, but enable read and write from php. The file contains sensitive information like passwords.
The problem is that
I cannot put the file outside the web root (server security restriction on access from php)
I would like to avoid mysql database.
Also I would try to avoid .htacess files.
So if I make a folder, say private, in the web root, and do
chmod 700 private
Then, if the file to protect is private/data, I do
chmod 700 private/file
will this be a safe setup? So now I can read and write to the file from php but it is not accessible for the public?
Is this a safe setup?
PHP runs as the same user as the webserver so if PHP can read it, so can your webserver (and vice versa).
If you don't want to use .htaccess there is another trick: save the file as a .php file. Even if someone accesses the file from the web they can't see the source, they might just get a white page or maybe an error depending on what exactly is in the file.
If you're running suPHP or fastCGI php, you can use a setup similar to what you've described to limit access to files. Otherwise, PHP will use the same user as the web server, and any file PHP can access is also accessible via url.
If want to keep the restrictions stipulated (which are rather strange), and as (i guess) you do not wish/have access to apache config directives, consider adding PHP to some group and give the group only rights to the file, ie. apache cannot read (if its not in root/wheel).
Or make it a valid .php file (so only php would be invoker when the file is requested) which returns nothing or redirects when invoked with php. or just cipher it.
Is there a way to stop users from typing in a url and gaining access to a file in a specific directory but still allow a PHP script to have access to the file to change it or download it?
In other words, I want to stop people from typing in: http://www.mysite.com/securefolder/file1.pdf. Maybe redirect them to a different page or something.
But I do not want to completely lock away the files because I still need to use a PHP script to download and modify the files.
Any help is appreciated,
Thanks
If all you want is a specific setting for a certain file a very simple rule will be all you need -
RewriteEngine on
RewriteRule ^/securefolder/file1.pdf$ access_denied.php
What might be a better idea is to make a rule for the entire secured folder -
RewriteEngine on
RewriteRule ^/securefolder/.*$ access_denied.php
One last (and probably best) way to do this is to create an additional .htaccess inside the secured folder and simply deny all access to it. Place this one line -
deny from all
In all of the solutions, we are only talking about external requests. All your scripts and internal paths to scripts, files, etc... will remain intact and unaffected by the rules you define within the .htaccess files.
Disable direct access to the file on the webserver, and serve the file from a PHP script (some hints on this manual page: http://www.php.net/manual/en/function.readfile.php). Webserver access restictions won't affect PHP, as it is directly accessing the filesystem. Here's a similar question: Secure files for download
If performance is critical, there is plugin for most of the webservers which will help you to serve the file directly (bypassing PHP):
Apache mod_auth_token
Lighttpd mod_secdownload
Nginx secure_download
The ideal approach will depend on whether the PHP script accesses the PDF file locally on disk, or remotely over http.
If the PHP script accesses the file locally on disk, simply place the file outside the root folder of the web site.
If the PHP script access the file remotely over http, here are some options:
Limit access by origin IP
Password protect the resource and serve over https
If the files are on the same server, you don't need to download them in order to serve them. Simply read them from the filesystem and output them directly.
If, however, they're not, and you need a script to be able to download files, and others to be refused, you could password protect the directory.
To then download files using for instance cURL, you can specify the following options:
curl_setopt($ch, CURLOPT_USERPWD, "$username:$password");
curl_setopt($ch, CURLOPT_HTTPAUTH, CURLAUTH_BASIC);
More information
Password Protecting Your Pages with htaccess
Sending a username and password with PHP cURL
Place your data file outside of the public web space. Read the file using PHP and serve.
There is no reason for the source file to be located within the web host's DocumentRoot unless you want the file to be served publicly.
A PHP script runs as local user and so is able to read the file even outside the scope of Apache (or other) web server.
I have a script that allows only authorised users to upload files to a certain folder.
However I do not know how to prevent people from downloading freely without login.
I need the solution in php.
I have googled around but nothing straight forward as yet.
Currently in my document root I have a folder called admin and a subfolder called uploads inside the admin. So only admin role can upload. Both editor and admin can download. What should I do in this case?
Please advise.
Put the files somewhere outside the public webroot directory, or configure your server to not serve the files. As long as your server will happily serve everything with a valid URL, there's nothing you can do with PHP to prevent that.
If your files are in the /public_html/ folder, take them out of that folder and place them in e.g. /secret_files/, so your directory structure looks something like this:
public_html/
index.html
admin/
admin_index.php
secret_files/
my_secret_file.txt
The webserver is only configured to serve files in the /public_html/ directory, so nobody will have access to directories outside (technical term above) it.
To still enable somebody to download those files, do as cletus suggests and use readfile to "manually serve" the files via a PHP script. PHP will still have access to these other parts of the file system, so you can use it as a gatekeeper.
Don't store the files in a directory under the document root.
Instead move them somewhere else and then a PHP script can programmatically determine if someone can download them and then use readfile() or something similar to stream them to the user.
You could also configure the Web server to not serve files from this directory but then you need PHP to serve them anyway. It's cleaner simply not to put them under the document root.
Answering question on how to password protect with PHP:
This should solve your problem.