i have some questions on how folders and files permissions work. say i have users directories outside 'protected' as below..
users
-- usera
-- docs
-- userb
-- docs
protected
i do not want user B who does not have the rights, to access anything in user A directories. also, i do not want any person to access the directories directory via url links. basically i just want users to be able to access their own directories, and no one else. how can it be done?
thanks!
I answered a simular question here limiting users to subdirectories which you should be able to adjust to suit your needs, I've copied it here as well.
Download.php
<?php
/** Load your user assumed $user **/
$file = trim($_GET['file']);
/** Sanitize file name here **/
if (true === file_exists('/users/user'.$user->id.'/'.$file)) {
//from http://php.net/manual/en/function.readfile.php
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.$file.'"');
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
} else {
throw new Exception('File Not Found');
}
.htaccess To deny all direct file downloads
deny from all
You would then link to the folders by using /download.php?file=filename.ext and it would only download that file from the users directory of the current user.
You'll want to ensure you sanitize the input file name so you're not vulnerable to directory transversal exploits.
Without more info to go on, my suggestion would be to make sure the user directories are above the web root. This will prevent them from being linked to. Then create a PHP script that validates that a user is who they say they are. Once you know the identify of a logged in user, you can use fpassthru() (http://php.net/fpassthru) or similar to deliver the docs to the user.
Related
I created a PHP application that automates the creation of rental documents such as leases, extensions, notices, etc. The application creates and saves the rental documents in a designated directory as a word document.
My application requires the user to login and verifies login using a session variable. My problem is how to protect the /docs/ directory that contains completed rental documents? If someone knew this directory existed, they could simply type it into a browser. I added a blank index.html file to this directory. This keeps the file names from displaying. I'm just wondering what is the best way to protect this directory, since it will contain docs with personal information?
Ryan thanks for your advice. As you suggested, I saved the files outside of the document root and accessed them with this code.
<?php
header('Content-Description: File Transfer');
header('Content-Type: application/msword');
header('Content-Disposition: attachment; filename="'.$_GET['doc'].'"');
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($doc));
readfile("../test/" . $_GET['doc']);
?>
To access the files, I include the filename in the url that links to the above code. EX. http://example.com/test.php?doc=filename.docx
I have a directory on my server that I want the general public to not have access to. (Nothing. No access to php files, images, anything.)
But I want to allow certain users to access this restricted area based on a boolean value in my php.
Is there any way to use PHP to determine whether or not a user can access a directory, similar to the using an htaccess file but with more customized logic?
One of the easiest way is to redirect restricted users to your homepage.
<?php
header( 'Location: http://www.yoursite.com' ) ;
exit;
?>
You can allow access to specific users by setting Boolean value in your DB.
Below solution will work based on your file storage location.
$file = '/file.png'; // set your file location
$userAccess = false;
if (file_exists($file) && $userAccess) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=your_file.png');
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
}
Check this link also.
Hope it will help you :)
So I finally found an answer elsewhere and it involves setting up a PHP file as a sort of proxy in conjunction with htaccess - http://simpletek.raivo.id.lv/restrict-server-files-access-based-on-php-logic/
I have a website with different user groups. Each user group has different pages they can access. These pages can contain links to files (documents, pdf, etc.). Each group should now be able to access only their documents in the group-specific folder.
What is the best practise to make this work? Following things came up:
Generate a Hash of all uploaded files and name the file according to the hash, so that it can't be found by
trying and restrict displaying of directory. (Problem: Links can be shared and files would be accessible by public)
Restrict Access with .htaccess file (Problem: User must type in a password each time and the cms can be linked to the .htaccess file - not dynamic)
Check if cookie exists with .htaccess e.g. http://www.willmaster.com/blog/contentprotection/htaccess-cookie.php (Problem: not dynamic if I create new user groups)
What is the best solution for this problem? Is it any of the mentioned?
What I did was wrapping the files in a dynamic script with access control:
if ($userIsLoggedIn)
{
header('Content-Type: application/pdf');
header('Content-Transfer-Encoding: binary');
header('Content-disposition: inline; filename="'.$filenameOfPDF.'"');
header('Pragma: public');
readfile($pathToPDF);
}
else echo 'You do not have access';
This is written in PHP. The file itself is stored in a directory which is not accessible from the web.
The only disatvantage is that you would need to do this for every file type. If only access is important you could generalize:
if ($userIsLoggedIn)
{
header('Content-Type: application/octet-stream');
header('Content-Transfer-Encoding: binary');
header('Content-disposition: inline; filename="'.$filename.'"');
header('Pragma: public');
readfile($path);
}
else echo 'You do not have access';
But I am sure there are other solutions.
Hi I am letting users purchase hidden virtual mov.zip files with paypal. I got the transaction part and the storing the details in the database etc... but after the user comes back to the transaction page I want to link them to the zipped file which is in a restricted folder (.htaccess deny from all). how can i grant them access into this directory to download the file for a couple of days. I can't temporarily move the file out of the directory because of its very large size (its a package of hd action effects).
Thank you.
If your hosting allow you to change the PHP setting for the script timeout, you could just stream the file through a PHP script that would check the user access.
For example, the request :
>http://domain.com/download.php?file=be6bc64c94bbc062bcebfb40b4f93304
<?php
session_start();
if (!isset($_GET['file']) header('Location: index.php'); // invalid request
// 1. check user session
...
// 2. get file from hash (use a Db like MySQL
$file = ...;
// 3. check user privilege for the given file
...
// 4. proceed to download
if (file_exists($file)) {
set_time_limit(0);
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
} else {
// error 404
}
I have a subdirectory of users that I want to limit each subfolder to that user only.
For example I have /users/user1 where I want to protect the user1 folder so that only user1 can access the files inside.
I tried playing around with an .htaccess and .htpasswd file, but I get prompted to log in a second time even though I have authenticated against a MySQL database.
I'm not sure what to do to basically have the second log in request automatically handled since the user would be authenticated previously.
I can post some code that I have for my .ht files, but I thought that this info could get the ball rolling.
I think that using a php proxy to access the files would be sufficient in this case, something along the lines of:
Download.php
<?php
/** Load your user assumed $user **/
$file = trim($_GET['file']);
/** Sanitize file name here **/
if (true === file_exists('/users/user'.$user->id.'/'.$file)) {
//from http://php.net/manual/en/function.readfile.php
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.$file.'"');
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
} else {
throw new Exception('File Not Found');
}
.htaccess To deny all direct file downloads
deny from all
You would then link to the folders by using /download.php?file=filename.ext and it would only download that file from the users directory of the current user.
You'll want to ensure you sanitize the input file name so you're not vulnerable to directory transversal exploits.