Prevent user from coying file from website - php

I have some images and pdf file on my server which I use in my website
The path to the images is
<img src ='/import/folder/image.jpg'>
Every image is associated with a pdf which resides with the image like the pdf for the above image will be at /import/folder/pdffile.pdf
the image source is visible to users
when some one view the source of page and copy the image source and paste in url after my base url
let suppose my base url is localhost.com
if some one manually write localhost.com/import/folder/image.jpg he can access my whole images and pdf file even my whole file system
How can I prevent users from accessing my file structure ?
I am using php and codeigniter
Thanks in advance

he can access my whole images and pdf file
this is how the web works.
even my whole file system
not whole of course but only files that you put into public access folder
How can I prevent users from accessing my file structure
they don't have an access to your file structure but to the public folder only.
you can't prevent users from accessing public folder because your site will stop working.
you have to ask more certain question, why and which files you want to secure.

In this case its difficult to prevent that people download your images. When you use "/import/folder/" its a public folder on your webspace. You can save the path with .htaccess.
For Your PDF files you could deliver the PDF File over php.
Get PDF 123
In the script you can check if the user has the rights to download the file and wheather the file exists and return the PDF as application/pdf output.
header('Expires: Thu, 19 Nov 1981 08:52:00 GMT');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: no-cache');
header("Content-Type : application/pdf");
header("Content-Disposition: attachment; filename=".$filename.".pdf");
Then the people can download the file. But in this case you have to save PDF is theirs.
Edit:
Then put the .htaccess to the folder with
deny from all
<FilesMatch "\.pdf$">
Order Allow,Deny
Deny from all
</FilesMatch>

Depending on what your server capacity is and how big the files are, you could do the following:
Stream both - the JPEG and the PDF file using what I call a "data-proxy" - a PHP script that reads the file content and streams it back to the browser, like (be careful to set the correct content type) (similar to what Stony proposed, although he left the readfile() part out):
$file_path = $download;
header('Content-Description: File Transfer');
header('Content-Type: audio/mpeg');
header('Content-Disposition: attachment; filename="'.$_GET['file'].'"');
header("Content-Transfer-Encoding: binary");
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file_path));
ob_clean();
ob_end_flush();
readfile($file_path);
exit;
Obfuscate the files. Make the filenames something like md5($filename.$salt) and remove the file extension. If you have the files in different folders (like /images and /pdf)) you don't need the extension for the streaming as you only read the content of the file. You could also place them outside the accessible web space (I think you need open_base_dir for this), thus no one except you would be able to access them. Use .htacces to further restrict access to the files as described in other answers.
Employ a session for the above script so only logged in users get the streaming.
Encrypt the files - you could encrypt the whole content of the files. So even if someone would get the file content, it would be encrypted. You can decrypt them just before streaming. If you employ a secure encryption algorithm this should be quite secure. However, this depends to the file sizes and the server capacity to a large extent as I suppose encrypting the whole file could be a problem if it's a large one.
Make the PDFs password protected. Although not really secure as it can be easily removed, it makes basic users run against the wall... You can do that on the server side too with an automated script.

Put Options -Indexes in a .htaccess file you place in localhost.com/import/folder/ (or higher up the document tree). This will disable access to the file structure in localhost.com/import/folder
Preventing users from accessing your files is something different, like Stony suggested, you can stream the files using php.
Edit: I saw your comment about people "guessing" the url of an image. You could store all the images with an encrypted filename. Something like an md5 hash of the filename and the uploadtime combined. Then store those encrypted names in a database table...

You can use this in the .htaccess
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^http://your_site_url/.*$ [NC]
RewriteRule \.(jpg)$ - [F]
This will prevent the direct access of the images, but the images will be vissible in your website

Related

Downloading files with download.php

I need to deliver big files like file.zip (~2 GB) to customers, with a unique URL for each customer. Then I will redirect (with .htaccess) a customer download link example.com/download/f6zDaq/file.zip to something like
example.com/download.php?id=f6zDaq&file=file.zip
But as the files are big, I don't want the fact that PHP processes the downloading (instead of just letting Apache handle it) to be a CPU / RAM performance issue for my server. After all, asking PHP to do it involves a new layer, so it might cause such an issue, if not done properly.
Question: among the following solutions, which one(s) are the best practice? (in particular, in terms of CPU/RAM)?
1: PHP solution with application/download
header('Content-Type: application/download');
header('Content-Disposition: attachment; filename=file.zip');
readfile("/path/to/file.zip");
CPU usage measured while downloading: 13.6%.
1bis: PHP solution with application/octet-stream (coming from Example #1 of this page)
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=file.zip');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize('file.zip'));
readfile("/path/to/file.zip");
1ter: PHP solution with application/octet-stream (coming from here):
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=file.zip');
header('Content-Transfer-Encoding: binary'); // additional line
header('Connection: Keep-Alive');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0'); // additional line
header('Pragma: public');
header('Content-Length: ' . filesize('file.zip'));
readfile("/path/to/file.zip");
1quater: Another PHP variant with application/force-download (edited; coming from here):
header("Content-Disposition: attachment; filename=file.zip");
header("Content-Type: application/force-download");
header("Content-Length: " . filesize($file));
header("Connection: close");
2: Apache solution, no PHP involved: let Apache serve the file, and use .htaccess to provide different URL for the same file (many ways to do it can be written). In terms of performance, it's similar to let the customer download example.com/file.zip, served by Apache server.
3: Another PHP solution. This would probably work:
$myfile = file_get_contents("file.zip");
echo $myfile;
but wouldn't this ask PHP to load the whole content in memory? (which would be bad in terms of performance!)
4: Just do a header("Location: /abcd/file.zip"); redirection as explained in File with a short URL downloaded with original filename.
Problem with this solution: this discloses the actual location of the file
example.com/abcd/file.zip
to the end user (who can then use or share this URL without authentification) which is not wanted...
But on the other hand, it is much lighter for the CPU since PHP just redirects the request and doesn't deliver the file itself.
CPU usage measured while downloading: 10.6%.
Note: the readfile doc says:
readfile() will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off with ob_get_level().
but I wanted to be 100% sure that it won't be slower / more CPU/RAM hungry than pure Apache solution.
You could use .htaccess to redirect the request to the file while keeping the permalink structure:
RewriteEngine On
RewriteBase /
RewriteRule ^download\/([^\/]+)\/file.zip download.php?id=$1 [L,NC]
Then in your download.php, you can check if the provided id is valid:
// Path to file
$file = 'file.zip';
// If the ID is valid
if ($condition) {
header("Content-Disposition: attachment; filename=\"" . basename($file) . "\"");
header("Content-Type: application/force-download");
header("Content-Length: " . filesize($file));
header("Connection: close");
} else {
// Handle invalid ids
header('Location: /');
}
When the user visits a valid url http://example.com/download/f6zDaq/file.zip, the download will start and the connection will be closed.
If the user visits an invalid url, they will be redirected to the home page.
The biggest problems you're going to face with files of those sizes are the following:
people downloading it with a download manager
interrupted connections
Normally, keep-alive can be a bad idea, as it dedicates a connection to a download, which can bog down your network connections instead of allowing them to be freed up easily. However, if you're expecting all of your files to be large, this is your friend, because you don't want people re-starting those downloads. And those downloads will make reliable connections with keep-alive, and be easier for the client to resume which helps reduce people trying to re-download massive files.
As such, of your presented options, I recommend
1ter
However, as others on here, I still recommend you test your solutions, and preferably from a location separate than you're serving the files from.
Addendum:
This said, serving with PHP isn't the best idea unless you have to get the header control features and .htaccess control in, because it's just adding more processing power. By far the better path would be simply to have the files in an accessible directory. .htaccess can rewrite access to files and folders, not just PHP scripts.
To create Apache-based protected download folders instead:
Options +FollowSymLinks
RewriteEngine On
RewriteRule ^/user/files/folder1.*$ http://example.com/userfiles/ [R=301,L]
Then, if you need to password-protect it, instead of using PHP, use Apache (which is already installed with most PHP installations). You do this by including a .htaccess file in the targeted folder (if you're dynamically making users, you might need to create a script to generate these for each new user) and making sure apache is prepped to handle passwords:
AuthType Basic
AuthName "Authentication Required"
AuthUserFile "/user/password/.htpasswd"
Require valid-user
(See here for more detail: Setting up Apache Passwords)
After this point, you make sure to have an .htpasswd file in the password directory with the format username:password/hashedpassword.
e.g.:
andreas:$apr1$dHjB0/..$mkTTbqwpK/0h/rz4ZeN8M0
john:$apr1$IHaD0/..$N9ne/Bqnh8.MyOtvKU56j1
Now, assuming you're not wanting them to pass in the password every single time, in the download link, include the access
Link (hopefully behind a password-protected interface.)
[Note: Do NOT use the direct password link method if passwords are not randomly assigned per file.]
OR if you're populating based off of the root apache password management AND your site is utilizing apache for it's login process, they might not need the user:pass part of the link at all, having already logged in with Apache.
NOTICE:
Now, this said, the files will be be accessible by people that the full link (with username/password) are shared with. So they'll be as secure (or as unsecure) as your server's https (or http if you allow) protocols, as well as your users sharing or not-sharing links.
Doing it this way, the files will be open to the users it's meant for with the full capabilities of the web accessible to them, meaning download helpers, browser-plugins that help, REST calls, and more, depending on your user's use cases. This can reduce security, which may or may not be a big deal depending on what you're hosting. If you're hosting private medical data (few users, high security, lower speed demands), I wouldn't do it this way. If you're hosting music albums, I'd totally do it this way (many users, lower security, high speed demands).
I would go with readfile. I used it for years, and never got memory issues, even running on a 128MB VPS.
Using PHP means you can easily handle authentication, authorization, logging, adding and removing users, expiring URL and so on. You can use .htaccess to do that, but you will have to write a rather large structure to handle this.
You can use X-Accel-Redirect when your webserver is Nginx. For Apache it's mod_xsendfile with X-Sendfile header.
<?php
header('X-Accel-Redirect: /download/f6zDaq/file.zip');
It costs less, also have a better performance, because web server handles file.
Memory & CPU wise you should probably go with readfile() or write some custom code using fopen() and fread() with custom buffer size.
Regarding the headers you send they do not impact the performance of the script, they will just instruct the client what to do with the server response (in your case, the file). You can Google each header and see what exactly it does.
You should probably have a look over this: Is there a good implementation of partial file downloading in PHP?. The things that might be interesting for you there: download range and download resuming support, ways to do this using web server plugins, PEAR packages or libraries that offer the functionality you need.
As mentioned in Fastest Way to Serve a File Using PHP, I finally did this:
apt-get install libapache2-mod-xsendfile
a2enmod xsendfile # (should be already done by previous line)
Then I added this in apache2.conf:
<Directory />
AllowOverride All
Require all granted
XSendFile on
XSendFilePath /home/www/example.com/files/
</Directory>
I then did service apache2 restart and included this in .htaccess:
RewriteRule ^(.*)$ download.php?file=$1 [L,QSA]
and this in the download.php:
header("X-Sendfile: /home/www/example.com/files/hiddenfolder_w33vbr0upk80/" . $file);
header("Content-type: application/octet-stream");
header('Content-Disposition: attachment; filename="' . $file . '"');
NB: strangely, even I have AllowOverride All enabled in the apache2.conf VirtualHost, doing this:
XSendFile on
XSendFilePath /home/www/example.com/files/
just in the /home/www/example.com/.htaccess or /home/www/example.com/files/.htaccess file didn't work (it fails with xsendFilePath not allowed here).
Benchmark:
10.6% CPU when downloading, exactly like if I do a direct download of the file with Apache (and no PHP at all), so it's all good!

PHP Force Download - Limit possible file download

I'm using the following to force download of MP3 files:
http://www.aaronfagan.ca/blog/2014/how-to-use-php-to-force-a-file-download/
Basically using PHP lines to force a download
<?php
if ($_GET['id']) {
$file = $_GET['id'];
header("Content-Description: File Transfer");
header("Content-Type: application/octet-stream");
header('Content-Disposition: attachment; filename="'.basename($file).'"');
header("Content-Length: ".filesize($file));
readfile($file);
}
else {
header('Location: http://www.mywebsite.com/error/');
}
?>
Am I correct to understand that anyone that knows how it works could basically download any files on any website with this?
For example, if I place that file in the root of mywebsite.com, anyone with knowledge could use a link like the following to download any file anywhere?:
http://www.mywebsite.com/download.php?id=http://www.anywebsite/files/file.pdf
Or would it only work on my website?
The files I want users to be able to download are MP3 files, would there be a way to "restrict" the type of files the "download.php" would process? so this way the "Content-Type" be set to something for only MP3 files, this way the "hack" would be restricted?
For example if I place that file in the root of mywebsite.com, anyone
with knowledge could use a link like the following to download any
file anywhere?:
http://www.mywebsite.com/download.php?id=http://www.anywebsite/files/file.pdf
If permissions open for http://www.anywebsite/files/file.pdf (it means you can open/download file.pdf with browser) you can download it remotly with your script (but as I now basename uses for local paths),
but usually permissions denied for direct download (you can close permissions too).
Also if you want you can add captcha to your download method to disable grab
Thanks.
Your code works only on your website.
For serving resources from other servers you can use this script Resource-Proxy.
Good Luck

Copy and download file in htaccess protected folder

I have a htaccess password protected folder with several files in it. Users are not allowed to access all files, but are allowed to download their own.
Since i can't direct link the file and since copying / removing isn't a real solution, i thought i'd just open the file using file_get_contents and echo it back into the page using the right header. But.. i don't get it working.. Here is my code. The error i am getting is that when opening the file i get a "file is damaged" error from Acrobat.
<?php
$file = "cms/docs/5641-1.pdf";
header('Content-Description: File Transfer');
header('Content-type: application/pdf');
header('Content-Disposition: attachment; filename='.basename("exoticfilename.pdf"));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
if (file_exists($file))
{
echo file_get_contents($file);
}
?>
Also, in this example I am just using a PDF file, but there are several types of files. Therefore i should probably change the header depending on the file type. Is there a solution for that, or should i just use a very long if / else statement?
If there is another, better way, I am open for that.
UPDATE
The above works, but not with all files. Older PDF's (Acrobat 6) don't work, but Acrobat X files do. Same counts for the docx files. Some work, others don't. Very weird, since I am able to open all directly on my PC. I assume it has something to do with the application/pdf line (or application/vnd.openxmlformats-officedocument.wordprocessingml.document' for docx). All others, like images, work.
Since you are using htaccess/htpasswd to protect the directory from hot-linking leeches. You are inadvertanly blocking access to the files from an outside source such as a browser from the client side. Since the directory requires authentication to access the files within it, you need to script around it. In a sense authenticating through the script. I have seen it done before, and you can find one of many references on the subject here
http://koivi.com/php-http-auth/
but bottom line is htaccess and htpasswd over run your scripts even if on the same host machine, as they are in a lack of better description server level, ran before even php starts its process on a page load.

Allow scripts to read a file but prevent users from viewing the file directly

Lets say I have a plain text file example.txt and I have a PHP script on my web-server readfile.php.
What I want to be able to do is to prevent users from typing http://www.example.com/example.txt and looking at the text file directly but I still want people to be able to load http://www.example.com/readfile.php which reads from the file example.txt and does something with it (possibly displays the contents of the file).
Also, if this can be done, what is the best way to do it?
Yes, this is easy to do.
There are two main ways to stop users from accessing example.txt. The first is to put it in a folder outside your web folder (Usually called www or public_html), the second is to put a .htaccess file in the folder with your example.txt script which blocks access to the file altogether. The .htaccess would look like
<files "example.txt">
deny from all
</files>
But you could change example.txt to something like *.txt if you wanted to block all .txt files in the folder.
Then you can use file_get_contents() in your readfile.php to get the contents of the text file, or if you just want to output the file you can use readfile
Just store the files you don't want publicly accessible outside the webroot.
/home
example.txt
/www
readfile.php
If /home/www/ is your public webroot folder, any file above it is not accessible through the web server. readfile.php can still access the file perfectly fine at ../example.txt though.
If you need to store the files in the webroot, then put the files in a folder and deny access to that folder. If you are using apache, make a .htaccess file in the folder and type in deny from all
I've done something similar where the files contain extremely sensitive information and I only want validated users to be able to retrieve the file through an HTTPS connection.
What I did was this:
I put the files in a directory path that is outside the scope of what the web server (Apache, for me) can see. Therefore, there are no possible URLs that will result in the file being served up directly by the web server. Then I created a script that allows users to login, click on the file they want, and then the PHP script reads the file, puts the appropriate headers, and then streams the file to the user's computer.
Of course, the script that shows the user the list of files and the script that streams the file out to the user must have at least read access to the files in the path where they are being stored.
Good luck!!
You can put the file "example.txt" outside of the public folder and read from readfile.php like $content = file_get_contents("../example.txt");
You can call a file like this and the people don't see the filename:
<?php
$file = 'example.txt';
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
?>
(source: php.net)
Would that work for you?
Quick hack - rename your file to ".cannotReadFileWithDOT". he server will close reading files with a dot at the beginning of the name, but your scripts will be able to read them. The plus is that the apache and nginx servers out of the box are configured to prohibit reading files with a dot at the beginning of the name.

Forcing to Download A File

I'm developing a web service. With this service, user's will upload their .php files, and service will remove UTF8 BOM characters from php file. And then, There will be a link like this :
Download Your File
But when i click this link, browser browsing to this file. I don't want browse it, i want to download it. So , when user click this link, downloading will start.
Any ideas ?
(P.S. I don't want modify uploadedfile.php file, also i read 5 questions about this, but still i have problem.)
You need to supply this HTTP header:
Content-Disposition: attachment; filename=example.txt
You can usually specify this for entire directories at a time by configuring your web server appropriately. If you mention which web server you are using, somebody may be able to suggest how to do this.
The problem is that you're allowing people to upload PHP files on your server, then giving them a link to execute that PHP file. The web server is automatically treating those uploaded PHP files like any other PHP file, i.e. executing it, which opens you up to a massive security hole.
Whatever purpose your web service has, I'd suggest renaming the file on your server when it is uploaded (something 'random' is best, without an extension), then having a PHP script feed it back out with the appropriate headers set when it is requested.
The URL for such a script would look like:
http://www.example.com/get_uploaded_file.php?id=jgh3h8gjdj2389
It would link the value in id with the file on the server, and if you've saved the original filename somewhere (flat file, DB), you can serve it out using its original name, so long as you set the right HTTP headers.
Linking directly to the PHP file may end up executing it. One way is (like somebody above suggested) to rename it. Or, you can have a downloader.php which does below:
<?php
header('Cache-Control: no-cache, must-revalidate');
header('Expires: Mon, 01 Jan 2000 01:00:00 GMT'); // some date in past
header('Content-type: text/plain');
header('Content-Disposition: attachment; filename='.basename($filepath));
header('Content-Length: ' . filesize($filepath));
flush(); // or any other flush function/mechanism you use.
readfile($filepath);
and link it something like:
Download
This method will let you retain the .php extension. Also, if the PHP file is big and connection is slow, they progress-bar would be accurate (because you've flushed the content length upfront.

Categories