Prevent direct url access to files - php

Background info:
I am working on a website which will provide image and video content via a subscription service. That is, users should ONLY have access to the image and video content so long as they are logged in successfully. (Note: the log in system uses a combination of MySQL DB - to store the username and password - and php to create new user sessions / authentication etc.)
The problem:
How do I stop a user (logged in or not) from directly accessing the image and video files? For example, a user who is not logged in could access the file directly as follows: www.domain.com/testvideo.mp4 - this would render the video content in the browser for them to watch or share with others. (NOTE: I still need to be able to use / display the image and video files on-site via HTML, CSS, PHP etc)
I have tried several .htaccess solutions (including: RewriteCond/RewriteRule & .htpassword) which have successfully prevented direct access BUT have prevented the ability to use the files on-site via HTML, CSS, PHP etc.
I was thinking that this must be a very common problem and if so, what the best way to resolve it was?

It is a pretty common problem with a pretty common solution. In order to force access control you have to invoke a PHP script before serving the file and verify the credentials. Then, if the credentials are valid, serve the actual file.
You may be tempted to serve the file directly from PHP script using something like readfile. This is going to kill your server performance and break download resuming for the client.
Luckily there is a solution, when you can hand over the actual file serving back to the web-server.
This works as following:
The web-server receives the request to /file.mp4.
According to the rewrite rules you've set up it directs it to your PHP script /serve.php instead.
Your script verifies the credentials, e.g. something from the session or cookies.
If the credentials are valid, the script issues specially crafted header. It tells the web-server to actually serve the static file. If not, you may as well output a 403 HTTP code.
The example script can be something like:
$file = '/tmp/file.mp4'; // it is in your best interest to make this file inaccessible for a direct download
header('X-Sendfile: ' . $file);
header('Content-Type: ' . contentType($file));
header('Content-Disposition: inline;');
In order for this to work you'll have to have mod_xsendfile (https://tn123.org/mod_xsendfile/) installed on your Apache, which is probably already the case for your hoster. You'll also have to drop in some lines to configure it and setup a proper rewrite.
You can fine a lot of stuff on Google by issuing "mod_xsendfile php", which might also help a great deal.
Hope that makes sense!

You cannot avoid that as long as your files are publicly available.
The most common way is to not serve the files directly, but to serve them through php so that you can check the users access before you serve the file. And the files can then reside anywhere on the server where the web-server user (www, apache, etc.) has access but the visitor hasn't.
Check the examples in the php manual on readfile and header to see how you can serve a file through php. You will find lots of examples here on SO as well.

Related

Securing PDFs in a webserver from public access

I have a website storing personal data about people. All of this information is in a database and pages allowing access to this information is password protected. However, recently I am having to keep PDF files that contain some of this information. These PDFs are stored in a folder in a server. I have put an index.html in that folder so that lookup is prevented.
However, I am worried about website copiers like HTTrack that could do a recursive lookup. I dont have access to .htaccess as the hosting service does not allow this.
How can I store PDFs in a secured way? I am using php and MySQL.
apartridge had it right when he suggested you use a php script to write the file contents to the HTTP response instead of linking directly to the file.
However, when sensitive files need to be protected from unauthorized access, there is one more step: move the files to a directory on the server that is not accessible from the web. I'm not as familiar with php, so I'll use an asp.net site as an example, but you should be able to follow along just fine.
In Windows, a path to a website might be C:\inetpub\wwwroot\MyWebsite\Index.html
I would store the files in something like C:\WebsiteFiles\ so those files simply aren't accessible from the web. Now you can control access through a php script and not have to worry about people or scripts guessing filenames.
Firstly - and please don't take offense at this - the fact you're asking this question suggests you are not qualified to write software dealing with personal information. Even if you get this particular issue right, you may be making other mistakes. I'd recommend spending some time on the OWASP website and get a basic understanding of web application security.
Next, you should not put the PDF files on a publicly accessible web folder. If someone forwards a URL (no matter how cleverly hashed the file name is) to someone who shouldn't have access to it, your security model breaks. Disallowing file enumeration is not enough - you should not allow anyone to access the PDF files without entering credentials.
You can do that most easily with an .htaccess file - and if your hosting provider doesn't support that, I'd question their suitability for a project hosting sensitive data.
If you really must, you can create a "pass-through" PHP script. So, if the URL is http://myserver.com/personalPDF.php?personID=JoeBlogs, the file personalPDF would use the following pseudo code
if user is not logged in
redirect to log-in page
if user does not have access to requested document
redirect to "unauthorized" page
set PDF mime type
read requested document from disk and send to client
In PHP, the last two lines are something like:
<?php
$file="JoeBlogs.pdf";
header('Content-disposition: attachment; filename='.$file);
header("Content-type: ".mime_content_type($file));
header('Content-Transfer-Encoding: binary');
ob_clean();
flush();
readfile($file);
?>
If you have an index.html file that will prevent the directory contents from being listed by the server. Now, you have to worry about people guessing the file names of your files. You can store them by cryptic names. Take a look at hash functions to generate random strings.
When you let a user download a file, you should use a PHP script to read the contents of the file and send a correct MIME header. You should not link directly to the cryptic names, to keep these names a secret. The PHP script can then do the correct validation of the users.
But first you should check if your host allows you to put files in a folder that is not publicly available. If you can store the files in a non-public folder you're in good shape.

How to protect PHP from the public?

So I'm a bit confused about what crafty users can and can't see on a site.
If I have a file with a bunch of php script, the user cant see it just by clicking "view source." But is there a way they can "download" the entire page including the php?
If permission settings should pages be set to, if there is php script that must execute on load but that I dont want anyone to see?
Thanks
2 steps.
Step 1: So long as your PHP is being processed properly this is nothing to worry about...do that.
Step 2: As an insurance measure move the majority of your PHP code outside of the Web server directory and then just include it from the PHP files that are in the directory. PHP will include on the file system and therefore have access to the files, but the Web server will not. On the off chance that the Web server gets messed up and serves your raw PHP code (happened to Facebook at one point), the user won't see anything but a reference to a file they can't access.
PHP files are processed by the server before being sent to your web browser. That is, the actual PHP code, comments, etc. cannot be seen by the client. For someone to access your php files, they have to hack into your server through FTP or SSH or something similar, and you have bigger problems than just your PHP.
It depends entirely on your web server and its configuration. It's the web server's job to take a url and decide whether to run a script or send back a file. Commonly, the suffix of a filename, file's directory, or the file's permission attributes in the filesystem are used to make this decision.
PHP is a server side scripting language that is executed on server. There is no way it can be accessed client side.
If PHP is enabled, and if the programs are well tagged, none of the PHP code will go past your web server. To make things further secure, disable directory browsing, and put an empty index.php or index.html in all the folders.
Ensure that you adhere to secure coding practices too. There are quite a number of articles in the web. Here is one http://www.ibm.com/developerworks/opensource/library/os-php-secure-apps/index.html

Securing Uploaded Files (php and html)

I have a simple site which allows users to upload files (among other things obviously). I am teaching myself php/html as I go along.
Currently the site has the following traits:
--When users register a folder is created in their name.
--All files the user uploads are placed in that folder (with a time stamp added to the name to avoid any issues with duplicates).
--When a file is uploaded information about it is stored in an SQL database.
simple stuff.
So, now my question is what steps do I need to take to:
Prevent google from archiving the uploaded files.
Prevent users from accessing the uploaded files unless they are logged in.
Prevent users from uploading malicious files.
Notes:
I would assume that B, would automatically achieve A. I can restrict users to only uploading files with .doc and .docx extensions. Would this be enough to save against C? I would assume not.
There is a number of things you want to do, and your question is quite broad.
For the Google indexing, you can work with the /robots.txt. You did not specify if you also want to apply ACL (Access Control List) to the files, so that might or might not be enough. Serving the files through a script might work, but you have to be very careful not to use include, require or similar things that might be tricked into executing code. You instead want to open the file, read it and serve it through File operations primitives.
Read about "path traversal". You want to avoid that, both in upload and in download (if you serve the file somehow).
The definition of "malicious files" is quite broad. Malicious for who? You could run an antivirus on the uplaod, for instance, if you are worried about your side being used to distribute malwares (you should). If you want to make sure that people can't harm the server, you have at the very least make sure they can only upload a bunch of filetypes. Checking extensions and mimetype is a beginning, but don't trust that (you can embed code in png and it's valid if it's included via include()).
Then there is the problem of XSS, if users can upload HTML contents or stuff that gets interpreted as such. Make sure to serve a content-disposition header and a non-html content type.
That's a start, but as you said there is much more.
Your biggest threat is going to be if a person manages to upload a file with a .php extension (or some other extension that results in server side scripting/processing). Any code in the file runs on your server with whatever permissions the web server has (varies by configuration).
If the end result of the uploads is just that you want to be able to serve the files as downloads (rather than let someone view them directly in the browser), you'd be well off to store the downloads in a non web-accessible directory, and serve the files via a script that forces a download and doesn't attempt to execute anything regardless of the extension (see http://php.net/header).
This also makes it much easier to facilitate only allowing downloads if a person is logged in, whereas before, you would need some .htaccess magic to achieve this.
You should not upload to webserver-serving directories if you do not want the files to be available.
I suggest you use X-Sendfile, which is a header that instructs the server to send a file to the user. Your PHP script called 'fetch so-and-so file' would do whatever authentication you have in place (I assume you have something already) and then return the header. So long as the web server can access the file, it will then serve the file.
See this question: Using X-Sendfile with Apache/PHP

Allow logged in users to view and download files (some 250+ MB) that would normally be 403 access denied

I'm building a web server out of a spare computer in my house (with Ubuntu Server 11.04), with the goal of using it as a file sharing drive that can also be accessed over the internet. Obviously, I don't want just anyone being able to download some of these files, especially since some would be in the 250-750MB range (video files, archives, etc.). So I'd be implementing a user login system with PHP and MySQL.
I've done some research on here and other sites and I understand that a good method would be to store these files outside the public directory (e.g. /var/private vs. /var/www). Then, when the file is requested by a logged in user, the appropriate headers are given (likely application/octet-stream for automatic downloading), the buffer flushed, and the file is loaded via readfile.
However, while I imagine this would be a piece of cake for smaller files like documents, images, and music files, would this be feasible for the larger files I mentioned?
If there's an alternate method I missed, I'm all ears. I tried setting a folders permissions to 750 and similar, but I could still view the file through normal HTTP in my browser, as if I was considered part of the group (and when I set the permissions so I can't access the file, neither can PHP).
Crap, while I'm at it, any tips for allowing people to upload large files via PHP? Or would that have to be don via FTP?
You want the X-Sendfile header. It will instruct your web server to serve up a specific file from your file system.
Read about it here: Using X-Sendfile with Apache/PHP
That could indeed become an issue with large files.
Isn't it possible to just use FTP for this?
HTTP isn't really meant for large files but FTP is.
The soluton you mentioned is the best possible when the account system is handled via PHP and MySQL. If you want to keep it away from PHP and let the server do the job, you can protect the directory by password via .htaccess file. This way the files won't go through the PHP, but honestly there's nothing you should be worried about. I recommend you to go with your method.

How can I protect my site from being leeched?

I am using the header function of PHP
to send the file to the browser with some small code. Its work well
and I have it so that if any one requests it with a referer other than my site
it redirects to a page first.
Unfortunately it's not working with the internet download manager.
What I want to know is how the rabidshare and 4shared sites do this.
You could use sessions to make sure the download is being requested by a valid user.
Not all browsers / softwares that can see web pages will send a Referer to your server. Some sites will make a browser "fingerprint", usually hashed, which might be Referer, User-Agent and a couple of other headers strung together to make a uniquie identifier for that user and thus restrict access as you describe.
Of course, I may have completely missed the point of your post!
A typical design pattern is using a front controller to have a single entry point for all requests. By having a front controller, you can control exactly what the client sees.
You can configure this in Apache so that all requests go through a single file (it's been a while since I've done this because I now concentrate on Java). I think you would need to look at pathinfo documentation for Apache.
This might require a significant change in the rest of your application code. But, the code will be more secure and maintainable in the long run.
I've served images and other binary files through this pattern. This allowed me to easily verify users were authenticated before actually sending them the file. Obfuscation is not security, so if you rely on obfuscating your URL, an attacker may be delayed in getting in, but it is just a matter of time.
Walter
The problem probably is that sending file through php script (with headers you mentioned) doesn't support starting file download at certain position. Download managers use this feature to download file using several simultaneous threads (assuming server gives one thread at certain speed).
For small project I would recommend making a copy of file with unique filename just for download time and redirecting user to this copied file. This way he gets full server download features and it also doesn't load processor as php does. Disadvantages - more disk space required and need to cleanup download directory.

Categories