I have a website storing personal data about people. All of this information is in a database and pages allowing access to this information is password protected. However, recently I am having to keep PDF files that contain some of this information. These PDFs are stored in a folder in a server. I have put an index.html in that folder so that lookup is prevented.
However, I am worried about website copiers like HTTrack that could do a recursive lookup. I dont have access to .htaccess as the hosting service does not allow this.
How can I store PDFs in a secured way? I am using php and MySQL.
apartridge had it right when he suggested you use a php script to write the file contents to the HTTP response instead of linking directly to the file.
However, when sensitive files need to be protected from unauthorized access, there is one more step: move the files to a directory on the server that is not accessible from the web. I'm not as familiar with php, so I'll use an asp.net site as an example, but you should be able to follow along just fine.
In Windows, a path to a website might be C:\inetpub\wwwroot\MyWebsite\Index.html
I would store the files in something like C:\WebsiteFiles\ so those files simply aren't accessible from the web. Now you can control access through a php script and not have to worry about people or scripts guessing filenames.
Firstly - and please don't take offense at this - the fact you're asking this question suggests you are not qualified to write software dealing with personal information. Even if you get this particular issue right, you may be making other mistakes. I'd recommend spending some time on the OWASP website and get a basic understanding of web application security.
Next, you should not put the PDF files on a publicly accessible web folder. If someone forwards a URL (no matter how cleverly hashed the file name is) to someone who shouldn't have access to it, your security model breaks. Disallowing file enumeration is not enough - you should not allow anyone to access the PDF files without entering credentials.
You can do that most easily with an .htaccess file - and if your hosting provider doesn't support that, I'd question their suitability for a project hosting sensitive data.
If you really must, you can create a "pass-through" PHP script. So, if the URL is http://myserver.com/personalPDF.php?personID=JoeBlogs, the file personalPDF would use the following pseudo code
if user is not logged in
redirect to log-in page
if user does not have access to requested document
redirect to "unauthorized" page
set PDF mime type
read requested document from disk and send to client
In PHP, the last two lines are something like:
<?php
$file="JoeBlogs.pdf";
header('Content-disposition: attachment; filename='.$file);
header("Content-type: ".mime_content_type($file));
header('Content-Transfer-Encoding: binary');
ob_clean();
flush();
readfile($file);
?>
If you have an index.html file that will prevent the directory contents from being listed by the server. Now, you have to worry about people guessing the file names of your files. You can store them by cryptic names. Take a look at hash functions to generate random strings.
When you let a user download a file, you should use a PHP script to read the contents of the file and send a correct MIME header. You should not link directly to the cryptic names, to keep these names a secret. The PHP script can then do the correct validation of the users.
But first you should check if your host allows you to put files in a folder that is not publicly available. If you can store the files in a non-public folder you're in good shape.
Related
Background info:
I am working on a website which will provide image and video content via a subscription service. That is, users should ONLY have access to the image and video content so long as they are logged in successfully. (Note: the log in system uses a combination of MySQL DB - to store the username and password - and php to create new user sessions / authentication etc.)
The problem:
How do I stop a user (logged in or not) from directly accessing the image and video files? For example, a user who is not logged in could access the file directly as follows: www.domain.com/testvideo.mp4 - this would render the video content in the browser for them to watch or share with others. (NOTE: I still need to be able to use / display the image and video files on-site via HTML, CSS, PHP etc)
I have tried several .htaccess solutions (including: RewriteCond/RewriteRule & .htpassword) which have successfully prevented direct access BUT have prevented the ability to use the files on-site via HTML, CSS, PHP etc.
I was thinking that this must be a very common problem and if so, what the best way to resolve it was?
It is a pretty common problem with a pretty common solution. In order to force access control you have to invoke a PHP script before serving the file and verify the credentials. Then, if the credentials are valid, serve the actual file.
You may be tempted to serve the file directly from PHP script using something like readfile. This is going to kill your server performance and break download resuming for the client.
Luckily there is a solution, when you can hand over the actual file serving back to the web-server.
This works as following:
The web-server receives the request to /file.mp4.
According to the rewrite rules you've set up it directs it to your PHP script /serve.php instead.
Your script verifies the credentials, e.g. something from the session or cookies.
If the credentials are valid, the script issues specially crafted header. It tells the web-server to actually serve the static file. If not, you may as well output a 403 HTTP code.
The example script can be something like:
$file = '/tmp/file.mp4'; // it is in your best interest to make this file inaccessible for a direct download
header('X-Sendfile: ' . $file);
header('Content-Type: ' . contentType($file));
header('Content-Disposition: inline;');
In order for this to work you'll have to have mod_xsendfile (https://tn123.org/mod_xsendfile/) installed on your Apache, which is probably already the case for your hoster. You'll also have to drop in some lines to configure it and setup a proper rewrite.
You can fine a lot of stuff on Google by issuing "mod_xsendfile php", which might also help a great deal.
Hope that makes sense!
You cannot avoid that as long as your files are publicly available.
The most common way is to not serve the files directly, but to serve them through php so that you can check the users access before you serve the file. And the files can then reside anywhere on the server where the web-server user (www, apache, etc.) has access but the visitor hasn't.
Check the examples in the php manual on readfile and header to see how you can serve a file through php. You will find lots of examples here on SO as well.
I have a simple site which allows users to upload files (among other things obviously). I am teaching myself php/html as I go along.
Currently the site has the following traits:
--When users register a folder is created in their name.
--All files the user uploads are placed in that folder (with a time stamp added to the name to avoid any issues with duplicates).
--When a file is uploaded information about it is stored in an SQL database.
simple stuff.
So, now my question is what steps do I need to take to:
Prevent google from archiving the uploaded files.
Prevent users from accessing the uploaded files unless they are logged in.
Prevent users from uploading malicious files.
Notes:
I would assume that B, would automatically achieve A. I can restrict users to only uploading files with .doc and .docx extensions. Would this be enough to save against C? I would assume not.
There is a number of things you want to do, and your question is quite broad.
For the Google indexing, you can work with the /robots.txt. You did not specify if you also want to apply ACL (Access Control List) to the files, so that might or might not be enough. Serving the files through a script might work, but you have to be very careful not to use include, require or similar things that might be tricked into executing code. You instead want to open the file, read it and serve it through File operations primitives.
Read about "path traversal". You want to avoid that, both in upload and in download (if you serve the file somehow).
The definition of "malicious files" is quite broad. Malicious for who? You could run an antivirus on the uplaod, for instance, if you are worried about your side being used to distribute malwares (you should). If you want to make sure that people can't harm the server, you have at the very least make sure they can only upload a bunch of filetypes. Checking extensions and mimetype is a beginning, but don't trust that (you can embed code in png and it's valid if it's included via include()).
Then there is the problem of XSS, if users can upload HTML contents or stuff that gets interpreted as such. Make sure to serve a content-disposition header and a non-html content type.
That's a start, but as you said there is much more.
Your biggest threat is going to be if a person manages to upload a file with a .php extension (or some other extension that results in server side scripting/processing). Any code in the file runs on your server with whatever permissions the web server has (varies by configuration).
If the end result of the uploads is just that you want to be able to serve the files as downloads (rather than let someone view them directly in the browser), you'd be well off to store the downloads in a non web-accessible directory, and serve the files via a script that forces a download and doesn't attempt to execute anything regardless of the extension (see http://php.net/header).
This also makes it much easier to facilitate only allowing downloads if a person is logged in, whereas before, you would need some .htaccess magic to achieve this.
You should not upload to webserver-serving directories if you do not want the files to be available.
I suggest you use X-Sendfile, which is a header that instructs the server to send a file to the user. Your PHP script called 'fetch so-and-so file' would do whatever authentication you have in place (I assume you have something already) and then return the header. So long as the web server can access the file, it will then serve the file.
See this question: Using X-Sendfile with Apache/PHP
Background: I have a website where people can store transactions. As part of this transaction, they could attached a receipt if they wanted.
Question: Is there any security risk if a user is allowed to upload any type of file extension to my website?
Info:
The user will be only person to ever re-download the same file
There will be no opportunity for the user to "run" the file
They will only be able to download it back to themselves.
No other user will ever have access to another users files
There will be a size restriction on the say (say 2mb)
More info: I was originally going to restrict the files to "pdf/doc/docx" - but then realised some people might want to store a jpg, or a .xls etc - and realised the list of files they "might" want to store is quite large...
edit: The file will be stored outside public_html - and served via a "readfile()" function that accepts a filename (not a path) - so is there anything that can 'upset' readfile()?
Yes, it is definitely a security risk unless you take precautions. Lets say, to re-download the file, the use has to go to example.com/uploads/{filename}. The user could upload a malicious PHP file, and then 'redownload' it by going to example.com/uploads/malicious.php. This would, of course, cause the PHP script to execute on your server giving him enough power to completely wreck everything.
To prevent this, create a page that receives the filename as a parameter, and then serve the page to the user with the correct content-type.
Something like, example.com/files?filename=malicious.php
"There will be no opportunity for the user to "run" the file"
As long as you are 100% sure that that will hold true, it is secure. However, make sure the file will not be able to be executed by the webserver. For example, if the user uploads a .php file, make sure the server does not execute it.
Computers don't run programs magically by themselves, so basically you just need to ensure that the user has no ability to trick your server into running the file. This means making sure the proper handlers are disabled if the files are under the web root, or passing them through a proxy script if they are not (basically echo file_get_contents('/path/to/upload') with some other logic)
Another option would be to store the file like name.upload but this would require keeping a list of original names that map to the storage names.
I have a folder (/files) and I have tons of files there that users can download. I want users to be able to download their files only and no be able to see others people file.
For example:
User A can only view and download:
- file1.doc
- file2.jpg
User B can only view and download:
- file3.txt
- file4.jpeg
User C can only view and download:
- file1.doc
- file2.jpg
- file3.txt
My idea was to put all files in the same folder so all users knows where to go. My question is: Can I use .htaccess or should I build a PHP scripts for this? What about security (which one is more secure)?
Thanks
Is it an open directory, to start with? What you could do is create a subfolder for each user, put their files in there and then assign appropriate permissions in .htaccess for said folders. However, this would require some security integration with your OS (i.e., users would have to have accounts on your machine, not just your web application)... A quick and dirty -- and insecure -- alternative would be to prepend all uploaded filenames with the username (e.g., 'file1.jpg' uploaded by 'foobar' could be named 'foobar.file1.jpg', for example), then it's just a case of your PHP script returning only those files with the respective username and perhaps stripping that part out when displaying (or again, you could use folders, as long as your script can create a new folder per user, when one doesn't exist). Another option, which is slightly more secure is to create a hash of the file and usernames in a database, rename all uploaded files with this hash and then query the database appropriately.
The best solution would definitely be OS managed accounts, a I first mentioned, but it entails more overhead.
Build a PHP script where you use readfile to send the file to the browser. This way you can restrict access for individual files, and use the authentication system you already have.
You can certainly use either htaccess or PHP for this. Neither are more secure, as far as I know, that the other - though done wrong both can permit access where none is intended!
PHP might be marginally better, since you have more flexibility (in terms of integrating it with other PHP authentication, say) and you can put the folder outside the usual web root, which is good practise anyway.
I'm building a web server out of a spare computer in my house (with Ubuntu Server 11.04), with the goal of using it as a file sharing drive that can also be accessed over the internet. Obviously, I don't want just anyone being able to download some of these files, especially since some would be in the 250-750MB range (video files, archives, etc.). So I'd be implementing a user login system with PHP and MySQL.
I've done some research on here and other sites and I understand that a good method would be to store these files outside the public directory (e.g. /var/private vs. /var/www). Then, when the file is requested by a logged in user, the appropriate headers are given (likely application/octet-stream for automatic downloading), the buffer flushed, and the file is loaded via readfile.
However, while I imagine this would be a piece of cake for smaller files like documents, images, and music files, would this be feasible for the larger files I mentioned?
If there's an alternate method I missed, I'm all ears. I tried setting a folders permissions to 750 and similar, but I could still view the file through normal HTTP in my browser, as if I was considered part of the group (and when I set the permissions so I can't access the file, neither can PHP).
Crap, while I'm at it, any tips for allowing people to upload large files via PHP? Or would that have to be don via FTP?
You want the X-Sendfile header. It will instruct your web server to serve up a specific file from your file system.
Read about it here: Using X-Sendfile with Apache/PHP
That could indeed become an issue with large files.
Isn't it possible to just use FTP for this?
HTTP isn't really meant for large files but FTP is.
The soluton you mentioned is the best possible when the account system is handled via PHP and MySQL. If you want to keep it away from PHP and let the server do the job, you can protect the directory by password via .htaccess file. This way the files won't go through the PHP, but honestly there's nothing you should be worried about. I recommend you to go with your method.