I have a framework that I've written. I created a package for said framework that allows me to track my employee's hours, and gives them a place to dump files for the accountant. (Invoices, etc.)
The problem is that these file dumps are accessible through the browser. I could use a .htaccess file to prevent the files from being served up at all, but the problem is that I would like the accountant and the employee to be able to download their files.
A solution might be to read the files with PHP, create a temporary copy, have the user or accountant download that copy, then delete the copy...but this poses two problems.
1) It's going to be time and resource intensive...especially for large files.
2) For whatever short amount of time, there will be a copy of the file which is accessible to whoever knows the URL.
The other solution would be to put the files outside of the public folder, but the problem is that I would like this package to be portable, and a lot of my servers are shared.
What method could I use to be able to serve the files only when authenticated, and avoid the flaws I described above?
Related
I'm making a web application which will only allow registered members to download zip folders from a folders directory.
I really need to know which would be the proper way to secure the folder as only members stored in my database will be able to access them so the problem is if somebody finds the directory and a file name there's nothing to stop them accessing it.
I've been doing some research and found some approaches but they all have major drawbacks.
1.) put the files outside of the webroot then use readfile to send them the data.
This is how I have it currently set up. the major draw back is that I'm on a shared server and max execution time for the script is 30 seconds (can't be changed) and if the file is big or user connection slow the timeout will be called before the download is complete.
2.) htaccess and htpasswd inside a webroot directory.
The problem with this is I don't want to have to ask the user to put a password again. unless there's a way to allow php to send the password then send a header to the actual zip file that needs to be downloaded.
3.) Keeping the files in webroot but obfuscating the file names so they are hard to guess.
this is just totally lame!
What I really would like to do is keep the files outside of web root then just send a header:location to that document to force a download, obviously as it's not in web root so the browser won't see it. is there a way around this. Is there a way to redirect to an out of web root file with header:location('/file') to force a download. thus allowing apache to serve the file and not php with readfile.
Is there some easier way to secure the folders and serve with apache that I am just not coming across? Has anybody experienced this problem before and is there an industry standard way to do this better?
I know this may resemble a repeat question but none of the answers in the other similar question gave any useful information for my needs.
What I really would like to do is keep the files outside of web root then just send a header:location to that document to force a download, obviously as it's not in web root so the browser won't see it.
More to the point, it is outside the web root so it doesn't have a URL that the server can send in the Location header.
is there a way around this. Is there a way to redirect to an out of web root file with header:location('/file') to force a download.
No. Preventing the server from simply handing over the file is the point of putting it outside the web root. If you could redirect to it, then you would just be back in "hard to guess file name" territory with the added security flaw of every file on the server being public over HTTP.
Is there some easier way to secure the folders and serve with apache that I am just not coming across.
Your options (some of which you've expressed already in the form of specific implementations) are:
Use hard to guess URLs
Put the file somewhere that Apache won't serve it by default and write code that will serve it for you
Use Apache's own password protection options
There aren't any other approaches.
Is there some easier way to secure the folders and serve with apache that I am just not coming across.
No, there isn't an easier way (but that said, all three implementations you've described are "very easy").
Another approach, which I consider really dirty but might get around your resource constraints:
Keep the files outside the web root
Configure Apache to follow symlinks
On demand: Create a symlink from under the web root to the file you want to serve
Redirect to the URI of that symlink
Have a cron job running every 5 minutes to delete old symlinks (put a timestamp in the symlink filename to help with this)
It's effectively a combination of the first two options in my previously bulleted list.
I have a simple site which allows users to upload files (among other things obviously). I am teaching myself php/html as I go along.
Currently the site has the following traits:
--When users register a folder is created in their name.
--All files the user uploads are placed in that folder (with a time stamp added to the name to avoid any issues with duplicates).
--When a file is uploaded information about it is stored in an SQL database.
simple stuff.
So, now my question is what steps do I need to take to:
Prevent google from archiving the uploaded files.
Prevent users from accessing the uploaded files unless they are logged in.
Prevent users from uploading malicious files.
Notes:
I would assume that B, would automatically achieve A. I can restrict users to only uploading files with .doc and .docx extensions. Would this be enough to save against C? I would assume not.
There is a number of things you want to do, and your question is quite broad.
For the Google indexing, you can work with the /robots.txt. You did not specify if you also want to apply ACL (Access Control List) to the files, so that might or might not be enough. Serving the files through a script might work, but you have to be very careful not to use include, require or similar things that might be tricked into executing code. You instead want to open the file, read it and serve it through File operations primitives.
Read about "path traversal". You want to avoid that, both in upload and in download (if you serve the file somehow).
The definition of "malicious files" is quite broad. Malicious for who? You could run an antivirus on the uplaod, for instance, if you are worried about your side being used to distribute malwares (you should). If you want to make sure that people can't harm the server, you have at the very least make sure they can only upload a bunch of filetypes. Checking extensions and mimetype is a beginning, but don't trust that (you can embed code in png and it's valid if it's included via include()).
Then there is the problem of XSS, if users can upload HTML contents or stuff that gets interpreted as such. Make sure to serve a content-disposition header and a non-html content type.
That's a start, but as you said there is much more.
Your biggest threat is going to be if a person manages to upload a file with a .php extension (or some other extension that results in server side scripting/processing). Any code in the file runs on your server with whatever permissions the web server has (varies by configuration).
If the end result of the uploads is just that you want to be able to serve the files as downloads (rather than let someone view them directly in the browser), you'd be well off to store the downloads in a non web-accessible directory, and serve the files via a script that forces a download and doesn't attempt to execute anything regardless of the extension (see http://php.net/header).
This also makes it much easier to facilitate only allowing downloads if a person is logged in, whereas before, you would need some .htaccess magic to achieve this.
You should not upload to webserver-serving directories if you do not want the files to be available.
I suggest you use X-Sendfile, which is a header that instructs the server to send a file to the user. Your PHP script called 'fetch so-and-so file' would do whatever authentication you have in place (I assume you have something already) and then return the header. So long as the web server can access the file, it will then serve the file.
See this question: Using X-Sendfile with Apache/PHP
I have a folder (/files) and I have tons of files there that users can download. I want users to be able to download their files only and no be able to see others people file.
For example:
User A can only view and download:
- file1.doc
- file2.jpg
User B can only view and download:
- file3.txt
- file4.jpeg
User C can only view and download:
- file1.doc
- file2.jpg
- file3.txt
My idea was to put all files in the same folder so all users knows where to go. My question is: Can I use .htaccess or should I build a PHP scripts for this? What about security (which one is more secure)?
Thanks
Is it an open directory, to start with? What you could do is create a subfolder for each user, put their files in there and then assign appropriate permissions in .htaccess for said folders. However, this would require some security integration with your OS (i.e., users would have to have accounts on your machine, not just your web application)... A quick and dirty -- and insecure -- alternative would be to prepend all uploaded filenames with the username (e.g., 'file1.jpg' uploaded by 'foobar' could be named 'foobar.file1.jpg', for example), then it's just a case of your PHP script returning only those files with the respective username and perhaps stripping that part out when displaying (or again, you could use folders, as long as your script can create a new folder per user, when one doesn't exist). Another option, which is slightly more secure is to create a hash of the file and usernames in a database, rename all uploaded files with this hash and then query the database appropriately.
The best solution would definitely be OS managed accounts, a I first mentioned, but it entails more overhead.
Build a PHP script where you use readfile to send the file to the browser. This way you can restrict access for individual files, and use the authentication system you already have.
You can certainly use either htaccess or PHP for this. Neither are more secure, as far as I know, that the other - though done wrong both can permit access where none is intended!
PHP might be marginally better, since you have more flexibility (in terms of integrating it with other PHP authentication, say) and you can put the folder outside the usual web root, which is good practise anyway.
I'm building a web server out of a spare computer in my house (with Ubuntu Server 11.04), with the goal of using it as a file sharing drive that can also be accessed over the internet. Obviously, I don't want just anyone being able to download some of these files, especially since some would be in the 250-750MB range (video files, archives, etc.). So I'd be implementing a user login system with PHP and MySQL.
I've done some research on here and other sites and I understand that a good method would be to store these files outside the public directory (e.g. /var/private vs. /var/www). Then, when the file is requested by a logged in user, the appropriate headers are given (likely application/octet-stream for automatic downloading), the buffer flushed, and the file is loaded via readfile.
However, while I imagine this would be a piece of cake for smaller files like documents, images, and music files, would this be feasible for the larger files I mentioned?
If there's an alternate method I missed, I'm all ears. I tried setting a folders permissions to 750 and similar, but I could still view the file through normal HTTP in my browser, as if I was considered part of the group (and when I set the permissions so I can't access the file, neither can PHP).
Crap, while I'm at it, any tips for allowing people to upload large files via PHP? Or would that have to be don via FTP?
You want the X-Sendfile header. It will instruct your web server to serve up a specific file from your file system.
Read about it here: Using X-Sendfile with Apache/PHP
That could indeed become an issue with large files.
Isn't it possible to just use FTP for this?
HTTP isn't really meant for large files but FTP is.
The soluton you mentioned is the best possible when the account system is handled via PHP and MySQL. If you want to keep it away from PHP and let the server do the job, you can protect the directory by password via .htaccess file. This way the files won't go through the PHP, but honestly there's nothing you should be worried about. I recommend you to go with your method.
I have a site that allows people to upload files to their account, and they're displayed in a list. Files for all users are stored on different servers, and they move around based on how popular they are (its a file hosting site).
I want to add the ability for users to group files into folders. I could go the conventional route and create physical folders on the hard drive, for each user on the server, and transverse them as expected. The downside to that is the user's files will be bound to a single server. If that server starts running of space (or many files get popular at the same time), it will get very tricky to mitigate it.
What I thought about doing is keeping the stateless nature of files, allowing them to be stored on any of the file servers, and simply storing the folder ID (in addition to the user id who owns the file) with each file in the database. So when a user decides to move a file, it doesn't get physically moved anywhere, you just change the folder ID in the database.
Is that a good idea? I use php and mysql.
Yes, it is.
I don't see any downside, except maybe more queries to the database, but with proper indexing of the parent folder id, this will probably be faster than accessing the filesystem directly.
Forget about folders and let the users tag their files, multiple tags per file. Then let them view files tagged X. This isn't much different to implement than virtual folders but is much more flexible for the users.