I have a web based application which server's content to authenticated users by interacting with a soap server. The soap server has file's which the user's need to be able to download.
What is the best way to serve these files to users? When a user requests a file, my server will make a soap call to the soap server to pull the file and then it will serve it to the user via referencing the link to it.
The question is that these temporary files need to be cleaned up at some point and my first thought was this being a linux based system, store them in /tmp/ and let the system take care of cleanup.
Is it possible to store these files in /tmp and have apache serve them
to the user?
If apache cannot access /tmp since it is outside of the web root, potentially I could create a symbolic link to /tmp/filename within the web root? (This would require cleanup of the symbolic links though at some point.)
Suggestions/comments appreciated on best way to manage these temporary files?
I am aware that I could write a script and have it executed as a cron job on
regular intervals but was wondering if there was a way similar to presented
above to do this and not have to handle deleting the files?
There's a good chance that Apache can read the tmp directory, but that approach smells bad. My approach would be to have PHP read the file and send it to the user. Basically, you send out the appropriate HTTP headers to indicate what type of content you're sending and what name to use for the file, and then you just spit out the file with echo (for example).
It looks like there's a good discussion of this in another question:
HTTP Headers for File Downloads
An additional benefit of this approach is that it leaves you in full control because there's PHP between a user and the file. This means you can add additional security measures (e.g., time-of-day controls), pull the file from various places to distribute bandwidth usage, and so on.
[additional material]
Sorry for not directly addressing your question. If you're using PHP to serve the files, they need not reside in the Apache web root, just where Apache/PHP has file-system read access to them. Thus, you can indeed simply store them in /tmp and let the OS clean them up for you. You might want to adjust the frequency of those clean-ups, however, to keep volume at the level you want.
If you want to ensure that access is reliably denied after a period of time or a certain number of downloads, you can store tracking information in your database (e.g., a flag on the user to indicate that they've downloaded the file), and then check it with your download script and possibly deny the download. This effectively separates security of access from frequency of cleanup, two things you may want to adjust independently.
Hope that's more helpful....
Related
I have not been able to find solid information on preferred (best practices) and/or secure methods to allow php to access config or other types of files on a linux server not contained in the public web directory or owned by the apache user so I'm hoping to find some answers here.
I am a fairly competent PHP programmer but am increasingly tasked with writing web applications (most of which are not publicly accessible via the web however) that require updating, changing or adding to config files or files generated by some service or application on the server.
For instance, I need to create a web interface that will view, add or remove entries from a /etc/mail/spamassassin/white-list.cf file owned by root.
Another scenario is that I need php to parse mime messages in /var/vmail that are owned by user vmail.
These are just a couple examples, there will be other files in locations owned by other processes/users. How can I write PHP applications that securely access and manipulate these files without opening security risks?
If I were needing to implement something like this, I would probably look at using something like sudo to fine-tune permissions. I'm not a Linux CLI expert, so I'm sure there are issues that I haven't taken into account when typing this out.
I would probably determine what tasks need to be done, and would write a separate script for each task that needs to be completed. Using sudo, I'd assign the necessary level of permissions for that script only.
Obviously, as the number of tasks increase, so would the complexity and the amount of work involved. I'm not sure how this would affect you at the moment.
I'm making a web application which will only allow registered members to download zip folders from a folders directory.
I really need to know which would be the proper way to secure the folder as only members stored in my database will be able to access them so the problem is if somebody finds the directory and a file name there's nothing to stop them accessing it.
I've been doing some research and found some approaches but they all have major drawbacks.
1.) put the files outside of the webroot then use readfile to send them the data.
This is how I have it currently set up. the major draw back is that I'm on a shared server and max execution time for the script is 30 seconds (can't be changed) and if the file is big or user connection slow the timeout will be called before the download is complete.
2.) htaccess and htpasswd inside a webroot directory.
The problem with this is I don't want to have to ask the user to put a password again. unless there's a way to allow php to send the password then send a header to the actual zip file that needs to be downloaded.
3.) Keeping the files in webroot but obfuscating the file names so they are hard to guess.
this is just totally lame!
What I really would like to do is keep the files outside of web root then just send a header:location to that document to force a download, obviously as it's not in web root so the browser won't see it. is there a way around this. Is there a way to redirect to an out of web root file with header:location('/file') to force a download. thus allowing apache to serve the file and not php with readfile.
Is there some easier way to secure the folders and serve with apache that I am just not coming across? Has anybody experienced this problem before and is there an industry standard way to do this better?
I know this may resemble a repeat question but none of the answers in the other similar question gave any useful information for my needs.
What I really would like to do is keep the files outside of web root then just send a header:location to that document to force a download, obviously as it's not in web root so the browser won't see it.
More to the point, it is outside the web root so it doesn't have a URL that the server can send in the Location header.
is there a way around this. Is there a way to redirect to an out of web root file with header:location('/file') to force a download.
No. Preventing the server from simply handing over the file is the point of putting it outside the web root. If you could redirect to it, then you would just be back in "hard to guess file name" territory with the added security flaw of every file on the server being public over HTTP.
Is there some easier way to secure the folders and serve with apache that I am just not coming across.
Your options (some of which you've expressed already in the form of specific implementations) are:
Use hard to guess URLs
Put the file somewhere that Apache won't serve it by default and write code that will serve it for you
Use Apache's own password protection options
There aren't any other approaches.
Is there some easier way to secure the folders and serve with apache that I am just not coming across.
No, there isn't an easier way (but that said, all three implementations you've described are "very easy").
Another approach, which I consider really dirty but might get around your resource constraints:
Keep the files outside the web root
Configure Apache to follow symlinks
On demand: Create a symlink from under the web root to the file you want to serve
Redirect to the URI of that symlink
Have a cron job running every 5 minutes to delete old symlinks (put a timestamp in the symlink filename to help with this)
It's effectively a combination of the first two options in my previously bulleted list.
I have a simple site which allows users to upload files (among other things obviously). I am teaching myself php/html as I go along.
Currently the site has the following traits:
--When users register a folder is created in their name.
--All files the user uploads are placed in that folder (with a time stamp added to the name to avoid any issues with duplicates).
--When a file is uploaded information about it is stored in an SQL database.
simple stuff.
So, now my question is what steps do I need to take to:
Prevent google from archiving the uploaded files.
Prevent users from accessing the uploaded files unless they are logged in.
Prevent users from uploading malicious files.
Notes:
I would assume that B, would automatically achieve A. I can restrict users to only uploading files with .doc and .docx extensions. Would this be enough to save against C? I would assume not.
There is a number of things you want to do, and your question is quite broad.
For the Google indexing, you can work with the /robots.txt. You did not specify if you also want to apply ACL (Access Control List) to the files, so that might or might not be enough. Serving the files through a script might work, but you have to be very careful not to use include, require or similar things that might be tricked into executing code. You instead want to open the file, read it and serve it through File operations primitives.
Read about "path traversal". You want to avoid that, both in upload and in download (if you serve the file somehow).
The definition of "malicious files" is quite broad. Malicious for who? You could run an antivirus on the uplaod, for instance, if you are worried about your side being used to distribute malwares (you should). If you want to make sure that people can't harm the server, you have at the very least make sure they can only upload a bunch of filetypes. Checking extensions and mimetype is a beginning, but don't trust that (you can embed code in png and it's valid if it's included via include()).
Then there is the problem of XSS, if users can upload HTML contents or stuff that gets interpreted as such. Make sure to serve a content-disposition header and a non-html content type.
That's a start, but as you said there is much more.
Your biggest threat is going to be if a person manages to upload a file with a .php extension (or some other extension that results in server side scripting/processing). Any code in the file runs on your server with whatever permissions the web server has (varies by configuration).
If the end result of the uploads is just that you want to be able to serve the files as downloads (rather than let someone view them directly in the browser), you'd be well off to store the downloads in a non web-accessible directory, and serve the files via a script that forces a download and doesn't attempt to execute anything regardless of the extension (see http://php.net/header).
This also makes it much easier to facilitate only allowing downloads if a person is logged in, whereas before, you would need some .htaccess magic to achieve this.
You should not upload to webserver-serving directories if you do not want the files to be available.
I suggest you use X-Sendfile, which is a header that instructs the server to send a file to the user. Your PHP script called 'fetch so-and-so file' would do whatever authentication you have in place (I assume you have something already) and then return the header. So long as the web server can access the file, it will then serve the file.
See this question: Using X-Sendfile with Apache/PHP
I'm building a web server out of a spare computer in my house (with Ubuntu Server 11.04), with the goal of using it as a file sharing drive that can also be accessed over the internet. Obviously, I don't want just anyone being able to download some of these files, especially since some would be in the 250-750MB range (video files, archives, etc.). So I'd be implementing a user login system with PHP and MySQL.
I've done some research on here and other sites and I understand that a good method would be to store these files outside the public directory (e.g. /var/private vs. /var/www). Then, when the file is requested by a logged in user, the appropriate headers are given (likely application/octet-stream for automatic downloading), the buffer flushed, and the file is loaded via readfile.
However, while I imagine this would be a piece of cake for smaller files like documents, images, and music files, would this be feasible for the larger files I mentioned?
If there's an alternate method I missed, I'm all ears. I tried setting a folders permissions to 750 and similar, but I could still view the file through normal HTTP in my browser, as if I was considered part of the group (and when I set the permissions so I can't access the file, neither can PHP).
Crap, while I'm at it, any tips for allowing people to upload large files via PHP? Or would that have to be don via FTP?
You want the X-Sendfile header. It will instruct your web server to serve up a specific file from your file system.
Read about it here: Using X-Sendfile with Apache/PHP
That could indeed become an issue with large files.
Isn't it possible to just use FTP for this?
HTTP isn't really meant for large files but FTP is.
The soluton you mentioned is the best possible when the account system is handled via PHP and MySQL. If you want to keep it away from PHP and let the server do the job, you can protect the directory by password via .htaccess file. This way the files won't go through the PHP, but honestly there's nothing you should be worried about. I recommend you to go with your method.
Is it possible to "deny from all" apache htaccess style using php.
I can't use htaccess because im using different webserver, so i wan't to use php to workaround it.
So let say user are trying to access folder name 'david', all content and subdirectory are denied from viewing.
No
PHP cannot be used to protect folders.
Because it is not PHP who serves requests, but a web server
You can move this catalog above Document Root to prevent web access to it.
But premissions will help you nothing
Use chmod to change the permissions on that directory. Note that the user running PHP needs to own it in that case.
If you just want to prevent indexing the folder, you can create an index.php file that does a simple redirection. Note: Requests that have a valid filename will still be let through.
<?php
header("Location: /"); // redirect user to root directory
Without cooperation from the webserver the only way to protect your files is
to encrypt them, in an archive, maybe, of which your script would know the password and tell no one - that will end up wasting cpu as the server will be decrypting it all the time, or
to use an incredibly deranged file naming scheme, a file naming scheme you won't ever describe to anyone, and that only your php script can sort trough.
Still data could be downloaded, bandwidth go to waste and encrypted files decrypted.
It all depends on how much that data matters. And how much your time costs, as these convoluted layers of somewhat penetrable obfuscation will likely eat huge chunks of developer time.
Now, as I said... that would be without cooperation from the webserver... but what if the webserver is cooperating and doesn't know?
I've seen some apache webservers, (can anyone confirm it's in the standard distribution?) for instance, come preloaded with a rule denying access to files starting with .ht, not only .htaccess but everything similar: .htproxy, .htcache, .htwhatever_comes_to_mind, .htyourmama...
Chances are your server could be one of those.
If that's the case... rename your hidden files .hthidden-<filename1>,.hthidden-<filename2>... and you'll get access to them only through php file functions, like readfile()