What file permissions should I set for uploaded files - php

I have a PHP script that processes file uploads. The script tries to organise the files that are uploaded and may create new folders to move the files into if needed. These files will be below the www root directory (ie, a web browser will be able to access them).
My question is, what permissions should I set for the folders that get created and for the files that are moved into them (using mkdir() and move_uploaded_file())?

Your webserver needs read and write permission in those folders, execute permission should be revoked (assuming UNIX-like systems). If not, a user could upload a script and have it executed by sending a HTTP request for it.
But IMO the whole concept is a potential security hole. Better store the files in a folder outside the webserver root, so that no direct acceess is possible. In your web application, you can have a PHP download page that scans the upload directory and displays a list of download links. These download links lead to another script, that reads the fiels from you storage dir und sends them to the user.
Yes, this is more work. But the scenario is very common, so you should be able to find some source code with example implementations easily. And it it much less work that having your server hacked...

to answer it specifically 766 (no execute permissions) would be the loosest you would want to use. On the other end 700 would allow no one but the web user to mess with the file.
But really it all depends you were doing with the files that would determine the best result.

Related

How can I provide a folder to write and download temp files from web application?

This questions comes from my poor knowledge of server-side web development, but I'll try to make it as clear as possible in order not to make any mistake in my server configuration.
I have a web application that at the press of a Download button should trigger a php which in turns will write a file to a directory and let the user who clicked download that file.
This directory will store temporary files and should be cleared periodically.
So my doubts are:
Where is a good place to store these temporary files (in /var/www/<my_app>/tmp?)
Should I grant the apache2 user (www-data) read and write permissions to this folder?
Did I miss anything else?
EDIT1
Just saw php passthruw command. Will this be enough for zipped files and let me avoid thinking about the tmp folder and permission?
1) that temp folder could be created wherever you want.
2) when you have a php script and user clicks some button in front-end application and triggers that php script to run, that php script gets executed on behalf of apache2 user which in turn is the 'other - world permission'. (There are users,groups and others). So you should grant write permission to others in order apache2 user(www-data) to write to that directory.
This way you can't upload a file via ftp or sftp or whatever, because with ftp , your user won't be www-data. and remember what you did. you gave the write permission to only www-data.
To better understand this concept, I'd advise you to read the following link and the answer too. File permissions for Laravel 5 (and others)

How to put_file_contents with php without user denied error?

I'm trying to replace the contents of a file in a different folder than my php file folder, and I'm getting an error: "Failed to open stream, Permission Denied"
I'm using the put_file_contents function to change the file contents. I searched online for a solution to this problem and found that the file directory is allowed to be written only by the owner/user. I checked directory properties in filezilla (ftp), and found that the directory was not writable by group or public.
In filezilla, I tried allowing the directory to be written by public, and the php file was able to write to the folder's file.
Therefore, I think I can easily set the permissions to the file only, and not the directory, and easily replace it's content by setting the permission as writable by public. Although I don't understand what owner/group/public options mean? Cause this is supposed to be a website's webserver hosted on a paid domain host, and I'm not sure if the public write option is safe or not, or why would there be user groups for a webserver hosting only a single website?
Since only a php file can change the contents on a webserver, why is a public option provided for a webserver? If it's for uploads, then that too means the upload page resides on the server! I cannot access the terminal on ftp or cpanel therefore I cannot execute chmod etc...
Please could someone provide more detail regarding security risks to files with public write permissions?
A server is still a computer, like any of them. This means that you can create several users, and that a lot of things can happen.
Your server might not necessarily stay only a web server. Obviously it is also an FTP server, and you probably also can access it using ssh. In the future you might want to use it for another purpose.
In any case, even if you use it just as a web server, setting a folder's permission as writable to the every user is dangerous, because of the security risks that your code most inevitably will have. It can act as a safeguard : even if you write something dangerous in the future, it contains the damage.
That is the reason why on some systems, the software responsible for serving http requests (typically Apache) runs as a specific user httpd.
User groups is just what it says it is : defined groups of users. You can read about it easily online.

Directory Protection from web access

I have a few PHP scripts that generate word documents into a folder.
The PHP scripts are locked down with a login session, however if you know the right URL the documents are accessible.
How do I lock down the directory from being accessible from a url but still maintain read and write access for the PHP scripts? Currently the chmod for the directory is set to 777. Can this be set with the folder permissions?
You place the documents at the wrong location in your file system.
Do not place the documents inside the document root and you do not have to protect them. There is not need to place them exactly in there. There is no limit to still accessing such documents from php when they are stored elsewhere. So create yourself a folder outside the document root and that's it.
General rule of thumb:
never place objects inside the document root that are not meant to be accessed directly by web requests.
documents meant to be offered for download should not be offered directly but by a handler script instead.

how to send header:location to out of web root file

I'm making a web application which will only allow registered members to download zip folders from a folders directory.
I really need to know which would be the proper way to secure the folder as only members stored in my database will be able to access them so the problem is if somebody finds the directory and a file name there's nothing to stop them accessing it.
I've been doing some research and found some approaches but they all have major drawbacks.
1.) put the files outside of the webroot then use readfile to send them the data.
This is how I have it currently set up. the major draw back is that I'm on a shared server and max execution time for the script is 30 seconds (can't be changed) and if the file is big or user connection slow the timeout will be called before the download is complete.
2.) htaccess and htpasswd inside a webroot directory.
The problem with this is I don't want to have to ask the user to put a password again. unless there's a way to allow php to send the password then send a header to the actual zip file that needs to be downloaded.
3.) Keeping the files in webroot but obfuscating the file names so they are hard to guess.
this is just totally lame!
What I really would like to do is keep the files outside of web root then just send a header:location to that document to force a download, obviously as it's not in web root so the browser won't see it. is there a way around this. Is there a way to redirect to an out of web root file with header:location('/file') to force a download. thus allowing apache to serve the file and not php with readfile.
Is there some easier way to secure the folders and serve with apache that I am just not coming across? Has anybody experienced this problem before and is there an industry standard way to do this better?
I know this may resemble a repeat question but none of the answers in the other similar question gave any useful information for my needs.
What I really would like to do is keep the files outside of web root then just send a header:location to that document to force a download, obviously as it's not in web root so the browser won't see it.
More to the point, it is outside the web root so it doesn't have a URL that the server can send in the Location header.
is there a way around this. Is there a way to redirect to an out of web root file with header:location('/file') to force a download.
No. Preventing the server from simply handing over the file is the point of putting it outside the web root. If you could redirect to it, then you would just be back in "hard to guess file name" territory with the added security flaw of every file on the server being public over HTTP.
Is there some easier way to secure the folders and serve with apache that I am just not coming across.
Your options (some of which you've expressed already in the form of specific implementations) are:
Use hard to guess URLs
Put the file somewhere that Apache won't serve it by default and write code that will serve it for you
Use Apache's own password protection options
There aren't any other approaches.
Is there some easier way to secure the folders and serve with apache that I am just not coming across.
No, there isn't an easier way (but that said, all three implementations you've described are "very easy").
Another approach, which I consider really dirty but might get around your resource constraints:
Keep the files outside the web root
Configure Apache to follow symlinks
On demand: Create a symlink from under the web root to the file you want to serve
Redirect to the URI of that symlink
Have a cron job running every 5 minutes to delete old symlinks (put a timestamp in the symlink filename to help with this)
It's effectively a combination of the first two options in my previously bulleted list.

php apache and temporary files

I have a web based application which server's content to authenticated users by interacting with a soap server. The soap server has file's which the user's need to be able to download.
What is the best way to serve these files to users? When a user requests a file, my server will make a soap call to the soap server to pull the file and then it will serve it to the user via referencing the link to it.
The question is that these temporary files need to be cleaned up at some point and my first thought was this being a linux based system, store them in /tmp/ and let the system take care of cleanup.
Is it possible to store these files in /tmp and have apache serve them
to the user?
If apache cannot access /tmp since it is outside of the web root, potentially I could create a symbolic link to /tmp/filename within the web root? (This would require cleanup of the symbolic links though at some point.)
Suggestions/comments appreciated on best way to manage these temporary files?
I am aware that I could write a script and have it executed as a cron job on
regular intervals but was wondering if there was a way similar to presented
above to do this and not have to handle deleting the files?
There's a good chance that Apache can read the tmp directory, but that approach smells bad. My approach would be to have PHP read the file and send it to the user. Basically, you send out the appropriate HTTP headers to indicate what type of content you're sending and what name to use for the file, and then you just spit out the file with echo (for example).
It looks like there's a good discussion of this in another question:
HTTP Headers for File Downloads
An additional benefit of this approach is that it leaves you in full control because there's PHP between a user and the file. This means you can add additional security measures (e.g., time-of-day controls), pull the file from various places to distribute bandwidth usage, and so on.
[additional material]
Sorry for not directly addressing your question. If you're using PHP to serve the files, they need not reside in the Apache web root, just where Apache/PHP has file-system read access to them. Thus, you can indeed simply store them in /tmp and let the OS clean them up for you. You might want to adjust the frequency of those clean-ups, however, to keep volume at the level you want.
If you want to ensure that access is reliably denied after a period of time or a certain number of downloads, you can store tracking information in your database (e.g., a flag on the user to indicate that they've downloaded the file), and then check it with your download script and possibly deny the download. This effectively separates security of access from frequency of cleanup, two things you may want to adjust independently.
Hope that's more helpful....

Categories