I have a bunch of PHP scripts inside of, say, /public_html/mydir/, and these scripts may possibly try to delete files / do other stuff to the filesystem.
I want to allow all filesystem modifications within the /public_html/mydir/ directory, but any access (or deletion) outside of the mydir directory shouild not be allowed.
How can I do this?
You either make a user that only has permission to access these directories, or you have to somehow run PHP in a sandbox like chroot
Related
I am creating a website creation service where users can upload files in an FTP browser interface. I also want them to be able to create real directories, but I was worried that if they uploaded scripts, that hey could reek havoc on the server.
I am using PHP, and the mkdir() function. Would it be a good idea to use mode 0664 when making directories, to prevent:
Malicious files from being executed.
Prevent just anyone from writing or executing those files.
Is this a good idea?
No, it does not make sense to use 0664 on directories.
You very likely need the X flag on them because you want to be able to get a directory listing. "Executing" a directory has nothing in common with executing a program. Disabling the X-flag on a directory will not affect execution of programs stored inside.
You have identify yourself which account is actually involved with file access. This account might be the owner of some files, and the owner of the directories of these files. Anything PHP does will be done as this user. So it does not matter if you restrict access to the file for the group or the world (aka 0700). On the other hand, your FTP access might not be the same user, but only sharing a group. In this case it is an annoying idea to restrict this group to not having access to the files, because you cannot read, write or delete them.
If you allow users to use an upload facility to your server, and you cannot guarantee that these files will never be executed, you probably shouldn't offer this service. Using access flags will not improve the situation, because PHP does not need the X flag, it only needs the file to be readable. So if the FTP site should be of any use, being able to read the files is a must, otherwise why uploading them?
Note that the usual FTP upload/download service separates uploads from downloads. An upload is a one-way operation that places the file on the server but without being accessible. An administrator has to check the upload and move it into the download area where it is read-only. Nobody can trigger the execution of any file because the server will ignore any executable flags and simply send the bytes of such a file back.
You cannot even open files in a directory without an executable permission. But you can access files in a directory without a read permission. A read permission is needed to list directory contents.
0644 directories are inaccessible except by root.
Examples
Inaccessible:
dir/ = drw-rw-r-- (0644) -- effectively the same as d--------- (0000)
So file_get_contents ('dir/file.txt') will always fail.
Accessible, but no listing possible:
dir/ = d-wx-wx--x (0331)
So file_get_contents ('dir/file.txt') will work if dir/file.txt exists. But opendir ('dir') will not.
Fully accessible:
dir/ = drwxrwxr-x (0775)
file_get_contents ('dir/file.txt') will work if dir/file.txt exists. And opendir ('dir') will return the contents of the directory.
I want to include file /httpdoc/a.php but this file including files /httpdoc/b/c.php and /http/d/e.php. How to allow to include only c.php? I want to allow including only from /httpdoc/b.
File: /httpdoc/a.php:
<php
include '/httpdoc/b/c.php'; // should be included
include '/httpdoc/d/e.php'; // shouldn't be included
There is any way to do this from script in PHP and only for this script? I want to allow including only from selected directory dynamically. Other scripts should be able to include all files so changes should be local.
Doesn't make too much sense from my point of view, because scripts will throw errors or exceptions if they cannot include what they desire. If it's necessary anyway, you could solve it by removing file or folder permissions for the user running your PHP.
Login as priviliged user. Then change the permissions of '/httpdoc/d' in a way that PHP cant access it anymore.
The commandline for that could simply look like this:
chmod 700 -R /httpdoc/d
With that, only the owner of this folder can use it and all of its subfiles and subfolders. (If the user under wich PHP is running - typically 'www-data' or 'apache' - is the owner of this folder, you have to change the folder owner)
When you have done this, turn off the error reporting.
For example:
#include "/httpdoc/a.php"
(There are other ways for doing so as well. E.g. error_reporting())
Regarding your comment:
There is any way to do this from script in PHP and only for this script? I want to allow including only from selected directory dynamically. Other scripts should be able to include all files so changes should be local.
If u have files like "/httpdoc/a.php" u dont 100% trust in, do the following.
Put them in a separate directory. You can put all other files except your secret ones (/http/d/e.php) in there as well.
Configure that directory as described here:
PHP - a different open_basedir per each virtual host
Is it possible to arrange file permissions/group ownership/etc in such a way that a file can be read by the function readFile() for a forced download, but it cannot be downloaded by navigating to the literal url of the file?
Maybe you could add the user that is running apache / php to the group that owns the file. And set config to read and write for owner and owner group, and no permission at all for others. (-rwxrw---- 0r 0760)
Never tested it, but it should work.
The Apache user will need read permissions. To prevent it from being navigated to, the best (and easiest) solution is to store the file outside of the web folder.
I'm building a website based on php and i want to ask where to put files that are retrieved with a require statement, so that they can not be accessed from users with their browser.
(for example a php file that connects to my database)
EDIT actually i think the better way is to put them outside the public root because apache tutorial says htaccess will have a slowdown impact. it can be done with adding a ../
for example require("../myFile.php"); (At least this works in my server)
Best regards to all
That depends on the web server configuration. Usually (or at least in all cases I witnessed), you have a document root which cannot be accessed by users with their browser, with in there a folder containing all public material (often called htdocs, httpdocs, public_html or anything of the kind. Often, you can place your PHP include files in that root, and then require them using require("../include_file.php");
However, it depends on the configuration whether PHP can include files outside your public folder. If not, a .htaccess file is your best option.
If you place those files outside the document root of your webserver users cannot access these files with a browser.
If you use apache you can also place these files in a directory to which you do not allow access with a .htaccess file.
And as a last remark, if your files do not generate output, there is no way users can check the contents of the files.
If you mean source code then it is not visible for users, if you want hide folder contents use .htaccess directive Options -Indexes to hide files, if you can access php source your server configuration is wrong and it is not parsing php files.
You normally place them into a directory that is not accessible over the webserver (outside the document or web root). Sometimes called a "private" directory.
You then include/require the file from that path as PHP has still access to the files.
See also:
placing php script outside website root
disable access to included files - For a method if you're not able to place the files in a private directory.
Just make them secure with .htacces!
Here's a very clear tutorial for protecting files with a password. If you don't need direct access to the files per browser, or only your scripts need access, just block them completly by changing the code between
<Files xy>
change this bit here
</Files>
to
Order allow,deny
Deny from all
Then you won't need your htpassword file anymore either!
You need to put these files outside of public-facing folders on your web server. Most (all?) web hosts should have the capability to change the document root of the website.
For example, let's say that all of your files are served from the following directory on your host: /home/username/www/example.com/
This means that anything that resides inside that directory is visible to the internet. If you went to http://example.com/myfile.png it would serve the file at /home/username/www/example.com/myfile.png.
What you want to do is create a new directory called, for example, public which will serve your files, and point the document root there. After you've done that, the request for http://example.com/myfile.png will be served from /home/username/www/example.com/public/myfile.png (note the public directory here). Now, anything else that resides within the example.com directory won't be visible on your website. You can create a new directory called, for example, private where your sensitive include files will be stored.
So say you have two files: index.php, which serves your website, and sensitive.php which contains passwords and things of that nature. You would set those up like this:
/home/username/www/example.com/public/index.php
/home/username/www/example.com/private/sensitive.php
The index.php file is visible to the internet, but sensitive.php is not. To include sensitive.php, you just include the full file path:
require_once("/home/username/www/example/com/private/sensitive.php");
You can also set your application root (the root of your websites files, though not the root of the publicly accessible files) as a define, possibly in a config file somewhere, and use that, e.g.:
require_once(APP_ROOT . "sensitive.php");
If you can't change the document root, then what some frameworks do is use a define to note that the file shouldn't be executed directly. You create a define in any file you want as an entry point to your application, usually just index.php, like so:
if (!defined('SENSITIVE')) {
define('SENSITIVE', 'SENSITIVE');
}
Then, in any sensitive file, you check that it's been set, and exit if it hasn't, since that means the file is being executed directly, and not by your application:
if (!defined('SENSITIVE')) {
die("This file cannot be accessed directly.");
}
Also, make sure that your include files, when publicly accessible (and really, even if not), have a proper extension, such as .php, so that the web server knows to execute them as PHP files, rather than serving them as plain text. Some people use .inc to denote include files, but if the server doesn't recognize them as being handled by PHP, your code will be publicly visible to anyone who cares to look. That's not good! To prevent this, always name your files with a .php extension. If you want to use the .inc style to show your include files, consider using .inc.php instead.
Seems it is a question about .htaccess, not sure what should I use.
How can I make files inside /data/ folder available to read and write only for scripts?
Users should not have access to them from browser window.
Simplest way to block access to non-scripts: move it outside the web server's document root. If your PHP files are served from /var/www/htdocs, put your data files in /var/www/data.
If that's not possible, the .htaccess solution looks like:
# don't let anyone access these files directly
Order allow,deny
Deny from all
You can change the ownership of the directory to the www or apache user, depending on what user your web-server is running as.
Then make sure that the permissions of the directory are set to 644 so that only the owner can write. If you want nobody else to be able to read, just make it 600.