I've tried searching about this online but found nothing decent. Here is my setup
NGINX serving php, javascript, css, html, etc
There is a folder inside web root that ONLY localhost should be able to access. (I've dealt with this permissions via location restriction in nginx directives)
How exactly should I set permissions for all folders that are not my web root? What should their permissions be? For example, /etc/nginx and /usr/whatever. What commands should I use?
And the content, properly said, which permissions should it have to allow only execution? I've read about 740 or something alike but what is "reading" a php file and "executing" a php file? How exactly should I put it?
And what about ownership of the content and users? How could I put this in root? Any important files I should change in php.ini to avoid malicious intent?
Related
THE SITUATION
I have multiple folders in my /var/www/ directory.
Users are created that have control over a specific directory... /var/www/app1 belongs to app1:app1 (www-data is a member of the app1 group).
This works fine for what I want.
THE PROBLEM
If the app1 user uploads a PHP script that changes the file/folder permissions for something in app2s directory structure, the Apache process (as there's only one installed on the server) will be more than happy to run it, as it has the necessary permissions to access both /var/www/app1 and /var/www/app2 folders and files.
EDIT:
To the best of my knowledge, something like, /var/www/app1/includes/hack.php:
<?php
chmod("/var/www/app2", 777);
?>
The Apache process (owned by www-data) will run this, as it has permissions to change both /var/www/app1 and /var/www/app2 directories. The user app1 will then be able to cd /var/www/app2, rm -rf /var/www/app2, etc., which is obviously not good.
THE QUESTION
How can I avoid this cross-contamination of the Apache process? Can I instruct Apache to only run PHP scripts that affect the files/folders that reside within the relevant vHost root directory and below?
While open_basedir would help, there are several ways of bypassing this constraint. While you could break a lot of functionality in php to close off all the backdoors, a better solution would be to stop executing the php as a user whom has access to all the files. To do that, you need to use php-fpm with a separate process pool/uid/gid for each vhost.
You should still have a separate uid for the php execution from the uid owning the files with a common group allowing a default read only access to the files.
You also need to have separate storage directories for session data.
A more elaborate mechanism would be to use something like Apache traffic server in front of a container-per owner with each site running on its own instance of Apache - much better isolation, but technically demanding and somewhat more resource intensive.
Bear in mind, if you are using mariadb or similar, that the DBMS can also read and write arbitrary files (SELECT INTO OUTFILE.../LOAD DATA INFILE)
UPDATE
Rather than the effort of maintaining separate containers, better isolation could be achieved with less effort by setting the home directory of the php-fpm uid appX to the base directory of the vhost (which should contain, not be, the document_root - see below) and use apparmor to constrain access to the common files (e.g .so libs) and #{HOME}. Hence each /var/www/appX might contain:
.htaccess
.user.ini
data/ (writeable by fpm-appX)
html/ (the document root)
include/
sessions/ (writeable by fpm-appX)
You should add an open_basedir directive to each site's vhost file. The open_basedir directive limits the directories that a site can access.
You can read more about open_basedir here.
I just made a mistake in a PHP application I'm developing with WAMP Server.
My WAMP / WWW folder is inside my D:\ disk, where I also have my personal data. My app, due to a fail in generating a dynamic path, deleted all my music, my photos and other personal files I had.
I mean... WHAT? How was it possible? I will need a recovery tool to recover that data.
How can keep the PHP from touching anything outside it's folder in www so it does not happen again? It's a disaster.
Limit the files that can be accessed by PHP to the specified directory-tree, including the file itself.
http://php.net/manual/en/ini.core.php#ini.open-basedir
Use open_basedir to restrict file operations to within specific directories, like this (in the website's VirtualHost file)...
php_admin_value open_basedir "C:/WampDeveloper/Temp/;C:/WampDeveloper/Websites/www.example.com/webroot/"
Though if you are deleteing via the command line or bat file (e.g., you are not using PHP file functions directly), the only way to fix this is to set Apache to run under a custom account that only has permissions set on WAMP's folder.
I'm having a real brain fart right now, and forgot whats required in httpd.conf to include files from directories outside the web root folder. The whole idea is that I can call files from other folders, without listing the location in the HTML/PHP source files.
Having a hard time searching through google and apache's site, could someone refresh my memory?
EDIT: Example, I'm using 'require_once "pdo-db-conn.php";' in a PHP file, but php-db-conn.php is actually outside the web root. The idea is that I don't need to list the external directory within my production php source code.
I've done it before, and it included listing directories which apache would search for files.
For Apache, you are looking for either Alias or AliasMatch.
For PHP, you are looking for include_path
I think this question should be something easy but after searching all over the web I couldnt find an answer, so I decided to ask here.
I have a file uploader in my website that works with php. The folder where files are being uploaded has 777 chmod. I also have a php script to list the files in that folder. What I need is to allow php to upload and browse files on that folder, but dont allow people to do it. The only solution I imagined is to chown that folder to another user different than default, so I could later chmod in filezilla and allow only owner to do it, so people will see the files trough the output of the php script, but not if they navigate to that folder.
Im using Debian, apache2. Id like to know what could I do.
To make it shor, my aim: allow php to upload, read, write and execute files in that folder, but not clients unless they use my php script.
Thanks in advance
Put all the files you're talking about in their own directory. Add a .htaccess file to that directory. The contents of the .htaccess should be deny from all.
This will prevent any user from manually accessing the files as access will be blocked off. Your PHP script can still browse the contents of the file and serve it up as an attachment with the correct content type.
For more info on how to serve a file for download in PHP, read this: https://serverfault.com/questions/316814/php-serve-a-file-for-download-without-providing-the-direct-link
All services including web servers run in a security context which is an account in the OS, for example apache starts using apache user in apache group. It is enough to change mode and change owner to this user and group. Never chmod a directory to 777 until there is a good explanation for that. Using this trick, web service process only can read, write and execute in that directory.
As well, if you want the browser clients not to see(read) the contents of that directory, you should deny listing on that directory. I think it is disabled for default.
See an example here: http://mattpotts.com/portal/
I put an includeme.htm in each directory on the required path to find the point of failure. It works fine on my local machine (windows) with the same directory structure but fails on my remote (linux) server.
Directory structure:
+-firefli/ drwx--x--x
+-private_html/ drwx------
+-foo/ drwxr-xr-x
+-bar/ drwxr-xr-x
+-portal/ drwxr-wr-w
+-public_html/ drwxr-wr-w
+-foo/ drwxr-wr-w
+-portal/ drwxr-wr-w
The permissions confirm that it's the private_html directory causing the trouble. Hopefully you can see the purpose of the directory structure, I don't know if it's a common way of doing things but it works for me. Well, until now.
I've gone a very long way around asking it but my question is simply this: is there anything wrong with setting private_html to be drwxr-xr-x? Given that I do not want it to be accessible via the web. But the permissions shouldn't do that should they? Because it's apache making the public_html directory accessible via http.
You shouldn't need to block out web users with folder/file permissions on private_html, as it's outside the web root. As you say, web users can only get to stuff in public_html
For future debugging speed, if you have a relative web path you can convert it to a real path using realpath:
$path = realpath('../../private_html');
// $path is now /public_html/foo/private.html or whatever
Well, if you have set up your DocumentRoot correctly to point to public_html, it won't be accessible from the web, no matter what permissions you put on it.
The Private HTMl is not accessible from the web without you putting in a .htaccess file that would redirect it. If you don't know what that means/how to do that, you are safe.
You should be fine setting these permissions to whatever your script needs.
what are the user:group for private_html? The web server needs to be either a member of the group or the owner of the file. In order to read the directory contents the dirctory needs to have the execute permission for the webserver to open it. Essentially they should have the same user:group as public_html. You just want to disallow the write permission. tot he webserver. If you have set your document root to public_html private_html is not accessible via the web no matter what the permissions. Also, i always use realpath on the path arguments to and file operation.