THE SITUATION
I have multiple folders in my /var/www/ directory.
Users are created that have control over a specific directory... /var/www/app1 belongs to app1:app1 (www-data is a member of the app1 group).
This works fine for what I want.
THE PROBLEM
If the app1 user uploads a PHP script that changes the file/folder permissions for something in app2s directory structure, the Apache process (as there's only one installed on the server) will be more than happy to run it, as it has the necessary permissions to access both /var/www/app1 and /var/www/app2 folders and files.
EDIT:
To the best of my knowledge, something like, /var/www/app1/includes/hack.php:
<?php
chmod("/var/www/app2", 777);
?>
The Apache process (owned by www-data) will run this, as it has permissions to change both /var/www/app1 and /var/www/app2 directories. The user app1 will then be able to cd /var/www/app2, rm -rf /var/www/app2, etc., which is obviously not good.
THE QUESTION
How can I avoid this cross-contamination of the Apache process? Can I instruct Apache to only run PHP scripts that affect the files/folders that reside within the relevant vHost root directory and below?
While open_basedir would help, there are several ways of bypassing this constraint. While you could break a lot of functionality in php to close off all the backdoors, a better solution would be to stop executing the php as a user whom has access to all the files. To do that, you need to use php-fpm with a separate process pool/uid/gid for each vhost.
You should still have a separate uid for the php execution from the uid owning the files with a common group allowing a default read only access to the files.
You also need to have separate storage directories for session data.
A more elaborate mechanism would be to use something like Apache traffic server in front of a container-per owner with each site running on its own instance of Apache - much better isolation, but technically demanding and somewhat more resource intensive.
Bear in mind, if you are using mariadb or similar, that the DBMS can also read and write arbitrary files (SELECT INTO OUTFILE.../LOAD DATA INFILE)
UPDATE
Rather than the effort of maintaining separate containers, better isolation could be achieved with less effort by setting the home directory of the php-fpm uid appX to the base directory of the vhost (which should contain, not be, the document_root - see below) and use apparmor to constrain access to the common files (e.g .so libs) and #{HOME}. Hence each /var/www/appX might contain:
.htaccess
.user.ini
data/ (writeable by fpm-appX)
html/ (the document root)
include/
sessions/ (writeable by fpm-appX)
You should add an open_basedir directive to each site's vhost file. The open_basedir directive limits the directories that a site can access.
You can read more about open_basedir here.
In PHP scripts, whether calling include(), require(), fopen(), or their derivatives such as include_once, require_once, or even, move_uploaded_file(), one often runs into an error or warning:
Failed to open stream : No such file or directory.
What is a good process to quickly find the root cause of the problem?
There are many reasons why one might run into this error and thus a good checklist of what to check first helps considerably.
Let's consider that we are troubleshooting the following line:
require "/path/to/file"
Checklist
1. Check the file path for typos
either check manually (by visually checking the path)
or move whatever is called by require* or include* to its own variable, echo it, copy it, and try accessing it from a terminal:
$path = "/path/to/file";
echo "Path : $path";
require "$path";
Then, in a terminal:
cat <file path pasted>
2. Check that the file path is correct regarding relative vs absolute path considerations
if it is starting by a forward slash "/" then it is not referring to the root of your website's folder (the document root), but to the root of your server.
for example, your website's directory might be /users/tony/htdocs
if it is not starting by a forward slash then it is either relying on the include path (see below) or the path is relative. If it is relative, then PHP will calculate relatively to the path of the current working directory.
thus, not relative to the path of your web site's root, or to the file where you are typing
for that reason, always use absolute file paths
Best practices :
In order to make your script robust in case you move things around, while still generating an absolute path at runtime, you have 2 options :
use require __DIR__ . "/relative/path/from/current/file". The __DIR__ magic constant returns the directory of the current file.
define a SITE_ROOT constant yourself :
at the root of your web site's directory, create a file, e.g. config.php
in config.php, write
define('SITE_ROOT', __DIR__);
in every file where you want to reference the site root folder, include config.php, and then use the SITE_ROOT constant wherever you like :
require_once __DIR__."/../config.php";
...
require_once SITE_ROOT."/other/file.php";
These 2 practices also make your application more portable because it does not rely on ini settings like the include path.
3. Check your include path
Another way to include files, neither relatively nor purely absolutely, is to rely on the include path. This is often the case for libraries or frameworks such as the Zend framework.
Such an inclusion will look like this :
include "Zend/Mail/Protocol/Imap.php"
In that case, you will want to make sure that the folder where "Zend" is, is part of the include path.
You can check the include path with :
echo get_include_path();
You can add a folder to it with :
set_include_path(get_include_path().":"."/path/to/new/folder");
4. Check that your server has access to that file
It might be that all together, the user running the server process (Apache or PHP) simply doesn't have permission to read from or write to that file.
To check under what user the server is running you can use posix_getpwuid :
$user = posix_getpwuid(posix_geteuid());
var_dump($user);
To find out the permissions on the file, type the following command in the terminal:
ls -l <path/to/file>
and look at permission symbolic notation
5. Check PHP settings
If none of the above worked, then the issue is probably that some PHP settings forbid it to access that file.
Three settings could be relevant :
open_basedir
If this is set PHP won't be able to access any file outside of the specified directory (not even through a symbolic link).
However, the default behavior is for it not to be set in which case there is no restriction
This can be checked by either calling phpinfo() or by using ini_get("open_basedir")
You can change the setting either by editing your php.ini file or your httpd.conf file
safe mode
if this is turned on restrictions might apply. However, this has been removed in PHP 5.4. If you are still on a version that supports safe mode upgrade to a PHP version that is still being supported.
allow_url_fopen and allow_url_include
this applies only to including or opening files through a network process such as http:// not when trying to include files on the local file system
this can be checked with ini_get("allow_url_include") and set with ini_set("allow_url_include", "1")
Corner cases
If none of the above enabled to diagnose the problem, here are some special situations that could happen :
1. The inclusion of library relying on the include path
It can happen that you include a library, for example, the Zend framework, using a relative or absolute path. For example :
require "/usr/share/php/libzend-framework-php/Zend/Mail/Protocol/Imap.php"
But then you still get the same kind of error.
This could happen because the file that you have (successfully) included, has itself an include statement for another file, and that second include statement assumes that you have added the path of that library to the include path.
For example, the Zend framework file mentioned before could have the following include :
include "Zend/Mail/Protocol/Exception.php"
which is neither an inclusion by relative path, nor by absolute path. It is assuming that the Zend framework directory has been added to the include path.
In such a case, the only practical solution is to add the directory to your include path.
2. SELinux
If you are running Security-Enhanced Linux, then it might be the reason for the problem, by denying access to the file from the server.
To check whether SELinux is enabled on your system, run the sestatus command in a terminal. If the command does not exist, then SELinux is not on your system. If it does exist, then it should tell you whether it is enforced or not.
To check whether SELinux policies are the reason for the problem, you can try turning it off temporarily. However be CAREFUL, since this will disable protection entirely. Do not do this on your production server.
setenforce 0
If you no longer have the problem with SELinux turned off, then this is the root cause.
To solve it, you will have to configure SELinux accordingly.
The following context types will be necessary :
httpd_sys_content_t for files that you want your server to be able to read
httpd_sys_rw_content_t for files on which you want read and write access
httpd_log_t for log files
httpd_cache_t for the cache directory
For example, to assign the httpd_sys_content_t context type to your website root directory, run :
semanage fcontext -a -t httpd_sys_content_t "/path/to/root(/.*)?"
restorecon -Rv /path/to/root
If your file is in a home directory, you will also need to turn on the httpd_enable_homedirs boolean :
setsebool -P httpd_enable_homedirs 1
In any case, there could be a variety of reasons why SELinux would deny access to a file, depending on your policies. So you will need to enquire into that. Here is a tutorial specifically on configuring SELinux for a web server.
3. Symfony
If you are using Symfony, and experiencing this error when uploading to a server, then it can be that the app's cache hasn't been reset, either because app/cache has been uploaded, or that cache hasn't been cleared.
You can test and fix this by running the following console command:
cache:clear
4. Non ACSII characters inside Zip file
Apparently, this error can happen also upon calling zip->close() when some files inside the zip have non-ASCII characters in their filename, such as "é".
A potential solution is to wrap the file name in utf8_decode() before creating the target file.
Credits to Fran Cano for identifying and suggesting a solution to this issue
To add to the (really good) existing answer
Shared Hosting Software
open_basedir is one that can stump you because it can be specified in a web server configuration. While this is easily remedied if you run your own dedicated server, there are some shared hosting software packages out there (like Plesk, cPanel, etc) that will configure a configuration directive on a per-domain basis. Because the software builds the configuration file (i.e. httpd.conf) you cannot change that file directly because the hosting software will just overwrite it when it restarts.
With Plesk, they provide a place to override the provided httpd.conf called vhost.conf. Only the server admin can write this file. The configuration for Apache looks something like this
<Directory /var/www/vhosts/domain.com>
<IfModule mod_php5.c>
php_admin_flag engine on
php_admin_flag safe_mode off
php_admin_value open_basedir "/var/www/vhosts/domain.com:/tmp:/usr/share/pear:/local/PEAR"
</IfModule>
</Directory>
Have your server admin consult the manual for the hosting and web server software they use.
File Permissions
It's important to note that executing a file through your web server is very different from a command line or cron job execution. The big difference is that your web server has its own user and permissions. For security reasons that user is pretty restricted. Apache, for instance, is often apache, www-data or httpd (depending on your server). A cron job or CLI execution has whatever permissions that the user running it has (i.e. running a PHP script as root will execute with permissions of root).
A lot of times people will solve a permissions problem by doing the following (Linux example)
chmod 777 /path/to/file
This is not a smart idea, because the file or directory is now world writable. If you own the server and are the only user then this isn't such a big deal, but if you're on a shared hosting environment you've just given everyone on your server access.
What you need to do is determine the user(s) that need access and give only those them access. Once you know which users need access you'll want to make sure that
That user owns the file and possibly the parent directory (especially the parent directory if you want to write files). In most shared hosting environments this won't be an issue, because your user should own all the files underneath your root. A Linux example is shown below
chown apache:apache /path/to/file
The user, and only that user, has access. In Linux, a good practice would be chmod 600 (only owner can read and write) or chmod 644 (owner can write but everyone can read)
You can read a more extended discussion of Linux/Unix permissions and users here
Look at the exact error
My code worked fine on all machines but only on this one started giving problem (which used to work find I guess). Used echo "document_root" path to debug and also looked closely at the error, found this
Warning:
include(D:/MyProjects/testproject//functions/connections.php):
failed to open stream:
You can easily see where the problems are. The problems are // before functions
$document_root = $_SERVER['DOCUMENT_ROOT'];
echo "root: $document_root";
include($document_root.'/functions/connections.php');
So simply remove the lading / from include and it should work fine. What is interesting is this behaviors is different on different versions. I run the same code on Laptop, Macbook Pro and this PC, all worked fine untill. Hope this helps someone.
Copy past the file location in the browser to make sure file exists. Sometimes files get deleted unexpectedly (happened with me) and it was also the issue in my case.
Samba Shares
If you have a Linux test server and you work from a Windows Client, the Samba share interferes with the chmod command. So, even if you use:
chmod -R 777 myfolder
on the Linux side it is fully possible that the Unix Group\www-data still doesn't have write access. One working solution if your share is set up that Windows admins are mapped to root: From Windows, open the Permissions, disable Inheritance for your folder with copy, and then grant full access for www-data.
Add script with query parameters
That was my case. It actually links to question #4485874, but I'm going to explain it here shortly.
When you try to require path/to/script.php?parameter=value, PHP looks for file named script.php?parameter=value, because UNIX allows you to have paths like this.
If you are really need to pass some data to included script, just declare it as $variable=... or $GLOBALS[]=... or other way you like.
Aside from the other excellent answers, one thing I overlooked on Windows while writing a simple script: This error will be shown when trying to open a file with characters that Windows does not support in file names.
For example:
$file = fopen(date('Y-m-d_H:i:s'), 'w+');
Will give:
fopen(2022-06-01_22:53:03): Failed to open stream: No such file or directory in ...
Windows does not like : in file names, as well as a number of other characters.
The following PHP settings in php.ini if set to non-existent directory can also raise
PHP Warning: Unknown: failed to open stream: Permission denied in
Unknown on line 0
sys_temp_dir
upload_tmp_dir
session.save_path
PHP - Failed to open stream : No such file or directory in mac
For example I will upload a picture. But I am getting this error. First thing i will do right click on the image and get info.
$thePathOfMyPicture = "/Users/misstugba/Desktop/";
use with function
if(move_uploaded_file($_FILES["file"]["tmp_name"],$thePathOfMyPicture.$_FILES["file"]["name"])){
echo "image uploaded successfully";
}
For me I got this error because I was trying to read a file which required HTTP auth, with a username and password. Hope that helps others. Might be another corner case.
Edit
You can check if this type of authentication is present by inspecting the headers:
$file_headers = get_headers($url);
if (!$file_headers) echo 'File headers missing';
else if (strpos($file_headers[0], '401 Unauthorized') > -1) echo '401 Unauthorized';
In PHP, start Apache then write you DB name and password if exist in your environment(.env).
I just made a mistake in a PHP application I'm developing with WAMP Server.
My WAMP / WWW folder is inside my D:\ disk, where I also have my personal data. My app, due to a fail in generating a dynamic path, deleted all my music, my photos and other personal files I had.
I mean... WHAT? How was it possible? I will need a recovery tool to recover that data.
How can keep the PHP from touching anything outside it's folder in www so it does not happen again? It's a disaster.
Limit the files that can be accessed by PHP to the specified directory-tree, including the file itself.
http://php.net/manual/en/ini.core.php#ini.open-basedir
Use open_basedir to restrict file operations to within specific directories, like this (in the website's VirtualHost file)...
php_admin_value open_basedir "C:/WampDeveloper/Temp/;C:/WampDeveloper/Websites/www.example.com/webroot/"
Though if you are deleteing via the command line or bat file (e.g., you are not using PHP file functions directly), the only way to fix this is to set Apache to run under a custom account that only has permissions set on WAMP's folder.
I have a VPS using FastCGI (WHM/cPanel). As I understand it, open_basedir must be set using a php.ini file in each user's /home/ directory (E.g.: setting it globally in apache config file will not work).
I want to use open_basedir for improved security, as I recently had a hack that involved traversing through different user's directories
I have added this value to a home directory's php.ini file:
open_basedir = /home/USERNAME/public_html:/usr/lib/php:/usr/local/lib/php:/tmp
What I want to know is, is there a way to test that this is functioning properly? Presumably I would want to try and execute a .php file in another user's directory from within that first user...however I don't know of a good way to test this. Any suggestions would be greatly appreciated.
Try listing the contents of a different user's public_html folder:
<?php
print_r(shell_exec('ls /home/$anotheruser/public_html/'));
?>
If open_basedir is configured properly, you will see a directory listing for that folder.
I'm not particularly experienced with .htaccess (outside of simple mod_rewrite, and basic deny/access), and am unsure of how to approach the following issue:
I have a directory structure as follows:
/parentDirectory
/childDirectoryOne
/childDirectoryTwo
I have a domain that points to /parentDirectory (we'll call it parent.com), and seperate subdomains for each of the children directories (we'll call them one.parent.com and two.parent.com respectively).
These are all located on a shared host. I need to be able to grant ftp access to the subdirectories, but the problem is right now, someone could upload a php file to a childDirectoryOne that scans the parent directory, thereby discovering its sibling direcotry, and can then move into the sibling directory and get sensitive information from files (like a dbConfig file).
What I have been attempting to do (with no success so far) is develop a set of .htaccess files that would prevent the scripts in the children directory from accessing the parent or sibling directories. I'm not even sure if this is possible. Unfortunately, my shared host has no support for setting up a chroot jail, so this is my last option for finding a solution (next to purchasing hosting for each and every ftp user so they can't access others information).
It's considered bad practice to allow read, write and execute permissions to a folder to people you don't absolutely trust.
The ability to upload an arbitrary script and execute it on the server is a very big deal (them accessing another folder is the least of your worries). People can completely destroy your server and all sites on it, access your db, overwrite other pages in any site, and the list goes on.
I would recommend disabling php entirely for uploaded files. You can put this in your .htaccess.
php_flag engine off
That being said, if you really want to do it this way, you can use the open_basedir.
<Directory /parentDirectory/childDirectoryOne>
php_admin_value open_basedir "/parentDirectory/childDirectoryOne"
</Directory>
NOTE!
You need to utilize safe_mode too, otherwise with shell(),exec()... you will be hacked.... BUT!! that's not enough. Read here fully - https://puvox.software/blog/restrict-php-access-upper-directory/