Is it possible to scan /home/ directory with opendir and scandir. When i try to exec script it says permission denied what should i do?
<?php
$dir = '/home/';
$dirs = scandir($dir);
?>
<pre>
<?php print_r($dirs); ?>
</pre>
You can use is_readable('/home/') to check if you have permission. If not you'd need to make sure the directory has read privileges, probably 0755 (rwxr-xr-x)
For security, PHP defines a 'basedir', below which you are not allowed to access. As Aleks G says, there is also the file permissions to consider.
This question talks about how to get around basedir restrictions: How can I relax PHP's open_basedir restriction?
Tom Haigh's answer copied to here:
You can also do this easily on a per-directory basis using the Apache (assuming this is your web server) configuration file (e.g. httpd.conf)
<Directory /var/www/vhosts/domain.tld/httpdocs>
php_admin_value open_basedir "/var/www/vhosts/domain.tld/httpdocs:/var/www/vhosts/domain.tld/zend"
</Directory>
you can also completely remove the restriction with
<Directory /var/www/vhosts/domain.tld/httpdocs>
php_admin_value open_basedir none
</Directory>
I'm not a PHP programmer, but I think your problem is that when PHP tries the opendir, it is running with the same user ID as apache has, which may not have permission to read your /home directory.
You could fix that by altering the permissions on /home, or adding the apache userid to whatever group has group ownership of home.
Possibly it is a problem in your Apache configuration file. Even if the filesystem permissions would permit it, your httpd.conf file may not allow access to /home. That would usually be the case if all of your HTML files are in /var/www.
The httpd.conf might be set up to allow serving files out of your users' home directories. In that case the permission would be granted for directories within /home but not for /home itself.
The answer is in your question: it's a permission problem. Most likely, the process under which Apache is running does not have permission to read /home directory (or your usernamd, if running in CLI). Manually do this in a terminal:
ls -ld /home
and check the attributes.
Related
Context: I have some files on linux web server for example create_db.txt. They are using in my php scripts but now everyone can watch them by the direct link
http://url/create_db.txt
What is the right way to deny access to this files and still have opportunity to wright and read informations in them from php scripts. Thanks.
If you are using Apache you could restrict access to specific files by adding an .htaccess file in the web root:
<Files create_db.txt>
Order allow, deny
Deny from all
</ Files>
The Files section above would restrict access for all users to the create_db.txt file.
Running nginx the same could be achieved by adding the following to your configuration:
location ^~ /create_db.txt {
deny all;
}
Like stated in the other answer you really should consider moving the file to a directory outside of your webroot. Of course the webserver must be able to access this folder. This can be done by setting the correct permission on the folder and perhaps by changing the owner to that of the webserver. Something like this:
mkdir -m 755 -p /path/outside/webroot
mv create_db.txt /path/outside/webroot
chown -R <user>:<group> /path/outside/webroot
The best way would be to never put files that should not be accessible into files in your web root. Just put them in a folder outside. If this is not possible, put them in a specific folder and deny access through .htaccess or whatever your webserver accepts
I coulnd't find anything on the internet how to create seperate php jails so I can create a "webspace" directory for someone in /var/www/html/ and their scripts cannot leave their webspace directory so that that folder is the root for php scripts in it and I can securely upload my scripts in another directory and its impossible for the person to access files outside their directory. Is there a solution how to create seperate jails or do i have to use UserDir ?
You could make use of open_basedir setting
<Directory /var/www/html>
php_admin_value open_basedir "/var/www/html"
</Directory>
See this page for more info: http://php.net/manual/en/ini.core.php#ini.open-basedir
I've seen a lot of questions on here regarding files not being accessible due to permissions with LAMP but nothing about making files unviewable by the http client using permissions.
I have files and folders in my Apache2 root folder that I don't want people to be able to access via their browser or by other external means. I set the permissions to 770, but this doesn't seem to be enough. Do outside users access files as the apache user?
I'm running LAMP under Ubuntu Server with little modifications to the defaults, thus my apache user is www-data, group is :www-data, and the apache root is /var/www.
I have a /var/www/_private folder that has 770 permissions and the same permissions on its enclosed files. However, if I access these files through a browser, they are still viewable. Are clients accessing my files as the www-data user? If so, how do I rectify this?
I've worked on hosted setups where setting the "other" permissions to 0 was sufficient for denying outside direct access to files. Do I need to install some extra module to gain this functionality?
Note: I still need my accessible-to-the-client PHP scripts to access these files via includes, fopen, etc...
Well, right, 770 means that the owner of the file and the group can read, write and execute it. I'm going to guess the Apache is the owner of that file, thus allowing it to access it and open it to the world.
Instead of modifying the permissions on the server, and possibly causing harm to the accessibility of the file, why don't you use an .htaccess file. It will instruct Apache to take actions in certain instances, like denying access to a file. Simply create the .htaccess file in the root of the website with
<Files {your file name here}>
deny from all
</Files>
and you'll deny everyone from accessing it with Apache.
And if you want to deny an entire directory:
<Directory /var/www/_private>
Order Deny,allow
Deny from all
</Directory>
I am on a shared hosting package on a LAMP stack with no shell access.
I can create symlinks using PHP's symlink() function.
Let's say my web root is /home/www/user1/public
Let's say I have a real directory named /home/www/user1/public/real_dir
And I create a symlink named /home/www/user1/public/fake_dir pointing to real_dir
Why would I get a 403 Forbidden when trying to access www.mydomain.com/fake_dir but not when trying to access www.mydomain.com/real_dir?
It shouldn't be a rights problem because when I create a file in PHP, I can access that all right.
I tried switching FollowSymlinks off and on in .htaccess (it was on), but no luck.
Could it be that FollowSymlinks is defined as not overwritable in a .htaccess file? Or is there something else to be aware of when working with Symlinks in Apache?
Apache has to be configured to allow access to the directory on the filesystem. This has to be done by a system administrator by inserting a <Directory> directive in the apache configuration files (httpd.conf).
Since the real directory is inside the web root it must be accessible, but FollowSymLinks may not have been enabled for the directory - this also has to be added to the <Directory> directive.
See http://httpd.apache.org/docs/2.0/mod/core.html#directory
This is possible SELinux security issue.
cat /selinux/enforce
if the value is 1, set it to 0, then restart apache.
I have a script giving me error 403 Forbidden error, it's just a copy of another script but the difference in this is that both use another mysql class to access database.
My whole project is complete and this is last file so I don't want to do the whole work again for a single file.
Server logs shows that client denied by server configuration:
What should I look for?
I have tried the following:
Permissions are 644
New file with just simple echo gives 403 too
Changed name of folder
However, index.php works perfectly.
Check the permissions and also ownership of the file. Generally, 403 means that the web server doesn't have the rights to read the file and therefore can't continue the request. The permissions may be set correctly, however the file might be owned by another account on the server - an account that isn't part of the same group as the account which is running the server.
For instance, I believe* Apache is ran by default under the httpd user account, which is part of the httpd group. However, the FTP user you're logging in as (for instance ftpuser) might not be part of the httpd group. So, in copying the file you've created it under a different user account and Apache won't get execute access with 644.
* it's been a while since I've used apache, but it's similar under nginx.
This isssue occurs if you have had denied for all in .htaccess file. Changing that resolves the issue.
I had the same problem. The .htaccess file in my root folder has this code:
<Files ~ "\.(php|php5|py|jsp|cgi|sh)$">
Require all denied
</Files>
But there was a folder /example where I needed to call php files, so I created a .htaccess file in that specific folder with this content:
<Files ~ "\.(php)$">
Require all granted
</Files>
Note: I am running Apache 2.4