Set some web directories as restricted directories - php

I am doing PHP web application, with Apache.
There are a few configuration files ( like App.yml) whose content I don't want to expose to users under whatsoever circumstances. Is there anyway that I can tweak my Apache setting so that these files won't be available when hostile users query for them?

The best option would be to place the files outside of your document root.
If that's not possible, you can deny access to them in apache .conf file (or a .htaccess file) with
<Directory /path/to/dir>
Deny from all
</Directory>

You can create a .htaccess file in that directory and place in it
order deny,allow
deny from all
You can also do this if you only want to block one file.
<Files filenamehere>
order deny,allow
deny from all
</Files>

Related

Only Allow To Access Folders Via Code With .htaccess

I want to hide my uploads folder but i want to access it via php . Is this possible with .htaccess ?
I tried something but didn't worked.
<files "/uploads">
order allow,deny
deny from all
</files>
<folders uploads>
Order Allow,Deny
Deny from all
</folders>
You're almost there, but it depends on what version of Apache you're using also.
The above method you're trying is if you want to block access to a specific file, if you want to block a folder, then add your .htaccess file to that folder and just use:
Below 2.4:
deny from all
2.4 or above:
Require all denied
IMPORTANT EDIT
You can just upload a .htaccess to the folder that you want to block with the following:
Deny from all
If there is some issue, add:
Allow from 127.0.0.1
It worked well for me.
Original answer
Try:
<Directory "/uploads">
Order allow,deny
Deny from all
Allow from 127.0.0.1
</Directory>
EDIT:
The code above will deny all except the local ip (of your server).
As thickguru said, it also depends on your apache version. Here are some other ways to do it:
<Directory "/uploads">
Require local
</Directory>
It will only allow if requested by the server (your script or somewhat on the server).
Or:
<Directory "/uploads">
Require ip 127.0.0.1
</Directory>
The same as above, but using the local adress. You can also add other ips to it. All the ips that you add there will be allowed to access the folder.

Allow a specific PHP file to access a hardened wp-content folder

1) My wp-content is hardened with a .htaccess file containing this code:
<Files *.php>
deny from all
</Files>
2) I want (need) to authorize xml-sitemap-xsl.php Otherwise I get this error in my error log: client denied by server configuration: /home/user/mysite.net/wp-content/plugins/wordpress-seo/css/xml-sitemap-xsl.php, referer: http://mysite.net/sitemap_index.xml
3) I think I should add the following code but I’m not sure if it’s the right code nor where to place it:
<Files "xml-sitemap-xsl.php">
Allow from all
</Files>
The thing I want to avoid is a conflict between the deny and allow commands.
Thanks,
P.
This has not much to do with Wordpress and I am not an expert regarding .htaccess, but I believe that what your file is doing is not denying access to your directory by all .php files, rather, denying access to all the .php files inside the directory.
The <Files> directive is used to add specific rules to specific files and, as far as I know, it cascades.
Considering your comment, this should do the trick
<Files *.php>
deny from all
</Files>
<Files "xml-sitemap-xsl.php">
Order Allow,Deny
Allow from all
</Files>
see: Deny direct access to all .php files except index.php

.htaccess allow access to files only from includes

I have various subfolders on my website and I would like for the user not to be able to access them through URL but on the same time my main PHP files to be able to include them or use them as actions on forms or links.
I tried using an .htaccess with
<Files *>
Order Allow,Deny
Deny from All
</Files>
but it denied all access even from within my own scripts. Logical as I found out, but I cannot know how to make it work. Any ideas?
P.S. My main concern is that some of the files are not included in main PHP files BUT they are linked there and their code ends up with a header('Location: ../index.php'); returning to the main page of the project.
I see a lot of answers with Allow,Deny not Deny,Allow
The order of this matters and is causing the problem. You are telling the computer that deny is more important than allow, because it is listed last. To show you... if you say:
<Files .htaccess>
Order Allow,Deny
Deny From All
Allow From xxx.xxx.xxx.xxx 127.0.0.1
</Files>
You are saying first Allow anyone Allowed, then Deny All... Which still Denies ALL.
If you reverse to Deny,Allow you are saying Deny All, then Allow anyone Allowed.
<Files .htaccess>
Order Deny,Allow
Deny From All
Allow From xxx.xxx.xxx.xxx 127.0.0.1
</Files>
Allow command, being more important, because it is the final command, is therefore allowing those listed after Allow From command.
xxx.xxx.xxx.xxx = Your IP
Do this:
<Files *>
Order Deny,Allow
Allow from 192.168.100.123 127.0.0.1
Deny from all
</Files>
The list of IP's will be specific hosts you allow, like localhost.
This also works with the directive, not just file, if you want only certain directories blocked.
There is an even safer method. Store your include files below the web accessible folders. So if your web files are here...
/var/www/mysite.com/
Store your include files here:
/var/includes/
Then include them with a full path...
include '/var/includes/myincludes.inc.php';
From the web, the myincludes.inc.php file is completely inaccessible.
Usually to protect these logic files from public access you can
put it in protected directory, above htdocs
add a check for public constant.. if(!is_defined(some_root_const)){die();}
change extension to .inc or something.. and deny with .htaccess based on that
put your application code outside of your public html folder. then you can add an include path at the top of your scripts to allow your script to access them as if they were in the same folder.
http://php.net/manual/en/function.set-include-path.php
In you .htaccess you will have to specify which IP's, hosts you want to allow and you can do it per directory as well. for e.g.
<Directory /dir/to/block>
Order Allow,Deny
Allow from 192.168.0.1 4.4.4.4
Deny from All
</Directory>
<Directory /dir/to/allow>
Order Allow, Deny
Allow from All
</Directory>

How can I disable work php files in a specific directory in My site

I have folder containing several files and sub-folders such as the following:
-/folder
-/subfolder
-/subfolder
-/subfolder
-/etc...
-index.html
-control.php
-ads.php
-/etc...
Now I want to disable work PHP or other files in main directory only (-/folder) like (-index.html, -control.php) , but all sub-folder files I want to works well.
I want to have effect on the files within that main folder only .
If I understand the question correctly, you need a .htaccess in the given directory with this content:
deny from all
Edit: Since as far as I know .htaccess rules apply to all subdirectories as well, you might have to create a .htaccess file in each subdir to override the restriction specified for root
allow from all
The configuration directives found in a .htaccess file are applied to
the directory in which the .htaccess file is found, and to all
subdirectories thereof.
http://httpd.apache.org/docs/current/howto/htaccess.html#how
You can use multiple directives to override the first directive
<directory /path/to/dir>
Order Allow,Deny
Deny from All
</directory>
<directory /path/to/dir/subdir>
Order Deny,Allow
Allow from All
</directory>
If you are using apache, you can add to your virtual-host the
<directory /path/to/dir>
Order allow,deny
Deny from All
</directory>
directive for each sub-directories, see http://httpd.apache.org/docs/2.2/howto/access.html
(I hope I understand your need)
Edit
if you need to filter files instead of directories, you can use a glob matching like that
<Files *.php>
Order allow,deny
Deny from All
</Files>
See http://httpd.apache.org/docs/2.2/en/mod/core.html#files

Block direct access to a file over http but allow php script access

I'm loading my files (pdf, doc, flv, etc) into a buffer and serving them to my users with a script. I need my script to be able to access the file but not allow direct access to it. Whats the best way to achieve this? Should I be doing something with my permissions or locking out the directory with .htaccess?
The safest way is to put the files you want kept to yourself outside of the web root directory, like Damien suggested. This works because the web server follows local file system privileges, not its own privileges.
However, there are a lot of hosting companies that only give you access to the web root. To still prevent HTTP requests to the files, put them into a directory by themselves with a .htaccess file that blocks all communication. For example,
Order deny,allow
Deny from all
Your web server, and therefore your server side language, will still be able to read them because the directory's local permissions allow the web server to read and execute the files.
That is how I prevented direct access from URL to my ini files. Paste the following code in .htaccess file on root. (no need to create extra folder)
<Files ~ "\.ini$">
Order allow,deny
Deny from all
</Files>
my settings.ini file is on the root, and without this code is accessible www.mydomain.com/settings.ini
in httpd.conf to block browser & wget access to include files especially say db.inc or config.inc . Note you cannot chain file types in the directive instead create multiple file directives.
<Files ~ "\.inc$">
Order allow,deny
Deny from all
</Files>
to test your config before restarting apache
service httpd configtest
then (graceful restart)
service httpd graceful
Are the files on the same server as the PHP script? If so, just keep the files out of the web root and make sure your PHP script has read permissions for wherever they're stored.
If you have access to you httpd.conf file (in ubuntu it is in the /etc/apache2 directory), you should add the same lines that you would to the .htaccess file in the specific directory. That is (for example):
ServerName YOURSERVERNAMEHERE
<Directory /var/www/>
AllowOverride None
order deny,allow
Options -Indexes FollowSymLinks
</Directory>
Do this for every directory that you want to control the information, and you will have one file in one spot to manage all access. It the example above, I did it for the root directory, /var/www.
This option may not be available with outsourced hosting, especially shared hosting. But it is a better option than adding many .htaccess files.
To prevent .ini files from web access put the following into apache2.conf
<Files ~ "\.ini$">
Order allow,deny
Deny from all
</Files>
How about custom module based .htaccess script (like its used in CodeIgniter)? I tried and it worked good in CodeIgniter apps. Any ideas to use it on other apps?
<IfModule authz_core_module>
Require all denied
</IfModule>
<IfModule !authz_core_module>
Deny from all
</IfModule>

Categories