I am using linux box and php. I have many cron tasks scheduled. I want to prevent access to these files if some one directly tries to access these files using browser. How do i do this.
Don't place the files under the web root
Require authentication/authorization
Limit access via IP address
A sample solution using .htaccess:
<Files "cronjobs.php">
Order deny,allow
Allow from allowedmachine.com
Allow from localhost
Deny from all
</Files>
Perhaps this? [source]: https://github.com/h5bp/html5-boilerplate/blob/master/.htaccess
# "-Indexes" will have Apache block users from browsing folders without a
# default document Usually you should leave this activated, because you
# shouldn't allow everybody to surf through every folder on your server (which
# includes rather private places like CMS system folders).
<IfModule mod_autoindex.c>
Options -Indexes
</IfModule>
Related
I have a plugin which is using an xml file located in the plugin folder.
example.com/wp-content/plugins/myplugin/myxml.xml
I want to deny access to the file for users but not to the plugin. If I type the URL I can read the file. I used the following in htaccess inside my plugin's folder
<Files ~ "\.xml$">
Order Allow,Deny
Deny from All
</Files>
I get the 403 error but the plugin cannot read the file
I used Options -Indexes as well
How can I fix this?
<Files ~ "\.xml$">
Order Allow,Deny
Deny from All
Allow from localhost
</Files>
This will only work if you place it in the main .htaccess. Then the file is not accessible from outside but accessible from the wordpress
The recommended solution for this issue is, Set proper file permission and user group. So all the application can access the file, but Public Users can't.
For more information visit Linux File permission
There are a couple of ways to go about this:
Load the file from the filesystem and not over the network if possible.
Use access control as #Jamie_D has suggested.
His code might not work if example.com doesn't resolve to localhost (check your /etc/hosts). It the file has to be accessed over the public internet, use your public IP.
For reference, here is the documentation for mod_access.
Access can be controlled based on the client hostname, IP address, or
other characteristics of the client request, as captured in
environment variables.
And you could also use authentication for that file.
I'm attempting to deny access to anyone surfing for PHP files in a specific directory:
example.com/inc/
I've created an example.com/inc/.htaccess file with the contents:
Order deny,allow
Deny from all
This results in a 403 Forbidden response when I try to access one of the files. For example: example.com/inc/file.php
The problem is, my web server is also denied access and my application stops working.
How can I deny access to people surfing for such PHP files but allow my shared web server access?
Note: I'm using GoDaddy shared hosting.
I would would just use a rule and block the access that is entered by the user. This will block any php file that is entered.
RewriteEngine On
RewriteRule ^.*\.php$ - [F,L,NC]
Edit based on your comment. Try this way.
<Files (file|class)\.php>
order allow,deny
deny from all
allow from 127.0.0.1
allow from 192.168.0.1
</Files>
Replace 192.168.0.1 with your server IP address.
Use proper directory structure put your files to lib/ directory for example and include them from file which is not present in this directory. This is how common frameworks works.
You can even map your url to web/ directory and put lib one directory up then you are sure that there is no access to your .php file but only index.php and assets.
You can read how it is solved for example in Symfony2 http://symfony.com/doc/current/quick_tour/the_architecture.html it'll give you some clues.
To block navigation access to all files ending in .php you can use:
RedirectMatch 403 ^.*\.php$
To only deny access to php files you can use this:
<Files *.php>
order allow,deny
deny from all
</Files>
I am using Apache on my server and I'd like to allow my visitors to download resources free of charge. However, to preserve even a little bit of bandwidth I'd like to deny direct access to the root folder of the resources like this:
www.myhost.com/resources/file.wav
If the visitor removes file.wav from the URL, they have access to all sounds at once and thus I'd have people just downloading like crazy. I don't want that.
How can I stop the users from going into that root directory, and any subfolders?
The dead easiest way to do this, without even messing with .htaccess (though that is a good idea) is to simply put an index.html file in the folder and set its permissions to 700.
If you want to just turn off directory listings you can create an htaccess file with this:
<Directory /path/to/directory>
Options -Indexes
</Directory>
If you want to deny access to sub-directories, you can use this:
<Files subdirectory/*>
deny from all
</Files>
If you want to allow access to just the .wav files, you can do this:
<Files *.wav>
allow from all
</Files>
I read here lot of answers about protecting a file on apache server, some told htaccess is only for deny user access for files. For me not..
Using the following lines I cant reach the xml, BUT my php script also can't!
<Files sample.xml>
Order allow,deny
Deny from all
</Files>
So how to protect a file from users, so that my php script could access it. The file is in the root dir.
If you are on a linux server, have a shell access to it, you change your file permissions to
chmod 0700 yourxmlfile.xml # this will make it readable/writable/executable only by the creator of the file.
or in your .htaccess file, you can do this:
RewriteEngine on
RewriteRule ^yourfile.xml$ 404.html
where your 404.html could be either a page not found page or any garbage page, which will display as invalid url.
If by users you mean people accessing your website from outside, you have several solutions:
The best way would be to move it out of the web root (if you can). That will surely protect it from outside access.
If you can't put file outside the web root, use a htaccess to protect the file by allowing access from the local machine only :
Order Deny,Allow
Deny From All
Allow From 127.0.0.1
I also suggest putting it in a protected subfolder where you could store all your private files.
You can also redirect the user to a 404 page by adding the code below to your .htaccess:
RewriteEngine On
RewriteRule ^(.*)your_file(.*)\.xml$ [R=404,L]
I'm loading my files (pdf, doc, flv, etc) into a buffer and serving them to my users with a script. I need my script to be able to access the file but not allow direct access to it. Whats the best way to achieve this? Should I be doing something with my permissions or locking out the directory with .htaccess?
The safest way is to put the files you want kept to yourself outside of the web root directory, like Damien suggested. This works because the web server follows local file system privileges, not its own privileges.
However, there are a lot of hosting companies that only give you access to the web root. To still prevent HTTP requests to the files, put them into a directory by themselves with a .htaccess file that blocks all communication. For example,
Order deny,allow
Deny from all
Your web server, and therefore your server side language, will still be able to read them because the directory's local permissions allow the web server to read and execute the files.
That is how I prevented direct access from URL to my ini files. Paste the following code in .htaccess file on root. (no need to create extra folder)
<Files ~ "\.ini$">
Order allow,deny
Deny from all
</Files>
my settings.ini file is on the root, and without this code is accessible www.mydomain.com/settings.ini
in httpd.conf to block browser & wget access to include files especially say db.inc or config.inc . Note you cannot chain file types in the directive instead create multiple file directives.
<Files ~ "\.inc$">
Order allow,deny
Deny from all
</Files>
to test your config before restarting apache
service httpd configtest
then (graceful restart)
service httpd graceful
Are the files on the same server as the PHP script? If so, just keep the files out of the web root and make sure your PHP script has read permissions for wherever they're stored.
If you have access to you httpd.conf file (in ubuntu it is in the /etc/apache2 directory), you should add the same lines that you would to the .htaccess file in the specific directory. That is (for example):
ServerName YOURSERVERNAMEHERE
<Directory /var/www/>
AllowOverride None
order deny,allow
Options -Indexes FollowSymLinks
</Directory>
Do this for every directory that you want to control the information, and you will have one file in one spot to manage all access. It the example above, I did it for the root directory, /var/www.
This option may not be available with outsourced hosting, especially shared hosting. But it is a better option than adding many .htaccess files.
To prevent .ini files from web access put the following into apache2.conf
<Files ~ "\.ini$">
Order allow,deny
Deny from all
</Files>
How about custom module based .htaccess script (like its used in CodeIgniter)? I tried and it worked good in CodeIgniter apps. Any ideas to use it on other apps?
<IfModule authz_core_module>
Require all denied
</IfModule>
<IfModule !authz_core_module>
Deny from all
</IfModule>