i want to protect some folders from a normal user but i (from localhost or my IP address) m unable to see the folder structure...so i do this
write a .htaccess file in that folder(eg.project/cms/js) and write below code
# no nasty crackers in here!
order deny,allow
deny from all
allow from 192.168.1.7
but by using this prohibit to everyone( including me) to see that folder structure.....
how do i protect that folder from other users except myself?
I think you got the allow and deny the wrong way around:
order allow,deny
allow from 192.168.1.7
deny from all
which first processes all the allow statements and next the deny statements.
I just checked, your above example works fine for me on my Apache 2.
Make sure your IP really is 192.168.1.7. Note that if it's the local machine you're trying to access, your IP will be 127.0.0.1.
Related
I was wondering if there is an “easy” way to protect a file from begin access from all domains…
Let say that I want only a few domains to use my script so they put in their HTML
From domain yourdomain.com you write
If your domain is in our “allow” access then you can use it, if not, then show an error or just nothing…
Is that possible with PHP?
Or do I have to use .htaccess ?
Probably the most straight forward way is to use a .htaccess file. These files typically take IP address and not domains. The code below would only allow from 1.1.1.1 and 2.2.2.2 and anything else would be denied.
<Limit GET POST>
order deny,allow
deny from all
allow from 1.1.1.1
allow from 2.2.2.2
</Limit>
I have several folders in my website directory which contains textfiles called "secure.txt".
For security reasons the URL of these files are never shown in the webbrowser but my website searches for these files (PHP code), which contains sensitive information.
How can I make the PHP code allowed to read these files but restrain access through the url, so a potential hacker wouldn't be able to read the content of these files?
put them out side your document root folder and place the following .htaccess file in the folder you want to protect. Also if you don't want to access it through a particular IP remove the last line.
order deny, allow
deny from all
allow from 192.168.0
[EDIT:]
To allow php scripts, etc. allow localhost (127.0.0.1)
order deny, allow
deny from all
allow from 127.0.0.1
You should put them in another folder and make the .htaccess deny from all, allow from 127.0.0.1
Old trick for that: Prefix the files with <?php die("I'm a fraid I can't do that, Jim"); ?>, and call them *.php. On parsing, ignore the prefix.
Edit
Why do this? The rationale behind it is, that you avoid a dependency on some special webserver configuration, which acn be forgotten (on moving to a new server), unavailable (many cheap hosts don't give you .htaccess), not applicable to some webserver software (IIS) etc.
So the reasoning is to trade some computational overhead against flexibility, portability and platform independence.
Can you move them out of your website directory altogether? If so, then make sure PHP has access to that directory! (The open_basedir value will need to include it.)
I'd suggest moving the files out of the webroot to be on the safe side
If you use Apache, deny access to all files named secure.txt from within httpd.conf:
<Files secure.txt>
deny from all
</Files>
You may do the same via .htaccess files as well (if your server is configured to allow override access rights from htaccess).
But a better solution would be to include the sensitive information into your PHP scripts themselves, or to store it in a database.
I am using linux box and php. I have many cron tasks scheduled. I want to prevent access to these files if some one directly tries to access these files using browser. How do i do this.
Don't place the files under the web root
Require authentication/authorization
Limit access via IP address
A sample solution using .htaccess:
<Files "cronjobs.php">
Order deny,allow
Allow from allowedmachine.com
Allow from localhost
Deny from all
</Files>
Perhaps this? [source]: https://github.com/h5bp/html5-boilerplate/blob/master/.htaccess
# "-Indexes" will have Apache block users from browsing folders without a
# default document Usually you should leave this activated, because you
# shouldn't allow everybody to surf through every folder on your server (which
# includes rather private places like CMS system folders).
<IfModule mod_autoindex.c>
Options -Indexes
</IfModule>
I have all of my database credentials within an include file that I wanted to place outside of my webroot file.
However, my shared hosting plan does not allow me to place files outside of the webroot. Would I have to look at encrypting my file in some way to make sure my credentials are secure?
I had read a method to produce a kind of fake 404 page, but that doesnt sound very secure to me at all.
I've also taken the step of creating a read-only user account so that if my account is compromised then at least nothing can be overwritten or dropped, but I obviously want to be as secure as I can given the limitations.
You can't
Best what is possible is create php file which will be interpreted by hosting service.
<?php
$DB_USER = 'your_user';
$DB_PASS = 'your_pass';
$DB_INSTANCe= 'your_instance';
When someone will access your file from web browser he won't see anything. When you need your file just include it.
You could also add some .htaccess (probably) so no one using web browser will be able to access your file.
Someone who has read access to the same physical host as you will be sadly able to access this file, and there is no way to prevent that.
If the server is running apache and you are allowed to override the directives then this could be achieved using by creating a .htaccess file in the webroot with the following lines, be sure to replace <FILENAME> (including the <>) with the name of the file you would like to deny access to.
#Deny access to the .htaccess file
<Files ~ "^\.htaccess">
Order allow,deny
Deny from all
</Files>
#Deny the database file
<Files ~ "^\<FILENAME>$">
Order allow,deny
Deny from all
</Files>
it's can be apply not only to wordpress. But to all blog platform that can be installed into a server and shared host. so, What do you do via PHP coding or pluggin or any method to secure your installation ?
Limit access to ftp. For this, upload to server file .ftpaccess, and paste following code:
<Limit ALL>
Deny from all
Allow from Your.IP.Address
</Limit>
Also, limit access to most important files (like wp-login.php), uploading in the same folder file .htaccess:
<Files "wp-login.php">
Order Deny,Allow
Deny from all
Allow from Your.IP.Address
</Files>
I suppose you could modify the .htaccess file to deny permission to /wp-admin* if the IP address doesn't match yours.
Being on a shared host is a bad limitation. This prevents you from installing a web application firewall like Mod_Security.
Here is a list of steps you can take to harden your Wordpress installation.