I have a script on "domain 1" which performs a simplexml_load_file on an XML file on "domain 2" like so:
$url = 'domain2.com/file.xml';
$xml = simplexml_load_file($url);
$xmlEmail = $xml->xpath("//users/user/email");
$userEmail = implode(", ", (array) $xmlEmail);
However, this XML file contains private info about users. I don't want it to be directly accessible. Domain 2 will also be different according to the user so I can't filter by actual domains or IP's I'm afraid.
Is it possible to set permissions to make it non accessible to the public, but somehow still readable via the simplexml call? Any alternative methods perhaps?
Set HTTP basic authentication for your web-server on "domain 2". Change the url for XML file like this:
$url = 'http://my_username:my_password#domain2.com/file.xml';
$xml = simplexml_load_file($url);
mod_auth_basic for Apache
Add following to your .htaccess file, that resides in the same folder as file.xml:
<Files file.xml>
Order Deny,Allow
Deny from all
Allow from domain2.com
</Files>
This will tell your web server to disable access to this file, except if request comes from domain2.com.
UPDATE:
According to what question owner told, this xml file will be accessed from different domains. In this case the only reasonable option I might think of is a password. galymzhan already provided the answer, but I will try extend his answer and to provide a working solution.
Add following to your .htaccess file, that resides in the same folder as file.xml:
.htaccess
<Files file.xml>
AuthType Basic
AuthName "Protected Access"
AuthUserFile /full/path/to/.htpasswd
Require valid-user
</Files>
Please note, it's not required to have a shell access to the server. you can upload .htpasswd file via ftp, and change it's permissions so it will be not read/write by web server group.
More info about password protection, and some examples, can be found here.
Related
I have a plugin which is using an xml file located in the plugin folder.
example.com/wp-content/plugins/myplugin/myxml.xml
I want to deny access to the file for users but not to the plugin. If I type the URL I can read the file. I used the following in htaccess inside my plugin's folder
<Files ~ "\.xml$">
Order Allow,Deny
Deny from All
</Files>
I get the 403 error but the plugin cannot read the file
I used Options -Indexes as well
How can I fix this?
<Files ~ "\.xml$">
Order Allow,Deny
Deny from All
Allow from localhost
</Files>
This will only work if you place it in the main .htaccess. Then the file is not accessible from outside but accessible from the wordpress
The recommended solution for this issue is, Set proper file permission and user group. So all the application can access the file, but Public Users can't.
For more information visit Linux File permission
There are a couple of ways to go about this:
Load the file from the filesystem and not over the network if possible.
Use access control as #Jamie_D has suggested.
His code might not work if example.com doesn't resolve to localhost (check your /etc/hosts). It the file has to be accessed over the public internet, use your public IP.
For reference, here is the documentation for mod_access.
Access can be controlled based on the client hostname, IP address, or
other characteristics of the client request, as captured in
environment variables.
And you could also use authentication for that file.
I want to be able to prevent people from accessing a file on a server, such as a document if they were to directly link to it via the URL. This is for security purposes so that documents on the site just can't be stumbled upon and downloaded...
What is the best approach for this?
I've tried using the .htaccess to deny access to docs and txts for examples, but you can still download the files it just prevents you from accessing the directory...which isn't what I want to do.
<Files ~ "\.(doc|txt)$">
order allow,deny
deny from all
</Files>
put it in a directory outside the public space and provide it via a custom PHP page which requires login or what you prefer
echo file_get_contents(/var/www/example.com/file.txt);
should works I guess
Try putting this in your .htaccess
<FilesMatch "\.(doc|txt)">
Order deny,allow
Deny from all
</FilesMatch>
The best thing to do is to not put it in the web server's document root in the first place.
You can in your .htaccess redirect all requests to files in that folder to a special PHP page that only allows logged in users to download the file but denies those unauthorized to access it.
Also it's a good idea putting the target file itself in a folder above public_html.
I have all of my database credentials within an include file that I wanted to place outside of my webroot file.
However, my shared hosting plan does not allow me to place files outside of the webroot. Would I have to look at encrypting my file in some way to make sure my credentials are secure?
I had read a method to produce a kind of fake 404 page, but that doesnt sound very secure to me at all.
I've also taken the step of creating a read-only user account so that if my account is compromised then at least nothing can be overwritten or dropped, but I obviously want to be as secure as I can given the limitations.
You can't
Best what is possible is create php file which will be interpreted by hosting service.
<?php
$DB_USER = 'your_user';
$DB_PASS = 'your_pass';
$DB_INSTANCe= 'your_instance';
When someone will access your file from web browser he won't see anything. When you need your file just include it.
You could also add some .htaccess (probably) so no one using web browser will be able to access your file.
Someone who has read access to the same physical host as you will be sadly able to access this file, and there is no way to prevent that.
If the server is running apache and you are allowed to override the directives then this could be achieved using by creating a .htaccess file in the webroot with the following lines, be sure to replace <FILENAME> (including the <>) with the name of the file you would like to deny access to.
#Deny access to the .htaccess file
<Files ~ "^\.htaccess">
Order allow,deny
Deny from all
</Files>
#Deny the database file
<Files ~ "^\<FILENAME>$">
Order allow,deny
Deny from all
</Files>
how i can protect folder which includes uploaded files?
i have folder include all files which uploaded by me, i want if user try to change url or pass to show all files, browser redirect him to another page like this example
www.tet.php/folder/text.doc
if user try to write (www.tet.php/folder) to show all files redirect him automatically to www.tet.php
or any one please tell me tricky way to disappear /folder/
I don't know php but one solution I am thinking about and don't know if php can handle or not:
You can put your “UsersUploads” folder outside the website directory, so if your website exist on “c:\website\example.com” you can put the “UsersUploads” there “c:\UsersUploads”, Like that Your web server has no control over this folder and its files, And your website code will still have access to this directory as a normal physical path.
If you use Apache, you have 2 solutions:
Move your uploaded_files folder out
of your DocumentRoot (the root of the
folders that are accessible from the
web).
Use an .htaccess file in that folder
to block access to this folder.
A little example of an .htaccess using an authentication to access to the folder:
AuthName "Page d'administration protégée"
AuthType Basic
AuthUserFile "/var/www/uploadedfiles/.htpasswd"
Require valid-user
If you use IIS then you have just to deny access to that folder for everyone through your IIS Administration console. You also can deny any access except for certain IP adresses or adressranges.
Just to expand on #Clement's answer to include the part about redirecting to a page on failure:
AuthName "Page d'administration protégée"
AuthType Basic
AuthUserFile "/var/www/uploadedfiles/.htpasswd"
Require valid-user
ErrorDocument 403 www.urlToRedirectTo.com
Also, .htpasswd should be placed outside of the web root, and should simply contain username:pasword. The password should be hashed. You can easily find utilities online to create these files for you.
what is the most secure way to password protect admin files/folders?
im on apache/php
The most secure way is to keep it off the internet alltogether ;-)
But irony aside, I'd suggest using .htaccess. Simple and requires no programming effort from you.
http://www.htpasswdgenerator.com/apache/htaccess.html#8
An alternative to the htaccess method is to put the files that should be protected outside the web-root - somewhere where a typical HTTP request can't reach them - and have PHP relay them back to the client as needed.
This is useful in situations where you need more control over the process than Apache gives you. Like, say: if you wanted to integrate this with your PHP application's member functionality; allowing members that have already logged in access to the files while denying access to others.
Create a .htaccess and .htpasswd with one of the 10000 .htaccess generators out there and use the htpasswd included in most distros to add users to the .htpasswd.
Securing admin folder with HTTP Authentication (.htpasswd & .htaccess)
Navigate to http://aspirine.org/htpasswd_en.html to generate
username and password in an encrypted form
Eg:
username: User_name
password: Mypassword
Result will be depending upon your selected hashing algorithm
Eg.:
User_name:TX9D66ksKUR0o
Save this in “.htpasswd” file
Creating a “.htpasswd” file on your web server other than the /public_html
directory. Preferably one directory above it in the /home folder which would
store the username and password in an encrypted form for the HTTP
authentication.
Add the following code to the .htaccess file inside the /admin
folder on your server. Do not forget to put the correct path of the
.htpasswd file in the following code snippet:
AuthType Basic
AuthName "Your_Name"
AuthUserFile path-to/.htpasswd/file
Require valid-user
AuthName "Authorisation Required"
require valid-user
# IP
# order deny,allow
# deny from all
# allow from xxx.xx.xx.xxx