I have searched it all around but couldn't find it all i want to know is i have a folder called temp like
->public_html
-->temp
now how do i restrict a user from accessing it from outside server like it gives an error when someone includes it in their php script? i want to do it through php not through .htaccess or apache i tried chmod but it restricts the whole website from the local server only. i have done the constant thing also but if someone knows the constant name then he can still extract the data.
You can't include a remote PHP file. If they have allow_furl_open and allow_url_include set to true and use include('http://yoursite/temp/yourfile.php'), then what gets included is the output of that PHP file. Not the PHP source itself.
So when you have a php file with the following contents:
<?php
$password = "secret";
echo "Test";
?>
And someone includes that file remotely, they'll only get "Test" back, which isn't valid PHP syntax. They won't be able to see the contents of the file, only what gets outputted. The file runs on the remote (or in this case your) server. Whoever includes it gets the output after execution on that server.
So you don't have to do anything like if (!isset(some_magical_constant)) die("Go away!"), that's just plain silly, but unfortunately I've seen it all over the web.
htaccess, only way I know of... don't think it's possible with PHP
<Directory /public_html/temp>
order allow,deny
deny from all
</Directory>
Put an empty "index.html" file inside your "temp" folder. This prevents user from seeing the contents of that folder. Which basically makes it impossible for users to work around it.
As for including it in a script, people have to know what files are in it to use it in a script.
Related
I've had a good look around here for articles relating to this and nothing I see seems to work. Let me start by explaining my situation.
I have a load of files and folders (directories) inside a .htpasswd/.htaccess protected folder. These are sat on a Windows Server running xamp apache.
On this server (and within this folder) I have a single .php page which pulls in database records & assets from with the folder & sub-folders. The assets are organised, but all over the place.
I have a linux server with a php file and I am trying to embed that single php file using iframe ideally (as its easy for format). Issue is it's still asking me to provide the credentials to login to the .htaccess site.
I tried to create a php file above the password protected directory to load the php file within using file_get_contents however that still asked me for the password.
I tried moving the file outside of the directory, but because all the assets are in the directory, it again asks for the login credentials...
WHAT DOES WORK
I tried editing the .htaccess to add my server IP however this didn't work as the iframe which loads it is a browser. Adding my device public IP works which is a nice proof of concept, so I am thinking is it possible to make something serverside load the content? rather than an iframe which renders & loads browser side
Alternatively, any workarounds I have missed?
Thanks
UPDATE 1
if i echo file_get_contents('local_file_path') I just get a screen of junk
$fixture) { # check if today $today = date("Y-m-d"); $fixturedate = $fixture['date']; if ($fixturedate == $today) { $this_fixture = ['title'=>$fixture['title'], 'hometeam'=>$fixture['hometeam'], 'homegoals'=>$fixture['hometeamgoals'], 'homecards'=>$fixture['hometeamcards'], 'awayteam'=>$fixture['awayteam'], 'awaygoals'=>$fixture['awayteamgoals'], 'awaycards'=>$fixture['awayteamcards'], 'progress'=>$fixture['progress']]; array_push($game_array, $this_fixture); } } } ?>
# " . date("G:i") . ""; ?>
No fixtures today..."; } echo ''; include "../connection.php"; //Connect to Database # GET Logos $lsql = "SELECT `fixture_list`.*,`logos`.* FROM `fixture_list` JOIN
if I do a require 'local_file_path' it's better but none of the file paths match up as they're all relative in the original document
I have a few Ideas you can try:
Option1
AuthUserFile /var/www/mysite/.htpasswd
AuthName "Please Log In"
AuthType Basic
require valid-user
Order allow,deny
Allow from 127.0.0.1
satisfy any
I've never personally tried this, so you may have to use the servers outbound IP (maybe). The idea is to allow access from a specific IP, localhost, without the login.
Option2
You could include the file by path (not url),
if I do a require 'local_file_path' it's better but none of the file paths match up as they're all relative in the original document
It's possible to set the working directory of PHP using chdir().
http://php.net/manual/en/function.chdir.php
bool chdir ( string $directory )
Changes PHP's current directory to directory.
set_include_path() may also be an option.
Option3
You could use cURL and login, the example I found is by command line but it might also work using PHP's Curl.
$output = shell_exec('curl -A "Mozilla" -L "http://user:password#domain.com/api/someapi.php"');
This may also work with file_get_contents, by adding the password and user to the url, however it will probably be recorded in the server access logs, which may or may not matter to you.
It's always preferable to access files on your own server by path, it's much faster then doing a network request. And less visible log wise.
I'm sure there are a few other tricks.
I have a small page that contains the connection data to my MySQL DB, which I include in other pages that require it. The small code is:
[connect_DB.php]
----------------
<?php
define("HOST", "localhost");
define("USER", "myUser");
define("PASSWORD", "myPasword");
define("DATABASE", "members");
$mysqli = new mysqli(HOST, USER, PASSWORD, DATABASE);
?>
A friend of mine proved to me that he can download the .PHP file, and in less than 2 min, he did. That means he got the login info for my MySQL server. And I was wondering if there existed another way of connecting with the database, without putting the password in a file, etc. He recommended me use SSL or TSL, or simply configuring the HTTPD.CONF or .HTACCES to not allow "exterior" access to the file. Is that correct or achievable?
I am actually testing this on Wamp Server (Win7), and I cannot create a .HTACCESS file because It tells me to enter a name for the file (which I am already introducing! :( ) every time I try to name it that way.
I understand that this may be a duplicate question, but believe me I read a lot of t hem but I don't understand what should I do. In advance, thank you.
I am pretty sure that your host Provider is not parsing the PHP files, because it would not be possible to download the sources if they'd pass the interpreter.
Make sure you have PHP installed, configured and activated, contact your provider's support in case of questions. The easiest way to test this is to upload a file:
test.php
<?php phpinfo(); ?>
To protect a file it's enough to put the your database config php outside of public_html, html or htdocs directory (one of these is most likely to be your document root), where you still can include(); it via PHP.
Other solution is to protect the file via .htaccess where you put something like that inside the file and upload it to your document_root:
<Files db_config.php>
deny from all
</Files>
To add a little more security,
you can protect the db_config.php by adding this on top:
if (!defined('IN_MY_PROJECT')) {
die('forbidden');
}
and put this on top of your index.php:
define('IN_MY_PROJECT', true);
Most likely, Your server is not setup correctly, meaning the PHP engine is not running, so its just sending or displaying the file contents back as say text/html and not being parsed by PHP. This is the only way that some tool like you mentioned in comment could possibly access the file, a tool like that would also not see the file if there was not a link to it somewhere, so perhaps you have Directory Indexes enabled to.
To test you have PHP enabled simply make a file and put in it:
<?php phpinfo(); ?> if it displays your php info then im stumped, if not then its proof your server is misconfigured for PHP.
Though once fixed, it is good practice to put sensitive files outside of your web root. And not a good idea to set your values in constants as a function like get_defined_constants() will give access to the sensitive values. Just put them directly in the construct arguments.
if you want to block access to a directory where only PHP can access you make a .htaccess file with deny from all in it, this will cause the server to send a 403 Forbidden.
I'm having trouble getting my php page to only run when it is requested by the server itself.
This is what I have right now:
if ($_SERVER['SERVER_ADDR'] == $_SERVER['REMOTE_ADDR']) {
//process page
} else {
$this->redirect('http://' . $_SERVER['HTTP_HOST'] . '/404');
}
However, when I curl it, it doesn't give any errors or return anything at all. If I remove the check, it spits out the HTML as expected.
I tried echoing both of those values and got 192.168.1.186, and 192.168.1.225 respectively. I do realize they are different (this is being run by the server itself), but how can I fix it? This code was from this S.O answer
The title of your question implies that I can provide an answer which doesn't quite match the body of your question.
My response to this is that putting your entire script in a giant if statement seems somehow insecure and unmaintainable.
You would need a guard like this if it were possible for other computers to run your script, say by accessing it from the web.
But, if the server is the only machine which can run the script, why not just put it in a place where only the server can access it? For instance, one directory level above the web-accessible directory or in a child-directory with 700 permissions. Or use .htaccess to limit access.
That seems both safer and more maintainable.
It is easier and better to use your server configuration to limit file access. You could for instance use a .htaccess file in you specific directory with these contents:
order deny,allow
deny from all
allow from 127.0.0.1
To deny all traffic except from 127.0.0.1 (localhost). This will return a 403 error when someone tries to access these files from another computer.
I'm working on an installer for a project of mine and the installer will create a configuration file.
I have it working 99.99% fine, but in that file i want a check to ensure a hacking can't access it directly, and that code uses the $_SERVER super global, which in every run, gets parsed by php so it breaks the logic I'm trying to go for.
does anyone know I can get the superglobal to stay intact as it is without it parsing or should i rethink my logic and add it elsewhere?
for those who may want to see the code, here it is:
#Disable direct access.
if(!strcasecmp(basename($_SERVER['SCRIPT_NAME']),basename(__FILE__)) || !defined('accessed')){
die('<string>No Direct access is allowed for this file.</string>');
}
Assuming you are using Apache (or any .htaccess compatible server), you just have to create a .htaccess file in the folder holding your configuration file, containing the following:
<Files config.php>
deny from all
</Files>
It will prevent any access to this file through an HTTP request.
See using .htaccess files for details.
Don't use in-script or .htaccess protections - just write the file somewhere outside of the document root. If you don't want something to become available, don't make it available.
Putting it in the document root is like your bank hanging sacks of money in the front window with "do not steal" written on them.
I am separating some XHTML from PHP by putting the XHTML into a separate file and then using PHP's include() function within the PHP script.
This works perfectly fine, however, users are still able to access the .html file directly if they know the address. They can't really do much with it, but I would rather it not show.
I've seen some scripts in the past use some form of referrer check, is this what I would do to add some basic (Notice I said 'basic') restrictions to prevent it from being viewed by accessing it directly?
Thanks!
Clarification: I forgot to mention that I want to do this within PHP, so no web-server configuration (Moving files out of document-root, configuring web-server to disallow access, etc.). I think the most logical choice here is to use the define() constant check, that's actually indeed what I've seen in other scripts that I had forgotten, as I outlined in my post. I realize this is probably not the best solution, but given that the html file that can be access is of no particular value, the define() constant should suffice.
If you currently place all your files (like index.php) in /something/public_html/ you will want to move the files to /something/. That way users cannot access the files.
The /public_html/ is called your document root. That folder is mapped to example.com, and and basically the website starts there. If you move the files to above where the website starts, no one can access those files via a browser.
As Ignacio said, this will not work with include if safe mode is turned on.
Other methods are to place something at the top of the file thats says
if(!defined("RUNNING_SCRIPT"))
die("No Direct Access Allowed");
and then in your PHP files put
define("RUNNING_SCRIPT", true);
If RUNNING_SCRIPT is not defined, that means they are directly accessing it, and it stops the page from loading. This only works though if PHP runs on the .html files.
You could also use a .htaccess file to disallowed access to that folders.
Just move it outside of the document root. This will not work if PHP is in Safe Mode though.
Change your webserver configuration to disallow access to that file?
No, do something like this:
index.php:
<?php
define('ALLOW_INCLUDE', true);
include('other.php');
?>
other.php:
<?php
if (defined('ALLOW_INCLUDE') === false) die('no direct access!');
// your code
?>
It's a good idea to place this as the first line.
You can also use .htaccess or drop a index.html page too as fallbacks.
<?php defined('SOME_CONSTANT_GLOBAL_TO_YOUR_APP') or die('Access denied.'); ?>
may be apache access control?
http://httpd.apache.org/docs/2.2/howto/access.html