Currently I have a bit of a 'different' set-up. My main files are on server1; this server simply delivers the content through php and mysql. But that's just the front end. In the back, on server2 (homeserver), are alot of different scripts doing various things that could not be done on server1, since it's a shared host and thus resources are limited.
This setup works great. If server2 loses power or something the site won't be updated, but what's there is still available, and it can just catch up once back online. But here's the problem; all scripts on the homeserver are wide open for everyone to execute. As an example, my database-syncing script;
Server1 detects it's databases hasn't been synced up with the one on server2, so it initiates the syncing script.
<?php
//This initiates the script on server2, which then dumps it's database into a .sql file
file_get_contents('http://server2.x.com/backup/backup_mysql.php');
//This reads out and saves said database file locally for processing
$myresult2 = file_get_contents('http://server2.x.com/backup/backups/db-backup.sql');
file_put_contents_atomic("backups/db-backup.sql", $myresult2);
//This will delete the backup file from server2
$deleteurl = 'http://server2.x.com/backup/backup_mysql.php?delete=true';
$myresult3 = file_get_contents($deleteurl);
//This initiates bigdump for processing the sql file
include_once('bigdump.php');
?>
As you can see, this opens up some obvious security flaws. *backup_mysql.php* can be used by anyone who knows the address to server2 and even once that's fixed, someone who monitors the /backup folder can retrieve the sql backup before my script deletes it again.
How do I prevent all this from happening?
You can use Apache2 webserver directves to deny access to certain locations from all IPs except your own.
You should look into using HTTP authentication to prevent access to everything published by server 2. This way you 'd be able to lock everyone else out of server 2 with minimum hassle.
With HTTP auth in place, your file_get_contents calls would need to change to include the credentials, for example
file_get_contents(http://user:pass#server2.example.com/data.php);
If you are worried that someone might sniff the credentials from the network, then you can also move to HTTPS. Since both the server the only user will be your own applications, you can create the certificates yourself and make your scripts accept them as valid.
You can protect your files or folders through .htaccess:
Password protecting your pages with htaccess
Here's another solution besides .htaccess:
server1/index.php
server1/config.php
server2/index.php
Lets say you dont want people accessing config, use the define function. Define a unique name and check if it's defined in the config.php file before setting variables / methods.
Related
I need to run a localhost php file (without uploading it to my website).
I have a php file calld myfile.php on my website. Here is the code.
<?php
include "local_file.php";
?>
Is there any way to make this program work such that I store local_file.php on my local server(localhost), and still access it using the above code(myfile.php) which is stored on my website?
You can use PHP's eval() function and load code as plaintext through HTTP (I assume that you have: installed HTTP server, public IP address, redirected ports on your router).
$code = file_get_contents("http://your_public_ip/local_code.txt");
eval($code);
Remember to use your public IP and store PHP code in file with other extension, so that code won't be executed on localhost.
Generally, I strongly discourage executing code from remote location, as it is a big security hole. Consider whether it is needed to execute included code. If you just want to include HTML code or some data - use file_get_contents() and print it usingecho.
More: http://www.php.net/manual/en/function.eval.php
I believe you can, you have to disable a security feature in php that allows only local includes. Then you will need to have a static IP or dynamic dns setup to point to your local. Your local will then need to be setup to share the file. I don't recommend this though as it opens a lot of potential security holes. Here is some more info:
including php file from another server with php
The security setting you are looking for is: allow_url_include
http://www.php.net/manual/en/filesystem.configuration.php#ini.allow-url-include
If you are going to go this route I would highly recommend blocking all connections to your localhost other than a single white listed IP address.
I've got a file on my site called dbSettings.php with the lines:
<?php
$host = "localhost";
$dbName = "database";
$user = "user";
$pwd = "pass";
$db = new mysqli($host, $user, $pwd, $dbName);
?>
I get this page into my main page with the require_once() function. Is there ANY way for someone who's reaching this page through the server(my domain) to get a hold of the values stored in these variables? Is this a secure way to keep the database settings?
This is a common way to keep database credentials in php applications. Generally a config file would keep those settings some good practices are
Proper file permissions to the file such as
chmod the file 640 instead of 600. Keep file ownership to your user and change group to webserver. This way, the webserver can only read and not modify it
Move file out of webroot so its not accessible directly by others
Only give needed database privileges to that database user
If user just needs to access one database only give privileges for that database and data not give Structure or Administration related privileges if not needed
If possible protect with
.htaccess
<files dbSettings.php>
order allow,deny
deny from all
</files>
If your webserver is configured properly, web clients should not be able to view the settings. When a .php file is accessed through the webserver, it executes the script, and returns what it prints to the client. The source code of the script is not sent to the client.
You can make things more secure by putting files like this outside your webroot. Then clients can't even address them, they can only be accessed by being required from another script.
From what I know of PHP security, there are a couple ways for this information to leak:
If dbSettings.php can be somehow loaded by a client, without the server processing the page. This would obviously give out the details, since they can just look at the file. Generally speaking, if you server is correctly configured, and and you have no backup files created, this shouldn't be an issue.
If dbSettings.php is loaded by the client, you want to make sure no errors could leak the information (so for instance the mysqli creation can't fail and leak the information). Since your app isn't initialized, it is possible for errors you regularly mask to appear.
As for securely storing the information, I don't believe there is a much better solution then to do as Barmar suggests, and put the configuration files somewhere the web server doesn't actually serve files. PHP can still generally access these files, but the web server won't give them out to clients. Depending upon your hosts, this may or may not be possible.
Make sure that your dbSettings.php is not accessible through URL, so php must be able to read this file, but this file should not be accessble by typing it's address in the browser (by guessing, luck, etc). But if you have properly configured www server you should be fine because php will be processed and blank page will be returned to user.
Only if you misconfigure your web server so it will not process php pages but return them to the user - the file will accessible by others to download. So again your settings must not be accessible from outside.
Besides that You should never have echo, dump, etc those values from your php code on production server and you should also catch any errors from mysql driver so they will not be printed out to the user on production server.
The most important security measure, as outlined by previous answers, is proper webserver configuration.
As you are creating a configuration file that should never be directly accessed, you can safely place it outside the server root, or in a directory for which the server is configured to not serve files. As long as the file has proper permissions, you can include() it without the webserver's help.
If for what ever reason you choose to stick with having the configuration file in an accessible location, you can use a define() trick to prevent unauthorised execution. This is done by defining a constant in the file to include the config file, prior to the actual include(), then checking for the existence of the constant in the config file itself.
example.php:
<?php
define ('my_const', 1);
include ('config.php');
echo $super_secret_data;
?>
config.php:
<?php
if (!defined ('my_const'))
die ();
$super_secret_data = 42;
?>
Of course, the only security benefit here is that executable code will not run without the constant if the page is accessed directly, and that its data will not be accessible by files that do not define the const (which is pretty bad security through obscurity, as you can probably still call getfilecontents() on the file).
This may be a really stupid question...I started worrying last night that there might be someway to view PHP files on a server via a browser or someother means on a client machine.
My worry is, I have an include file that contains the database username and password. If there were a way to put the address of this file in to a browser or some other system and see the code itself then it would be an issue for obvious reasons.
Is this a legitimate concern?
If so how do people go about preventing this?
Not if your server is configured right. I think discussion on how that is done belongs on serverfault.
To add on to the other answers:
If you use a file extension like .inc there's indeed a higher risk. Can you open the file directly in your browser?
The most important advice is missing:
Only the files that should be accessed by a browser, should be in a publicly accessible location. All the other code (and configuration) should be in a completely separate directory.
For example
root
- webroot
- includes
- config
Only 'webroot' is exposed by your webserver (apache). Webroot can for example contain a single index.php, along with all your assets (javascript, css, images).
Any code index.php needs to load comes from 'includes' and all the configuration from 'config'. There's no way a user could ever directly access anything from those 2 directories, provided this is done correctly.
This depends on the file extension you have given the include file.
If the extension is one that is known and executed by the web server, it will be protected. If you browse to the file, the server will try to execute the code rather than just returning it as plain text.
If the extension is not known by the web server it will serve it as plain data, so anyone (who can guess the file name) can browse to the file and see the source code.
A Directory Traversal Vulnerability can used to obtain files off of the remote mahine. Alternatively you can use MySQL based sql injection to read files using load_file(). You can also test your system with w3af's urlfuzzer which will look for "backup files", such as index.php.zip. Also make sure that all files have .php extensions, a .inc can be viewed from the public. I would also disable Apache directory listing.
Normally there should be no way to view the PHP files remotely... it would be absolutely pointless. This completely depends on what web server you are using and how it's setup though.
Having looked around I can see that it is possible to protect a directory via the .htaccess by adding these lines:
Order allow,deny
Deny from all
This apparently protects the directory so that only local non web-access is possible.
This allows me to keep my includes in a subdirectory of the main site directory which is good for organisation and it can be used on the projects where I do not have access to folders outside the web root.
Does anyone else use this method?
Just for good measure I've put the directory permissions to execute only.
And the include extension is PHP as suggested by others.
I would like to ensure that any scripts that are trying to "include" my database connection file are located under my own domain. I don't want a hacker to include the database connection file to their malicious script and gain access to my database that way. My connection file's name is pretty easy to guess, it's called "connect.php". So without renaming it and taking the security through obscurity route, how can I protect it by making sure all connection requests are made by scripts residing under my own domain name? How can this be checked using PHP?
Generally speaking if someone tries to include a file on your domain, they will see the results of the execution of that file. What do you see when you load the connect.php script in your web browser? Thats what they'll see as well if they try to include a remote file.
That said, its generally a good idea to keep important files inaccessible from the outside of your public web space. So, if your website is /var/www/yoursite/ then keep your connect.php in /some/dev/dir/yoursite and include the files from your pages using require_once '/some/dev/dir/yoursite/connect.php';
thetaiko's answer addresses the fundamental issues here - but if anyone else has access to run code on the server (i.e. its a shared server) then access to the file will depend on how the server is configured.
There are lots of ways that access might be constrained - e.g. suphp, base_opendir, multiple chrooted servers. The only way to find out what's going on for sure is to casr yourself in the role of the hacker and see if you can access files outside your designated area.
C.
What do you mean by including your connection file? If a script does include "connect.php" then they can see the source code of the file, so whatever security measures you add to that file will be pointless, as it will be like:
if($notFromHostname)
{
echo "DONT LOOK AT THIS";
die();
}
define('DB_PASS',"myPassword");
...
And the "hacker" will clearly be able to see your password. You are probably better off using something like iptables to deny hosts that are not from a specific domain.
Are you on a shared server and don't want other users of the same server instance to be able to get at your files? That'd be up to your server provider, then, to provide some sort of chroot or virtual system to keep your things in. For Apache, mod_suid can accomplish this nicely, and each vhost gets its own userid and permissions set.
If you want external users to not be able to get at your files, then unless you've badly munged your code, or the server's badly misconfigured, then all they'll get when they visit http://yourserver.com/connect.php is a blank page
No other user than yourself should have access to your PHP files in any way, as Felix mentioned. However, this is how you'd check in PHP:
if($_SERVER['SERVER_NAME'] != "example.com")
die("I've been kidnapped!");
I have a php script PayPal eStores/dl_paycart but it has PayPal eStores "settings.php" Security Bypass Vulnerability
I would like to know if I can prevent direct access to a php include file.
Would this help?
defined( '_paycart' ) or die( 'Access to this directory is not permitted' );
Thank you
I would STRONGLY recommend finding some new script. Any sort of blocking is just sticking a finger in the dam; it isn't a permanent solution and eventually it's going to break.
If you really want to use it, check out htaccess files, particularly "Order Allow,Deny" and "Deny from All"
The problem is that if someone is able to use "include" and read the code contents, variables, and the like, that means that they are already operating on the same server and, to be a bit crude, you're boned if they try to screw with you.
On the other hand, if you're looking to prevent outside access to the file from a remote server, then the include call can only retrieve the values which would be displayed to any external site (and if the question is, "Can I prevent external sites from even loading this file remotely", the answer is "through server configurations in http.conf and .htaccess files" ).
The long and the short, however, is that this is not something which can really be fixed with PHP, this is a server security issue.
The fact that the script has a .php extension offers some protection - any http or https call for that file will go through the web server which is going to execute the php before serving the request.
I would recommend moving the script to a directory under your public web directory and putting .htaccess file in that directory that either blocks all requests, or requires a password to access it. Then include the script when needed by scripts in your public directory. See Apache's .htaccess Tutorial
Probably the most secure way is something like this
$allowed_files = array("/paths/", "/that/", "/are/", "/allowed/");
if(!in_array($_SERVER['PHP_SELF'], $allowed_files))
{
die("Not Allowed");
}
Fill the array with Files that you would like to have access. (You might have to access PHP self in each page you want and copy and paste it in). This will check to make sure that the file being executed is one of the allowed pages. If it isn't the script will die.
I believe $_SERVER might be able to be changed, but probably won't be. This file will still be able to be gotten using fopen or file_get_contents, and if someone reads it, they will know what to change.
But I would forewarn, it is not 'completely secure', because there isn't really a way to make something 'completely' secure.