How to make .PHP file only accessible to the server? - php

I created a cron job through goDaddy control center.
The cron job is in the folder "cron jobs".
I don't want anyone to be able to run it, how should I set the permissions of the folder so that it can't be publicly opened but it still can be used for the cron job?
Will unchecking Public > Read be enough to prevent anyone from running it?

Just put the files outside of the webroot/document root folder.

In .htaccess add this.
<Location /cronjobs>
order deny,allow
deny from all
allow from 127.0.0.1
</Location>
I included allow from 127.0.0.1 so it can be run from the server, i.e. so the cron can still run.

Another possible solution if the file is meant to be used exclusively as an include() and not ran standalone by a user who enters it in the url.
Place this code at the top of the file you want to block direct calling of.
if(basename($_SERVER['PHP_SELF']) == 'blockedFile.php')
{
header('Location: ./index.php');
exit();
}
PHP checks if the file's name is the one being ran directly. If blockedFile.php were included in index.php with include() then basename($_SERVER['PHP_SELF']) would equal index.php. If it were standalone, it would equal blockedFile.php and send the user back to the index page.

Put it in a directory, and in that directory create a file called .htaccess with this inside:
<FILESMATCH "\.php$">
order deny,allow
deny from all
</FILESMATCH>
Now only the server can access PHP files inside that directory. Example, by include or require.
This is useful for keeping your MySQL password safe, you can put the connection function inside a PHP file in this "protected" directory and include it into your scripts.

One option that you have is to use the $_SERVER values to see if it is a web request or a cli request.
See http://php.net/manual/en/reserved.variables.server.php
I would look at checking to see if the $_SERVER['argv'] value is set at the start of your script(s). If it's not set then exit the script.
Alternatively you can check to see if $_SERVER['SERVER_ADDR'] is set, which would mean it's being executed by the webserver.
Note that I don't have a godaddy account handy to test this, so ensure you verify before going live.

Related

securing outward-facing website db configs

I'm adding some database usage to a public facing site, and I wanted input on what the most secure way to store mysql connection information might be. I've come up with a few options:
First I could store the config in another directory, and just set the PHP include path to look for that dir.
Second, I know there are some files that apache won't serve to browsers, I could use one of these types of files.
Third, I could store encrypted files on the server, and decrypt them with PHP.
Any help would be much appreciated.
Storing the config outside of apache's document root is a must
You can configure apache to disallow any files with htaccess.
in the config folder add a .htaccess with the following
order allow,deny
deny from all
If you don't want to use .htaccess as #johua k, mentions, instead add
<Directory /home/www/public/config>
order allow,deny
deny from all
</Directory>
to your apache config.
This will deny any files in that folder from being served to anyone, which is fine since php doesn't care about htaccess you can just
include('config/db.php')
If you properly config your php scripts, they should never appear in plain text.
so a file like
define('mysql_password', 'pass')
would never display that text.
If you are worried about a shared hosting environment and another use having access to read this file then you should evaluate the security of the linux installation and the host. Other users should have any browsing access to your file. From the web files marked php should never return source.
You can explicitly tell apache not to serve the files ever, so they would only be include() or require() able.

Make php files hidden from outside world

My php website have multiple php files , some of them are for user interface and some of them are helper files(the files, which communicates through database and each other to return a result). Now I need that user can't execute the helper files from their direct url's.
e.g. mydomain.com/login.php ---------- (Interface file, must be accessible to user)
mydomain.com/login_handle.php ----(Healper file, must not be accessible to user)
So I need is user can execute and browse mydomain.com/login.php but must ot be able to execute mydomain.com/login_handle.php, while login.php and handle_login.php keep communicate and can access each other. Thanks,
Edit: Sorry but I'm using shared hosting and there is no folder other than public_html.
The first things I would attempt:
Move the included files outside of the document root
Move the included files inside another folder and protect it using .htaccess. Alternatively, rename your include files to end with .inc and create a rule based on that.
Make sure the included files don't output anything; this is not really secure, but if your file only contains functions, class definitions, etc. without producing any output, it would just show an empty page.
The hackish approach for this can be accomplished by using constants:
index.php
<?php
define('MY_CONSTANT', '123');
include('helper.php');
helper.php
<?php
if (!defined('MY_CONSTANT')) { exit; }
// we were called from another file
// proceed
Edit
The number 2 approach from above can be done by:
Create a folder underneath public_html, e.g. includes/
Move all the files that should be included only into this folder
Add the following .htaccess inside:
<FilesMatch "\.php$">
Order allow, deny
Deny from all
</FilesMatch>
Try to use .htaccess.
Instead of 127.0.0.1 this ip, you need to put your server ip address.
<Files login_handle.php>
Order Allow,Deny
Deny from all
Allow from 127.0.0.1
</Files>

Map only accessible for webserver and not for other users

On my site i use a lot of includes, the most of the includes should only be accessible for the webserver and not for the rest of the world. So if i include "../include_map/file.php" in a page on my site, it should not be possible to request with an URL by other users in the world ("website.com/include_map/file.php"). Is there a possibility to protect the map with the include files so that only the webserver can include the files?
PHP can include files from everywhere (also non public directories) on the servers harddrive.
for example, if your htdocs is located in /var/www/domain/htdocs/ you can also include files located in /var/www/domain/include_map while the webserver wont be allowed to read from there (if configured properly).
you can then test to access the file with www.yourdomain.com/../include_map/file.php.
if you can still access it like this, your webservers configuration needs some attention to prevent others from reading your logs and other things.
another way is to deny access to the directory via .htaccess or apache config. php can still include the files, while users cant access them from the internet.
in the apache config you would do something like:
<Directory /inlcude_map>
Order Deny,Allow
Deny from all
</Directory>
in a .htaccess file you could write
Order Deny,Allow
Deny from all
the .htaccess file should be located in the directory you want to secure. Consult your server provider to find out which way is best for you. As stated in the comment you have to find out if .htaccess is an option for you first.
You could do as zuloo said. If you want this to work under any condition you could use a constant for this.
The file including:
define('IS_APP', true);
require_once('some/file/to/include.php');
// your code here
The included file:
if(!defined('IS_APP')) {
die('No direct access');
}
// your code here

How to protect files from outside?

I've made a very small CMS myself. After login a session is set.
The CMS includes certain images, php pages, etc.
These pages may also include forms to add data to the database.
Now the problem is that you actually can use an address to get to the page which shows the form, ie;
domain.com/mycms/includes/addpage.php
How would you suggest to protect this?
NOTE: when I am logged in everything must work, just from outside it may not show the form. I could check if the session exists but I wonder if there are better and easier ways.
First of all, if you are including PHP files, you really should not place them inside your public web root.
If this is not possible, an alternative approach would be to define a constant in your index.php (assuming you use this as a main entry point) and checking wether this constant is set in every include file in order to prevent direct access to these files.
For example:
// index.php:
define('INDEX_LOADED', true);
// /includes/addpage.php:
if (!defined('INDEX_LOADED')) die('no direct access allowed');
Aim to put your files in
domain.com/private/includes/addpage.php
And then from your page do
something like
include('../private/includes/addpage');
I always use extension .inc.php for PHP files that should not be accessed from outside. Then I deny that extension to be visible from outside. For apache you can do this in .htaccess file in main directory:
<Files ~ "\.inc\.php$">
Order allow,deny
Deny from all
</Files>
Also if you use some framework or you have a class (or include) directory you can deny access to the whole directory like this (apache):
<Location ~ "^/(classes|framework)"
Order allow,deny
Deny from all
</Location>
Other web servers have other ways to forbid files. If you want it universal and portable - the Aron Rotteveel's suggestion is the best.
You can leave files that only contain classes declarations unprotected - if they are run from outside no code will run. Make sure that php ini setting display_errors is off for the host
If it necessary to keep private files inside public folder you can protect it with CHMOD permissions like 700

Custom Log File with Correct Permissions

I have a processing file for my website's payments. It works just fine, but what I would like to do is log all the requests to this page so that if anything throws an error, the raw data is saved and I can process the transaction manually. The processing file uses fopen to write to the log file in another directory.
What I have right now is a separate folder on my root directory with permissions 755. Then a log file inside with permissions 777. The processing file that writes to the log file, in PHP if that matters, is set to 777.
This works right now, but the log file is publicly available. I know I can be doing this better and that the permissions aren't correct. How can I do this better?
Put the log file outside the document root. The PHP script that writes to it will still be able to get to it (via the full path) but Apache won't be able to serve it.
I came across this whilst searching the answer for myself. I don't believe there is a simple "permissions fix" to do what you want and perhaps the safest way is to put the log files outside of public_html directory.
However this can be a nuisance sometimes - especially if you are wanting to e.g. catch paypal ipn dump text in a log file, but not have it publicly accessible.
In such cases, you can use .htaccess file directives to allow write from script, but deny reading from public access.
For example, this works for me (Apache .htaccess in root public_html folder);
<FilesMatch "mycustom\.log">
Order allow,deny
Deny from all
</FilesMatch>
and if you have multiple logs you want to protect, use it like this, with "Pipe Separated";
<FilesMatch "mycustom\.log|ipn_errors\.log">
Order allow,deny
Deny from all
</FilesMatch>
It is worth noting that the above directives are deprecated as of apache 2.4 and you may wish to consider using more current directives instead: https://httpd.apache.org/docs/2.4/howto/access.html
Hope that helps you!

Categories