I'll get mallwared site hosted on linux hosting. All php files now start from lines:
<?php
$md5 = "ad05c6aaf5c532ec96ad32a608566374";
$wp_salt = array( ... );
$wp_add_filter = create_function( ... );
$wp_add_filter( ... );
?>
How I can cleanup it's with bash/sed or something?
You should restore your backup.
FILES="*.php"
for f in $FILES
do
cat $f | grep -v 'wp_salt|wp_add_filter|wp_add_filter' > $f.clean
mv $f.clean $f
done
Just a warning, the wp_add_filter() recursively evaluates encoded php code, which in turn calls another script that is encoded and evaluated. This larger script not only injects malicious code throughout your site but appears to collect credentials, and execute other hacks. You should not only clean your site, but make sure the flaw is fixed and any credentials that might have been exposed are changed. In the end, it appears to be a wordpress security issue but I've not confirmed this. I've added some comments on this over at http://www.php-beginners.com/solve-wordpress-malware-script-attack-fix.html, which includes a clean-up script and more information on how to decode the malicious script.
You can do it with PHP (fopen, str_replace and fwrite) . There shouldn't be any encoding problems.
I just hit with this on a very full hosting account, every web file full of php?!
Much digging and post reading everywhere I came across this guys cleaner code (see http://www.php-beginners.com/solve-wordpress-malware-script-attack-fix.html) - and tried it on a couple of the least important sites first.
So far so good. Pretty much ready to dig in and utilize it account wide to try and wipe this right off.
The virus/malware seems to be called "!SShell v. 1.0 shadow edition!" and infected my hosting account today. Along with the cleaner at http://www.php-beginners.com/solve-wordpress-malware-script-attack-fix.html, you actually need to discover the folder containing the shell file that gives the hackers full access to your server files and also discover the "wp-thumb-creator.php" that's the file that does all the php injection. I've posted more about this # my blog: http://www.marinbezhanov.com/web-development/6/malware-alert-september-2011-sshell-v.1.0/
Related
I have thousands of html pages which are handled as php.
inside each page, is a line:
<? file_get_contents("http://www.something.com/get_html.php?id=something"); ?>
for some reason, suddenly this line has been slowing down the server. When the page loads, it waits around 15 seconds at this line before proceeding.
The answer here works, namely,
$context = stream_context_create(array('http' => array('header'=>'Connection: close\r\n')));
file_get_contents("http://www.something.com/somepage.html",false,$context);
which "tells the remote web server to close the connection when the download is complete".
However, this would require rewriting all the thousands of files. Is there a way to do the same thing from the get_html.php script?
this would be alot easier than rewriting all the pages. I tried sending
header("Connection: close"); in that script but no cigar.
To summarize, I am looking for the answer here but adapted to remote server side solution
You could easily do a find/replace in files in a certain directory with most editors. However, I would suggest you started caching results instead of poking your own or foreign servers subsequently for each request.
Is the remote server outside of your local network? If not you could query the database or something else directly over your scripts without a http call. Else you could cache your search results in Memcache or files for a couple of time. It depends on the size and varity of your data how much memory is required for caching.
This are only two examples how to get faster response times. There are many approaches to do this.
you may try the following:
http://www.php.net/manual/en/function.override-function.php
don't know if you can change your server configuration
Here are a couple of things for you to try. Try using cURL to make the request and see if it is still hanging up. Also, try fetching a different page on your site to see if it is also slow. These tests will help determine if it's that particular page or the connection that's hanging up. If another page is slow also, then modifying the 'get_html.php' page probably won't be much help.
To elaborate on Elias' answer, if the connection can easily be fixed by doing a find replace, you can use something like this from the command line in *nix:
perl -pi -w -e 's/search/replace/g;' *.php
-e means execute the following line of code.
-i means edit in-place
-w write warnings
-p loop
You'd have to test this out on a few files before doing all of them, but more specifically, you can use this to very quickly do a find/replace for all of your files:
perl -pi -w -e 's/(file_get_contents\("http:\/\/www.something.com\/somepage.html",false,\$context\)\;)/\$context = stream_context_create(array("http" => array("header" => "Connection: close\\r\\n")));\n$1/g;' *.php
We have a Java IRC application where users are allowed to execute arbitrary PHP and get the result. Here is one example of what this is used for:
btc: <php>$btc = json_decode(file_get_contents('https://btc-e.com/api/2/1/ticker'), true); $ticker = $btc['ticker']; echo "Current BTC Ticker: High: $".$ticker['high']." Low: $".$ticker['low']." Average: $" . $ticker['avg'];
We also have a python setup, but we like PHP because PHP does not require newlines in the code anywhere. (Because this is IRC, we cannot give it newlines unless we exec a web-loaded .py file)
The issue is how to prevent people from trying to exploit the system, such as in:
<php>echo readfile("/etc/passwd");
Which would, clearly, read out the passwd file for all to see.
We are also having this problem, after we tried to block readfile():
<php>$rf = readfile; echo $rf("/etc/passwd");
How should we go about securing this system? (The full code is on github, for any interested: https://github.com/clone1018/Shocky)
As an aside, no real sensitive information is being exposed, as the whole thing is in a VM, so it isn't a "timebomb" or anything. We still want to lock it down though.
That sounds like plugging one hole in a colander. Filesystem security should be handled by the OS, not the application. And as far as /etc/passwd goes, the OS is already securing it.
Here's the first line of my /etc/passwd - yes, I'm going to post it publicly:
root:x:0:0:root:/root:/bin/bash
Usually, passwords aren't actually stored in /etc/passwd. User information is, but the passwords are replaced with x, with the real password only available to the root user.
However, you should lock down PHP to some degree. You can change many PHP options during runtime with ini_set, including open_basedir. http://www.php.net/manual/en/ini.core.php#ini.open-basedir
If you only want to restrict the file reading maybe this can help
http://www.php.net/manual/en/ini.core.php#ini.open-basedir
If you are using an old version of php < 5.4 you can consider using php safe mode
http://php.net/manual/en/ini.sect.safe-mode.php
Set the following vars for safe mode to restrict php
safe_mode_exec_dir
disable_functions = readfile,system
and many other
Also the user wont be able to read any file for which uid is different, e.g. /etc/password.
Be advised that safe mode is depreciated/ removed from latest versions of php
Is there a way to view the PHP error logs or Apache error logs in a web browser?
I find it inconvenient to ssh into multiple servers and run a "tail" command to follow the error logs. Is there some tool (preferably open source) that shows me the error logs online (streaming or non-streaming?
Thanks
A simple php code to read log and print:
<?php
exec('tail /var/log/apache2/error.log', $error_logs);
foreach($error_logs as $error_log) {
echo "<br />".$error_log;
}
?>
You can embed error_log php variable in html as per your requirement. The best part is tail command will load the latest errors which wont make too load on your server.
You can change tail to give output as you want
Ex. tail myfile.txt -n 100 // it will give last 100 lines
See What commercial and open source competitors are there to Splunk? and I would recommend https://github.com/tobi/clarity
Simple and easy tool.
Since everyone is suggesting clarity, I would also like to mention tailon. I wrote tailon as a more modern and secure alternative to clarity. It's still in its early stages of development, but the functionality you need is there. You may also use wtee, if you're only interested in following a single log file.
You good make a script that reads the error logs from apache2..
$apache_errorlog = file_get_contents('/var/log/apache2/error.log');
if its not working.. trying to get it with the php functions exec or shell_exec and the command 'cat /var/log/apache2/error.log'
EDIT: If you have multi servers(i quess with webservers on it) you can create a file on the machine, when you make a request to that script(hashed connection) you get the logs from that server
I recommend LogHappens: https://loghappens.com, it allows you to view the error log in web, and this is what it looks like:
LogHappens supports kinds of web server log format, it comes with parses for Apache and CakePHP, and you can write your own.
You can find it here: https://github.com/qijianjun/logHappens
It's open source and free, I forked it and do some work to make it work better in dev env or in public env. That is:
Support token for security, one can't access the site without the token in config.php
Support IP whitelists for security and privacy
Sopport config the interval between ajax requests
Support load static files from local (for local dev env)
I've found this solution https://code.google.com/p/php-tail/
It's working perfectly. I only needed to change the filesize, because I was getting an error first.
56 if($maxLength > $this->maxSizeToLoad) {
57 $maxLength = $this->maxSizeToLoad;
58 // return json_encode(array("size" => $fsize, "data" => array("ERROR: PHPTail attempted to load more (".round(($maxLength / 1048576), 2)."MB) then the maximum size (".round(($this->maxSizeToLoad / 1048576), 2) ."MB) of bytes into memory. You should lower the defaultUpdateTime to prevent this from happening. ")));
59 }
And I've added default size, but it's not needed
125 lastSize = <?php echo filesize($this->log) || 1000; ?>;
I know this question is a bit old, but (along with the lack of good choices) it gave me the idea to create this tiny (open source) web app. https://github.com/ToX82/logHappens. It can be used online, but I'd use an .htpasswd as a basic login system. I hope it helps.
WARNING: This is a possible exploit. Do not run directly on your server if you're not sure what to do with this.
http://pastehtml.com/view/1b1m2r6.txt
I believe this was uploaded via an insecure upload script. How do I decode and uncompress this code? Running it in the browser might execute it as a shell script, open up a port or something.
I can do a base64 decode online but i couldn't really decompress it.
So there's a string. It's gzipped and base64 encoded, and the code decodes the base64 and then uncompresses it.
When that's done, I am resulted with this:
<? eval(base64_decode('...')); ?>
Another layer of base64, which is 720440 bytes long.
Now, base64 decoding that, we have 506961 bytes of exploit code.
I'm still examining the code, and will update this answer when I have more understanding. The code is huge.
Still reading through the code, and the (very well-done) exploit allows these tools to be exposed to the hacker:
TCP backdoor setup
unauthorised shell access
reading of all htpasswd, htaccess, password and configuration files
log wiping
MySQL access (read, write)
append code to all files matching a name pattern (mass exploit)
RFI/LFI scanner
UDP flooding
kernel information
This is probably a professional PHP-based server-wide exploit toolkit, and seeing as it's got a nice HTML interface and the whole lot, it could be easily used by a pro hacker, or even a script kiddie.
This exploit is called c99shell (thanks Yi Jiang) and it turns out to have been quite popular, being talked about and running for a few years already. There are many results on Google for this exploit.
Looking at Delan's decoded source, it appears to be a full-fledged backdoor providing a web interface that can be used to control the server in various ways. Telling fragments from the source:
echo '<center>Are you sure you want to install an IP:Port proxy on this
website/server?<br />
or
<b>Mass Code Injection:</b><br><br>
Use this to add PHP to the end of every .php page in the directory specified.
or
echo "<br><b>UDP Flood</b><br>Completed with $pakits (" .
round(($pakits*65)/1024, 2) . " MB) packets averaging ".
round($pakits/$exec_time, 2) . " packets per second \n";
or
if (!$fp) {echo "Can't get /etc/passwd for password-list.";}
I'd advise you to scrub that server and reinstall everything from scratch.
I know Delan Azabani has done this, but just so you actually know how he got the data out:
Just in case you're wondering how to decompress this, use base64 -d filename > output to parse base64 strings and gunzip file.name.gz to parse gzipped data.
The trick is in recognising that what you've got is base64 or gunzip and decompressing the right bits.
This way it goes absolutely nowhere near a JS parser or PHP parser.
First, replace the eval with an echo to see what code it would execute if you'd let it.
Send the output of that script to another file, say, test2.php.
In that file, do the same trick again. Run it, and it will output the complete malicious program (it's quite a beast), ~4k lines of hacker's delight.
This is code for php shell.
to decode this
replace replace eval("?>". with print(
run this
php5 file.php > file2.php
then replace eval with print and run in browser. http://loclhost/file2.php
I work at a small php shop and I recently proposed that we move away from using our nas as a shared code base and start using subversion for source control.
I've figured out how to make sure our dev server gets updated with every commit to our development branch... and I know how to merge into trunk and have that update our staging server, because we have direct access to both of these, but my biggest question is how to write a script that will update the production server, which we many times only have ftp access to. I don't want to upload the entire site every time... is there any way to write a script that is smart enough to upload only what has changed to the web server when we execute it (don't want it to automatically be uploading to the production enviroment, we want to execute it manually)?
Does my question even make sense?
Basically, your issue is that you can't use subversion on the production server. What you need to do is keep, on a separate (ideally identically configured) server a copy of your production checkout, and copy that through whatever method to the production server. You could think of this as your staging server, actually, since it will also be useful for doing final tests on releases before rolling them out.
As far as the copy goes, if your provider supports rsync, you're set. If you have only FTP you'll have to find some method of doing the equivalant of rsync via FTP. This is not the first time anybody's had that need; a web search will help you out there. But if you can't find anything, drop me a note and I'll look around myself a little further.
EDIT: Hope the author doesn't mind me adding this, but I think it belongs here. To do something approximately similar to rsync with ftp, look at weex http://weex.sourceforge.net/. Its a wrapper around command line ftp that uses a local mirror to keep track of whats on the remote server so that it can send only changed files. Works great for me.
It doesn't sound like SVN plays well with FTP, but if you have http access, that may prove sufficient to push changes using svnsync. That's how we push changes to our production severs -- we use svnsync to keep a read-only mirror of the repository available.
I use the following solution. Just install the SVN client on your webserver, and attach this into a privately accessible url:
<?php
// make sure you have a robot account that can't commit ;)
$username = Settings::Load()->Get('svn', 'username');
$password = Settings::Load()->Get('svn', 'password');
$repos = Settings::Load()->Get('svn', 'repository');
echo '<h1>updating from svn</h1><pre>';
// for secutity, define an array of folders that you do want to be synced from svn. The rest should be skipped.
$svnfolders = array( 'includes/' ,'plugins/' ,'images/' ,'templates/', 'index.php' => 'index.php');
if(!empty($_GET['justthisone']) && array_search($_GET['justthisone'], $svnfolders) !== false){ // you can also just update one of above by passing it to $_GET
$svnfiles = array($_GET['justthisone']);
}
foreach($svnfiles as $targetlocation)
{
echo system("svn export --username={$username} --password {$password} {$repos}{$targetlocation} ".dirname(__FILE__)."/../{$targetlocation} --force");
}
die("</pre><h1>Done!</h1>");
I'm going to make an assumption here and say you are using a post-commit hook to do your merging/updating of your staging server. This may work, but I would strongly recommend you look into a Continuous Integration solution. The following are some that I am aware of:
Xinc - http://code.google.com/p/xinc/ (PHP Specific)
CruiseControl - http://cruisecontrol.sourceforge.net/ (Wildly popular.)
PHP integration made possible with http://phpundercontrol.org/about.html
Hudson - [https://hudson.dev.java.net/] (Appears to be Java based, but allows for plugins/extensions)
LFTP is capable of synchronizing directories over ftp.
Just an idea:
You could hold a revision of your project on a host you have access to and where subversion is installed. This single revision reflects the production server's version.
You could now write a PHP script that makes this repository update over svn and then find all files that have been changed since the rep was updated. These files you can upload.
Such a script could look like this:
$path = realpath( '/path/to/production/mirror' );
chdir( $path );
$start = time();
shell_exec( 'svn co' );
$list = array();
$i = new RecursiveIteratorIterator( new RecursiveDirectoryIterator( $path ), RecursiveIteratorIterator::SELF_FIRST );
foreach( $i as $node )
{
if ( $node->isFile() && $node->getCTime() > $start )
{
$list[] = $node->getPathname();
}
// directories should also be handled
}
$conn = ftp_connect( ... );
// and so on
Just as it came to my mind.
I think this will help you
https://github.com/midhundevasia/deploy
its works well in Windows.