Since few hours, I'm having multiple requests coming from various IP's to our website occuring every second(maybe 4 or 5 requests per second).
The website's usual traffic is about 3 to 5 requests per minute.
The requests are very random, for example:
/gtalczp/197zbcylgxpoaj-26228e-dtmlnaibx/
/109/jxwhezsivr/10445_xwvpfdyzhea.cgi
/nouyaku.html
/index.php/43e3133-pmuwbfgoedakvxs/
/keyword_list/s_index=L
The site's indexing in google is now all in japanese characters and messed up.
I have tried blocking IP's(via .htaccess) that make all these random requests, but every time a new IP is making a new request. How can I stop all of these requests? Can I use an .htaccess rule that allows only the links that are available in the site?
EDIT: Our site is running Wordpress latest version, with custom built features. If this was some kind of hack, how can I find the infected files/database tables?
EDIT 2: these look like legit google bots, but why are they trying to access these random links which don't exist...
This traffic is coming from automated security scanners. They scan blocks of IP ranges used by AWS, Digital Ocean etc looking for known security bugs on the web server.
Can you stop it? Sort of.
One quick way to catch the low hanging fruit is to put a /password.txt on the root of the webserver. Every scanner on this planet will scan for that. Block any IP that accesses it. You can use Fail2Ban for this.
You can also rate limit access to your webserver. If a client is scanning pages very quickly it's likely a scanner and in which case ban the IP. But could also be a search engine spider etc. In which case this will likely hurt your SEO.
With request for slugs containing Japanese keywords like nouyaku and Google indexing of pages in Japanese might well indicate the Japanese Keyword Hack. This Google article provides an explanation and some generic general fixes and preventitive measures: https://developers.google.com/web/fundamentals/security/hacked/fixing_the_japanese_keyword_hack
Fixing Wordpress hacks already covered elsewhere: you will find numerous questions and answers about this on Stackoverflow or via Google.
.htaccess Google's article advises replacing your htaccess. A useful start would be adding and tweaking either Geof Starr's 6G "Firewall" or 7G(beta) code.
The rate of requests is DDOS like; so it makes sense to cater for this at same time (e.g. mod_evasive, Fail2ban, and modsecurity) google protecting Apache from DDOS attacks.
DDOS, brute force and Wordpress - stopping dodgy requests before PHP/Wordpress code/SQL is run will massively reduce server load. If there is no need for the Public to log in to Wordpress, then use htaccess to password protect wp-login and also maybe wp-admin folder (may cause problems on some sites).
Denial of Service Attacks
Denial of service attacks are a common threat to consider when hosting a website on the internet. Although most security vulnerabilities can be prevented by avoiding dangerous coding habits/techniques, DOS protection requires a different approach.
Lazy DOS Attacks
Specifically, I am not so worried about sophisticated spammers on the net with expensive equipment (maybe superior to my own) who will accept payment to take down sites. I can't imagine anyone would pay that amount of money just to take my website down for a bit. What I am more worried about are 'lazy attacks' which would be likely to come from clever teens looking to test their destructive powers on unsuspecting sites, specifically those who have heard about my site and projects through hackathons and events.
These 'lazy attacks' are extremely cheap and easy to carry out, and can take the form of a simple browser script or flood program that sends an overwhelming amount of requests to my site servers. A powerful system (like my dedicated server) can only be taken down by a weak system (like a light-resource script) through exploiting weaknesses in the more powerful system and using it's power to work against itself.
In particular, I mean when the same user continuously makes requests to the server that require almost no effort. If the ratio of server resources used to client resources used is larger than at least the ratio of server to client resources available, then my site could be taken down very easily.
if (Server Resources : Client Resources > Server Power : Client Power)
{
ServerDown();
}
Securing the Request-Return Policy in Apache
I am not asking how to make my Apache site DOS proof, but rather how to make it less vulnerable to cheap, lazy DOS attacks by locking down the number of requests one client can send in a set period of time before being treated as an attacker. This specifically applies to php scripts and other pages that require more resources to relay content to the client.
What is the best approach (mysql banned table, Apache settings, honeypot etc.) to securing a relatively small site against cheap DOS attacks?
I was thinking about web-security and then this thought popped into my head.
Say that there's this jerk who hates me and knows how to program. I am managing a nice website/blog with a considerable amount of traffic. Then that jerk creates a program that automatically request my website over and over again.
So if I am hosting my website on a shared hosting provider then obviously my website will stop responding.
This type of attacks may not be common, but if someone attempts something like that on my website i must do something about it. I don't think that popular CMS's like wordpress or drupal do something about this type of attacks.
My assumption is ;
If a user requests more than x times (let's say 50) in 1-minute, block that user. (stop responding)
My questions are;
Is my assumption ok ? If not what to do about it ?
Do websites like Google, Facebook, Youtube...[etc] do something about this type of attacks.
What you are facing is the DoS.[Denial of Service] Attack. Where one system tries to go on sending packets to your webserver and makes it unresponsive.
You have mentioned about a single jerk, what if the same jerk had many friends and here comes DDoS [Distributed DoS] Attack. Well this can't be prevented.
A Quick fix from Apache Docs for the DoS but not for the DDoS ...
All network servers can be subject to denial of service attacks that
attempt to prevent responses to clients by tying up the resources of
the server. It is not possible to prevent such attacks entirely, but
you can do certain things to mitigate the problems that they create.
Often the most effective anti-DoS tool will be a firewall or other
operating-system configurations. For example, most firewalls can be
configured to restrict the number of simultaneous connections from any
individual IP address or network, thus preventing a range of simple
attacks. Of course this is no help against Distributed Denial of
Service attacks (DDoS).
Source
The issue is partly one of rejecting bad traffic, and partly one of improving the performance of your own code.
Being hit with excess traffic by malicious intent is called a Denial of Service attack. The idea is to hit the site with traffic to the point that the server can't cope with the load, stops responding, and thus no-one can get through and the site goes off-line.
But you can also be hit with too much traffic simply because your site becomes popular. This can easily happen overnight and without warning, for example if someone posts a link to your site on another popular site. This traffic might actually be genuine and wanted (hundred of extra sales! yay!), but can have the same effect on your server if you're not prepared for it.
As others have said, it is important to configure your web server to cope with high traffic volumes; I'll let the other answers speak for themselves on this, and it is an important point, but there are things you can do in your own code to improve things too.
One of the main reasons that a server fails to cope with increased load is because of the processing time taken by the request.
Your web server will only have the ability to handle a certain number of requests at once, but the key word here is "simultaneous", and the key to reducing the number of simultaneous requests is to reduce the time it takes for your program to run.
Imagine your server can handle ten simultaneous requests, and your page takes one second to load.
If you get up to ten requests per second, everything will work seamlessly, because the server can cope with it. But if you go just slightly over that, then the eleventh request will either fail or have to wait until the other ten have finished. It will then run, but will eat into the next second's ten requests. By the time ten seconds have gone by, you're a whole second down on your response time, and it keeps getting worse as long as the requests keep pouring in at the same level. It doesn't take long for the server to get overwhelmed, even when it's only just a fraction over it's capacity.
Now imagine the same page could be optimised to take less time, lets say half a second. Your same server can now cope with 20 requests per second, simply because the PHP code is quicker. But also, it will be easier for it recover from excess traffic levels. And because the PHP code takes less time to run, there is less chance of any two given requests being simultaneous anyway.
In short, the server's capacity to cope with high traffic volumes increases enormously as you reduce the time taken to process a request.
So this is the key to a site surviving a surge of high traffic: Make it run faster.
Caching: CMSs like Drupal and Wordpress have caching built in. Make sure it's enabled. For even better performance, consider a server-level cache system like Varnish. For a CMS type system where you don't change the page content much, this is the single biggest thing you can do to improve your performance.
Optimise your code: while you can't be expected to fix performance issues in third-party software like Drupal, you can analyse the performance of your own code, if you have any. Custom Drupal modules, maybe? Use a profiler tool to find your bottlenecks. Very often, this kind of analysis can reveal that a single bottleneck is responsible for 90% of the page load time. Don't bother with optimising the small stuff, but if you can find and fix one or two big bottlenecks like this, it can have a dramatic effect.
Hope that helps.
These types of attacks are called (D)DoS (Distributed Denial of Service) attacks and are usually prevented by the webserver hosting your PHP Application. Since apache is used the most, I found an article you might find interesting: http://www.linuxforu.com/2011/04/securing-apache-part-8-dos-ddos-attacks/.
The article states that apache has multiple mods available specifically created to prevent (D)DoS attacks. These still need to be installed and configured to match your needs.
I do believe that Facebook, Google etc. have their own similar implementations to prevent DoS attacks. I know for a fact that Google Search engine uses a captcha if alot of search requests are coming from the same network.
Why it is not wise to prevent DoS within a PHP script is because the PHP processor still needs to be started whenever a request is made, which causes alot of overhead. By using the webserver for this you will have less overhead.
EDIT:
As stated in another answer it is also possible to prevent common DoS attacks by configuring the server's firewall. Checking for attacks with firewall rules happens before the webserver is getting hit, so even less overhead there. Furthermore you can detect attacks on other ports aswell (such as portscans). I believe a combination of the 2 works best as both complement each other.
In my opinion, the best way to prevent DoS is to set the firewall to the lower level: at the entry of the server. By settings some network firewall config with iptables, you can drop packets from senders which are hitting too hard your server.
It'll be more efficient than passing through PHP and Apache, since them need to use a lot (relatively) of processus to do the checking and they may block your website, even if you detect your attacker(s).
You can check on this topic for more information: https://serverfault.com/questions/410604/iptables-rules-to-counter-the-most-common-dos-attacks
DDoS (Distributed Denial of Service Attacks) are generally blocked on a server level right?
Is there a way to block it on a PHP level, or at least reduce it?
If not, what is the fastest and most common way to stop DDoS attacks?
DDOS is a family of attacks which overwhelm key systems in the datacenter including:
The hosting center's network connection to the internet
The hosting center's internal network and routers
Your firewall and load balancers
Your web servers, application servers and database.
Before you start on building your DDOS defence, consider what the worst-case value-at-risk is. For a non-critical, free-to-use service for a small community, the total value at risk might be peanuts. For a paid-for, public-facing, mission-critical system for an established multi-billion dollar business, the value might be the worth of the company. In this latter case, you shouldn't be using StackExchange :) Anyway, to defend against DDOS, you need a defence in-depth approach:
Work with your hosting center to understand the services they offer, including IP and port filtering at their network connections to the internet and firewall services they offer. This is critical: Many sites are pulled from the internet by the hosting company as the hosting company deals with the data center-wide disruption caused by the DDOS to one customer. Also, during an DDOS attack, you will be working very closely with the hosting center's staff, so know their emergency numbers and be on good terms with them :) They should be able to block of whole international regions, completely block specific services or network protocols and other broad-spectrum defensive measures, or alternatively allow only whitelisted IPs (depending on your business model)
While on the hosting center - use a Content Delivery Network to distribute (mainly static) services close to your end users and hide your real servers from the DDOS architects. The full CDN is too big for a DDOS to take out all nodes in all countries; if the DDOS is focused on one country, at least other users are still OK.
Keep all your systems and software packages updated with the latest security patches - and I mean all of them:
Managed switches - yup these sometimes need updating
Routers
Firewalls
Load balancers
Operating systems
Web servers
Languages and their libraries
Ensure that you have a good firewall or security appliance set up and regularly reviewed by a qualified security expert. Strong rules on the firewall are a good defence against many simple attacks. It's also useful to be able to manage bandwidth available for each open service.
Have good network monitoring tools in place - this can help you understand:
That you're under attack rather than simply being under heavy load
Where the attack is coming from (which may include countries you don't normally do business with) and
What the attack actually is (ports, services, protocols, IPs and packet contents)
The attack might simply be heavy use of legitimate web site services (eg hitting 'legal' URIs running queries or inserting/updating/deleting data) - thousands or millions of requests coming from tens to millions of different IP addresses will bring a site to its knees. Alternatively, some services might be so expensive to run that only a few requests cause a DOS - think a really expensive report. So you need good application level monitoring of what is going on:
Which services have been invoked and what arguments/data are sent (i.e. logging in your application)
Which users are doing the invoking and from which IPs (i.e. logging in your application)
What queries and inserts/updates/deletes the DB is performing
Load average, CPU utilization, disk i/o, network traffic on all computers (and VMs) in your system
Making sure that all this information is easily retrievable and that you can correlate logs from different computers and services (i.e. ensure all computers are time synchronized using ntp).
Sensible constraints and limits in your application. For example, you might:
Use a QoS feature in the load balancer to send all anonymous sessions to separate application servers in your cluster, while logged-on users use another set. This prevents an application-level anonymous DDOS taking out valuable customers
Using a strong CAPCHA to protect anonymous services
Session timeouts
Have a session-limit or rate-limit on certain types of request like reports. Ensure that you can turn off anonymous access if necessary
Ensure that a user has a limit to the number of concurrent sessions (to prevent a hacked account logging on a million times)
Have different database application users for different services (eg transactional use vs. reporting use) and use database resource management to prevent one type of web request from overwhelming all others
If possible make these constraints dynamic, or at least configurable. This way, while you are under attack, you can set aggressive temporary limits in place ('throttling' the attack), such as only one session per user, and no anonymous access. This is certainly not great for your customers, but a lot better than having no service at all.
Last, but not least, write a DOS Response Plan document and get this internally reviewed by all relevant parties: Business, Management, the SW dev team, the IT team and a security expert. The process of writing the document will cause you and your team to think through the issues and help you to be prepared if the worst should happen at 3am on your day off. The document should cover (among other things):
What is at risk, and the cost to the business
Measures taken to protect the assets
How an attack is detected
The planned response and escalation procedure
Processes to keep the system and this document up-to-date
So, preamble aside, here are some specific answers:
DDOS are generally blocked on a server level, right?
Not really - most of the worst DDOS attacks are low-level (at the IP packet level) and are handled by routing rules, firewalls, and security devices developed to handle DDOS attacks.
Is there a way to block it on a PHP level, or at least reduce it?
Some DDOS attacks are aimed at the application itself, sending valid URIs and HTTP requests. When the rate of requests goes up, your server(s) begin to struggle and you will have an SLA outage. In this case, there are things you can do at the PHP level:
Application level monitoring: Ensure each service/page logs requests in a way that you can see what is going on (so you can take actions to mitigate the attack). Some ideas:
Have a log format that you can easily load into a log tool (or Excel or similar), and parse with command-line tools (grep, sed, awk). Remember that a DDOS will generate millions of lines of log. You will likely need to slice'n'dice your logs (especially with respect to URI, time, IP and user) to work out what is going on, and need to generate data such as:
What URIs are being accessed
What URIs are failing at a high rate (a likely indicator of the specific URIs the attackers are attacking)
Which users are accessing the service
How many IPs are each user accessing the service from
What URIs are anonymous users accessing
What arguments are being used for a given service
Audit a specific users actions
Log the IP address of each request. DON'T reverse DNS this - ironically the cost of doing this makes a DDOS easier for the attackers
Log the whole URI and HTTP method, eg "GET http://example.com/path/to/service?arg1=ddos"
Log the User ID if present
Log important HTTP arguments
Sensible rate limits: You might implement limits on how many requests a given IP or User can make in a given time period. Could a legitimate customer make more than 10 requests per second? Can anonymous users access expensive reports at all?
CAPTCHA for anonymous access: Implement a CAPTCHA for all anonymous requests to verify that the user is a person, not a DDOS bot.
What's the fastest and most common way to stop DDOS attacks?
The fastest is probably to give in to the blackmail, although this might not be desirable.
Otherwise, the first thing you to do is contact your hosting and/or CDN provider and work with them (if they haven't contacted you already asking what the hell is going on...). When a DDOS occurs, it will likely collaterally affect other customers of the hosting provider, and the provider may be under considerable pressure to shut down your site simply to protect their resources. Be prepared to share your logs (any and all information) with the provider; these logs, combined with their network monitors, may together provide enough information to block/mitigate the attack.
If you are expecting a DDOS, it's a very good idea to qualify your hosting provider on the level of protection they can provide. They should have DDOS experience and tools to mitigate it - understand their tools, processes and escalation procedures. Also ask about what support the hosting provider has from their upstream providers. These services might mean more up-front or monthly cost, but treat this as an insurance policy.
While under attack, you will need to grab your logs and mine them - try and work out the pattern of the attack. You should consider switching off anonymous access and throttling the services under attack (i.e. decrease the application's rate limit for the service).
If lucky and you have a small, fixed customer-base, you might be able to determine your valid customers IP addresses. If this is the case, you might switch to a white-list approach for a short while. Make sure all your customers know this is going on so they can call if they need to access from a new IP :)
Doug McClean has some great advice at: https://stackoverflow.com/a/1029613/1395668
According the PHP part of the question;
Although I don't rely on PHP for this, it could be implemented but needs to consider all these possiblities or more;
Attacker may change IP for each request
Attacker may pass parameter(s) to URI that target site doesn't care these parameter(s)
Attacker may restart the session before expiry
...
Simple pseudo;
<?php
// Assuming session is already started
$uri = md5($_SERVER['REQUEST_URI']);
$exp = 3; // 3 seconds
$hash = $uri .'|'. time();
if (!isset($_SESSION['ddos'])) {
$_SESSION['ddos'] = $hash;
}
list($_uri, $_exp) = explode('|', $_SESSION['ddos']);
if ($_uri == $uri && time() - $_exp < $exp) {
header('HTTP/1.1 503 Service Unavailable');
// die('Easy!');
die;
}
// Save last request
$_SESSION['ddos'] = $hash;
?>
The php level is too late in the request chain.
Putting your apache server behind an open source appliance may be a good option for you.
http://tengine.taobao.org/ has some documentation and source code more modules aimed at DDOS prevention. It is a expansion of nginx, so you can easily set it up as a reverse proxy for your apache instance.
See: http://blog.zhuzhaoyuan.com/2012/01/a-mechanism-to-help-write-web-application-firewalls-for-nginx/ for how to fight collision has DoS attacks.
Totally forgot too, http://www.cloudflare.com is one the top free web application firewall, they have free and paid plans and will save your ass from DDOS we use it for alot of our high traffic sites just for its caching capabilities. It is awsome!
You can not do this in PHP level. DDOS is a kind of attack that send too many requests to your webserver. Your webserver will reject request before it call your PHP script.
If you are using Apache, here is some tips from Apache:
http://httpd.apache.org/docs/trunk/misc/security_tips.html
DDoS is best handled by very expensive, purpose-built network appliances. Hosts are generally not good at doing DDoS protection because they are subject to relatively low performance, state exhaustion, limited bandwidth, etc. Use of iptables, apache mods, and similar services can help in some situations if you have no access to DDoS mitigation hardware or a DDoS mitigation service, but it is far from ideal and still leaves you at risk of attack.
Do NOT use PHP-based protection, it's horrible and will hardly have an impact at all! Configure your webserver to rate-limit requests, for example in Nginx using the limit_req module (http://nginx.org/en/docs/http/ngx_http_limit_req_module.html)
Although, I would recommend using CloudFlare to combat layer-4 - however not layer-7 based attacks unless you're willing to pay.
There are plugins you can use in apache for ddos/dos.
Good start here
http://www.debianadmin.com/how-to-protect-apache-against-dosddos-or-brute-force-attacks.html
If you're on LEMP, you can check here.
http://nginx.org/en/docs/http/ngx_http_limit_conn_module.html
These are good inexpensive starting points.
How about something like this on PHP side:
//if user does not change IP, then ban the IP when more than 10 requests per second are detected in 1 second
$limitps = 10;
if (!isset($_SESSION['first_request'])){
$_SESSION['requests'] = 0;
$_SESSION['first_request'] = $_SERVER['REQUEST_TIME'];
}
$_SESSION['requests']++;
if ($_SESSION['requests']>=10 && strtotime($_SERVER['REQUEST_TIME'])-strtotime($_SESSION['first_request'])<=1){
//write the IP to a banned_ips.log file and configure your server to retrieve the banned ips from there - now you will be handling this IP outside of PHP
$_SESSION['banip']==1;
}elseif(strtotime($_SERVER['REQUEST_TIME'])-strtotime($_SESSION['first_request']) > 2){
$_SESSION['requests'] = 0;
$_SESSION['first_request'] = $_SERVER['REQUEST_TIME'];
}
if ($_SESSION['banip']==1) {
header('HTTP/1.1 503 Service Unavailable');
die;
}
DDOS are generally blocked on a server level, Please enable DDOS protection in your Server Level. Please check the below notes for DDOS protections.
Apache HTTP Server configuration settings that can help prevent DDOS problems:
The RequestReadTimeout directive allows to limit the time a client may take to send the request.
Allow 10 seconds to receive the request including the headers and 30 seconds for receiving the request body:
RequestReadTimeout header=10 body=30
Allow at least 10 seconds to receive the request body. If the client sends data, increase the timeout by 1 second for every 1000 bytes received, with no upper limit for the timeout (except for the limit given indirectly by LimitRequestBody):
RequestReadTimeout body=10,MinRate=1000
RequestReadTimeout header=10-30,MinRate=500
RequestReadTimeout header=20-40,MinRate=500 body=20,MinRate=500
The KeepAliveTimeout directive may be also lowered on sites that are subject to DoS attacks. Some sites even turn off the keepalives completely via KeepAlive, which has of course other drawbacks on performance.
The values of various timeout-related directives provided by other modules should be checked.
The directives LimitRequestBody, LimitRequestFields, LimitRequestFieldSize, LimitRequestLine, and LimitXMLRequestBody should be carefully configured to limit resource consumption triggered by client input.
Tune the MaxRequestWorkers directive to allow the server to handle the maximum number of simultaneous connections without running out of resources.
Anti DDOS steps:
The very first important thing is to identify the ddos attack first. Identifying the ddos attack more early means more better for your server .
Getting better bandwidth available for your server. Always keep more than enough bandwidth which is required to for your server. This won’t prevent DDOS attack but it will take longer time. By which you will get some extra time to act.
If you own your own web server then you can defend at network parameter by rate limit your router, add filters to drop packets to different sources of attacks, time out half opened connections more aggressively. Also set lower SYN, ICMP and UDP flood drop thresholds.
If you don’t have much idea about these things, then go and contact your hosting providers quickly. They can try their best prevent the DDOS attacks.
There are also Special DDOS mitigation service provided by Cloudflare and many other companies. By which they can help you to prevent the DDOS attacks. Also many companies offer cheap ddos protection and dos protection.