DDoS attack on /wp-admin/admin-ajax.php - php

Was tempted to post this on ServerFault.
Cloudflare enabled with SSL. Got 250k requests in past 2 hours, 192k were cached, the others weren't.
The page that's causing the issue for me is "/wp-admin/admin-ajax.php".
The CPU is spiking like crazy here. Been manually adding IP's into server's firewall, but it's not good enough since the IP's keep changing. So site is down because it keeps getting crashed/timeout.
On a dedicated server, anything else I can do besides cloudflare? Have cpanel installed and can add extensions.
Thanks.

Technically the admin-ajax.php file is supposed to be publicly available, but who puts a file like that in an admin directory? sigh.
In addition to using fail2ban (which is a great suggestion) you could try one or more of the following... in no particular order.
Look at the user agents used in the attack, if there is a commonality use Cloudflare's User-Agent Blocking.
Use use Cloudflare's javascript challenge on the firewall to require visitors from the attacking countries to complete a javascript captcha before being allowed through.
Block access to the file (temporarily?) using .htaccess
Use a Cloudflare page rule to redirect traffic to that URI to another (e.g. to www.yoursite.com).
Use Cloudflare Access to protect your entire /wp-admin directory.
Use Cloudflare's zone lockdown to restrict access to that URI to specific IP addresses.

Related

Captcha on nginx server, brute-force attack

I have a server Synology DSM 6.2 NGINX and WordPress web site on it. Web site is hosted on port :80, server page is hosted on port :5000.
Someone is trying to hack the admin server account by accessing the login screen on port :5000, something like brute-force attack.
I want to set captcha like on Cloudflare on this login page, but I don't have access to this login page code, so I can't put Google ReCaptcha scripts into the header for example.
So the question is, is it possible to create some kinda redirect logic in NGINX like: "when user trying to access port :5000 check cookies for a token, if token not found, redirect or show(somehow) captcha, if verification succeeds, redirect back to the port :5000"?
Maybe there is some kind of plugin or better way to solve this problem?
PS Sorry for the poor English and incomprehensible explanations.
Multiple options:
change to https (mandatory)
activate IP blocking for attacks and expand to whole ranges where necessary
Install Zyxel USG and add rules to block geography
I have sometimes 6-7 attacks per second and at least 1-2 per minute. Firewall blocks almost everything. If you don't need access to your site from Russia, China, Ukraine, etc, then this is the most effective way.

Disallow Robots in API URL using PHP

I know I can disallow robots using robots.txt but few search engines does not follow this. Hence I have a API where my users sends transactional info to insert/update/delete etc., using my API Request Parameters. But when I look at my logs, huge hits have been made to my .php page, Hence I google to use it in my php API page and found nothing.
Hence I landed on SO to get help from experts, is there any way I can block/disallow SE robots to access my base API URL?
The main approaches that I know of for dealing with bots that are ignoring robots.txt are to either:
Blacklist them via your firewall or server
Only allow whitelisted users to access your API
However, you should ask yourself whether they're having any impact on your website. If they're not spamming you with requests (which would be a DDoS attack) then you can probably safely ignore them and filter them out of your logs if you need to analyse real traffic.
If you're running a service that people use and you don't want it to be wide open to spam then here's a few more options on how to limit usage:
Restrict access to your API just to your users by assigning them an API token
Rate limit your API (either via the server and/or via your application)
Read the User Agent (UA) of your visitors, a lot of bots will mention they're bots or have fake UAs, the malicious ones will pretend to be users
Implement more advanced measures such as limiting access to a region if a lot of requests suddenly come from there in a short period of time
Use DDoS protection services such as CloudFlare
There's no perfect solution and each option involves trade-offs. If you're worried about DDoS then you could start by looking into your server's capabilities, for example here's an introduction into how NGINX can control traffic: https://www.nginx.com/blog/rate-limiting-nginx/
In a nutshell, any IP hitting your site can be a bot so you should defend by imposing limits and analysing behaviour, since there's no way to know for sure who is a malicious visitor and who isn't until they start using your service.

Website Administration Location + PHP CURL

I'm building an online dating website at the moment.
There needs to be an admin backend to the site to approve users/photos etc.
I can add this admin part of the site/login etc to the same domain.
eg: www.domainname.com/admin
Or from my experience with PHP CURL I can put this site on a different domain and CURL the requests through.
Question: is it more secure to put the admin code/site on a completely different domain? or it really doesn't matter if it sits on the same domain? hacking/security is the really point of this.
thx
Technically it might be more secure if you ran it from a different server and hosted it on a subdomain using a different IP/vhost, or use a proxy mod for your webserver (see Apache mod_proxy) to proxy requests from yourdomain.com/admin to admin.otherdomain.com and enforce additional IP or access control using .htaccess or equivalent to access the proxy url.
Of course, if those other domains are web accessible, then they are only as secure as the users and passwords that use them.
For corporate applications, you may want to make the admin interface accessible from a VPN connection, but I don't know if that applies to you.
If there is a vulnerability on your public webserver that allows someone to get shell access, then it may make it slightly more difficult to get administrative access since they don't have the code for the administration portion.
In other words, it can provide additional security depending on the lengths you go to, but is not necessarily a solid solution.
Using something like cURL is a possibility, but you'd have far less troubleshooting to do using a more conventional method like proxy or subdomain on another server.

Using htaccess to track visitors?

I have a demo server where I put samples of my apps, I send potential customers links to those apps. Is it possible to use htaccess to track visitors, without adding tracking capability to the apps themselves? The data I'm interested in are:
date and time of page visit
ip of visitor
url of the page visited
referrer
post and get (query string) data if any
That entirely depends on your webserver, what options it provides for htaccess overrides.
For Apache, the access log logs what you are looking for
http://httpd.apache.org/docs/current/logs.html#accesslog
but is not configurable via htaccess.
no, that's impossible to use .htaccess file, because it's merely a configuration file, not executable one.
However you can use another web-server capability - log files.
Everything you asking for is already stored in the access log, almost in the same format you listed here.
An important note: unlike google analytics or any other third-party or scripting solution, web-server logs is the only reliable and exact source of tracking data, contains very request been made to your site.
Best way it to use google analytics.
You will get all what you need and much much more.
I know this thread has been quiet for a while, but i it not possible to use the prepend?? directive that prepends a script to all visits to track site/page visits ?
I have not got the code (tried something similarthough was not successfull) but I used the prepend directive to prepend a script that "switches" on gzip for all site visits. I am sure the same can be implemented for logs (for those of us with cheap shared servers!) Come on coders, do us all a favour and reveal the secret!

What is https and SSL? How do they work? How can they be used in PHP?

I know the general definition but I need more details on how to implement them in general and PHP in specific, and what exactly are the features I gain from them?
SSL stands for "Secure Socket Layer", and it's a method of encrypted HTTP communication (among other things). It encrypts the traffic between a web browser and a server, making it possible to send secure data without fear of eavesdropping.
SSL is a web-server level technology, and has nothing to do with PHP. You can enable any web server with SSL, whether it has PHP on it or not, and you don't have to write any special PHP code in order to make your PHP pages show up over SSL.
There are many, many guides to be found on the internet about how to set up SSL for whatever webserver you might be using. It's a broad subject. You could start here for Apache.
some webservers are configured to mirror the whole site, so you can get every page over http or https, depending on what you prefer, or how the webbrowser sends them around. https is secure, but a bit slower and it puts more strain on your hardware.
so you might implement your site and shop as usual, but decide to put everything from the cart to the checkout, payment and so on under https. to accomplish this, all links to the shopping cart are absolute and prefixed with https:// instead of http://. now, if people click on the shopping cart icon, they're transfered to the secure version, and because all links from there on are relative again, they stay there.
but! they might replace the https with http manually, or go on the unencrypted version using a malicious link, etc.
in this case, you probably might want to check if your script was called over https (_SERVER["SERVER_PROTOCOL"], afaik), and deny the execution if not (good practice). or issue a redirect to the secure site.
on a side note: https is not using ssl exclusivley anymore, tls (the successor to ssl, see rfc2818) is more modern
rule of thumb: users should have the choice if they want http or https in noncritical environments, but forced to use https on the critical parts of your site (login/cart/payment/...) to prevent malicious attacks.

Categories