I have a server Synology DSM 6.2 NGINX and WordPress web site on it. Web site is hosted on port :80, server page is hosted on port :5000.
Someone is trying to hack the admin server account by accessing the login screen on port :5000, something like brute-force attack.
I want to set captcha like on Cloudflare on this login page, but I don't have access to this login page code, so I can't put Google ReCaptcha scripts into the header for example.
So the question is, is it possible to create some kinda redirect logic in NGINX like: "when user trying to access port :5000 check cookies for a token, if token not found, redirect or show(somehow) captcha, if verification succeeds, redirect back to the port :5000"?
Maybe there is some kind of plugin or better way to solve this problem?
PS Sorry for the poor English and incomprehensible explanations.
Multiple options:
change to https (mandatory)
activate IP blocking for attacks and expand to whole ranges where necessary
Install Zyxel USG and add rules to block geography
I have sometimes 6-7 attacks per second and at least 1-2 per minute. Firewall blocks almost everything. If you don't need access to your site from Russia, China, Ukraine, etc, then this is the most effective way.
Related
I was implementing API (without JWT) where user login is not needed. So, I choose basic authentication but, now anyone can stole it from Network (tab browser) because credentials are available during sending the request.
After the lot of the research I think it's better to whitelist domains ( 75 total in my system ). I tried it in CodeIgniter 3 but I am not getting a way to get the domain name via referrer or origin.
Now, the last solution to white list the domains in server. I have Plesk with Ubuntu installation but I don't know what should be the best solution for whitelist the domains in server. Also, I never did this type of the work in server.
I hope I have explained the question better!
Was tempted to post this on ServerFault.
Cloudflare enabled with SSL. Got 250k requests in past 2 hours, 192k were cached, the others weren't.
The page that's causing the issue for me is "/wp-admin/admin-ajax.php".
The CPU is spiking like crazy here. Been manually adding IP's into server's firewall, but it's not good enough since the IP's keep changing. So site is down because it keeps getting crashed/timeout.
On a dedicated server, anything else I can do besides cloudflare? Have cpanel installed and can add extensions.
Thanks.
Technically the admin-ajax.php file is supposed to be publicly available, but who puts a file like that in an admin directory? sigh.
In addition to using fail2ban (which is a great suggestion) you could try one or more of the following... in no particular order.
Look at the user agents used in the attack, if there is a commonality use Cloudflare's User-Agent Blocking.
Use use Cloudflare's javascript challenge on the firewall to require visitors from the attacking countries to complete a javascript captcha before being allowed through.
Block access to the file (temporarily?) using .htaccess
Use a Cloudflare page rule to redirect traffic to that URI to another (e.g. to www.yoursite.com).
Use Cloudflare Access to protect your entire /wp-admin directory.
Use Cloudflare's zone lockdown to restrict access to that URI to specific IP addresses.
I have set up a small raspberry pi home automation from scratch along with some ipcams to surveil my house.
All those are controlled by a small webpage (with relay switches and video feeds) i have set up that works very well within my home network.
I was thinking of setting up a dyndns account (as i dont have a static ip) in order to be able to access all those things remotely through 4G.
Obviously i dont just want to point an open port to my web page and just make it accessible online.
How to i make this secure? Here are my thoughts:
Do i set up a quick Joomla! site inside my network that ill have to log in every time? (or something similar - im just familiar with Joomla!)
Is there some way to password protect the website with .htaccess? Is it safe? could you point me out to a guide?
Is there some way to restrict access only to my cellphone's 4G mac address?
Is there some way to set up some vpn or other "tunnel" between my phone and home? (I wouldnt want it to apply to all my phone's traffic though)
Do i have this all wrong and there's some other awesome way to do what i need?
Please keep in mind that i would appreciate simplicity and ease of connection every time ill access the website. ie. i wouldnt want to log in every time i need to open my garage while driving near the house.
Lastly i was thinking of posting this in some other stack subforum but i ended up here, if you think there is some more suitable community please let me know.
Thanks in advance
Password protect it. However, hackers can hack the IP cameras. Find vulnerabilities in the software and patch them. Port forwarding to IP cameras should be safe. You can use Hamachi for a tunnel but it requires you to have a computer. You can restrict access by IP address. There will be guides on the Internet. Google is the best option for you.
I run a computer lab for grade schoolers (3-14 y.o.) and would like to create a desktop/dashboard page consisting of a number of iframes, each pointing at a different external website
(for which we have created individual accounts for each child); and when a kid logs in (to the dashboard) a script will log her in to those websites, so she does not have to.
I have 1 server and 20 workstations, I'll refer to them as 'myserver' and 'mybrowser'(s) respectively. All these behind the same router (dynamic IP).
A kid gets on a 'mybrowser' workstation, fires up Firefox and runs desktop.php (hosted in 'myserver') and gets a login screen (for 'myserver')
'mybrowser' ---http---> 'myserver'
Once logged in, 'myserver' will retrieve a set of username and password stored in its database and run a CURL script to send those to an 'external web server'.
'mybrowser' ---http---> 'myserver' ---curl---> 'external web server'
SUCCESSFUL, well, I thought.
Turns out CURL, being run off 'myserver', logs in 'myserver' instead of 'mybrowser'.
The session inside the iframe, after refresh, is still NOT logged in. Now I know.
Then I thought of capturing the cookies from 'myserver' and set it into 'mybrowser' so that 'mybrowser' can now browse (within the iframe)
as a logged in user. After all, we (all the 'mybrowsers') are behind the same router as 'myserver', thus same IP address.
So in other words, I only need 'myserver' to log a user in to several external websites all at once ,and once done pass the control over back to individual users' browsers.
I hope the answer will not resort to using CURL to display and control the external websites for the whole session, aside from being a drag that will lead to some other sticky issues.
I am getting the nuance that this is not permitted due to security issues, but what if all the 'mybrowsers' and 'myserver' are behind the same router? Assuming there's a way to copy the login cookies from 'myserver' to 'mybrowsers', would 'external web server' know that a request came from different machines?
Can this be done?
Thanks.
The problem you are facing relates to the security principles of cookies. You cannot set cookies for other domains, which means that myserver cannot set a cookie for facebook.com, for example.
You could set your server to run an HTTP proxy and make it so that all queries run through your server and do some kind of URL translation (e.g. facebook.com => facebook.myserver) which then in return allows you to set cookies for the clients (since you're running on facebook.myserver) and then translates cookies you receive from the clients and feed them to the third party websites.
An example of a non-transparent proxy that you could begin with: http://www.phpmyproxy.com/
Transparent proxies (in which URLs remain "correct" / untranslated) might be worth considering too. Squid is a pretty popular one. Can't say how easy this would be, though.
After all that you'll still need to build a local script for myserver that takes care of the login process, but at least a proxy should make it all possible.
If you have any say in the login process itself, it might be easier to set up all the services to use OpenID or similar login services, StackOverflow and its sister sites being a prime example on how easy login on multiple sites can be achieved.
I'm building an online dating website at the moment.
There needs to be an admin backend to the site to approve users/photos etc.
I can add this admin part of the site/login etc to the same domain.
eg: www.domainname.com/admin
Or from my experience with PHP CURL I can put this site on a different domain and CURL the requests through.
Question: is it more secure to put the admin code/site on a completely different domain? or it really doesn't matter if it sits on the same domain? hacking/security is the really point of this.
thx
Technically it might be more secure if you ran it from a different server and hosted it on a subdomain using a different IP/vhost, or use a proxy mod for your webserver (see Apache mod_proxy) to proxy requests from yourdomain.com/admin to admin.otherdomain.com and enforce additional IP or access control using .htaccess or equivalent to access the proxy url.
Of course, if those other domains are web accessible, then they are only as secure as the users and passwords that use them.
For corporate applications, you may want to make the admin interface accessible from a VPN connection, but I don't know if that applies to you.
If there is a vulnerability on your public webserver that allows someone to get shell access, then it may make it slightly more difficult to get administrative access since they don't have the code for the administration portion.
In other words, it can provide additional security depending on the lengths you go to, but is not necessarily a solid solution.
Using something like cURL is a possibility, but you'd have far less troubleshooting to do using a more conventional method like proxy or subdomain on another server.