I have a site that is being attacked all the time and it is using joomla extensions
So I am trying to figure out what exploit are they using and I have decided to block those
I am using the below code:
RewriteEngine On
RewriteCond %{QUERY_STRING} (^|&)xqgu=(.*)
RewriteRule ^(.*)$ - [F]
RewriteCond %{QUERY_STRING} (^|&)fck=(.*)
RewriteRule ^(.*)$ - [F]
but its not working as I can still access the site on
site.com/index.php?fck=you
Can I block all get request that have paramer after index.php?=
that are not coming form my IP?
like
site.com/index.php?fck=uxsw
site.com/index.php?xqgu=otzd
site.com/index.php?some=thing
buy allow get on
site.com/index.php
order deny,allow
deny from all
allow from <your ip>
or
<RequireAll>
Require ip xx.xx.xx.xx yy.yy.yy.yy
</RequireAll>
Related
I have a wordpress site. recently under a serious ddos attack in wp-login.php. I have renamed wp-login.php to a new mysitename-login.php. and creat a new empty file with name wp-login.php. I have joined cloudflare, still received attack log in access_log. I have tried mod_evasive, but it will kill googlebot
Now I am manully add them into my .htaccess like
<Limit GET POST>
order allow,deny
deny from 108.162.253.180
deny from 173.245.48.134
deny from 173.245.49.187
deny from 173.245.51.180
deny from 173.245.54.66
deny from 108.162.219.
deny from 109.239.235.
allow from all
</Limit>
And I have an idea to create the .htaccess dynamic.
in current wp-login.php
$ip=$_SERVER['REMOTE_ADDR'];
// INSERT INTO ip_table (ip) values ($ip);
// ip is unique index
$html='<Limit GET POST>/n/r'
$html.=//select * from ip_table loop all
$html.='allow from all/n/r</Limit>';
$html.=<<<TXT
# BEGIN WordPress
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>
TXT;
file_put_content($html,'/var/www/html/.htaccess');
But I am afraid, if there have some problem during the file_put_content, the .htaccess is broken, my site will be broken too...
Any better way, to create a blacklist by using the robot access wp_login.php and no risk to be broken site?
Thanks.
Instead of creating a Blacklist, why not make a Whitelist? This wouldn't work if you allow all users to login to Wordpress, for example if you're using a membership plugin, but if only you and a few select Admins login, then just get everyone's IP address and add those to your .htaccess file like this:
## Prevent anyone not on my ip whitelist from accessing wp admin
RewriteCond %{REQUEST_URI} ^(/wp-admin|/wp-login.php).*$
RewriteCond %{REMOTE_ADDR} !=111.111.111.111
RewriteCond %{REMOTE_ADDR} !=222.222.222.222
RewriteCond %{REMOTE_ADDR} !=333.333.333.333
RewriteRule ^.*$ / [R=301,L]
What about using mod_evasive for Apache? This way you can easily block all IPs that try to connect to the certain URL very often in a short period of time.
You could block all IPs that will try to connect to your fake login page as well.
in my site "www.website.com" i have a folder with forbidden access using this .htaccess :
RewriteEngine On
RewriteBase /
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/]+)/.*\ HTTP [NC]
RewriteRule .* - [F,L]
Now i want to have access only from my site "www.website.com" to a specific PHP files using Ajax (POST). I have modify .htaccess like this:
RewriteEngine On
RewriteBase /
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/]+)/.*\ HTTP [NC]
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http(s)?://.*website.com [NC]
RewriteRule \.(php)$ - [NC,F,L]
But with this code i have access to all php files. I need to access only in *Add.php and *Edit.php . These files are in several subfolders. What should i do ? Is this the right way to do this?
Well, after a lot of searching, reading and testing i conclude that the best way to secure a website that works using Ajax is:
1 - Have only one (if possible) file like router.php that will "route" depending on the POST/GET navigation variables, using includes to files that are in sub-folders.
2 - Except of SESSION-based authentication you could also implement Basic HTTP authentication and/or HTTPS (SSL) to secure user credentials when login. If you are not using HTTPs, you should use field or form encryption because in 'wire' all are in plaintext. I have found useful this http://www.itsyndicate.ca/jquery/
3 - In every POST use Token-based Authentication with a token that is created from user credentials, send over with the request and then re-calculate and compare.
4 - I have tried lot of combinations for using only one .htaccess in document ROOT but always something was missing, or miss-configured or not working as i expected. So, i found more simple to use one .htaccess in every sub-folder instead of one in the root. In sub-folders depending on what they contain's .htaccess should look like this:
### No Direct Access to All files ###
<IfModule mod_authz_host.c>
Order Deny,Allow
Deny from all
# Allow from 127.0.0.1
</IfModule>
### One of the alternative ways with mod_rewrite ###
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/]+)/.*\ HTTP [NC]
RewriteRule .* - [F,L]
</IfModule>
### permit ONLY filetypes in a pattern ###
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} ^(.+)\.(css|js|gif|png|jpe?g|woff)$
RewriteRule .? - [S=1]
RewriteRule ^(.+)$ - [F,L,NC]
</IfModule>
I prefer THE_REQUEST because this does not include any additional headers sent by the browser. This value has not been unescaped (decoded), unlike most other variables, and it has no point to spoof. I use skip [S=1] because i prefer to tell "If that then this rule not valid", so, in any other case the rule that "Denies" it is valid.
5 - For extra security you can use code inside php files, implementing one of the methods described in this article:
Prevent direct access to a php include file
6 - Also, if you are Forbidding Image Hotlinking (and NOT only) described here , beware that referer can be spoofed!!
I want anything.domain.co.uk to go to a php script which then uses the 'anything' from the url within the html to create different headers for example.
This is the script in my .php file for this
<?php echo array_shift(explode(".",$_SERVER['HTTP_HOST']));?>
I have used the script below but it creates a loop obviously.
How do I get anything.domain.co.uk to pick up the php script, still have anything.domain.co.uk in the url and also have www.domain.co.uk go to 'normal' index page
Options +FollowSymlinks -MultiViews
RewriteEngine on
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} !www.domain.co.uk$ [NC]
RewriteCond %{HTTP_HOST} ^(www.)?([a-z0-9-]+).domain.co.uk [NC]
RewriteRule (.*) http://%2.domain.co.uk/$1 [R=301,L]
You can use the Location directive to locate the script you want to target and the use an Allow from to limit access to certain domains:
<Location /script.php>
Order Deny,Allow
Deny from all
Allow from anything.domain.co.uk
</Location>
I have a page running on phpbb where I want to disable registrations from certain counteries. I've ended up with this
<Files "ucp.php">
Order Allow,Deny
Allow from all
SetEnvIf GEOIP_COUNTRY_CODE {country} BlockCountry
Deny from env=BlockCountry
</Files>
as you can see I'm using geoip to detect the country. But now the problem is that this piece of code disallows already registered users to login from those countries, but I want just the registration part which is ucp.php?mode=register.
This however doesn't work even with backslashes so I don't know how make it work.
Thanks for your help
You could do something like this in your .htaccess
RewriteEngine on
RewriteCond %{ENV:GEOIP_COUNTRY_CODE} ^(CA|US|MX)$
RewriteCond %{QUERY_STRING} ^(.*)mode=register(.*)$ [NC]
RewriteRule ^ucp.php$ deny_page_for_other_countries.php [L]
I am sorry, I have tried for hours to get this working, but I haven't made progress...
I want to make it so that if a user on my site types user.site.com they will be taken to site.com/user, but the URL will still show user.site.com. How can I do this? With .htaccess? Server files?
Almost there Ken
RewriteEngine On
RewriteCond %{HTTP_HOST} !www.site.com$ [NC]
RewriteCond %{HTTP_HOST} ^([a-z0-9-_]+).site.com [NC]
RewriteRule (.*) %1/$1 [QSA,L]
%1 = what's before .site.com
$1 = what you got after the /
If you have test.site.com/foo.php , you would have /test/foo.php.
if you just want test, just forget about the $1.
QSA = query string append,
L = Last.
You should read the url about mod_rewrite in #phihag post.
Use mod_rewrite:
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} !www.site.com$ [NC]
RewriteCond %{HTTP_HOST} ^([a-z0-9-_]+).site.com [NC]
RewriteRule (.*) %1/$1 [QSA,L]
If you want to link to resources, either use full (http://site.com/user/static/x.css) or relative (static/x.css) URLs. Absolute URLs (/user/static/x.css) will need to be crafted differently when this rule is in effect.
To keep the original address in the address bar you will need a reverse proxy rather than a redirect. Redirecting tells the browser to send a second request to the server with a different address, reverse proxy tells the server to find another page and send it without notifying the browser about it (this is what you want I believe). Reverse proxy is achieved with the [P] flag in mod_rewite
Make sure mod_rewrite, mod_proxy and mod_proxy_http are loaded and put the directives
<Proxy *>
Order deny,allow
Allow from all
</Proxy>
RewriteEngine on
RewriteRule ^/(.*) http://site.com/user/$1 [PL]
into your virtual host configuration for user.site.com or .htaccess if you do not have root privileges. This will proxy all pages from the subdomain to the main domain folder. If you only want to proxy the index page use RewriteRule ^/ http://site.com/user instead.
I assume you are using http and not https. If so, it gets a little more complex...