I have made a page that I want to protect with a IP address block.
Everybody that visits example.com should be redirected to example.com/block.php, unless there IP address is known to me. I want to be able to add a lot of IP addresses in a list somewhere. After I added the IP address of someone to the list, when they visit example.com again, they now should be redirected to example.com/index.php.
Is this possible? I have seen a lot of questions that look the same, but nothing that really answers my question.
There's several ways to do this, since you have the htaccess tag in your question, I assume you're looking for an htaccess file solution.
If you're not using apache 2.4 (or higher), you can use mod_authz to handle all of the blocking. Something like this:
# this makes it so blocked IPs get shown the block.php page
ErrorDocument 403 /block.php
Order Allow,Deny
Deny from all
Allow from 1.2.3.4
Allow from 1.2.3.5
Allow from 5.6.7.8
etc...
Here, you block everything and keep a list of "Allow from" lines. You can shorten them if you want to allow an entire subnet, e.g.
Allow from 1.2.3
will allow an IP starting with 1.2.3.
You can also use mod_rewrite for this:
RewriteEngine On
RewriteCond %{REMOTE_ADDR} !^1\.2\.3\.4$
RewriteCond %{REMOTE_ADDR} !^1\.2\.3\.5$
RewriteCond %{REMOTE_ADDR} !^5\.6\.7\.8$
RewriteRule ^ /block.php [L,F]
Here, you have a bit more flexibility since you can use a regular expression to match the IP. So you can do stuff like:
RewriteCond %{REMOTE_ADDR} !^1\.2\.[3-6]\.
So this allows all IPs starting with: 1.2.3, 1.2.4, 1.2.5, and 1.2.6.
You can also put these directives into the server/vhost config file, it'll work a bit faster there instead of the htaccess file.
How about this?
<?php
$allowed = array(
'123.45.67.890',
'456.45.67.890',
'789.45.67.890',
);
if(!in_array($_SERVER['REMOTE_ADDR'], $allowed)) {
header("Location: /block.php");
exit();
}
// continue with code ...
?>
Hope it works!
Related
I have a Windows client application that consumes a php page hosted in a shared commercial webserver.
In this php page I am returning an encrypted json. Also in this page I have a piece of code to keep track of which IPs are visiting this php page, and I have noticed that there is a spyder/Nutch-2 crawler visiting this page.
I am wandering how is possible that a crawler could find a page that is not published in any search engines. I there a way to block crawlers from visiting this specific page?
Shall I use .htaccess file to configure it?
You can forbid specific crawlers by doing thatfollowing;
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (spyder/Nutch-2) [NC]
#For multi block
#RewriteCond %{HTTP_USER_AGENT} (spyder/Nutch-2|baidu|google|...) [NC]
RewriteRule .* - [R=403,L]
That crawler, can change agent name, so this may not be the solution. You need to block that crawler by looking at ip address in need;
Order Deny,Allow
Deny from x.x.x.x
However, that bot can also change his ip address. This means, you need to track your access logs. And decide which agents to be blocked and add them to list manually
You can indeed use a .htaccess. robots.txt is another option but some crawlers will ignore this.
You can also block specific user agent strings. (They differ from crawler to crawler)
robots.txt:
User-agent: *
Disallow: /
This example tells all robots to stay out of the website:
You can block specific directories
Disallow: /demo/
More information about robots.txt
You can ban the particular IP address with .htaccess file:
Order Deny,Allow
Deny from xxx.xx.xx.xx
where xxx represents IP address
Close. It would be better to use a robots.txt file. The page linked goes through why you would want to set one up and how to do so. In summary:
It avoids wasting server resources as the spiders and bots run the scripts on the page.
It can save bandwidth.
It removes clutter from the webstats.
You can fine-tune it to exclude only certain robots.
One caveat I should mention. Some spiders are coded to disregard the robots.txt file and will even examine it to see what you don't want them to visit. However, spiders from legit sources will obey the robots.txt directives.
You could use .htaccess or another option would be to use php code. At the top of the php code simply put something like this:
if(strpos($_SERVER['HTTP_USER_AGENT'],'spyder/Nutch-2') !== false) {
die();
}
//rest of code here
Working on a dev site for a client, we want to deny all access from it, but allow easy whitelisting when out of office for meetings or working from home (dynamic ip).
What we want to happen, is have a form, that writes your IP address to the htaccess file along with a comment above it stating who this is or who authorized it etc.
Without going into a bunch of details, a simple password wont work in our case, having people monitor email accounts for requests, having clients obtain their own IP addresses, things like this just wont fly.
What would be nice, is allowing these added IP addresses in htaccess to expire. So I figure complicated logic like that wont fly in htaccess itself, so it would need to be managed by a 3rd party software, unless anyone has any other ideas?
I recommend using Apache's RewriteMap directive. Please note that to use the RewriteMap directive you have to place the directive in the httpd.conf and NOT the .htaccess file. You can use it in several simple ways.
Plain text file
The plain text version allows you to have a .txt file that holds the ip addresses. I added a line for a comment. This way doesn't allow auto expiration.
httpd.conf
RewriteEngine on
RewriteMap ipmap txt:/path/to/whitelist.txt
RewriteCond ${ipmap:%{REMOTE_ADDR}} !^allow$ [NC]
RewriteRule .* - [F,L]
whitelist.txt
# Chris London added this 2013/06/14
127.0.0.1 allow
123.45.67.89 allow # Some other comment
Custom Program
With the RewriteMap you can actually have it run an external program but this one comes with some caveats. I personally haven't used this method especially with a PHP script. To make it work with a PHP script, I believe, it has to run indefinitely reading the stdin and writing to the stdout.
RewriteEngine on
RewriteLock /path/to/rewrite.lock
Rewritemap ipmap prg:/path/to/executable.php
RewriteCond ${ipmap:%{REMOTE_ADDR}} !^allow$ [NC]
RewriteRule .* - [F,L]
executable.php
#!/usr/bin/php
<?php
$in = fopen('php://stdin', 'r');
$out = fopen('php://stdout', 'r');
while ($ip = fgets($f)) {
// TODO add better logic
if ($ip == '127.0.0.1') {
fwrite(out, 'allow');
} else {
fwrite(out, 'deny');
}
}
fclose($f);
Keep your rewrite map program as simple as possible. If the program hangs, it will cause httpd to wait indefinitely for a response from the map, which will, in turn, cause httpd to stop responding to requests.
Be sure to turn off buffering in your program. Buffered I/O will cause httpd to wait for the output, and so it will hang.
Remember that there is only one copy of the program, started at server startup. All requests will need to go through this one bottleneck. This can cause significant slowdowns if many requests must go through this process, or if the script itself is very slow.
DB Query
I also haven't used this one yet but it looks pretty neat. mod_dbd will need to be configured to point at the right database for this to work. You have a SQL statement that fetchs the ip addresses and you can add a filter for the expiration date.
RewriteEngine on
RewriteMap ipmap "dbd:SELECT ipaddress FROM rewrite WHERE expiration < TIME() and ipaddress = %s"
RewriteCond ${ipmap:%{REMOTE_ADDR}} !^%{REMOTE_ADDR}$ [NC]
RewriteRule .* - [F,L]
There are a couple other types out there but these seem to be the best fit for you. Like I said before I haven't used the Custom Program or the DB Query before so I may have said something wrong. Hopefully another user on here may catch my mistake so these will all work for you.
What I'm trying to achieve is to use htaccess to only allow requests coming in from the same server, but do so by using available variables and not specify the IP.
The goal is to be able to run cron jobs and ajax requests to the files in the respective folder, but return a 404 page if tried to access directly.
Here's what I have so far:
Options -MultiViews +FollowSymLinks
RewriteEngine On
RewriteCond %{REMOTE_ADDR} !%{SERVER_ADDR}[NC]
RewriteRule ^(.*)$ /error404.html [L,R=404]
This works fine for ajax. It works for cronjobs too if the server happens to use the same outgoing IP, but if the server's outgoing IP is different from the IP of the site, obviously it will fail and return 404 because %{REMOTE_ADDR} is different from %{SERVER_ADDR}.
One solution would be to see what the outgoing IP is for that server and add it as another exception. However I'm looking for a more reusable solution. I tried using regex to match only the first part of the IP's but I'm having no luck with that. Don't really know how to go about this. Basically with regex what I'm trying to achieve is this:
let's assume:
%{REMOTE_ADDR} = 192.322.122.50
%{SERVER_ADDR} = 192.322.122.1
These are the 2 variables I need to find a valid comparison expression for. This expression would return true if the first part of the IP's is identical.
Another way would be to specify the range that is allowed, but I don't "know" what the wanted range is. I know it's the first part if the SERVER_ADDR variable, but I don't know how to tell the server what I mean :D
Hopefully I wasn't too confusing. Ultimately what I'm looking for is a way to determine whether a request is coming from the same server as the site this is on. And it has to be achieved through the .htaccess file. Why? Because the protected folder also contains files other than php scripts, so the alternative would be to serve all of those dynamically and use PHP for all the conditions. Using a plain htaccess command would be much more elegant. I just hope there is a way to do this.
This isn't tested, but you may give it a try if you want:
# note that this only works if both server and visitor are using IPv4 addresses
RewriteCond %{SERVER_ADDR} ^([0-9]{1,3})\.([0-9]{1,3})\.([0-9]{1,3})\.([0-9]{1,3})$
RewriteCond %{REMOTE_ADDR} !^%1\.%2\.%3\.([0-9]{1,3})$
RewriteRule ^.*$ error404.html [R=404,L]
Let me know if this kind of stuff works, but don't shoot me if it doesn't :)
Here's the deal. Basically I've got multiple domains, and I would like one of the domains to point to the regular base of the site, but the other domains to point to a subsection of the site without modifying the url in the address bar. The idea is that each of the extra domains are branded for their section.
Example:
www.domain.com Top level of site
www.domain2.com points to www.domain.com/abc
www.domain3.com points to www.domain.com/def
etc.
Also note that in the example 'abc' and 'def' are not real directories on the filesystem, but are virtual areas defined in a database.
I've spent quite a bit of time looking around and couldn't find anything that fits this. I do not want to use domain cloaking because it uses frames. Regular redirects obviously are easy to point to the right thing, but they change the url in the address bar.
Is there any way to accomplish this?
Edit:
I've added the alias as Mark suggested below, but can anyone shed light on how I could then use mod_rewrite to make this work?
First, your DNS records need to point to the IP of the web server.
Second, you have to set up Apache to serve multiple domains from the same VirtualHost. Enable mod_alias and then use the ServerAlias directive in your VirtualHost definition. For example:
ServerName www.domain.com
ServerAlias domain.com www.domain2.com domain2.com www.domain3.com domain3.com
If you've done that correctly, you should see the exact same content at each domain, without any redirection.
What you do next is an architectural decision. You can use mod_rewrite to hard-code URL routing rules based on domain, or you can use PHP to do the routing, based on the value of $_SERVER['HTTP_HOST'].
A mod_rewrite solution might involve adding something like this to your .htaccess file:
RewriteEngine On
RewriteCond %{HTTP_HOST} domain2\.com$
RewriteRule ^(.*)$ /abc/$1 [L]
RewriteCond %{HTTP_HOST} domain3\.com$
RewriteRule ^(.*)$ /def/$1 [L]
This is a high-level overview. Please comment on my answer if you need details on a particular part of the process.
In my project I have to make a subdomain, i.e
if the user name is XXX when he register, a sub domain will be created like XXX.example.com
how to do it?
I will use php for scripting.
I found a script that seems to do exactly that, create a subdomain on your server on demand.
It probably needs a little bit of tweaking for it to work on your particular control panel, but the review are quite positive as far as I can tell.
Link
Have you considered using htaccess and url rewriting?
Found this code that may help you:
# Rewrite <subdomain>.example.com/<path> to example.com/<subdomain>/<path>
#
# Skip rewrite if no hostname or if subdomain is www
RewriteCond %{HTTP_HOST} .
RewriteCond %{HTTP_HOST} !^www\. [NC]
# Extract (required) subdomain (%1), and first path element (%3), discard port number if present (%2)
RewriteCond %{HTTP_HOST}<>%{REQUEST_URI} ^([^.]+)\.example\.com(:80)?<>/([^/]*) [NC]
# Rewrite only when subdomain not equal to first path element (prevents mod_rewrite recursion)
RewriteCond %1<>%3 !^(.*)<>\1$ [NC]
# Rewrite to /subdomain/path
RewriteRule ^(.*) /%1/$1 [L]
Source (Post #6)
This might be a little more complex than you think.
I suggest to do some reading on mod rewriting and htaccess.
You could start here:
htaccess Tutorial
Modrewrite tutorial
Subdomain Modrewrite Example
EDIT: Or just go with one of the nice examples provided my fellow SO users. ;)
As long as this is for non-SSL sites, then by far the easiest way is not to bother - just use a wildcard DNS domain and vhost, then map any domain specific behaviours in your PHP code. If you need SSL sites then its a lot more complicated - you need to have a seperate IP address/port for each certificate - and woldcard certs can be very expensive.
If you're wanting to set up some sort of hosting package then its a bit more involved - how you go about this depends on what webserver and DNS server you are using.
Assuming (again no SSL) with Apache on Unix/POSIX/Linux and bind, then, again I'd go with a wildcard DNS entry, then:
1) create a base dir for the website, optionally populate this with a default set of files
2) add a vhost definition in its own file in /etc/httpd/conf.d named as XXX.conf
3) send a kill -HUP to the HTTPD process (causes it to read the new config files without having to do a full restart).
One thing to note is that you really shouldn't allow the httpd process direct write access to its own config files - you definitely don't want to give it root privileges. A safer solution would be to create a CLI script to perform this using the username as an argument then make it setuid and invoke it from the script run by the HTTPD process.
C.
the best way is to use a joker in your DNS server :
www.example.com. IN A 1.2.3.4
*.example.com. IN A 1.2.3.4
By this way, No subdomain has to be created : all are pointing to the same IP by default.
In your PHP code, you just have get $_SERVER["HOST"] and get the fist part :
$hostParts=explode('.',$_SERVER["HTTP_HOST"]);
$user=$hostParts[0]
First, you need to make sure you have a wildcard domain setup in DNS, and make sure your webserver (apache?) directs all queries for that wildcard domain to your php file.
Then in php you can look at $_SERVER["HTTP_HOST"] to see which subdomain is used for that particular request.
Since you will make sub-domains when an user registers.
Try this as .htaccess file:
Options +FollowSymlinks
RewriteEngine on
RewriteRule ^.htaccess$ - [f]
RewriteCond %{HTTP_HOST}!^www.domain.com
RewriteCond %{HTTP_HOST} ^([^.]+).domain.com
RewriteRule ^$(.*) /$1/%1 [L]
make a function of a controller which will take the value of sub-domain and display what necessary.
like::
public function show ($domain)
{
**.**..*..**.** Your code goes here
}
when a user will try this xxx.domain.com/controller/show this will be domain.com/controller/show/xxx . if you want to xxx.domain.com to be domain.com/controller/show/xxx just edit the htaccess file as you want.