I tried to make an ip based deny list which stored ip numbers in mysql. Basicly i try to header redirect if user's ip in array but it wont work. What is wrong?
$ip_array = array();
$ip_ban_query = mysql_query("SELECT ip FROM banned_ips");
while ($deny = mysql_fetch_assoc($ip_ban_query)){
$add_ip = $deny['ip'];
$ip_array[] = $add_ip;
}
if (in_array ($_SERVER['REMOTE_ADDR'], $ip_array)) {
header("Location: http://www.google.com/");
exit();
}
We can greatly simplify your code here.. reducing complexity almost always flushes away bugs. :)
// There are several different methods to accomplish this, and you really
// should be using statements here, but both are out of scope of this question.
$ipBanQuery = sprintf("SELECT ip FROM banned_ips WHERE ip = '%s'", mysql_real_escape_string($_SERVER['REMOTE_ADDR']));
$result = mysql_query($ipBanQuery);
if (mysql_num_rows($result)) {
header('Location: http://www.google.com/');
exit();
}
It also depends on where you're calling this code. Be extra-sure that this is being called before any output to the browser - stray spaces, HTML, or other debugging info will prevent any additional headers from being sent. Check your webserver's error log to see if there's something wonky going on.
Related
I have written a script, to redirect the users who visit my website,
http://localhost/ghi/red.php?go=http://www.google.com
When theres URL like above my script grabs the go variable value and checks whether its there on my database table as a trusted site if so it redirects to the site. In this occurance the redirection should take place even for sub domains
as an example even if the "go" variable has a value like www.google.com/images the redirection should take place if www.google.com is there in the trusted sites table.
I do that by using PHP INDEX OF function as below
$pos = strrpos($trusted_sites, $go_value);
this works fine, But there is a problem that i accidentally came across...
Which is even if the go variable has a value like www.google.comqwsdad it still redirects the user to www.google.com
this is a serious bug any help would be highly appreciated on how to avoid redirecting to wrong urls
If you want such redirect from a whitelist of sites. First build of the whilelist in an array. Then you can compare them using in_array() from the $_GET['go']. Consider this example:
// sample: http://localhost/ghi/red.php?go=http://www.google.com/images
if(isset($_GET['go'])) {
$go = $_GET['go'];
$url = parse_url($go);
$go = $url['host'];
$scheme = $url['scheme'];
$certified_sites = array('www.imdb.com', 'www.tomshardware.com', 'www.stackoverflow.com', 'www.tizag.com', 'www.google.com');
if(in_array($go, $certified_sites)) {
header("Location: $scheme://$go");
exit;
} else {
// i will not redirect
}
}
The "correct" way is to us an array of sites (or even a database), then use in_array.
<?php
$trusted_sites=array("http://www.google.com","http://www.yahoo.com");
if (in_array("http://www.google.com",$trusted_sites)) {
print "Ok\n";
} else {
print "Bad site\n";
}
A quick way of cheating, which I use from time to time, is to make sure you have a separator (e.g. a space) as the first and last character of your $trusted_sites, then add the separator to the beginning and end of your $go_value.
<?php
$trusted_sites="http://www.google.com http://www.yahoo.com";
$go="http://www.google.com";
if (strpos(" $trusted_sites "," $go ")===False) {
print "Bad site\n";
} else {
print "Ok\n";
}
In this example, I've added the separator (a space) to the beginning and end of both variables, inside the strpos(); in the case of $trusted_sites, I could have put them in the initial declaration instead.
I have never coded php before and I really need this very simple script.
so let me explain what I need.
user comes to my website via affiliate link so when they finished redirecting
the url will look like this
http://website.com/lp/index.html?sub=1&customer_id=1039be6e23b4420c3e1063dc44a04d
now I have a download link on my website
On download click
check for duplicate ip address from database. if not duplicate
capture sub="" & customer_id="" from the address bar.
save to database with IP Address (this for tracking)
and redirect immediately to the download link
if the ip is not duplicated
http://dl.website.com/download/downloadpop.aspx?id={Customer_id}
if it's duplicated
http://dl.website.com/download/downloadpop.aspx?id=beenbefore
Thank you so much!
You probably want header for the redirect:
http://php.net/manual/en/function.header.php
And mysql_connect, mysql_query, etc. for DB stuff:
http://php.net/manual/en/book.mysql.php
You can extract GET params from $_GET:
http://php.net/manual/en/reserved.variables.get.php
Note that any call to header() must take place before other output (see the example on the page linked.)
This is rather broad, and impossible to answer properly without knowing any details about your database structure, but here's how the basics of this would work:
<?php
$sub = $_GET['sub'];
$customer_id = $_GET['customer_id'];
$ip = $_SERVER['REMOTE_ADDR'];
$db = mysql_connect(...) or die(mysql_error());
$quoted_sub = mysql_real_escape_string($sub);
$quoted_customer_id = mysql_real_escape_string($customer_id);
$quoted_ip = mysql_real_escape_string($ip);
$sql = "SELECT count(*) AS cnt FROM yourtable WHERE ip_address = '$quoted_id'";
$result = msyql_query($sql) or die(mysql_error());
$row = mysql_fetch_assoc($result);
if ($row['cnt'] == 0) {
$enc = urlencode($customer_id);
... IP isn't in the database, so do the insert stuff ...
header(" http://dl.website.com/download/downloadpop.aspx?id=$enc");
} else {
header("Location: http://dl.website.com/download/downloadpop.aspx?id=beenbefore");
}
exit();
First off, you can access these variables via $_GET().
Next, you INSERT them into a database using PDO.
Finally, you can redirect someone with the appropriate header():
header('Location:http://dl.website.com/download/downloadpop.aspx?id=beenbefore');
hi I just want your opinions about this code I found on a website for detect real search spiders from spammer is it good?? and do you have any recommendations for other scripts or methods for this subject
<?php
$ua = $_SERVER['HTTP_USER_AGENT'];
$spiders=array('msnbot','googlebot','yahoo');
$pattern=array("/\.google\.com$/","/search\.live\.com$/","/\.yahoo\.com$/");
for($i=0;$i < count($spiders) and $i < count($pattern);$i++)
{
if(stristr($ua, $spiders[$i])){
//it's pretending to be MSN's bot or Google's bot
$ip = $_SERVER['REMOTE_ADDR'];
$hostname = gethostbyaddr($ip);
if(!preg_match($pattern[$i], $hostname))
{
//the hostname does not belong to either live.com or googlebot.com.
//Remember the UA already said it is either MSNBot or Googlebot.
//So it's a spammer.
echo "spammer";
exit;
}
else{
//Now we have a hit that half-passes the check. One last go:
$real_ip = gethostbyname($hostname);
if($ip != $real_ip){
//spammer!
echo "Please leave Now spammr";
break;
}
else{
//real bot
}
}
}
else
{
echo "hello user";
}
}
note: it used user agent switcher with this code and it worked perfectly but am not sure if it will work in real world, so what do you think??
What would keep a spammer from simply giving an entirely correct user agent string?
I think this is fairly pointless. You would have to at least compare IP ranges (or their name servers) as well in order to get reliable results. This is possible for Google:
Google Webmaster Central: How to verify Googlebot
but even if you test for Google and Bing this way, a spambot can enter your site simply by giving a browser user-agent. Therefore, it is ultimately impossible to detect a spam-bot. They are a reality, and there is no good way to keep them out from a web site.
you can also have htaccess so that things like this will be prevented just like on this tutorial
http://perishablepress.com/press/2007/06/28/ultimate-htaccess-blacklist/
I have a client whose domain seems to be getting hit pretty hard by what appears to be a DDoS. In the logs it's normal looking user agents with random IPs but they're flipping through pages too fast to be human. They also don't appear to be requesting any images. I can't seem to find any pattern and my suspicion is it's a fleet of Windows Zombies.
The clients had issues in the past with SPAM attacks--even had to point MX at Postini to get the 6.7 GB/day of junk to stop server-side.
I want to setup a BOT trap in a directory disallowed by robots.txt... just never attempted anything like this before, hoping someone out there has a creative ideas for trapping BOTs!
EDIT: I already have plenty of ideas for catching one.. it's what to do to it when lands in the trap.
You can set up a PHP script whose URL is explicitly forbidden by robots.txt. In that script, you can pull the source IP of the suspected bot hitting you (via $_SERVER['REMOTE_ADDR']), and then add that IP to a database blacklist table.
Then, in your main app, you can check the source IP, do a lookup for that IP in your blacklist table, and if you find it, throw a 403 page instead. (Perhaps with a message like, "We've detected abuse coming from your IP, if you feel this is in error, contact us at ...")
On the upside, you get automatic blacklisting of bad bots. On the downside, it's not terribly efficient, and it can be dangerous. (One person innocently checking that page out of curiosity can result in the ban of a large swath of users.)
Edit: Alternatively (or additionally, I suppose) you can fairly simply add a GeoIP check to your app, and reject hits based on country of origin.
What you can do is get another box (a kind of sacrificial lamb) not on the same pipe as your main host then have that host a page which redirects to itself (but with a randomized page name in the url). this could get the bot stuck in a infinite loop tieing up the cpu and bandwith on your sacrificial lamb but not on your main box.
I tend to think this is a problem better solved with network security more so than coding, but I see the logic in your approach/question.
There are a number of questions and discussions about this on server fault which may be worthy of investigating.
https://serverfault.com/search?q=block+bots
Well I must say, kinda disappointed--I was hoping for some creative ideas. I did find the ideal solutions here.. http://www.kloth.net/internet/bottrap.php
<html>
<head><title> </title></head>
<body>
<p>There is nothing here to see. So what are you doing here ?</p>
<p>Go home.</p>
<?php
/* whitelist: end processing end exit */
if (preg_match("/10\.22\.33\.44/",$_SERVER['REMOTE_ADDR'])) { exit; }
if (preg_match("Super Tool",$_SERVER['HTTP_USER_AGENT'])) { exit; }
/* end of whitelist */
$badbot = 0;
/* scan the blacklist.dat file for addresses of SPAM robots
to prevent filling it up with duplicates */
$filename = "../blacklist.dat";
$fp = fopen($filename, "r") or die ("Error opening file ... <br>\n");
while ($line = fgets($fp,255)) {
$u = explode(" ",$line);
$u0 = $u[0];
if (preg_match("/$u0/",$_SERVER['REMOTE_ADDR'])) {$badbot++;}
}
fclose($fp);
if ($badbot == 0) { /* we just see a new bad bot not yet listed ! */
/* send a mail to hostmaster */
$tmestamp = time();
$datum = date("Y-m-d (D) H:i:s",$tmestamp);
$from = "badbot-watch#domain.tld";
$to = "hostmaster#domain.tld";
$subject = "domain-tld alert: bad robot";
$msg = "A bad robot hit $_SERVER['REQUEST_URI'] $datum \n";
$msg .= "address is $_SERVER['REMOTE_ADDR'], agent is $_SERVER['HTTP_USER_AGENT']\n";
mail($to, $subject, $msg, "From: $from");
/* append bad bot address data to blacklist log file: */
$fp = fopen($filename,'a+');
fwrite($fp,"$_SERVER['REMOTE_ADDR'] - - [$datum] \"$_SERVER['REQUEST_METHOD'] $_SERVER['REQUEST_URI'] $_SERVER['SERVER_PROTOCOL']\" $_SERVER['HTTP_REFERER'] $_SERVER['HTTP_USER_AGENT']\n");
fclose($fp);
}
?>
</body>
</html>
Then to protect pages throw <?php include($DOCUMENT_ROOT . "/blacklist.php"); ?> on the first line of every page.. blacklist.php contains:
<?php
$badbot = 0;
/* look for the IP address in the blacklist file */
$filename = "../blacklist.dat";
$fp = fopen($filename, "r") or die ("Error opening file ... <br>\n");
while ($line = fgets($fp,255)) {
$u = explode(" ",$line);
$u0 = $u[0];
if (preg_match("/$u0/",$_SERVER['REMOTE_ADDR'])) {$badbot++;}
}
fclose($fp);
if ($badbot > 0) { /* this is a bad bot, reject it */
sleep(12);
print ("<html><head>\n");
print ("<title>Site unavailable, sorry</title>\n");
print ("</head><body>\n");
print ("<center><h1>Welcome ...</h1></center>\n");
print ("<p><center>Unfortunately, due to abuse, this site is temporarily not available ...</center></p>\n");
print ("<p><center>If you feel this in error, send a mail to the hostmaster at this site,<br>
if you are an anti-social ill-behaving SPAM-bot, then just go away.</center></p>\n");
print ("</body></html>\n");
exit;
}
?>
I plan to take Scott Chamberlain's advice and to be safe I plan to implement Captcha on the script. If user answers correctly then it'll just die or redirect back to site root. Just for fun I'm throwing the trap in a directory named /admin/ and of coursed adding Disallow: /admin/ to robots.txt.
EDIT: In addition I am redirecting the bot ignoring the rules to this page: http://www.seastory.us/bot_this.htm
You could first take a look at where the ip's are coming from. My guess is that they are all coming from one country like china or Nigeria, in which case you could set up something in htaccess to disallow all ip's from those two countries, as for creating a trap for bots, i havent the slightest idea
I'm using the following snippet to redirect an array of IP addresses. I was wondering how I would go about adding an entire range/block of IP addresses to my dissallowed array...
<?php // Let's redirect certain IP addresses to a "Page Not Found"
$disallowed = array("76.105.99.106");
$ip = $_SERVER['REMOTE_ADDR'];
if(in_array($ip, $disallowed)) {
header("Location: http://google.com");
exit;
}
?>
I tried using "76.105.99.*", "76.105.99", "76.105.99.0-76.105.99.255" without any luck.
I need to use PHP rather than mod_rewrite and .htaccess for other reasons.
Here's an example of how you could check a particular network/mask combination:
$network=ip2long("76.105.99.0");
$mask=ip2long("255.255.255.0");
$remote=ip2long($_SERVER['REMOTE_ADDR']);
if (($remote & $mask)==$network)
{
header("Location: http://example.com");
exit;
}
This is better than using a string based match as you can test other masks that align within an octet, e.g. a /20 block of IPs
Try the substr function:
$ip = '76.105.99.';
if (substr($_SERVER['REMOTE_ADDR'], 0, strlen($ip)) === $ip) {
// deny access
}
You can approach the problem in a different way.
If you want to ban 76.105.99.* you could do:
if (strpos($_SERVER['REMOTE_ADDR'], "76.105.99.")!==FALSE)
{
header ('Location: http://google.com');
}
Who exactly are you interested in blocking? You can use PHP or apache to block (or allow) a bunch of specific IP addresses.
If you are interested in blocking people from an entire country for example, then there are tools that give you the IP addresses you need to block. Unfortunately, it's not as simple as just specifying a range.
Check out http://www.blockacountry.com/ which generates a bunch of ip addresses you can stick in your .htaccess to block whole countries.
What you need to do is to have a test to see if a particular address lives inside a particular address range as defined by CIDR
So for instance, you need to be able to say
is 192.168.1.5
inside
192.168.1.0/24
That function is easy to write, assuming you have some basic tools to do CIDR work.
Assuming you are on a 32bit system, this class http://snipplr.com/view/15557/cidr-class-for-ipv4/
Pay attention to the IPisWithinCIDR function
It would be better to do this in apache(or any other server)
I believe that you'll need to create a for loop to add each IP address (within the range) to your array.
pseudo code
for i = 0 to 255
disallowed[i] = "76.105.99." + i
next
$blocked_ip_range_array = array('109.237.108.0','109.238.0.0');
for($i=0;$i<count($blocked_ip_range_array);$i++){
$network=ip2long($blocked_ip_range_array[$i]);
$blipr = explode(".",$blocked_ip_range_array[$i]);
if($blipr[2]=='0'){
$mask=ip2long("255.255.0.0");
}
else{
$mask=ip2long("255.255.255.0");
}
$remote=ip2long($_SERVER['REMOTE_ADDR']);
if (($remote & $mask)==$network)
{
header("Location: http://xurcun.info");
exit;
}
}
Below is a URL showing something rather similar to what Mr. Dixon and Ameer are discussing:
http://www.blackdog.ie/blog/blocking-ip-ranges-with-php/
Hope this helps.
Respectfully,
Wil