How can I redirect multiple bad/adults keyword in a URL?
I was trying to do it with htaccess, but I don't think it's good for many keyword
example:
http://example.com/mp3/sex-with-me/page/1 redirect to http://example.com/mp3/with-me/page/1
http://example.com/video/selena-gomes-porn/page/1 redirect to http://example.com/video/selena-gomes/page/1
some code from htaccess file:
RewriteRule ^mp3/(.*)-sex/page/(.*)?$ http://site.com/mp3/$1/page/$2 [R=301,L]
RewriteRule ^mp3/(.*)-sex-(.*)/page/(.*)?$ http://site.com/mp3/$1-$2/page/$3 [R=301,L]
RewriteRule ^mp3/sex-(.*)/page/(.*)?$ http://site.com/mp3/$1/page/$2 [R=301,L]
RewriteRule ^video/(.*)-porn/page/(.*)?$ http://site.com/video/$1/page/$2 [R=301,L]
RewriteRule ^video/(.*)-porn-(.*)/page/(.*)?$ http://site.com/video/$1-$2/page/$3 [R=301,L]
RewriteRule ^video/porn-(.*)/page/(.*)?$ http://site.com/video/$1/page/$2 [R=301,L]
Is it possible to do this with PHP?
It would be possible to do in PHP, sure... I'm just not sure it's a good idea. You could redirect every request to a PHP page, explode $_SERVER['REQUEST_URI'], run some regexes, then continue on to the page.
I see a few big issues. Firstly, why are you changing the URI but still allowing access to the site? If this is your own site, you should be filtering these words before the creation of a URI. If this is supposed to be a proxy, then why allow access to a site with flagged words in the URI? (Especially because there are many better ways to deny access to inappropriate material than a PHP filtering based on the URI)
Secondly, what do you do about the town of Middlesex, or athlete Mr. Gay? What about if a dyke breaks along the Mississippi river? If someone is writing about e-readers (Nook, i.e.) you could have a problem, too. I could circumvent your filter by adding some hyphens or other junk characters. Basically, filtering based on the contents of the URI is very problematic and not likely to work too well.
If you want to do it in PHP, it'd probably be something like this:
<?php
$uri_component = explode('/',$_SERVER['REQUEST_URI']);
foreach($uri_component as $fragment){
if(preg_match('/regex/',$fragment) echo "BAD WORD";
}
?>
If your site has entry point for example through index.php you can define array with bad words, searching this words in url using your array and redirect to right url if bad words will be found.
In index.php:
<?php
$badWords = array('sex','prOn','etc');
$uriParts = explode('/',$_SERVER["REQUEST_URI"]);
// build preg with bad words
$preg = '/'.implode('|',$badWords).'/is';
foreach ($uriParts as $k=>$part)
{
$uriParts[$k]=preg_replace($preg,'',$uriParts[$k]);
}
// if bad words were found
if($uri='/'.implode('/',$uriParts)!=$_SERVER["REQUEST_URI"])
{
$newUrl = 'http://'.$_SERVER["SERVER_NAME"].$uri;
// redirecting user to good url with http code 301
header("HTTP/1.1 301 Moved Permanently");
header('Location: '.$newUrl);
exit();
}
If your site has multiple entry points for example mp3.php, video.php and etc..
You can save code which you see above in file bad_words_guard.php :) and include it in your every entry file:
<?php
require_once('path/to/bad_words_guard.php');
...
Related
I'm working with a client who has a site with many subdomains representing different areas covered in his locksmith business. He picks up a lot of traffic from directory websites, and wants to use his domain only as the link on these websites. When someone clicks it, he wants them to be redirected based on a keyword in the referring URL.
For example, a referring Yell URL could be
yell.com/ucs/UcsSearchAction.do?keywords=locksmith&location=Watford%2C+Hertfordshire&scrambleSeed=1311994593
Client wants htaccess or something similar to pick out the keyword 'Watford' from that URL, and redirect to watford.hisbusiness.com accordingly.
This isn't something I've done before and I'm baffled. Research found no clues.
You can check HTTP_REFERER to grab information from the referring URL.
RewriteEngine on
RewriteCond %{HTTP_REFERER} yell\.com/.*\?.*\&location\=(\w+)\%2C\+(\w+)
RewriteRule ^$ http://${lower:%1}.hisbusiness.com/ [R=302,L]
The ${lower:$1} is used to make Watford lowercase. In order for this to work, you'll need to add the following to your httpd.conf or virtual host configuration file:
RewriteMap lower int:tolower
Note: The rule in place above is designed for the domain root (hisbusiness.com) only - that is to say that a request to hisbusiness.com/something won't trigger the redirect. If you'd like it to check for the URI as well, use the following rule instead:
RewriteRule ^(.*)$ http://${lower:%1}.hisbusiness.com/$1 [R=302,L]
To make the redirect permanent and cached by browsers/search-engines, change 302 to 301.
Use Header on PHP using your required conditions:
if(condition 1){
header("Location: http://mywebsite1.com");
}
if(condition 2){
header("Location: http://mywebsite2.com");
}
else{
header("Location: http://mywebsite3.com");
}
You can use [stristr][1] on the if condition.
So, i have some PHP script pages that are accessed trough AJAX, others trough POST or GET and are used to send emails and access the database, and although i know that a search engine probably wont have interest in this pages i do not want it to even know that those exist.
I want a solid way to separate the pages that should be seen by a search engine and the ones that shouldn't.
I've seen Matt Cutts video (https://www.youtube.com/watch?v=nM2VDkXPt0I) in which he explains that the best way to prevent a page to viewed by Google is by using .htacess with password protection... The problem is that my script pages must be accessed by users.
Id like to know if there is a solution that only involves .htacess once in this video Matt Cutts explains that noindex, robots.txt are not very effective.
So the solution must follow the rules:
Use only .htacess (or something that works, but with no exceptions)
No HTML tags because of the specific response I'm getting in .responseText (these pages don't even have html, just php)
Allow single page restriction (not full directories for example)
Allow user access
I've searched a lot, and seen many solutions out there, but nothing that works for me, so, any ideas ?
Create a directory for your ajax pages and then set the htaccess to block Google from accessing it.
For directory redirects:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^googlebot
RewriteRule ^/ajax/ - [F,L]
For single page redirects:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^googlebot
RewriteRule ^([^/\.]+)/?$ yourpage.php [L]
Just in case you want to redirect multiple files (as i assume you do)
RewriteCond %{HTTP_USER_AGENT} ^googlebot
RewriteRule ^(file1|file2|file3|file4)\.html$ http://www.yoursite.com [R=301,NC,L]
Hope this helps.
Note that this must be uploaded to the parent directory and not the ajax folder.
Editing for a different solution, as you seem keen on single file redirects, you could return a PHP 301 redirect if a search engine bot enters your site
function bot_detected() {
if (isset($_SERVER['HTTP_USER_AGENT']) && preg_match('/bot|crawl|slurp|spider/i', $_SERVER['HTTP_USER_AGENT'])) {
return TRUE;
}
else {
return FALSE;
}
}
if(bot_detected() {
header (“http/1.1 301 Moved Permanently”);
header (“Location: http://www.yourwebsite.com”);
}
Is there any way to redirect every pages on a website to another website ?
Actually what I mean is that, I own two websites eg :
1.com
2.com
2.com is my main website. When I add a page to 2.com (eg:2.com/index.html), 1.com ignores it and creates (1.com/index.html) with the redirecting code to 2.com/index.html.
Can I do this ?
Is there any way to do this by php ?
Actually what I need is a script that automatically create files which are added to my 2nd site on my 1st site. So Can I do this with php and mysql or any other scripting or programming language?
If you own both domains you could just both redirect them to your website using a DNS-A-record or whatever and then simply use a server alias (Server Alias) as outlined on apache.org. If the user then visits the domain, he will still see the original domain, which he visited.
Another way would be using a rewrite rule as described by this blog:
RewriteCond %{HTTP_HOST} ^www.2.com$ [NC,OR]
RewriteCond %{HTTP_HOST} ^2.com$ [NC,OR]
RewriteCond %{HTTP_HOST} ^www.2.com$ [NC]
RewriteRule ^(.*)$ http://1.com/$1 [R=301,L]
Then your users would always see 1.com in their address bar.
Impossible to do with PHP, since a PHP code is executed when file is launched, and not when any file on server is launched.
Possible with .htaccess:
RewriteRule (.*) http://www.newdomain.com/ [R=301,L]
Redirecting to www.newdomain.com from every page on your old domain.
See this post for more methods about redirecting.
// Put this script on 1.com and it will redirect to to 2.com or vice versa
<?php
header('Location: http://2.com/index.html');
exit();
?>
If I did not understand your question correctly, let me know and I will help you as best I can.
// Super hack time
<?php
// 1.com
$files = scandir('./*'); // not recursive, note that
$sent = file($files['log.txt']);
unset($files['log.txt']);
$notsent = array_diff($files, $sent);
foreach($notsent as $file) {
$contents = file_get_contents($file);
// Use curl to post to 2.com receiving script http://davidwalsh.name/execute-http-post-php-curl
file_put_contents($sent, $file, FILE_APPEND);
}
?>
Disclaimer: Have not tested, but it is the most direct way to do what I think you want. Again I really don't know why you would want to do this.
The above answer can only be used before any html has been loaded. If you're looking for something that is easier to implement use this:
<script>window.location = 'http://google.com';</script>
I'm not sure if I completely understood your question.
With PHP
header('Location: http://2.com');
With HTML
<meta http-equiv="refresh" content="2;url=http://2.com">
Having provided more information:
Add a CNAME record to the DNS of 1.com with the value of 2.com
I would prefer to setup Nginx web server on 1.com and configure it as a proxy, so 2.com actually handles all requests. Thus you can avoid replicating the whole 2.com on 1.com and at the same time the user browser will not be redirected to 1.com like if you use Location header.
OK, so I'm rewriting some page URLs for a custom PHP cart.
I've got the following rules:
RewriteRule ^category/([0-9]+)/([a-z-]+)$ /store.php?cat=$1 [L]
RewriteRule ^product/([0-9]+)/([a-z-]+)$ /product.php?id=$1 [L]
These will allow me to use a url structure like example.com/product/23/product-slug.
That part is working alright. I'm wondering what options I have for the other direction; redirecting requests for the OLD url to the NEW url. So for example, when someone goes to /product.php?id=2 I want to redirect to /products/2/slug.
Any idea how to get this done?
I tried a simple redirect, but would not work:
Redirect 301 ^/store\.php\?cat=16$ http://www.example.com/category/16/category-slug
Redirect only takes a url prefix, not a regex (e.g. /store.php or /store)
You need to try RedirectMatch:
RedirectMatch 301 ^/store\.php\?cat=16$ http://www.example.com/category/16/category-slug
Also, is it supposed to start with a /? I'm not sure (your RewriteRule entries above start with no slash, for example)
I solved this a different way: with a modification to the store.php file.
I looked at output from print_r($_SERVER) after pinging both the normal and rewritten urls. I found that $_SERVER['SCRIPT_URL'] contains "/store.php" when the normal url is hit and it contains my rewritten path when the rewritten url is hit.
This means I can do a simple test and redirect appropriately:
if ($_SERVER['SCRIPT_URL'] == "/store.php") {
// run some code that will generate the url
$rewrittenURL = generateURL();
// then add:
header("HTTP/1.0 301 Moved Permanently"); // tell spiders this is permanent
header("Location: $rewrittenURL");
}
I have a URL such as http://www.domain.com/index.php?p=register. I want to redirect that to use HTTPS (SSL) with .htaccess, but only on this, and a couple of other pages (the login page, etc), but not the entire site. The URLs don't point to directories, but are used to dynamically include different files.
Can someone give me a pointer or an example of how to get a single page redirect to HTTPS please?
Thanks.
Not htaccess, but another way could be to use PHP to redirect:
<?php
$redirectlist = array('register','login','myaccount');
if (in_array($_GET['p'], $redirectlist) && strtolower($_SERVER['HTTPS']) != 'on') {
exit(header("location: https://{$_SERVER['SERVER_NAME']}{$_SERVER['REQUEST_URI']}"));
}
?>
The only reason I mention this is that, in some cases, this may be easier to maintain than a separate htaccess. You would need to put this in place in your PHP content before any text was outputted (see header()).
something like this should work:
RewriteEngine on
RewriteCond %{SERVER_PORT} !^443$
RewriteCond %{QUERY_STRING} (^|&)p=(register|login|or|other|protected|page)($|&)
RewriteRule (.*) https://www.domain.com/index.php [R=301,QSA,L]
Some explanation:
Check if server port is different from 443 (standard for secure connections), to ensure we are going to redirect only non-secured connections
The query string (everything after ?) have to match pattern: include p variable with one value from pipe separated list
Redirect everything to secure domain, sending 301 response status, appending all query string and marking is as last rule, so any rules below this will not be parsed (since this is redirect, we don't want to take any other actions)
If you have option to follow the php method, I would recommend to follow that or with any other dynamic languages. You must avoid using htaccess since links to images, js and other contact on that page will be forced to be nonSSL and modern browsers would show a non-compliance sign which might look a whitewash over your SSL cost.