Is there any way to redirect every pages on a website to another website ?
Actually what I mean is that, I own two websites eg :
1.com
2.com
2.com is my main website. When I add a page to 2.com (eg:2.com/index.html), 1.com ignores it and creates (1.com/index.html) with the redirecting code to 2.com/index.html.
Can I do this ?
Is there any way to do this by php ?
Actually what I need is a script that automatically create files which are added to my 2nd site on my 1st site. So Can I do this with php and mysql or any other scripting or programming language?
If you own both domains you could just both redirect them to your website using a DNS-A-record or whatever and then simply use a server alias (Server Alias) as outlined on apache.org. If the user then visits the domain, he will still see the original domain, which he visited.
Another way would be using a rewrite rule as described by this blog:
RewriteCond %{HTTP_HOST} ^www.2.com$ [NC,OR]
RewriteCond %{HTTP_HOST} ^2.com$ [NC,OR]
RewriteCond %{HTTP_HOST} ^www.2.com$ [NC]
RewriteRule ^(.*)$ http://1.com/$1 [R=301,L]
Then your users would always see 1.com in their address bar.
Impossible to do with PHP, since a PHP code is executed when file is launched, and not when any file on server is launched.
Possible with .htaccess:
RewriteRule (.*) http://www.newdomain.com/ [R=301,L]
Redirecting to www.newdomain.com from every page on your old domain.
See this post for more methods about redirecting.
// Put this script on 1.com and it will redirect to to 2.com or vice versa
<?php
header('Location: http://2.com/index.html');
exit();
?>
If I did not understand your question correctly, let me know and I will help you as best I can.
// Super hack time
<?php
// 1.com
$files = scandir('./*'); // not recursive, note that
$sent = file($files['log.txt']);
unset($files['log.txt']);
$notsent = array_diff($files, $sent);
foreach($notsent as $file) {
$contents = file_get_contents($file);
// Use curl to post to 2.com receiving script http://davidwalsh.name/execute-http-post-php-curl
file_put_contents($sent, $file, FILE_APPEND);
}
?>
Disclaimer: Have not tested, but it is the most direct way to do what I think you want. Again I really don't know why you would want to do this.
The above answer can only be used before any html has been loaded. If you're looking for something that is easier to implement use this:
<script>window.location = 'http://google.com';</script>
I'm not sure if I completely understood your question.
With PHP
header('Location: http://2.com');
With HTML
<meta http-equiv="refresh" content="2;url=http://2.com">
Having provided more information:
Add a CNAME record to the DNS of 1.com with the value of 2.com
I would prefer to setup Nginx web server on 1.com and configure it as a proxy, so 2.com actually handles all requests. Thus you can avoid replicating the whole 2.com on 1.com and at the same time the user browser will not be redirected to 1.com like if you use Location header.
Related
Accessing the site without "www." counterpart in browser,
as in "site.com", will still echo "www.site.com" from $_SERVER['SERVER_NAME'];
This might seem like not a big deal, just determine this on front-end when making your ajax calls, using something like location.href, right?
Or simply remove "www." from SERVER_NAME. Good solution, but doesn't address the primary issue. Now PHP code must rely on JavaScript? Shouldn't back-end be able to determine yes-www vs no-www address?
Additionally, when you have <form action = "URL">, let alone PHP that generates code that must make calls to the server, the solution that uses $_SERVER['SERVER_NAME'] will produce cross-domain security errors.
So -- the question is -- is there a way in PHP to automatically determine whether www. part was or wasn't entered into the browser's address bar as part of the domain name?
The problem with this is that now you have server-side PHP code rely on JavaScript, that sometimes might be disabled in the browser. And that just in general, sounds like awkward practice.
Yes this is entirely possible.
<?php
//Will be true if www. exists, and false if not.
$host_has_www = (strpos($_SERVER['HTTP_HOST'], 'www.') !== false) ? true : false;
if ($host_has_www == true) {
//Do something
}
?>
To force www edit .htaccess
RewriteCond %{HTTP_HOST} !^www\.
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]
then every request will always have www or use a similar method to remove www
I'm working with a client who has a site with many subdomains representing different areas covered in his locksmith business. He picks up a lot of traffic from directory websites, and wants to use his domain only as the link on these websites. When someone clicks it, he wants them to be redirected based on a keyword in the referring URL.
For example, a referring Yell URL could be
yell.com/ucs/UcsSearchAction.do?keywords=locksmith&location=Watford%2C+Hertfordshire&scrambleSeed=1311994593
Client wants htaccess or something similar to pick out the keyword 'Watford' from that URL, and redirect to watford.hisbusiness.com accordingly.
This isn't something I've done before and I'm baffled. Research found no clues.
You can check HTTP_REFERER to grab information from the referring URL.
RewriteEngine on
RewriteCond %{HTTP_REFERER} yell\.com/.*\?.*\&location\=(\w+)\%2C\+(\w+)
RewriteRule ^$ http://${lower:%1}.hisbusiness.com/ [R=302,L]
The ${lower:$1} is used to make Watford lowercase. In order for this to work, you'll need to add the following to your httpd.conf or virtual host configuration file:
RewriteMap lower int:tolower
Note: The rule in place above is designed for the domain root (hisbusiness.com) only - that is to say that a request to hisbusiness.com/something won't trigger the redirect. If you'd like it to check for the URI as well, use the following rule instead:
RewriteRule ^(.*)$ http://${lower:%1}.hisbusiness.com/$1 [R=302,L]
To make the redirect permanent and cached by browsers/search-engines, change 302 to 301.
Use Header on PHP using your required conditions:
if(condition 1){
header("Location: http://mywebsite1.com");
}
if(condition 2){
header("Location: http://mywebsite2.com");
}
else{
header("Location: http://mywebsite3.com");
}
You can use [stristr][1] on the if condition.
So, i have some PHP script pages that are accessed trough AJAX, others trough POST or GET and are used to send emails and access the database, and although i know that a search engine probably wont have interest in this pages i do not want it to even know that those exist.
I want a solid way to separate the pages that should be seen by a search engine and the ones that shouldn't.
I've seen Matt Cutts video (https://www.youtube.com/watch?v=nM2VDkXPt0I) in which he explains that the best way to prevent a page to viewed by Google is by using .htacess with password protection... The problem is that my script pages must be accessed by users.
Id like to know if there is a solution that only involves .htacess once in this video Matt Cutts explains that noindex, robots.txt are not very effective.
So the solution must follow the rules:
Use only .htacess (or something that works, but with no exceptions)
No HTML tags because of the specific response I'm getting in .responseText (these pages don't even have html, just php)
Allow single page restriction (not full directories for example)
Allow user access
I've searched a lot, and seen many solutions out there, but nothing that works for me, so, any ideas ?
Create a directory for your ajax pages and then set the htaccess to block Google from accessing it.
For directory redirects:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^googlebot
RewriteRule ^/ajax/ - [F,L]
For single page redirects:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^googlebot
RewriteRule ^([^/\.]+)/?$ yourpage.php [L]
Just in case you want to redirect multiple files (as i assume you do)
RewriteCond %{HTTP_USER_AGENT} ^googlebot
RewriteRule ^(file1|file2|file3|file4)\.html$ http://www.yoursite.com [R=301,NC,L]
Hope this helps.
Note that this must be uploaded to the parent directory and not the ajax folder.
Editing for a different solution, as you seem keen on single file redirects, you could return a PHP 301 redirect if a search engine bot enters your site
function bot_detected() {
if (isset($_SERVER['HTTP_USER_AGENT']) && preg_match('/bot|crawl|slurp|spider/i', $_SERVER['HTTP_USER_AGENT'])) {
return TRUE;
}
else {
return FALSE;
}
}
if(bot_detected() {
header (“http/1.1 301 Moved Permanently”);
header (“Location: http://www.yourwebsite.com”);
}
My company uses Xerox Docushare for document management. We are consolidating 2 docushare servers into one. Assuming users have a lot of docushare pages bookmarked in their browser, is it possible to place a php file in the root folder which will receive all these requests and perform a redirect.
For example
http://old-server/docushare/dsweb/View/Collection-xxxx
would get redirected to
http://new-server/docushare/dsweb/View/Collection-yyyy
The collection-xxxx to collection-yyyy would probably come from a file we intend to generate as part of the conversion.
I did take a look at
http://php.net/manual/en/function.header.php
but that is on a url level whereas i am looking to convert all requests on the older path.
Thanks.
By my opinion, the simplest way is to put .htaccess file. In the root of your document root
RewriteEngine on
Options +FollowSymLinks
RewriteCond %{HTTP_HOST} .
RewriteCond %{HTTP_HOST} ^old-server
RewriteRule http://old-server/docushare/dsweb/View/(.*)$ http://new-server/docushare/dsweb/View/$1 [R=301,L]
For more inspiration check this page
The PHP way
In front controller or whatever is hitten as first by web server, will be condition, using $_SERVER variable, similar to this
if($_SERVER['SERVER_NAME'] == 'old-server')
{
$redirectionPath = str_replace('http://old-server/docushare/dsweb/View/', 'http://new-server/docushare/dsweb/View/', $_SERVER['SERVER_NAME'].$_SERVER['REQUEST_URI']);
header(sprintf('Location: %s', $redirectionPath), 301);
}
This is the ugly way and you should not use it unless you have no other choice. Not to mention my blind written code ;)
I don't know exactly in what situation you are, but i think the .htaccess file solution solves issue you are experiencing
I have a URL such as http://www.domain.com/index.php?p=register. I want to redirect that to use HTTPS (SSL) with .htaccess, but only on this, and a couple of other pages (the login page, etc), but not the entire site. The URLs don't point to directories, but are used to dynamically include different files.
Can someone give me a pointer or an example of how to get a single page redirect to HTTPS please?
Thanks.
Not htaccess, but another way could be to use PHP to redirect:
<?php
$redirectlist = array('register','login','myaccount');
if (in_array($_GET['p'], $redirectlist) && strtolower($_SERVER['HTTPS']) != 'on') {
exit(header("location: https://{$_SERVER['SERVER_NAME']}{$_SERVER['REQUEST_URI']}"));
}
?>
The only reason I mention this is that, in some cases, this may be easier to maintain than a separate htaccess. You would need to put this in place in your PHP content before any text was outputted (see header()).
something like this should work:
RewriteEngine on
RewriteCond %{SERVER_PORT} !^443$
RewriteCond %{QUERY_STRING} (^|&)p=(register|login|or|other|protected|page)($|&)
RewriteRule (.*) https://www.domain.com/index.php [R=301,QSA,L]
Some explanation:
Check if server port is different from 443 (standard for secure connections), to ensure we are going to redirect only non-secured connections
The query string (everything after ?) have to match pattern: include p variable with one value from pipe separated list
Redirect everything to secure domain, sending 301 response status, appending all query string and marking is as last rule, so any rules below this will not be parsed (since this is redirect, we don't want to take any other actions)
If you have option to follow the php method, I would recommend to follow that or with any other dynamic languages. You must avoid using htaccess since links to images, js and other contact on that page will be forced to be nonSSL and modern browsers would show a non-compliance sign which might look a whitewash over your SSL cost.