4 Websites share the same codebase, each has a unique domain. Some functions may differ depending on the domain, and the same applies to CSS, Javascript and image files.
Now what is the preferred way to get the information in php, which domain is requested? Could anyone point out advantages/disadvantages of the proposals below, or make a better proposal?
Solution 1: Point all 4 domains to the same index.php, and get the domain via $_SERVER['SERVER_NAME']
Solution 2 Like 1., one index.php for all domains, but pass the domain as query parameter via mod_rewrite:
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{HTTP_HOST} ^(.*) [NC]
RewriteRule ^ index.php?site=%1 [QSA,L]
Solution 3: Point the 4 domains to 4 different index php, maybe in 4 different public folders, and define the domain on top of these index.php files.
This may sound trivial or like it doesn't matter much, but I want to be sure to pick the best solution.
We've done this before, with a framework that we built for building websites, which supports using the same installed codebase for N number of sites. We do this by:
Have all domains point to the same index.php file within the same
DocumentRoot.
Have all requests for all pages point to the index.php using .htaccess
and rewrite rules.
Then the index.php file sets up the environment. We have a database table that links the result of $_SERVER['SERVER_NAME'] with a one-to-many relationship to the site being requested. This way, a request to example.com and www.example.com can reference a single site, but example2.com could reference a different site. Once the index.php file determines the site and page being accessed, it grabs relevant information for that request from the database, then creates an associative array called $site[] which stores all of the information for the request being made. This array is then used to determine what assets to read in, like a specific template from the CRM, and which assets to read in that are for that specific site and page (like CSS, JS, etc).
Related
i read the other threads about this topic but i couldn't find anything that i felt could apply in my case. It seams pretty basic though. I have a website built in php with apache server. In this moment all the traffic is done via http. The people who paid for the site, now want to move it to https. They bought a certificate and the web server hosts will install it. What changes do i need to make to make it work via https, besides changing the redirects within the code?
I also found this link which seems pretty helpful, but i think maybe it's too complex?
Thank you,
Alex
You should change your resource links (like external JavaScript references such as jQuery) in the site where there are hard-coded paths in http://domain.name/some-link-here to just //domain.name/some-link-here. This will prevent the browser from complaining about mixed-mode content.
For links that are on the same domain, you could use absolute/relative URLs.
After that you can place and .htaccess such that any URLs accessed on the domain would automatically redirect to the HTTPS version. Place the following lines as the first rule in the file
.htaccess code:
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
The .htaccess will also take care of any hard-coded links (towards the same domain/site) that you might have in your site and that you have missed.
I have a system that handles multiple domains under one file structure and each domain would need different sitemap and robots.txt.
For sitemap I have set up a redirect and I know it works well. I would like to confirm that same is possible with robots.txt?
I have added a rewrite rule in .htaccess that redirects person to a php page. On this php page I find what domain user has and print out correct information with text header.
Is this allowed?
Extra info:
I have a codeigniter application that is used by domainA and domainB. while domainA should see robots for domainA, domainB should see robots for domainB. And if i am to create robots.txt in the root of the site both domainA and domainB would have access to it, due to that I have created a separate php page to give out correct robots for domainA and B.
In .htaccess i have a rewrite rule similar to:
RewriteRule ^robots.txt$ func/getRobots/$1 [L]
After looking around I was able to identify that other people do it:
http://www.aberdeencloud.com/docs/guides/how-use-different-robotstxt-development-and-live-environments
.htaccess and robots.txt rewrite url how to
https://webmasters.stackexchange.com/questions/61654/redirect-google-crawler-to-different-robots-txt-via-htaccess
I just want to be sure that it wouldn't damage SEO side of the system.
You damage your SEO when search engines are not capable of accessing your robots.txt at all.
Not using standard practices regarding the location of your robots.txt is equivalent to taking a risk. Some search engines may handle your case well, others may frown and find it suspicious.
In one of the links you mention, a solution is suggested to avoid rewriting the URL. I would stick to it.
I am trying to make a site where users, after registration, have a personalized page with a subdomain. However, I want the same main site code to be used and content that is displayed to be generated depending on what subdomain the user visits, not by having copies of the code for each subdomain.
If it is unclear, what I want is similar to how blog hosting sites have subdomains, setting and everything without the need to do a fresh installation of the actual blog script.
I have PHP, MySQL knowledge and I use Codeigniter as my PHP Framework. What I do not know is how to achieve this effect without duplicating files.
Try putting this in your .htaccess file.
RewriteEngine On
RewriteCond %{HTTP_HOST} !^$.example.com [NC]
RewriteRule ^(.*)$ http://example.com/profile?id=$1 [R=301]
NOTE: Not sure if this'll work. Wrote it on the spot.
Make sure you have the profile system so that it loads id with a GET request of the /profile/ directory.
I want to e-commerce platform like www.bigcommerce.com and www.buildabazaar.com. I have done with the backend coding. I have added a unique identification number for each customer to differentiate their products and choices. When a customer registers, a folder with his unique ID will be created in my domain, where all his images and styles will be saved. I have created like this. But i am stuck when a customer buys a plan from me and he will change name servers , he should get his site to be displayed in his domain.
I don't know how these things works. Please someone suggest me how to go about it.
You need to use Apache's mod_rewrite and define a rewrite rule per domain to map it to the correct folder.
Something like this:
RewriteCond %{HTTP_HOST} ^www\.example\.com$ [NC]
RewriteRule ^/?(.*)$ /customer/private/folder/$1 [L]
I cannot say for sure it will work perfectly for your setup, so you should read up on mod_rewrite. It is very powerful and should handle your situation.
you must use mod_rewrite in order to do that (virtually), not by creating a real folder on the file system!
Edit: if you are using Apache as the web server.
I have been looking for a while now to find a solution to accomplish the following system.
I would like to build a PHP system on, let's say, domainA. On this domainA I will allow an administrator to 'create' a new website. This website contains only of pieces of text, which are all stored in a database. This, I know how to do.
However, now I would like to make it possible that visitors that surf to domainB will be invisibly redirected to for example domainA.com/gateway.php?refdomain=domainB&page=xxx or something similar.
I have a vague idea this should be done by .htaccess, but I don't really know how I can do this in the most easy way. If for example there are some POST or GET requests on domainB, this should keep working. Also images linked to http://www.domainB.com/test.gif should be invisibly loaded form www.domainA.com.
I also know there are some CMS systems (eg drupal) which allow this feature, so it is possible, I just don't know how.
Thank you for any advice that might point me in the right direction,
kind regards,
Digits
Are you hosting both of these on the same machine? If so, something like VirtualHosts in Apache could solve this for you.
mod_alias and mod_rewrite might also be of some use to you.
Basically, you'll want to point all your domains to the same directory (maybe using a wildcard in your vhosts) and then setup urlrewrite; look at this question for an example, and it can be in a .htaccess file or Apache configuration.
All requests that come in will go to the same gateway.php and you can extract the current domain and requests using $_SERVER['REDIRECT_QUERY_STRING'], $_SERVER['REQUEST_URI'] and $_SERVER['SERVER_NAME'] for example. See $_SERVER. You'll then be able in your gateway.php to send the correct files.
If you use a CMS like Drupal, you should be able to assign these using the Portal Alias. By using the alias you will be able to assign different domains to point to different "sites" that are created.
OK, here's a really simple example:
RewriteCond %{HTTP_HOST} .
RewriteCond %{HTTP_HOST} !^www\.domainA\.com
RewriteRule (.*) http://www.domainA.com/gateway.php?realpath=$1 [L,QSA]
You could then parse "realpath" in your gateway script using parse_url and take the appropriate actions.
You could get more complex with your rewrite rules to have separate ones for images, etc. if you wanted to
You could use the redirect header..