I work as a web developer, creating custom websites for customers. My .htaccess files are generally very simple. I generally only do 2 things:
Create a custom ErrorDocument for 404 errors
Redirect the /upload directory to /admin/upload to simplify some shared code between the public website and the admin system
I have everything working okay, other than one slight annoyance I'm wondering if anything can be done about. I'm using php, and I try to design the websites in a modular fashion, so that it doesn't matter whether the code is hosted at "example.com/" (the real domain), "localhost/exampleproject2" (on my development machine), or "mycompanywebsite.com/example.com" (Where we host a "beta" version for the customer to view when we get to that point in development).
From the PHP side of things, I have no issues here. But so far as Apache is concerned, I have to create 3 slightly different .htaccess files for each of those domains:
ErrorDocument 404 /404.php
Redirect 301 /upload /admin/upload
ErrorDocument 404 /exampleproject2/404.php
Redirect 301 /exampleproject2/upload /exampleproject2/admin/upload
ErrorDocument 404 /example.com/404.php
Redirect 301 /example.com/upload /example.com/admin/upload
Obviously this is not a huge deal, but it is a bit annoying. Mostly when I or someone else transfers files over FTP and accidentally overwrites the .htaccess from one environment with it's counterpart from another environment, and the site breaks. Also it means I can't include .htaccess in my git repository without possibly breaking things.
From my testing, relative urls don't seem to work at all. I think maybe Apache just doesn't support relative urls in .thaccess files for Redirects and ErrorDocuments. At least that's the way it seems. I get a server error when I try.
I'd like a way to have a single .htaccess file I can share between all 3 environments and include in the git repository. If anyone knows a good way to accomplish this that would be awesome. Solutions don't have to use .htaccess, I'm open to any solution that makes my code less dependent on absolute paths or urls.
Related
I have found a lot of information how to redirect example.com to www.example.com. Two methods DNS and .htaccess. The problem is that most answers do not provide hosting service type and seems different people recommend different option. I know .htaccess is no brainer for Shared Hosting.
Does anybody know which method is better for VPS?
In order of least to most advisable places to issue an HTTP 30X:
DNS
Well, it's impossible to perform an HTTP redirect with DNS, so this is right out.
Application [eg: in PHP]
You can do it here, but really you want to do as little host-based muckery in the application, mostly of the sake of separation of concerns.
.htaccess
You can totally do it here, but you probably shouldn't use .htaccess at all if you can avoid it.
Server/Vhost Config
Yes, this is where you should do it. Rewrites and redirects are parsed once at server start, and handled by highly-efficient code.
Very Late Edit:
I'd like to add that performing redirects in the application code is not necessarily a bad thing, but for simple www. and other host/infrastructure-level redirects the simplicity can't be beat.
eg: foo.com to www.foo.com and :80 to :443
However, if you're also doing application redirects you need to have a bulletproof division between what redirects are handled by the the server and which are handled by the application. Otherwise you can descend into a circle of hell where the application and server disagree on how to redirect.
If that division is anything less than absolute then you're probably better off simply letting the application manage redirects entirely.
These types of suggest questions are frowned upon, but, the domain registry should handle both WWW and non-WWW domain names. Then at your server, you can choose to respond to both, or 301 redirect one to another via htaccess
I'm developing a new website on a server that uses ~ for user directories. So the new site (old one is still online while developing) is hosted under domain.com/~cms/. Is that a problem? I never saw that before ... I know its not from outerspace but ... kind of uncommon isn't it?
I need to know if there is a proper solution via .htacces Redirect to rewrite all requests to that subdirectory BUT leave some old URIs like domain.com/dir1 or domain.com/dir2 untouched ...
Is that possible? I know some basic .htaccess stuff but since the live page is still up I dont want to fool around on this server without beeing sure :)
Thx!!
i read the other threads about this topic but i couldn't find anything that i felt could apply in my case. It seams pretty basic though. I have a website built in php with apache server. In this moment all the traffic is done via http. The people who paid for the site, now want to move it to https. They bought a certificate and the web server hosts will install it. What changes do i need to make to make it work via https, besides changing the redirects within the code?
I also found this link which seems pretty helpful, but i think maybe it's too complex?
Thank you,
Alex
You should change your resource links (like external JavaScript references such as jQuery) in the site where there are hard-coded paths in http://domain.name/some-link-here to just //domain.name/some-link-here. This will prevent the browser from complaining about mixed-mode content.
For links that are on the same domain, you could use absolute/relative URLs.
After that you can place and .htaccess such that any URLs accessed on the domain would automatically redirect to the HTTPS version. Place the following lines as the first rule in the file
.htaccess code:
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
The .htaccess will also take care of any hard-coded links (towards the same domain/site) that you might have in your site and that you have missed.
I have a system that handles multiple domains under one file structure and each domain would need different sitemap and robots.txt.
For sitemap I have set up a redirect and I know it works well. I would like to confirm that same is possible with robots.txt?
I have added a rewrite rule in .htaccess that redirects person to a php page. On this php page I find what domain user has and print out correct information with text header.
Is this allowed?
Extra info:
I have a codeigniter application that is used by domainA and domainB. while domainA should see robots for domainA, domainB should see robots for domainB. And if i am to create robots.txt in the root of the site both domainA and domainB would have access to it, due to that I have created a separate php page to give out correct robots for domainA and B.
In .htaccess i have a rewrite rule similar to:
RewriteRule ^robots.txt$ func/getRobots/$1 [L]
After looking around I was able to identify that other people do it:
http://www.aberdeencloud.com/docs/guides/how-use-different-robotstxt-development-and-live-environments
.htaccess and robots.txt rewrite url how to
https://webmasters.stackexchange.com/questions/61654/redirect-google-crawler-to-different-robots-txt-via-htaccess
I just want to be sure that it wouldn't damage SEO side of the system.
You damage your SEO when search engines are not capable of accessing your robots.txt at all.
Not using standard practices regarding the location of your robots.txt is equivalent to taking a risk. Some search engines may handle your case well, others may frown and find it suspicious.
In one of the links you mention, a solution is suggested to avoid rewriting the URL. I would stick to it.
So my site is ready and it's going live in 2 days the last thing on the todo list is:
deny access to /inc/ /classes/ /sources/ so as i always do i investigated, and found that none of the big websites denying access to the private folders.
i'v trolled around the web and found no one mentioning this method for denying access by saying 'this directory doesn't exists'.
is it safe to show 404 error pages for directories?
and if it is safe... how can i do it.
and if not what is the best way to secure directories?
The best way not to allow access is by actually moving them outside the document root, so there's no direct path through the web to those files. Some hosters don't allow this, to which there are 2 solutions: (1) go to another hoster, plenty of proper ones, or (2) the hard way, indeed, give annoying crackers as little information as possible, and go for the standard 404, you can set it up in .htaccess files.