Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I want to block search engines like Google and Yahoo from crawling user sub.domains like user.example.com, how can i do it?
Use robots.txt file in your web server.
So, in your subdomain put a robots.txt file that looks like this:
User-agent: *
Disallow: /
All sites have something specifically specified because that's what they don't want crawled by search engines...
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
Two HTML web pages generated using PHP.
One script that creates a footer which is referenced by both web pages.
Does a crawler consider the footer HTML content once or twice?
Crawlers don'see or care about server-side code. They only see the content output by a requested URL just like any user-agent.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
On our website, we use page names with GET attributes for URL's, for example "page.php?index=43". We want to use rewrite rules in the htaccess file so that we can type "/pages/some-page-title" for the same effect. However, we also want to keep our Google rankings from the previous URL's. Is there a way we can achieve that?
If you make a proper redirect, with a 301 return code from the old urls to the new urls, the ranking should not be affected.
https://support.google.com/webmasters/answer/83105?hl=en
However, this is every SEO'ers nightmare, because what if... :) I think you just have to trust Google's promise that ranking will be kept when you create proper, working redirects.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I have a relatively straight forward task, but one which I help with so I don't mess up the whole SEO.
I have a current set of products which are assigned a category: /products/internal/$1 - $1 being any individual product.
However, what I need to do is actually 301 redirect everything to /product/ with the "s", so:
/products/internal/$1 to /product/internal/$1
As ever, I've read a few other threads, but unsure what applies if you know the being structure of the URL.
RedirectMatch 301 /products/internal/(.*) /product/internal/$1
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I have coded a web directory that uses GET to generate all pages of information from a database. Is it possible for these pages to be indexed individually by a search engine?
Yes. If you can link directly to it, then a search engine can index it (unless you take other steps to explicitly exclude them).
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
Hello I need to check all websites contains "tomer" word in url. I need this for some copyright issues about my company.
For example when I search in google "tomer" , it should give me only "tomercompany.com", "anothertomercompany.com" etc. How can I do that? Any ideas will be appreciated, thanks.
Use the :inurl operator
If you include inurl: in your query, Google will restrict the results
to documents containing that word in the URL
https://www.google.no/#q=inurl:tomer