Does the spiders indexing your website (google bot...) have a "culture"? - php

This is a SEO question :
i've the choice to display a page's title according to the culture of the visitor.
If it's an english :
<title>
<?php if ($sf_user->getCulture() == 'en') : ?>
Hello, this is an english website
<?php else ?>
Bonjour, ceci est un site français
<?php endif ?>
</title>
Does the bots/spiders has a culture ?
Does that means that on Google uk my website page will be :
"Hello, this is...."
and on Google france this will be
"Bonjour...."
Thank you
EDIT:
Anyone visiting my website will see it in English, except for France, Belgium, and maybe Canada. It can be done because getCulture() returns browser accepted & preferred languages
EDIT2:
When a user opens my website (based on HTTP_ACCEPT_LANGUAGE) :
<?php $culture = $request->getPreferredCulture(array('en', 'fr'));
$this->getUser()->setCulture($culture);
$this->getUser()->isFirstRequest(false); ?>

Please see working with multi-regional websites from the Official Google Webmaster blog. The best way to handle multiple languages is not to dynamically return different languages, but rather to have distinct domains or distinct URLs for each language. If you want to give visitors a single landing page, consider having that page redirect to the language-specific page. Also, to maximize crawling, consider having links that easily allow a user to switch to different language version of the same page.

A bot views the page in the default localization you've set up, since it's not logged in. (How would your page know which visitor comes from which country? You might be able to hack in something using a geo-ip lookup, if you wanted).
How does your site appear to non-registered in visitors?

The googlebot indexes the language it finds on your site without any login or registration. Therefore if the default view of your site is English, you'll only have English content in the Google index. This post gives more background on how sites are crawled.
The key is to provide links on your site that the bot can follow which will lead it to your content in all its various languages.
In answer to your question, no, the googlebot will not have culture since this is determined by your application and a user's preference within your application.

Related

Google search doesn't show Arabic home page search results in multilanguage website built with Yii1

I have built a multilanguage website using Yii1 PHP framework, it supports both Arabic and English. Every URL in the site has a form: www.example.com/lang/(title_of_page OR something like slug for the articles/news)
Except home page for English and Arabic, that has the same url: www.example.com. The user can change the language, so the language of site will be changed and the page will be reloaded with another language, but it keeps the same url.
Problem: Home page with Arabic language doesn't appear on Google Arabic search but page with English does.
I have used xml-sitemap online tool to make a sitemap file from website URLs but I found that all Arabic URLs couldn't be crawled.
Does this problem appear because I have the same URL for home page for every language or could be another reason?
I'm no SEO master, but it may be the cause, that the site's language depends on cookies and don't know how Google likes that.
Searched a bit for an official information and I found this link of Google which states:
Keep the content for each language on separate URLs. Don’t use cookies
to show translated versions of the page. Consider cross-linking each
language version of a page. That way, a French user who lands on the
German version of your page can get to the right language version with
a single click.
So the answer is simple, don't use the same URL when changing language on the home page. I don't know your site, what it's main language, but I think you should make a primary language with www.example.com URL and secondary languages with base URL like www.example.com/lang/.

Let facebook choose the right language with my share link

I have a multi-languages website (en_US + fr_FR), I made it using CakePHP.
The first time a user comes to my website, the language will be define using the $_SERVER['HTTP_ACCEPT_LANGUAGE'], or the default language (en_US) if $_SERVER['HTTP_ACCEPT_LANGUAGE'] is not set.
So the problem is that my OpenGraph tags are differents depending on the language, but the Facebook scraper doesn't have any $_SERVER['HTTP_ACCEPT_LANGUAGE'] var... so he'll always choose the default language (en_US), even if the user that clicked on the Share Link was French.
So my question is :
How can I force Facebook to choose a different language when scraping my webpage.
I already took a look at this url : https://developers.facebook.com/docs/opengraph/guides/internationalization/ but I didn't understand anything...
Please, help !
(actually I'll have to do the same thing with Google+ share link too... but Facebook is more important right now)
EDIT :
My webpage URL doesn't change when the language changes. It's always http://my-website.com/
EDIT 2 :
I think I managed to make it works.
The answer was really in the doc : https://developers.facebook.com/docs/opengraph/guides/internationalization/ but it was really hard to understand.
But now how can I change my language in Facebook to check that it really works (because now, it always gives me the French translation instead of the default language) ?
Actually, I had to add a way to set the website language using the URL :
http://my-website.com/en/ or http://my-website.com/?lang=en
It is much more easier this way to share my website in different languages AND this is a recommandation from Google

Multi-Lingual Site using Cookies

im a newbie to php/javascript/jquery,etc. How do I implement cookies into a multilingual/multiregional (wordpress.org) website layed out with architecture like explained below?
http://domain.com/ takes users to a page that asks for which country they are in, they have the option of four different countries, Canada, UK, Australia and US. Then if for example, they select Canada, they are taken to http://ca.domain.com/ and it asks for language, English, French, Spanish and below that is a dropdown for other languages. If they click english, they will be taken to the final result of a wordpress blog at http://ca.domain.com/en/
How may I set cookies so that next time they visit http://domain.com/ or even jump to http://ca.domain.com/ that it would immediately send them the the wordpress blog that they saw last time, also would these cookies (not the same one set on their desktop) be able to work if a user is a mobile client? Can I also make a link back to the language and region selection process?
Thanks in advance!
By the Way, I am using WP Multisite for the countries and maybe set up some sort of php translation at the end of the subdomain multisite urls
Extension to question: Remember language selection - then redirect to home page in subsequent visits

Same Google +1 button likes count for same page in different languages

I have a website that is avaliable in two languages, english and portuguese.
The website is configured so that the Google +1 button likes count is the same no matter what the language the reader is reading the web site, but this gives me a problem, because I must choose only one URL for the Google +1 button, which will be in only one of the avaliable languages: portuguese or english, not both. Examples of the URLs that I use to configure the Google +1 button URL (href) are below:
Portuguese url: www.website.net/the-page
English url: www.website.net/en/the-page
With this, when the user click on the Google +1 button, she ends up sharing the page in the language that was configured in the Google +1 button, which may not be in the same language preferred by the user.
To make things clearer, these are currently the possible scenaries of liking my website with the Google +1 button:
1. Google +1 configured with the english URL version: brazilian users would share the post/page in english (BAD!), and american users would share the post in english (OK).
2. Google +1 configured with the portuguese URL version: brazilian users would share the post/page in portuguese (good), but american users would share the post in portuguese (not OK).
How can I (if it is possible) make Google +1 button likes count be the same for both languages and still let the user share the page in his/her own preferred language?
I don't think there is any way to have the +1 button share different URLs but have the same count, so you need some way of determining the language to display that is not based on the URL. The best way to do this is to examine the "Accept-Language" header from the HTTP request, and serve up the portuguese page if the language is pt, and the english version otherwise. Something like the following (untested code) on www.website.net/the-page:
if ($_SERVER['HTTP_ACCEPT_LANGUAGE'] != 'pt') {
header('Location: /en/the-page');
}

One adress (www.domain.com) for the whole website ? Do you recommend it for SEO?

Hello I'm planning to developp a communication platform fully in Ajax and Long-Polling.
There will be no full page reloading !
So the website adress would always be www.domain.com
Do you recommend that for SEO ?
Forget about SEO, what about your visitors - will people be able to bookmark a page on your site and get back to where they want to be? Will they be be able to email a link to their friends to show them something?
Its not just Google that likes to have direct URLs to visit. Those direct URLs are vital for SEO, but they're also important for your human visitors too.
Google has a full specification on how to make ajax-powered sites like this crawlable.
The trick is to update window.location.hash with an escaped fragment whenever you want specific content to be linkable, and treated as its own page, without having to reload. For example, Twitter rewrites their URIs from http://twitter.com/user to http://twitter.com/#!/user.
From an SEO standpoint these are both valid and will be regarded as its own separate page. They can be directly linked to, and be used in browser history navigation. If you update your meta-data (keywords, description etc.) and sitemaps accordingly, SEO will be the least of your worries.
As long as you can generate a fully qualified link for each page, you should be fine if you generate a sitemap including those links and submitting it to google.
If you look on Twitter and FB, they #! in the URL so Google still crawls pages
If it's mostly using Ajax for content population, loading and state changes, then it's probably a bad model for SEO purposes anyway. Somewhat of a moot point by nature, no?

Categories