Showing my website in a users language - php

I have a small search engine site and I was wondering if there was any way of displaying my site in the users language. I am looking for an inventive and quick way that can also reside on just one URL.
I hope you can understand my question.

You could use the HTTP header "Accept-Language", to detect which languages the user has choosen as its prefered ones, in his browser.
In PHP, this will be available (if sent by the browser) in $_SERVER, which is an array that contains (amongst other things) HTTP headers sent by the client.
This specific header should be available as $_SERVER['HTTP_ACCEPT_LANGUAGE'].

I am assuming you already have different versions of the site in various languages. Most sites seem to just ask the user what their language is and then save that in a cookie. You can probably guess a users language using an ip to location tool.
You are probably more interested in this though: http://techpatterns.com/downloads/php_language_detection.php. This php script allows you to detect the users language based on info sent from their browser. It might not be completely accurate though, so you should always have an option to switch the language.
If you don't have translations of your page, you can redirect users to a google translate page.

There is a really easy solution for this. Just use Google's Translate Elements JS addon. You drop the JS on the page and Google takes care of the rest.
http://translate.google.com/translate_tools
The only downside is that they cannot fully interact with the site using this. By that I mean they cannot input something in their own language and you get back the input in yours. Also searches will have to be done in the sites native language. So really this just depends on what you are trying to accomplish here.

You could use a script which checks for a language cookie.
If language-cookie is set, you can use that value for using the right language-vars,
if not you find out the users current language by a way, you prefer. I think there are lot of ways, dont know which is the best.
Additional you would place a form somewhere on the site, where the user can klick a language, and u give that by post to a script which then sets a cookie, or overwrites the current cookie, if there is one allready.
This method obviously works with one url for all your languages, which i think is quite nice about it...

Related

Share login between PHP and ASP Classic (VBScript)

I'm trying to update/maintain an older web site that was initially written in Classic ASP/VBScript, and later had PHP pages added. I'd like to set it up so that PHP handles the login, but then that logged in state can be shared between PHP and ASP/VBScript. Note that the pages and languages are fairly intermingled -- somebody spending time on the site might come across several different pages in each language, in no particular order.
(Eventually I expect it to be completely rewritten in PHP, but I have to eat this elephant one bite at a time; and for now I'm simple trying to improve security.)
Let's assume I've successfully logged in and validated the user in PHP using something like phpPass. How do I tell the ASP/VBScript page they just pulled up that they're logged in? How can I best do this securely?
(And thank you for any help!)
You cannot share sessions across Classic ASP/VBScript and PHP as they create/use them differently. My solution isn't that secure but would work:
Log the user in via 1 of the languages (say PHP)
Pass the initial session variable to a URL and get ASP to look at the querystring and then create another session for ASP there.
That would deal with it...although not that secure!
The best answer I've been able to find for this issue was the following. Specific to sharing a login between Classic ASP and ASP.net, but the methodology is exactly the same:
As you probably already know, classic asp and asp.net cannot share the
same session state, so you do need to have a mechanism to log from one
into the other.
What I would do is: when someone logs in, create a unique GUID that
you save in the database for that user. When you jump from one site to
the other, pass that GUID into the query string. When you try to
auto-log them into the other site, look up that GUID and see if it's
attached to anyone. If it is, log them in.
This way you aren't passing anything that a user could guess or
decrypt.
Additionally, it's smart to add a timestamp to the database as well; and the GUID should only be valid for a second or two. Log in on the PHP end, then flip over to ASP and check the GUID.
Not totally secure, but appears to be about as secure as I'm going to find.
source: https://stackoverflow.com/a/921575/339440
Edit to add: per comments, also record the user's IP address to the database and compare it on the ASP side. No teleporting allowed!
CORRECTION: In this case "GUID" is a misnomer. What you need here is a random string of characters, not a GUID. A GUID is a semi-random construct with one of a handful of specific formats, and is not applicable here.

How to determine real user are browsing my site or just crawling or else in PHP

I want to know whether a user are actually looking my site(I know it's just load by the browser and display to human, not actually human looking at it).
I know two method will work.
Javascript.
If the page was load by the browser, it will run the js code automatically, except forbid by the browser. Then use AJAX to call back the server.
1×1 transparent image of in the html.
Use img to call back the server.
Do anyone know the pitfall of these method or any better method?
Also, I don't know how to determine a 0×0 or 1×1 iframe to prevent the above method.
A bot can access a browser, e.g. http://browsershots.org
The bot can request that 1x1 image.
In short, there is no real way to tell. Best you could do is use a CAPTCHA, but then it degrades the experience for humans.
Just use a CAPTCHA where required (user sign up, etc).
I want to know whether a user are actually looking my site(I know it's just load by the browser and display to human, not actually human looking at it).
The image way seems better, as Javascript might be turned off by normal users as well. Robots generally don't load images, so this should indeed work. Nonetheless, if you're just looking to filter a known set of robots (say Google and Yahoo), you can simply check for the HTTP User Agent header, as those robots will actually identify themselves as being a robot.
you can create an google webmasters account
and it tells you how to configure your site for bots
also show how robot will read your website
I agree with others here, this is really tough - generally nice crawlers will identify themselves as crawlers so using the User-Agent is a pretty good way to filter out those guys. A good source for user agent strings can be found at http://www.useragentstring.com. I've used Chris Schulds php script (http://chrisschuld.com/projects/browser-php-detecting-a-users-browser-from-php/) to good effect in the past.
You can also filter these guys at the server level using the Apache config or .htaccess file, but I've found that to be a losing battle keeping up with it.
However, if you watch your server logs you'll see lots of suspect activity with valid (browser) user-agents or funky user-agents so this will only work so far. You can play the blacklist/whitelist IP game, but that will get old fast.
Lots of crawlers do load images (i.e. Google image search), so I don't think that will work all the time.
Very few crawlers will have Javascript engines, so that is probably a good way to differentiate them. And lets face it, how many users actually turn of Javascript these days? I've seen the stats on that, but I think those stats are very skewed by the sheer number of crawlers/bots out there that don't identify themselves. However, a caveat is that I have seen that the Google bot does run Javascript now.
So, bottom line, its tough. I'd go with a hybrid strategy for sure - if you filter using user-agent, images, IP and javascript I'm sure you'll get most bots, but expect some to get through despite that.
Another idea, you could always use a known Javascript browser quirk to test if the reported user-agent (if its a browser) is really actually that browser?
"Nice" robots like those from google or yahoo will usually respect a robots.txt file. Filtering by useragent might also help.
But in the end - if someone wants to gain automated access it will be very hard to prevent that; you should be sure it is worth the effort.
Inspect the User-Agent header of the http request.
Crawlers should set this to anything but a known browser.
here are the google-bot header http://code.google.com/intl/nl-NL/web/controlcrawlindex/docs/crawlers.html
In php you can get the user-agent with :
$Uagent=$_SERVER['HTTP_USER_AGENT'];
Then you just compare it with the known headers
as a tip preg_match() could be handy to do this all in a few lines of code.

How do bookmarks work?

Im interesting in how bookmarks work for social networks sites like facebook for example, when you look at someone's profile its
www.facebook.com/customname
or if they didnt make one yet its
www.facebook.com/generatedname
Is there a get request somewhere im missing??? Is the
www.facebook.com/profile.php?key=
hidden in the url? But how does the server know to interpret the url to look for someone's profile page? How does it work!!!!! Thanks!
Yes, the request is usually hidden using rewrite engines such as mod_rewrite.
As such something like facebook.com/customname is rewritten to facebook.com/profile.php?key=customname, which then internally looks up the correct profile page from the database.
There is some solution called mod_rewrite, which actually translates the URL visited by the user (and visible to the user) into the path of the script (along with all the parameters).
Example: when you visit eg. http://www.facebook.com/ben, server may actually translate it into www.facebook.com/profile.php?name=ben without you noticing it (because it happens on the server side).
That is how it is done.
But there is still another, loosely related solution that happens on the client side (within the user's browser, not on the server). This solution is called pushState and it is HTML5's feature (HTML5 is new standard, supporting application-like behaviours in modern browsers).
Just look at this demonstration (it allows you to change URL, go back and forth, but if you type the visited URL directly you will show that there is nothing on the server). To make similar thing, you will need to learn JavaScript (language of the scripts executed on browser's side).
Alternatively to pushState some pages (like Twitter and - afair - Facebook) use solutions based on location hash (the part of the URL after #), which lets them maintain compatibility with some deprecated browsers, like IE7 etc.
Maybe this is far too much to answer your question, but you now should be pretty informed about how the URL visible to the user may differ from what is really invoked.
If you have any additional questions, let me know.
They probably use .htaccess or a similar mechanism to redirect all requests to a single entry file. That file starts processing the request and can also check to see if there is an account for customname that was specified on the url.

2 IP addresses for 1 website

I have a website that is totally in French. I plan to have an english version since few months. Everything work fine, people can switch their language and the session handle their language.
The problematic is that now I have bought a domain that is different of the french one and I guess it will require to point to the same sub-domain of the host.
I guess that I will need to make some code to check the domain name and if the user is coming to the server with the english one to switch the session to the english language otherwise to use the french language. Am I wrong or not?
I think I will proceed that way but many other part of the website might be completly different like Ads and Images. Is the best way to handle multilanguage website is to make many comparison of the language for these Ads and Images? or to make a duplicate of the website on an other sub-domain and link the new domain to the new folder (I really think that duplicating will be worst for the long run).
Any advice would be appreaciate.
(Both questions in this question are bolded)
If what you mean is a new domain name, point it to the same server as your first domain, and do the language checking (or whatever is required) in the PHP script:
if ($_SERVER["HTTP_HOST"] == "my_first_domain_name.fr")
{ // use french site }
elseif ($_SERVER["HTTP_HOST"] == "my_second_domain_name.fr")
{ // use english site }
you could also think about a solution that splits french content into a directory named /fr, and english content in /en.
Every site that I have built to support multiple languages detected the user's language and then stored it in their session information. How you detect their language is up to you (from their IP, defaulting to a language, etc.), but make sure you provide the user an easy way to change languages. Then, based on the session information, we would update the site copy (ie., put up a different translation), experience (ex., only show products or news stories from that locale), etc.
Having multiple copies of the site on different [sub-]domains is a viable option, though one I don't like: you will have to support and release to all those different domains.
You could also set the session variable if the user comes from your new domain. Just have both domains point to the same place.
You should aim to duplicate as little as possible.. duplicating your site will lead to maintainability problems in future.
You can point both domain names at the same server IP, and have conditional server-side code to determine which content is served to the user.
In PHP, the variable $_SERVER['SERVER_NAME'] is populated with the server name from the client's http request (e.g. 'google.com').
If people access the same php script via different domain names, you could use the value of this to decide which content to present (e.g. have an html template, with the relevant content populated from the database according to site).
In terms of advertisements, you could do the same. Something like google ads will likely take care of this for you.
More generally, we're talking about virtualhosts here. There are lots of different ways to achieve what you're after, and methods vary according to the specifics of the problem, platform, hosting constraints etc.
A lot of sites base the default language choice (and advertisements, currencies used etc) on geoip, falling back to some default.
There are a lot of ways to cut this cookie. Note that since sessions are controlled by cookies (by default at least), your users will get different sessions depending on which domain they request.

Changing web content based on browser type

I'm writing a web application and I'd like to work out what type of browser/OS the request is coming from, and customise the returned content accordingly. So if someone visits the site from an iPhone/Android, they get a more streamlined experience, or if it's a desktop, they get the full version. I will pretty much take a completely different path, rather than try to mix the content together.
What is the recommended approach for this in ASP.NET/IIS and PHP? Is there a single place I can catch incoming HTTP requests, make a decision, then redirect? Or is this usually done on a page by page case? Any gotchas I should look out for?
Edit: A good point was made to make sure there is a link to the full version on the reduced version. That's a good point, but raises the problem that once the user make this choice, all future redirections now have to point to the full version. I'd really rather be doing all of this in one place.
Cheers,
Shane
ASP.NET has a built-in browser detection mechanism. It's driven by a fully extensible collection of XML files (*.browser) that contain regular expressions for matching the incoming User-Agent string and associated properties for the matched agents.
You can access the properties from the Request.Browser object; you can also tag control properties based on browser specifics.
There's a bunch of info on the Web about this -- I also cover it in detail in my book: Ultra-Fast ASP.NET.
Not a direct answer but it's worth checking out CSS media types. You can specify the handheld type to streamline the page for phones and other small screened devices.
http://www.w3.org/TR/CSS21/media.html
You could take a look at the UserAgent header in the HTTP request and redirect accordingly.
In PHP that would be $_SERVER['HTTP_USER_AGENT'].
You should however watch out that you don't write a lot of duplicate code when doing this.
For ASP.NET applications you can check out the Global.asax file and Session_BeginRequest event.
You should probably look at Conditional Comments:
http://msdn.microsoft.com/en-us/library/ms537512%28VS.85%29.aspx

Categories