I have a problem with UTF-8 strings in PHP on my Debian server.
Update in details
I´ve done a little more testing and the situation is now more specific. I updated the title and details to fit it better the situation. Thanks for the responses and sorry that the problem wasn´t described clearly. The following script works fine on my local Windows machine but not on my Debian server:
<?php
header("Content-Type: text/html; charset=UTF-8");
$string = '<html><head></head><body>UTF-8: ÄÖÜ<br /></body</html>';
$document = new DOMDocument();
#$document->loadHTML($string);
echo $document->saveHTML();
echo $string;
As expected on my local machine the output is:
UTF-8: ÄÖÜ
UTF-8: ÄÖÜ
On my server the output is:
UTF-8: ÄÖÜ
UTF-8: ÄÖÜ
I wrote the script in Notepad++ in UTF-8 without BOM and transferred it over SSH. As noticed by guido the string itself is properly UTF-8 encoded. There seems to be a problem with PHP DOM or maybe libxml. And the reason must be some setting since it is machine dependant.
Original question
I work locally with XAMPP on Windows and everything is fine. But when I deploy my project on the server UTF-8 strings get all messed up. In fact when I upload this test script
echo utf8_encode('UTF-8 test: ÄÖÜ');
I get "ÃÃÃ". Also when I connect with putty to the server I cannot write umlauts (ÄÖÜ) correctly in the shell. I have no idea if this issue is even PHP related.
Check for your apache's AddDefaultCharset setting.
On standard debian apache distributions, the setting can be modified in /etc/apache2/conf.d/charset.
Please verify that your file is byte-to-byte the same as on your local machine. FTP transfer in text mode could have messed it up. You may want to try binary one.
EDIT: answer for updated question:
<?php
header("Content-Type: text/html; charset=UTF-8");
$string = '<html><head>'
.'<meta http-equiv="content-type" content="text/html; charset=utf-8">'
.'</head><body>UTF-8: ÄÖÜ<br /></body</html>';
$document = new DOMDocument();
#$document->loadHTML($string);
echo $document->saveHTML();
echo $string;
?>
I suspect your input string may be already UTF-8. Try:
setlocale(LC_CTYPE, 'de_DE.UTF-8');
$s = "UTF-8 test: ÄÖÜ";
if (mb_detect_encoding($s, "UTF-8") == "UTF-8") {
echo "No need to encode";
} else {
$s = utf8_encode($s);
echo "Encoded string $s";
}
Are you explicitly sending a content-type header? If you omit it, it's likely that Apache is sending one for you. If the file is served with a Latin-1 encoding (by Apache) and the browser reads it as such, then your UTF-8 characters will be malformed.
Try this:
<?php
echo "Drop some UTF-8 characters here.";
Then this:
<?php
header("Content-Type: text/html; charset=UTF-8");
echo "Drop some UTF-8 characters here.";
The second should work, if the first doesn't. You may also want to save the file as a UTF-8-encoded file, if it's not already.
If your database characters are messed up, try setting the (My)SQL connection encoding.
Try changing the defualt charset on the server in your php.ini file:
default_charset = "UTF-8"
also, make sure your are sending out the proper content type headers as utf-8
In my experience with utf-8, if you properly configure the php mbstring module and use the mbstring functions, and also make sure your database connection is using utf-8 then you won't have any problems.
The db part can be done for mysql with the query "SET NAMES 'utf8'"
I usually started an output buffer using mbstring to handle the buffer. This is what I use in production websites and it is a very solid approach. Then send the buffer when you have finished rendering your content.
Let me know if you would like the sampe code for that.
Another easy trick to just see if it is the wrong headers being sent out by php or the webserver is to use the view->encoding menu on your browser and see if it is utf-8. If it's not and you switch it to utf-8 and everything looks ok then it is a problem with your headers or content type. If it is already utf-8 and the text is screwed up then it is something going wrong in your code or db connection. If you are using mysql make sure the tables and columns involved are also utf-8
The cause of the problem was an old version of libxml (2.6.32.) on the server. On the development machine it was 2.7.3. I upgraded libxml to an unstable package resulting in version 2.7.8. The problems are now gone.
Related
I'm on PHP 5.5.8 and saw that I was getting weird data, turns out when I decode html entities I'm getting some type of corrupted characters or something.
echo html_entity_decode('é');
Displays ├® in my terminal and é in a browser, when it should be é. I've used html_entity_decode('é', ENT_QUOTES, 'UTF-8') and defined my default charset to be UTF-8 as well. The thing is I've tried it on another server and it worked fine. But on my local environment it's failing so .. probably something to do with some settings but I don't know where to look. Can anyone help?
It seems that you have a mismatch of encodings, check in your php.ini, if your default_charset there is not utf-8, that will mess things up.
You can also set it at run time with ini_set.
ini_set ( 'default_charset', 'utf-8' );
I'm searching there for a long time, but without any helpful result.
I'm developing a PHP project using eclipse on a Ubuntu 11.04 VM. Every thing works fine. I've never need to look for the file encoding. But after deploying the project to my server, all contents were shown with the wrong encoding. After a manual conversion to UTF8 with Notepad++ my problems were solved.
Now I want to change it in my Ubuntu VM, too. And there's the problem. I've checked the preferences in Eclipse but every property ist set to UTF8: General content types, workspace, project settings, everything ...
If I look for the encoding on the terminal, it says "test_new.dat: text/plain; charset=us-ascii". All files are saved to ascii format. If I try to create a new file with the terminal ("touch") it's also the same.
Then I've tried to convert the files with iconv:
iconv -f US-ASCII -t UTF8 -o test.dat test_new.dat
But the encoding doesn't change. Especially PHP files seems to be resistant. I have some *.ini files in my project for which a conversion works?!
Any idea what to do?
Here are my locale settings of Ubuntu:
LANG=de_DE.UTF-8
LANGUAGE=de_DE:en
LC_CTYPE="de_DE.UTF-8"
LC_NUMERIC="de_DE.UTF-8"
LC_TIME="de_DE.UTF-8"
LC_COLLATE="de_DE.UTF-8"
LC_MONETARY="de_DE.UTF-8"
LC_MESSAGES="de_DE.UTF-8"
LC_PAPER="de_DE.UTF-8"
LC_NAME="de_DE.UTF-8"
LC_ADDRESS="de_DE.UTF-8"
LC_TELEPHONE="de_DE.UTF-8"
LC_MEASUREMENT="de_DE.UTF-8"
LC_IDENTIFICATION="de_DE.UTF-8"
LC_ALL=
I was also wondering about character encoding and found something that might be usefull here.
When I create a new empty .txt-file on my ubuntu 12.04 and ask for its character encoding with: "file -bi filename.txt" it shows me: charset=binary. After opening it and writing something inside like "haha" I saved it using "save as" and explicitly chose UTF-8 as character encoding. Now very strangely it did not show me charset=UTF-8 after asking again, but returned charset=us-ascii. This seemed already strange. But it got even stranger, when I did the whole thing again but this time included some german specific charakters (ä in this case) in the file and saved again (this time without saving as, I just pressed save). Now it said charset=UTF-8.
It therefore seems that at least gedit is checking the file and downgrading from UTF-8 to us-ascii if there is no need for UTF-8 since the file can be encoded using us-ascii.
Hope this helped a bit even though it is not php related.
Greetings
UTF-8 is compatible with ASCII. An ASCII text file is therefore also valid UTF-8, and a conversion from ASCII to UTF-8 is a no-op.
I'm trying this code (on my local web server)
<?php
echo 'the word is / думата е '.$_GET['word'];
?>
but I get corrupted result when enter ?word=проба
the word is / думата е ����
The document is saved as 'UTF-8 without BOM' and headers are also UTF-8.
I have tried urlencode() and urldecode() but the effect was same.
When upload it on web server, works fine...
What if you try sending a HTTP Content-type header, to indicate the browser which encoding / charset your page is generating ?
For instance, something like this might help :
header('Content-type: text/html; charset=UTF-8');
echo 'the word is / думата е '.$_GET['word'];
Of course, this is if you are generating HTML -- you probably are.
Considering there is a configuration setting at the server's level that defines which encoding is sent by default, maybe the default encoding on your server is OK -- while the one on your local server is not.
Sending such a header by yourself would solve the problem : it would make sure the encoding is always set properly.
I suppose you are using the Apache web server.
There is a common problem with Apache configuration - a line with "AddDefaultCharset" in the config should be commented out (add # in the begining of the line, or replace the line with "AddDefaultCharset off") because it "overrides any encoding given in the files in meta http-equiv or xml encoding tags".
In my current installation (Apache2 # Ubuntu Linux) the line is found in "/etc/apache2/conf.d/charset" but in other (Linux/Unix) setups can be in "/etc/apache2/httpd.conf", or "/etc/apache/httpd.conf" (if you are using Apache 1). If you don't find it in these files you can search for it with "cd /etc/apache2 ; grep -r AddDefaultCharset *" (for Apache 2 # Unix/Linux).
Take a look at Changing the server encoding. An excellent read!
Cheers!
If You recieve $_GET from AJAX make sure that Your blablabla.js file in UTF-8 encode. Also You can use iconv("cp1251","utf8",$_GET['word']); to display your $_GET['word'] in UTF-8
I just had the issue and it sometimes happens if you filter the GET variable with htmlentities(). It seems like this function converts cyrillic characters into weird stuff.
I got very strange problem. I have one php website which is running in two server. One is on Apache (Linux) and second is on IIS (WIndow). Linux Server, I just run it for demo. IIS is the actual hosting that I need to host. Even with all the same code, database, in the linux server, there's no  character. But in IIS, everywhere got  characters. I checked all the meta tag, it's utf-8. In database collation is utf-8 also. In mysql database, i got those  character, but somehow, in linux, when we fetch the content from database, those  doesn't show. It just happening on IIS. Can anyone point out how can i resolve this ? Thank you.
I had a similar issue a while ago, there are some useful comments and information here - it's PHP but I believe the theory would be the same:Question 386378
You also need to specify UTF-8 in the HTTP headers. With PHP:
<?php
header('Content-Type: text/plain; charset=utf-8');
?>
With Apache:
AddDefaultCharset UTF-8
The Apache setting can be placed in an .htaccess file.
I checked all the meta tag, it's utf-8.
The browser doesn't interpret the meta tag. It's only a fallback when no http-headers are present. Right click and select "View Page Info" to see what encoding the browser actually interprets the page in.
In database collation is utf-8 also. In mysql database
Collation is irrelevant for display of characters. The charset matters however. So does the connection charset.
Try inspecting the html responses directly by using something like Fiddler or Firebug. Check to see if the responses from IIS/Apache (which should be returning exactly the same text) have:
Different data
Different headers
Pay particular attention to the Content-Type header, which should say what character encoding (utf-8, ISO/IEC 8859-1, Latin-1, etc.) the returned text is in.
We have developed a PHP-MySQL application in two languages - English and Gujarati. The Gujarati language contains symbols that need unicode UTF-8 encoding for proper display.
The application runs perfectly on my windows based localhost and on my Linux based testing server on the web.
But when I transfer the application to the client's webserver (linux redhat based), the Gujarati characters are not displayed properly.
I have checked the raw data from both the databases (on my webserver and on the client's webserver) - it is exactly the same. However the display is different. On my server the fonts are displayed perfectly, but when I try to access the client's copy of the app, the display of Guajarati font is all wrong.
Since I am testing both these installation instances from the same machine and the same browser, the issue is not of browser incompatability or the code. I believe that there is some server setting that needs to be done, which I am missing out.
Can you help please.
Thanks
Vinayak
UPDATE
Thanks. We have tried the apache and php settings suggestions given by the SO community members. But the problem remains.
To breakdown the issue we have looked at the different stages that the data is passing through.
The two databases (at the client's end and at at our end) are identical. There is no difference in them.
The next step in this app is that a query is run which recovers the data, creates an xml file and saves it.
This XML file is then displayed using a PHP page.
We have identified that the problem is that there is a difference in the XML file being created. The code used for creating the XML file is as below:
function fnCreateTestXML($testid)
{
$objQuery = new clsQuery();
$objTest = new clsTest();
$setnames = $objQuery->fnSelectRecords("tbl_testsets", "setnumber", "");
$queryresultstests = $objQuery->fnSelectRecords("tbl_tests", "", "");
if($queryresultstests)
{
foreach($queryresultstests as $queryresulttest)
{
foreach($setnames as $setname)
{
//Creating Root node test and set its attribute
$doc = new DomDocument('1.0','utf-8');
$root = $doc->createElement('test');
$root = $doc->appendChild($root);
//and so on
//Saving XML on the disk
$xml_create = $doc->saveXML();
$testname = "testsxml.xml";
$xml_string = $doc->save($testname);
Any ideas??
Thanks
Vinayak
The answer almost certainly lies in the headers being sent with the web pages. To diagnose issues like this, I've found it useful to install the firefox addon "Live HTTP Headers".
Install that addon, then turn it on and reload a page from the client's webserver, and from your own.
What you'll probably see is that the page served by your webserver has the header:
Content-Type: text/html; charset=UTF-8
Whereas when served by the client webserver it says:
Content-Type: text/html
The way I would recommend fixing this is for you to ensure that you explicitly set the header to specify utf-8 in every page of your application. This then insulates your application from future configuration changes on the client's end.
To do this, call
header('Content-type: text/html; charset=utf-8');
on each page before sending any data.
Since you've stated that it is working in your development environments and not on your clients, you might want to check the clients Apache's AddDefaultCharset and set this to UTF-8, if it's not already. (Assuming that they're using Apache.)
I tend to make sure the following steps are checked
PHP Header is sent in UTF-8
Meta tag is set to UTF-8 (Content-Type)
Storage is set to UTF-8
Server output is set to UTF-8
Make sure your php code files are encoded in UTF8 with BOM (Byte Order Mark)
Make sure, that the response headers are correct - the Content-type should have UTF-8 in it.
Check the character set settings on the DB instance on the client's machine:
SHOW VARIABLES LIKE 'character_set%';
SHOW VARIABLES LIKE 'collation%';
Before executing any text fetching queries, try executing:
SET NAMES utf8
SET CHARACTER SET utf8
If that works, you might want to add the following to the my.cnf on the client's machine:
[mysqld]
collation_server=utf8_unicode_ci
character_set_server=utf8
default-character-set=utf8
default-collation=utf8_general_ci
collation-server=utf8_general_ci
Please use this meta tag : meta http-equiv="Content-Type" content="text/html; charset=UTF-8"
Make sure use this code in php :mysql_query ("set character_set_results='utf8'");