I have this statement which works in the first run
UPDATE accounts SET username = CONVERT(CAST(CONVERT(username USING latin1) AS BINARY) USING utf8)
This would convert latin1 characters into UTF8 chinese character in mysql.
Running it the 2nd time, the character will become weird.
How do I add in a where condition that only update username to utf8 from latin1 if the word is currently in latin1
What is the CHARACTER SET for the column? If it is still latin1, you are asking for a lot of trouble.
A table declared to be latin1, and containing latin1 bytes can be converted to utf8 via
ALTER TABLE tbl CONVERT TO CHARACTER SET utf8;
After that (and assuming you have SET NAMES set correctly), text will be converted to utf8 as it is inserted. No need to a flag; no need for second pass; etc.
Related
I have a laravel 4.2 project and I have a utf8 fields in database, but the way the data is stored in this filed is like را characters.
In any php file (other than laravel) after selecting the data from database those characters rendered in correct way after using (SET NAMES 'utf8'). I want to do same in laravel. (even if not a database solution)
Here is what i have tried:
make sure all files are utf8
make sure that in config files charset and collation are utf8
use PDO::MYSQL_ATTR_INIT_COMMAND => "SET NAMES utf8" in config files
I also tried to use Blade::setEchoFormat('e(utf8_encode(%s))'); but did not know how to use it in correct way.
Any help would be appreciated.
Do not use iconv. Do not use utf8_encode.
You have Mojibake.
Were you expecting را instead of را? Is this the HEX that is in the table: D8B1D8A7?
This is the classic case of
The bytes you have in the client are correctly encoded in utf8 (good).
You connected with SET NAMES latin1 (or set_charset('latin1') or ...), probably by default. (It should have been utf8.)
The column in the tables may or may not have been CHARACTER SET utf8, but it should have been that.
If you need to fix the data it takes a "2-step ALTER", something like
ALTER TABLE Tbl MODIFY COLUMN col VARBINARY(...) ...;
ALTER TABLE Tbl MODIFY COLUMN col VARCHAR(...) ... CHARACTER SET utf8 ...;
where the lengths are big enough and the other "..." have whatever else (NOT NULL, etc) was already on the column.
ok here is how i get it to work :
my problem was the opposite of what i thought .
all what i did is removing utf8 settings from database.php config file , and comment
line $connection->prepare($names)->execute(); in MysqlConnector.php file
and the right words shown
thanks for all and if any one know a better solution please share with me.
Set CHARACTER SET to utf8 and COLLATE to utf8_general_ci of your database, table, and column.
ALTER DATABASE db_name CHARACTER SET utf8 COLLATE utf8_general_ci;
ALTER TABLE table_name CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci;
ALTER TABLE table_name MODIFY column_name column_name VARCHAR(255) CHARACTER SET utf8 COLLATE utf8_general_ci;
I am converting a spreadsheet using PHPExcel to a Database and the cell value happens to contain Russian. If I run mb_detect_encoding() I am told the text is UTF8 and if I set a header of UTF8 then I see the correct Russian characters.
However if I compile it into a string (with only addslashes involved in the process) and insert it into the table I see lots of ????. I have set the table characterset as utf8mb4 and also set the collation as utf8mb4_general_ci. I have also run $this->db->query("SET NAMES 'utf8mb4'"); on my DB connection.
I run PDO query() with my multi part insert and get the ???s but if I output the query to screen I get ÐŸÐ¾Ñ which would be valid UTF8. Why would this not be stored correctly in the database?
I have kept this question rather than deleting it so someone may find the answer helpful.
The reason I was struggling was because in SQLYog it doesn't show you the column Charset by default. There is an option which reads "Hide language options" on the Alter table view which will then reveal that when SQLyog creates a table it uses the default server Charset as opposed to what you define the table Charset to be. I'm not sure if thats correct - but the solution simply is to turn on the Column Charset settings and check they match what you are expecting.
По is Mojibake for По. Probably...
The bytes you have in the client are correctly encoded in utf8 (good).
You connected with SET NAMES latin1 (or set_charset('latin1') or ...), probably by default. (It should have been utf8.)
The column in the tables may or may not have been CHARACTER SET utf8, but it should have been that.
The question marks imply...
you had utf8-encoded data (good)
SET NAMES latin1 was in effect (default, but wrong)
the column was declared CHARACTER SET latin1 (default, but wrong)
One way to help diagnose the problem(s) is to run
SELECT col, HEX(col) FROM tbl WHERE ...
For По, the hex should be D09FD0BE. Each Cyrillic character, in utf8, is hex D0xx.
I am trying to import data from an XML file into a MYSQL DB using PHP. I am able to get the code to work just fine but when I look at the data in the DB there are special characters. For example, when I look at the XML in my browser it shows up as "outdoors in good weater..." but in the DB it appears to as "outdoors in good weather…".
I've cycled through all the different types of collation for that field in my DB but it does not seem to help much. Sometimes it shows up with the characters mentioned above and others as ???.
I have also tried to sync up the data with the following code in my PHP
$mysqli->query("SET NAMES 'utf8' COLLATE 'utf8_general_ci'");
But, again I have had no luck.
Thank you for reading this and for your help!
Akshay
You need to change the character set to UTF-8, along with your collation:
ALTER TABLE tablename CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci;
What you are seeing is a Unicode ellipsis character (…) being converted into another character set, which is probably Latin1. That is why it looks garbled.
i have this chinese character which i want to insert into database and browser output in correct but when data is inserted into mysql it is in different format something like this.
my chinese characters
图片库,高清图片大全,图库
and inserted into database is
图片库,高清图片大全,图库
i have tried several things like setting utf8
mysql_set_charset("utf8", $connection);
then chenaged the collation from swidish to utf general
then checked the details with this command which proved that my charcter is correctly set to utf8
SHOW CREATE TABLE tablea;
CREATE TABLE `tablea` (
`id` int(12) NOT NULL AUTO_INCREMENT,
`tiya` text CHARACTER SET utf8 NOT NULL,
PRIMARY KEY (`id`),
) ENGINE=MyISAM AUTO_INCREMENT=10 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci
can anyone suggest where to solve chinese charracter issue .my insert is this
mysqli_query ($con,"INSERT INTO tablea (tiya) VALUES ('$tit')");
All of your entire environment must be utf-8-capable. The file encoding of you php-file must be utf-8. The web server (apache?) must serve utf-8. Your mysql table must have utf-8 as charset. You connection(s) must be utf-8.
If one in the chain is missing, you get the above results. Additionaly: if you tried to fiddle around with character conversion inbetween, this is probably mixed up entirely. Once the above chain is set up properly, no char conversion is needed at all.
I need to convert data from old database to new one. Old database was in latin1_swedish_ci collation and have content in cyrilic language like this
<p>ÐрхиепиÑкоп охридÑки и ми...
This content with utf-8 enconding on page looks like this
<p>Архиепископ охридски и митрополит скопски ...
Which is fine. Now I need to convert all of this data into native UTF-8 content. No expirience with these, any sugg.
Thanks
You can try this
ALTER TABLE <tablename> CONVERT TO CHARACTER SET utf8 COLLATE utf8_unicode_ci
And note that , this will affect existing column collations also. If you want to change default collasion to utf8 , must change database collation. After that all new table will be utf8
From the manual,
ALTER TABLE t MODIFY col1 CHAR(50) CHARACTER SET utf8;
However, if you have characters that cannot be converted then you will lose that data. First make a backup and try it there.