I have a MySQL DB table where I store addresses, including Norwegain addresses.
CREATE TABLE IF NOT EXISTS `addresses` (
`id` int(11) unsigned NOT NULL AUTO_INCREMENT,
`street1` varchar(50) COLLATE utf8_danish_ci NOT NULL,
`street2` varchar(50) COLLATE utf8_danish_ci DEFAULT 'NULL',
`zipcode` varchar(10) COLLATE NOT NULL,
`city` varchar(30) COLLATE utf8_danish_ci NOT NULL,
PRIMARY KEY (`id`),
KEY `index_store` (`name`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_danish_ci;
Now, this table was fine untill I screwed up and accidentaly set all cities = 'test'. Luckilly I had another table called helper_zipcode. This table contains alle zipcodes and cities for Norway.
So I updated addresses table with data from helper_zipcode.
Unfortunately in the front end, cities like Bodø now shows like Bod�.
All æ ø åare now shown as � � � (but they look fine in the DB).
I'm using HTML 5, so my header looks like this:
<!DOCTYPE HTML>
<head>
<meta charset = "utf-8" />
(...)
This is not the first time I struggle with unicode.
What is the seceret for storing unicode characters (from Europe) in DB and display the same way when retrieved from DB?
from mysql docs:
Posted by lorenz pressler on May 2
2006 12:46pm [Delete] [Edit]
if you
get data via php from your mysql-db
(everything utf-8) but still get '?'
for some special characters in your
browser (<meta
http-equiv="Content-Type"
content="text/html; charset=utf-8"
/>), try this:
after mysql_connect() , and
mysql_select_db() add this lines:
mysql_query("SET NAMES utf8");
worked for me. i tried first with the
utf8_encode, but this only worked for
äüöéè... and so on, but not for
kyrillic and other chars.
Is your problem storing the data in mysql or from retrieving the stored data using php?
Before query (1-st time) you must need add mysql_query("SET NAMES UTF8");.
What happens if you change your browser encoding from auto-detect to UTF-8 or Unicode ?
What I'm trying to determine if its the Database or the Web-browser that's wrong.
Alternatively. if you have a Database tool for your MySQL database, does that show the right or wrong characters ?
Related
I am currently working on a project which is translated in 18 languages like russian, german, swedish or chinese. I have some issues with sorting countries names in different languages. For example, countries names in french are sorted like that :
- États-Unis
- Éthiopie
- Afghanistan
I don't have this issue on my local server using MAMP.
My database's character set is configured as utf8 and the collation is utf8_unicode_ci. I have exactly the same configuration on the distant server.
I created a my.cnf file on my local server with the following instructions in order to correctly display special characters :
[mysqld]
skip-character-set-client-handshake
collation_server=utf8_unicode_ci
character_set_server=utf8
On the distant server, the my.cnf file does not contain these lines. When I tried to add them, MySQL did not recognise special characters anymore like if it was interpreting them as latin1.
I checked collation_database and all character_set variables but they are all set as utf8 / utf8_unicode_ci.
Here is the SQL code for the creation of the table :
CREATE TABLE esth_countries (
country_id varchar(2) COLLATE utf8_unicode_ci NOT NULL,
name varchar(100) COLLATE utf8_unicode_ci NOT NULL,
region varchar(40) COLLATE utf8_unicode_ci NOT NULL,
language_id varchar(2) COLLATE utf8_unicode_ci NOT NULL,
PRIMARY KEY (country_id,language_id),
KEY language_id (language_id)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
Special characters are correctly displayed on my distant server. The only problem concerns sorting using ORDER BY clause.
It seems like there is something wrong with the distant server's configuration but I can't figure out what.
I have one old database which I must use. The problem is that the old data(mostly text) is stored in 1252(latin1_general_ci) and is showed like ?????? on the page. Then I've converted whole database and the table to UTF-8 collation like this:
ALTER DATABASE databasename CHARACTER SET utf8 COLLATE utf8_unicode_ci;
ALTER TABLE tablename CONVERT TO CHARACTER SET utf8 COLLATE utf8_unicode_ci;
But the problem whit old records remains. I know that the queries above are just change the fields collation. My question is there an way to show those ????? records properly on the web page now?
1) Create dump
mysqldump --default-character-set=latin1 --skip-set-charset mydatabase mytable > ./mytable.sql
2) In mytable.sql replace latin1 in utf8
CREATE TABLE `test` (
`id` int(10) unsigned NOT NULL,
`name` char(255) NOT NULL default '',
PRIMARY KEY (`id`)
) ENGINE=MyISAM DEFAULT CHARSET=latin1;
3) Import DB
mysql --user=login -p --database=mydatabase < ./mytable.sql
mysqldump — A Database Backup Program
I've recently started using laravel for a project I'm working on, and I'm currently having problems displaying data from my database in the correct character encoding.
My current system consists of a separate script responsible for populating the database with data, while the laravel project is reponsible for displaying the data. The view that is used, is set to display all text as utf-8, which works as I've successfully printed special characters in the view. Text from the database is not printed as utf8, and will not print special characters the right way. I've tried using both eloquent models and DB::select(), but they both show the same poor result.
charset in database.php is set to utf8 while collation is set to utf8_unicode_ci.
The database table:
CREATE TABLE `RssFeedItem` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`feedId` smallint(5) unsigned NOT NULL,
`title` varchar(250) COLLATE utf8_unicode_ci NOT NULL,
`url` varchar(250) COLLATE utf8_unicode_ci NOT NULL,
`created_at` datetime NOT NULL,
`updated_at` datetime NOT NULL,
`text` mediumtext COLLATE utf8_unicode_ci,
`textSha1` varchar(250) COLLATE utf8_unicode_ci DEFAULT NULL,
PRIMARY KEY (`id`),
UNIQUE KEY `url` (`url`),
KEY `feedId` (`feedId`),
CONSTRAINT `RssFeedItem_ibfk_1` FOREIGN KEY (`feedId`) REFERENCES `RssFeed` (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=6370 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
I've also set up a test page in order to see if the problem could be my database setup, but the test page prints everything just fine. The test page uses PDO to select all data, and prints it on a simple html page.
Does anyone know what the problem might be? I've tried searching around with no luck besides this link, but I haven't found anything that might help me.
I did eventually end up solving this myself. The problem was caused by the separate script responsible for populating my database with data. This was solved by running a query with SET NAMES utf8 before inserting data to the database. The original data was pulled out, and then sent back in after running said query.
The reason for it working outside laravel, was simply because the said query wasn't executed on my test page. If i ran the query before retrieving the data, it came out with the wrong encoding because the query stated that the data was encoded as utf8, when it really wasn't.
I have the next table:
CREATE TABLE IF NOT EXISTS `applications` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`name` varchar(45) COLLATE utf8_unicode_ci DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
I want to store the value "España" in the "name" field.
I have a PHP FILE (encoded in UTF8) with a form to save that. When i save "España" using the php file and i read from mysql with php i see the data ok.
But if a go to PMA o Mysql Query Browser i see this: "España"
If i save it from PMA (with encoding set to UTF-8 ) or mysql query browser i see ok on that two tools, but i see "Espa�a" from PHP.
I dont understand why.
In bytes:
If is saved from PHP i see: C3 83 C2 B1 (for ñ)
If is saved from MQB or PMA i see: C3 B1 (for ñ)
Run mysql_set_charset() before executing queries on the open connection.
mysql_set_charset("UTF-8");
The problem was the php mysql client. It uses latin1 as encoding for the connection, you can see this with:
echo mysql_client_encoding($con);
or
print_r(mysql_fetch_assoc(mysql_query("show variables like 'char%';")));
There is two ways to solve this:
mysql_set_charset("utf8");
or
mysql_query("SET CHARACTER SET 'UTF8'", $con); // data send by the server
mysql_query("SET NAMES 'UTF8'", $con); // data send by the client
I have a mysql table, I made it in latin1 style and it is all that way. How can I make a table that is all latin1 except one column, which i need to be able to accept chinese characters?
Also, Whats the best structure for a column with chinese characters?
alter table your_table
modify column
chinese_column varchar(255) collate utf8_general_ci; <-- or any relevant collate
details can be found here