Wrong character encoding with database output using laravel - php

I've recently started using laravel for a project I'm working on, and I'm currently having problems displaying data from my database in the correct character encoding.
My current system consists of a separate script responsible for populating the database with data, while the laravel project is reponsible for displaying the data. The view that is used, is set to display all text as utf-8, which works as I've successfully printed special characters in the view. Text from the database is not printed as utf8, and will not print special characters the right way. I've tried using both eloquent models and DB::select(), but they both show the same poor result.
charset in database.php is set to utf8 while collation is set to utf8_unicode_ci.
The database table:
CREATE TABLE `RssFeedItem` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`feedId` smallint(5) unsigned NOT NULL,
`title` varchar(250) COLLATE utf8_unicode_ci NOT NULL,
`url` varchar(250) COLLATE utf8_unicode_ci NOT NULL,
`created_at` datetime NOT NULL,
`updated_at` datetime NOT NULL,
`text` mediumtext COLLATE utf8_unicode_ci,
`textSha1` varchar(250) COLLATE utf8_unicode_ci DEFAULT NULL,
PRIMARY KEY (`id`),
UNIQUE KEY `url` (`url`),
KEY `feedId` (`feedId`),
CONSTRAINT `RssFeedItem_ibfk_1` FOREIGN KEY (`feedId`) REFERENCES `RssFeed` (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=6370 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
I've also set up a test page in order to see if the problem could be my database setup, but the test page prints everything just fine. The test page uses PDO to select all data, and prints it on a simple html page.
Does anyone know what the problem might be? I've tried searching around with no luck besides this link, but I haven't found anything that might help me.

I did eventually end up solving this myself. The problem was caused by the separate script responsible for populating my database with data. This was solved by running a query with SET NAMES utf8 before inserting data to the database. The original data was pulled out, and then sent back in after running said query.
The reason for it working outside laravel, was simply because the said query wasn't executed on my test page. If i ran the query before retrieving the data, it came out with the wrong encoding because the query stated that the data was encoded as utf8, when it really wasn't.

Related

broken UTF8 characters in MySQL using PHP

I have noticed when dealing with some names that are not of normal spelling ie standard alphabet UK/US are getting lost from my inserting of a record to what actually shows up in the database. I have done quiet a bit of reading regarding the Collation type, which is what I thought was causing the issue, but not sure if this is the case or I'm still doing it wrong as my problem is still persisting.
Below is an example of a record I am creating as well as my database structure, and as you can also see the last_name field has "ö", when I lookup the record I actually see the last_name "Körner"
CREATE TABLE `data` (
`id` bigint(20) NOT NULL,
`profile_id` int(11) NOT NULL,
`first_name` varchar(100) NOT NULL,
`last_name` varchar(100) NOT NULL,
) ENGINE=MyISAM DEFAULT CHARSET=utf8;
ALTER TABLE `data`
ADD PRIMARY KEY (`id`),
ADD UNIQUE KEY `profile_id` (`profile_id`);
ALTER TABLE `data`
MODIFY `id` bigint(20) NOT NULL AUTO_INCREMENT;
INSERT IGNORE INTO data (profile_id, first_name, last_name) VALUES (1, 'Brent', 'Körner');
The field collation on the last_name is set to 'utf8_general_ci' which that I understand or should I say thought would sort this issue out.
This seems to be something I am doing wrong / missing with PHP, as when I execute the INSERT query within PhpMyAdmin it saves fine.
it seems the issue was down to PHP in the end, and i wasn't setting the charset.
For mysql
mysql_set_charset('utf8');
For mysqli
mysqli_set_charset('utf8');
ref https://akrabat.com/utf8-php-and-mysql/

MySQL: #1293 - Incorrect table definition

Designed Database and Table in Local Server Using MySQL 5.6.17 and Export into *.sql file.
When try to Import *.sql file into Live Server(MySQL 5.1.36) got below Error:
#1293 - Incorrect table definition; there can be only one TIMESTAMP column with CURRENT_TIMESTAMP in DEFAULT or ON UPDATE clause
Understood the issue through this link
Is there any way to import Local Server's *.sql file to Live Server without Updating MySQL Version?
TABLE:
CREATE TABLE 'currency' (
'id' int(11) NOT NULL AUTO_INCREMENT,
'currency_name' varchar(30) CHARACTER SET utf8 COLLATE utf8_unicode_ci DEFAULT
NULL,
'country' varchar(20) CHARACTER SET utf8 COLLATE utf8_unicode_ci NOT NULL,
'currency' varchar(5) CHARACTER SET utf8 COLLATE utf8_unicode_ci NOT NULL,
'created_by' int(11) NOT NULL,
'created_on' timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
'status' int(1) NOT NULL DEFAULT '1',
'modified_by' int(11) DEFAULT NULL,
'modified_on' timestamp NOT NULL ON UPDATE CURRENT_TIMESTAMP,
PRIMARY KEY ('id')) ENGINE=InnoDB AUTO_INCREMENT=22 DEFAULT CHARSET=utf8;
NOTE:- Windows Server Running WAMP(Local Server) and QNAP(Live Server).
There is not a direct way to do it, since this is not a valid table definition for any version prior to MySQL Server 5.6.5.
You could edit the dump file by hand or with a tool like sed or perl to modify the offending lines, or you could change the table definition on the source server... but then your application, which presumably expects this behavior, isn't going to work properly.
You could also modify the table definition to make it valid for 5.1, with only one automatic timestamp, and use triggers to get the rest of the desired behavior.
The best course, of course, is one of these:
update the 5.1 server to 5.5 and then to 5.6, or
when developing for a 5.1 server, always use 5.1 in the development environment, so you don't get into conditions like this, relying on newer features that aren't compatible with older deployments... remembering, though, that there are some pretty significant improvements that happened in version 5.6.

How to convert latin1_swedish_ci data into utf8_general_ci?

I have a MySQL database with all the table fields collation as
latin1_swedish_ci
It has almost 1000 of the records already stored and now I want to convert all these data into
utf8_general_ci
So that I can display any language content. I have already altered the field collations into utf8_general_ci but this does not CONVERT all the old records into utf8_general_ci
one funny thing.
CONVERT TO CHARSET and CONVERT()/CAST() suggested by Anshu will work fine if charset in the table is in right encoding.
If for some reason latin1 column containts utf8 text, CONVERT() and CAST() will not be able to help. I had "messed" my database with that setup so spend bit more time on solving this.
to fix this in addition to character set conversion, there are several exercises required.
"Hard one" is to recreate the database from dump that will be converted via console
"Simple one" is to convert row by row or table by table:
INSERT INTO UTF8_TABLE (UTF8_FIELD)
SELECT convert(cast(convert(LATIN1_FIELD using latin1) as binary) using utf8)
FROM LATIN1_TABLE;
basically, both cases will process string to original symbols and then to right encoding, that won't happen with simple convert(field using encoding) from table; command.
Export your table.
Drop the table.
Open the export file in the editor.
Edit it manually where the table structure is created.
old query:
CREATE TABLE `message` (
`message_id` int(11) NOT NULL,
`message_thread_id` int(11) NOT NULL,
`message_from` int(11) NOT NULL,
`message_to` int(11) NOT NULL,
`message_text` longtext NOT NULL,
`message_time` varchar(50) NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
new query: ( suppose you want to change message_text field. )
CREATE TABLE `message` (
`message_id` int(11) NOT NULL,
`message_thread_id` int(11) NOT NULL,
`message_from` int(11) NOT NULL,
`message_to` int(11) NOT NULL,
`message_text` longtext CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci NOT NULL,
`message_time` varchar(50) NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
save the file and import back to the database.

Complete multi language website in php

I need an idea on how to make a complete multilanguage website. I came across many ways some are having an xml file to have template bits translated. This works if I only want the main template. But even the content will be translated.
For example I have a new entry in english, it should be translated to 4 other languages. Most attributes are common.
What i reached so far is by creating a table for the main website template with attributes:
lang, tag, value
In my template it will do a match on lang and tag.
What is the best way to translate the rest of the website (dynamic php pages using mysql)
You need a table for languages as below:
CREATE TABLE `language` (
`langid` tinyint(3) unsigned NOT NULL AUTO_INCREMENT,
`language` varchar(35) CHARACTER SET utf8 COLLATE utf8_unicode_ci NOT NULL,
PRIMARY KEY (`langid`)
) ENGINE=InnoDB
Then for example you have a table for posts as below:
CREATE TABLE `post` (
`postid` int unsigned NOT NULL AUTO_INCREMENT,
`langid` tinyint(3) unsigned NOT NULL AUTO_INCREMENT,
`content` TEXT CHARACTER SET utf8 COLLATE utf8_unicode_ci NOT NULL,
`title` varchar(35) CHARACTER SET utf8 COLLATE utf8_unicode_ci NOT NULL,
PRIMARY KEY (`postid`)
) ENGINE=InnoDB
In the post table you need a key like langid, which refers to the specific language in the language table. In you dashboard you will have something as below:
Each textbox refers to that specific language.
You should have another table for site's menus, and put langid foreign key in there. You should be well on your way.
Look into gettext extension - http://php.net/manual/en/book.gettext.php
Then use a program like POEdit or simplepo to do the actual editing of the language files
IMO, this is the best way I have found for a multilingual site
You can also look into the Zend_Translate module

How to insert images into the database and fetch it from the database and display using PHP?

How do I insert images into the database, and fetch it from the database and display using PHP?
I have tried many times as i am a beginner of php please help me out.
You can use a BLOB field for storing binary data, but please be aware that the performance may be moderate. It's almost always better to store the image as a file on disk and just store the file name in the database. To fetch the image from db, you will need to read the whole image into memory, which is a waste of resources. It's better to offload the image handling to the web server which can stream the image to the client.
As much as this is not a recommended practice, if you have your heart set on doing this, there are a pile of tutorials online which will walk you through how to do this:
Uploading Files To MySQL Database
Upload a File and write to MySQL
Upload Files to MySQL using PHP Tutorial
Uploading files into a MySQL database using PHP
You can create a table like so:
CREATE TABLE `image` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`filename` varchar(128) COLLATE utf8_unicode_ci NOT NULL,
`mime_string` varchar(128) COLLATE utf8_unicode_ci NOT NULL,
`data` longblob NOT NULL,
`data_size` int(11) NOT NULL,
`hash` varchar(64) COLLATE utf8_unicode_ci NOT NULL,
`compressed` tinyint(4) NOT NULL,
`remote_id` int(11) DEFAULT NULL,
`created_at` datetime DEFAULT NULL,
`updated_at` datetime DEFAULT NULL,
PRIMARY KEY (`id`),
KEY `image_I_1` (`hash`),
KEY `image_I_2` (`remote_id`)
) ENGINE=InnoDB AUTO_INCREMENT=125 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci ROW_FORMAT=COMPACT
When you write the file back to the output, you'll have to write the HTTP header content type
to the same as the mime type:
header( 'Content-type', 'image/png' );
Which is why you store the mime type along with it.

Categories