Japanese language in sqlite db - php

I'm trying to create sqlite db with php (zend framework).
I have the next code:
$sqlite = Zend_Db::factory("pdo_sqlite", array("dbname" => "sqlite.db"));
$sqlite->query('CREATE TABLE IF NOT EXISTS newTable (word TEXT PRIMARY KEY, description TEXT);');
$sqlite->query("INSERT INTO newTable VALUES (\"".$word."\", \"".$description."\");");
Everything is fine and I can create db with needed text. But when I try to use Chinese or Japanese text my resulted db read slowly. I tried read it through php, sqliteman (app for edit sqlite), application on Android.
What should I change for obtain the same result as with English, Cyrillic or any other non-hieroglyphic language?
Thank you for your time!

Question closed. The problem was in very big amount of tables.

Related

Compare same values stored with different encodings

This question is not a duplicate of PHP string comparison between two different types of encoding because my question requires a SQL solution, not a PHP solution.
Context ► There's a museum with two databases with the same charset and collation (engine=INNODB charset=utf8 collate=utf8_unicode_ci) used by two different PHP systems. Each PHP system stores the same data in a different way, next image is an example :
There are tons of data already stored that way and both systems are working fine, so I can't change the PHP encoding or the databases'. One system handles the sales from the box office, the other handles the sales from the website.
The problem ► I need to compare the right column (tipo_boleto_tipo) to the left column (tipo) in order to get the value in another column of the left table (unseen in image), but I'm getting no results because the same values are stored different, for example, when I search for "Niños" it is not found because it was stored as "Niños" ("children" in spanish). I tried to do it via PHP by using utf8_encode and utf8_decode but it is unacceptably slow, so I think it's better to do it with SQL only. This data will be used for a unified report of sales (box office and internet) in variable periods of time and it has to compare hundreds of thousands of rows, that's why it is so slow with PHP.
The question ► Is there anything like utf8_encode or utf8_decode in MYSQL that allows me to match the equivalent values of both columns? Any other suggestion will be welcome.
Next is my current code (with no results) :
DATABASE TABLE COLUMN
▼ ▼ ▼
SELECT boleteria.tipos_boletos.genero ◄ DESIRED COLUMN.
FROM boleteria.tipos_boletos ◄ DATABASE WITH WEIRD CHARS.
INNER JOIN venta_en_linea.ventas_detalle ◄ DATABASE WITH PROPER CHARS.
ON venta_en_linea.ventas_detalle.tipo_boleto_tipo = boleteria.tipos_boletos.tipo
WHERE venta_en_linea.ventas_detalle.evento_id='1'
AND venta_en_linea.ventas_detalle.tipo_boleto_tipo = 'Niños'
The line ON venta_en_linea.ventas_detalle.tipo_boleto_tipo = boleteria.tipos_boletos.tipo is never gonna work because both values are different ("Niños" vs "Niños").
It appears the application which writes to the boleteria database is not storing correct UTF-8. The database column character set refers to how MySQL interprets strings, but your application can still write in other character sets.
I can't tell from your example exactly what the incorrect character set is, but assuming it's Latin-1 you can convert it to latin1 (to make it "correct"), then convert it back to "actual" utf8:
SELECT 1
FROM tipos_boletos, ventas_detalle
WHERE CONVERT(CAST(CONVERT(tipo USING latin1) AS binary) USING utf8)
= tipo_boleto_tipo COLLATE utf8_unicode_ci
I've seen this all too often in PHP applications that weren't written carefully from the start to use UTF-8 strings. If you find the performance too slow and you need to convert frequently, and you don't have an opportunity to update the application writing the data incorrectly, you can add a new column and trigger to the tipos_boletos table and convert on the fly as records are added or edited.

mySQL - Load GTFS Data

I was wondering if anyone has had any success loading GTFS data to a mySQL database. I've looked all over the place for a good tutorial but I can't find anything that has been helpful.
I succeed in importing GTFS files into MySQL.
Step 1: Create a database
CREATE DATABASE gtfs
DEFAULT CHARACTER SET utf8
DEFAULT COLLATE utf8_general_ci;
Step 2: Create tables
For instance, create the table stops for stops.txt,
-- stop_id,stop_code,stop_name,stop_lat,stop_lon,location_type,parent_station,wheelchair_boarding
CREATE TABLE `stops` (
stop_id VARCHAR(255) NOT NULL PRIMARY KEY,
stop_code VARCHAR(255),
stop_name VARCHAR(255),
stop_lat DECIMAL(8,6),
stop_lon DECIMAL(8,6),
location_type INT(2),
parent_station VARCHAR(255),
wheelchair_boarding INT(2),
stop_desc VARCHAR(255),
zone_id VARCHAR(255)
);
Step 3: Load local data
For instance, load the local file stops.txt into the table stops,
LOAD DATA LOCAL INFILE 'stops.txt' INTO TABLE stops FIELDS TERMINATED BY ',' IGNORE 1 LINES;
The complete source code with an example is placed on GitHub (here). Make some slight changes for your purpose.
Have you tried this one :
https://code.google.com/p/gtfsdb/
The website says :
GTFS (General Transit Feed Specification) Database
Python code that will load GTFS data into a relational database, and Sql/Geo-Alchemy ORM bindings to the GTFS tables in the gtfsdb.
The gtfsdb project's focus is on making GTFS data available in a programmatic context for software developers. The need for the gtfsdb project comes from the fact that a lot of developers start out a GTFS-related effort by first building some amount of code to read GTFS data (whether that's an in-memory loader, a database loader, etc...); gtfsdb can hopefully reduce the need for such drudgery, and give developers a starting point beyond the first step of dealing with GTFS in .csv file format.
I actually used the following link as a base and converted it to a script, worked like a charm
http://steffen.im/php-gtfs-to-mysql/
In my case, I created the table structures first.
Then load the data via the following command in mysql command console
load data local infile '/media/sf_Downloads/google_transit/stops.txt' into table stop fields terminated by ',' enclosed by '"' lines terminated by '\n' ignore 1 rows;
This loads the stops.txt into stop table. ignore the first row (heading) in the stops.txt file.
'/media/...' is just the path to the file, where I extract the gtfs data.
I loaded the GTFS data into a SQLite database through the following commands:
Create a new SQLite Database named test.sqlite3
sqlite3 test.sqlite3
Set the mode to csv
sqlite> .mode csv
Import Each File into a corresponding table with import command .import FILE TABLE
sqlite> .import agency.txt agency
sqlite> .import calendar.txt calendar
...
sqlite> .import stops.txt stops
Now your database should be loaded
Hi looking for something similar and still not getting an easy way to do it. I am going to give a try GTFS to MySQl using Python
According to Google Developers, a GTFS feed is basically a zip file containing text files. You should be able to store it in your MySQL database using a BLOB type field. But this isn't recommended. You should store it as a file on your server's disk and then store the file's name/path into a regular text/varchar field in your database.

Can I have specific selectable values for a column in MySQL

Basically just as the title says, in MySQL can I specify a specific range of values for a column so a choice can be selected when a record is added.
I am only starting to understand MySQL and I know that you can do this MS-Access. I'm not sure but do I have to put something special in the 'Length/Values' column when designing? If I can do it, where do I specify what values would be selectable?
If not, is there a more efficient work around then creating a new table and relating the specified values as a foreign key.
Cheers
MS Access combines a few functionalities into a single program:
A database engine (MS Jet)
A programming environment with a programming language (VBA)
A table editor
A table data editor
A form editor
A report editor
A query editor
... more
MySQL is a database engine only. So there is no natural "Select Box" to input data. This would need to come from your programming environment or form generator.
That said, there is support for such a data type: Use ENUM - e.g. CREATE TABLE test (saluation ENUM ('Mr.', 'Mrs.'));

MySQL database normalization (taken from Excel)

I have imported a SQL database from an Excel sheet, so it's a little bit messy. For example there's a product field with VARCHAR values such as product8. I would like to grep through these data using some regex, capture the id in this example, and alter column data types. As of now I would start preg_matching the long and hard PHP way, and I'm curious how a database normalization is done right using SQL commands. Thanks for your support in advance.
you can select case, to pull the ids
select right(product,length(product)-7) as productID from table
this will pull the numbers, then you can do whatever

Zend_Db_Table insert issue with large content

I've been using Zend Framework for quite a while now, but now I have an issue that's got me puzzled. I have a table (MySQL) with just 2 columns (id and msg). The id field is an auto increment value and the msg field is a longtext type.
The following code should work just fine for inserting a record into the table:
$t = new Zend_Db_Table('table1');
$id = $t->insert(array('msg' => $content));
Whenever $content is just a plain string value like 'blah', it works just fine, like it should.
But the fieldtype is not a longtext type for no reason, the content will be quite large.
So now I try to place about 14kb of data in $content, no exception occurs, the database insert seems to have worked, but only for the first 6kb. The rest of the data is just gone?!
When I try to use to oldfashioned mysql_connect, mysql_query, etc routines, it all works like a charm. So it's really seems to be a Zend framework issue...
Has anybody else experienced this?
Configuration
Zend Framework v1.11.10
Zend_Db_Table is configured with the PDO_MYSQL adapter
MySQL database table two columns; id (autoincrement)
Issue
Attempting to INSERT 14kb of UTF8-encoded HTML into longtext column
Zend_Db_Table truncates the data at 6kb
Tests
mysql_query can INSERT the same data without issue
Zend_Db_Table can SELECT all the data without issue
setting error_reporting(-1) reveals 'no errors, warnings or notices'
mediumblob works fine
Isolating the issue
Change the column to a mediumtext. Does the insert work now?
Have you checked what the query actually looks like? See comment above from namesnik.
How many bytes get written on the failed inserts? Is it consistent?

Categories