MySQL select spatial field value and convert to WKT with PHP - php

If selecting a spatial field value from MySQL without converting it to text in the query, I get an "unreadable" string in PHP. Which function can I use to convert this to WKT?
Example:
SELECT AsText(polygon_field) FROM Table; // gives a nice WKT string.
SELECT polygon_field FROM Table; // gives an unreadable (binary?) string.
Due to limitations in the framework we use, it would be great if that string could be converted to WKT using PHP. Any ideas on what function to use? Can't seem to find anything, because all examples I find rely on the MySQL function AsText :-(

The result is the internal format of the column of MySQL (see the manual on fetching values). This format is neither WKB (well known binary) nor WKT (well known text).
I would not expect any other software to be able to parse those values, especially since MySQL provides the helper functions AsText and AsBinary, which are easy to use (if one does not use a hindering framework).
You can create a view, which contains the AsText output of the column as the column itself. This only helps you while selecting, but maybe that is all you need.

Related

Updating Serialized value in mysql

I want to update a Serialized value on meta_value column on mysql table.But json is hard to select from sql . How can i achieve.
a:2:{i:0;a:2:{s:19:"sevent_speaker_name";s:8:"John Doe";s:18:"sevent_speaker_img";a:1:{i:0;s:5:"72921";}}i:1;a:2:{s:19:"sevent_speaker_name";s:10:"John Smith";s:18:"sevent_speaker_img";a:1:{i:0;s:5:"72922";}}}
Here is the value. I want to replace all sevent with elevent but how can i do it?
Can I use LIKE? But first it need to be unserialized?.
This encoded information probably comes from WordPress or some other PHP framework's database; it's not JSON. Note that strings are encoded by storing the string length and the string contents:
s:19:"sevent_speaker_name"
You can use MySQL's REPLACE function to replace sevent with elevent, but you must be careful to update the length value s as well, or WordPress/PHP won't be able to read in the data.
It's possible to write a MySQL query to update the specific example given above, but it's difficult to write a query to substitutes all strings generically. Here is a tool that does the work.

Is the MySQL JSON data type bad for performance for data retrieval?

Let's say I have a MySQL JSON data type called custom_properties for a media table:
An example of the json data stored in the custom_properties column could be:
{
"company_id": 1,
"uploaded_by": "Name",
"document_type": "Policy",
"policy_signed_date": "04/04/2018"
}
In my PHP Laravel app I would do something like this:
$media = Media::where('custom_properties->company_id', Auth::user()->company_id)->orderBy('created_at', 'DESC')->get();
This would fetch all media items belonging to company 1.
My question is that lets say we have 1 million media records, would this be a bad way to fetch records in terms of performance? Can anyone shed some light on how MySQL indexes JSON data types?
Is the performance significantly better if we joined separate tables and index the columns instead? I'd like to know what the actual performance difference would be.
From the MySQL official docs:
JSON documents stored in JSON columns are converted to an internal
format that permits quick read access to document elements. When the
server later must read a JSON value stored in this binary format, the
value need not be parsed from a text representation. The binary format
is structured to enable the server to look up subobjects or nested
values directly by key or array index without reading all values
before or after them in the document.
When they say "quick read access" they mean "better than if you stored JSON in a TEXT column."
It's still bad performance compared to an indexed lookup.
Using JSON_EXTRACT() or the -> operator is the same as searching on any other expression in MySQL, in that it causes the query to do a table-scan.
If you want better performance, you must create an index on the field you search for. That requires defining a generated column before you can make an index.
See https://mysqlserverteam.com/indexing-json-documents-via-virtual-columns/

search mysql records based on a column that has string of json

I wanted to query records from a mysql table using a column that has json string. As, I am working an old project, the Mysql version used is 5.0 and hence, I can't use mysql 5.7 json functions e.g. JSON_CONTAINS etc etc.
select * cpd_company where operation_data = '{"opening_status":"open", "days": ["Mon"]}';
operation_data column contains such json string:
{"days":["Mon","Fri"],"open_hour":"12:00 AM","close_hour":"12:30 PM","operating_type":"daily","opening_status":"open","journey_type":"Departure"}
This is not working because the column is of Text datatype and the string supplied doesn't equal to the full string.
Can someone help? Thanks
Maybe you should try LIKE '%{$needle}%' and other string function in the case you do not want to reorganize your database structure?
As far as I could see from your example you could split you query to obtain needed result:
1) first query will select entries with "opening_status":"open" needle,
2) second - with "Mon" needle
3) than you should find intersection of this two queries results.
Following will work only if you have only one entry of "Mon", but I believe you will find workaround using MySql string functions in the case there are few "Mon"s in your JSON string.

Unserializing PHP array in SQL

One of the former developers in an old system I'm working on used PHP serialize() to flatten one array of data into one column of a MySQL table.
I need to write a complex query that involves this column. Is there a way to unserialize the data (even if it's into a comma separated string) within the SQL query or do I need to pass the results back to PHP?
Serialized PHP data is just a string, so for "simple" searches, you can use basic MySQL string operations.
e.g. The serialized array has a name parameter and you need to match against that, so
... WHERE LOCATE(CONCAT('s:', LENGTH('foo'), ':foo') ,serializeddatafield) > 0
which would produce
WHERE LOCATE('s:3:foo', serializeddata) > 0
But then, this would find ANY strings whose value is foo anywhere in the data, and not necessarily in the particular array field you want.
e.g. you're probably better off yanking out the whole field, deserializing in PHP, and then doing the search there. You could use the above technique to minimize the total number of strings you'd have to yank/parse, however.

Break up data from one column into multiple columns in a new table (MySQL)

I have a table full of data, where one column is full of different entries for each row, where the data is formatted like this:
A:some text|B:some other text|C:some more text|
I want to separate those strings of text into two columns on a new table. So the new table should have one column for A, B, C etc. and the other column will have the rest of the text in their respective rows.
And there is another value (a DATETIME value) in a separate column of the first table that I would like to copy into a third column for each of the separated entries.
Let me know if this needs clarificaiton, I know it's kind of confusing and I'm pretty fuzzy with MySQL. Thanks!
MySQL supports SUBSTRING, together with LOCATE you could probably whip up something nice, based on the pipe symbol you seem to use as a separator.
http://dev.mysql.com/doc/refman/5.1/en/string-functions.html#function_locate
http://dev.mysql.com/doc/refman/5.1/en/string-functions.html#function_substring
In most cases I prefer to write "convertors" in a another language than perform it directly on the database, however in this situation it looks like it's not that much data so 'might' work fine..
I think you should better write a simple script in VBScript, PHP or any other scripting language of your choice. All scripting languages provide you with string manipulation and date formatting functions. Database queries won't allow you to handle the "unexpected".

Categories