Zend_Db_Table insert issue with large content - php

I've been using Zend Framework for quite a while now, but now I have an issue that's got me puzzled. I have a table (MySQL) with just 2 columns (id and msg). The id field is an auto increment value and the msg field is a longtext type.
The following code should work just fine for inserting a record into the table:
$t = new Zend_Db_Table('table1');
$id = $t->insert(array('msg' => $content));
Whenever $content is just a plain string value like 'blah', it works just fine, like it should.
But the fieldtype is not a longtext type for no reason, the content will be quite large.
So now I try to place about 14kb of data in $content, no exception occurs, the database insert seems to have worked, but only for the first 6kb. The rest of the data is just gone?!
When I try to use to oldfashioned mysql_connect, mysql_query, etc routines, it all works like a charm. So it's really seems to be a Zend framework issue...
Has anybody else experienced this?

Configuration
Zend Framework v1.11.10
Zend_Db_Table is configured with the PDO_MYSQL adapter
MySQL database table two columns; id (autoincrement)
Issue
Attempting to INSERT 14kb of UTF8-encoded HTML into longtext column
Zend_Db_Table truncates the data at 6kb
Tests
mysql_query can INSERT the same data without issue
Zend_Db_Table can SELECT all the data without issue
setting error_reporting(-1) reveals 'no errors, warnings or notices'
mediumblob works fine
Isolating the issue
Change the column to a mediumtext. Does the insert work now?
Have you checked what the query actually looks like? See comment above from namesnik.
How many bytes get written on the failed inserts? Is it consistent?

Related

PHP query not returning data from new SQL column (MAMP)

I needed to simply add a new column to my DB table during development to accommodate a data change, however, my query when ran from my PHP script is not returning the column or data within said new column. My query is as straightforward as it gets SELECT * FROM time_table ORDER BY date DESC. It returns all previously existing columns from time_table, which leads me to believe there is a caching issue somewhere. I am using MAMP for local development, if that helps.
Thanks in advance.
Looks like the problem was a fault of my own.
Moral of the story: Do not forget to update your model(s) after making structure changes to the DB.

Laravel PHPUnit failing on ALTER TABLE using SQLite

I have a migration which I made at the beginning of my project, basically adding a TEXT column called 'description' which is set to NOT NULL.
Now several months down the track I need to change that to allow null.
I can't use Laravel 5.5 change() function as I have a enum in my column list and it bugs out, so i need to add it as a raw query in a migration like so;
DB::statement('ALTER TABLE `galleries` MODIFY `description` TEXT NULL;');
When i do a php artisan migrate against my local mysql database it all works great, BUT when i try to run my test suite, it all breaks.
Im using SQLite for my test suite, and the error im getting is as follows;
PDOException: SQLSTATE[HY000]: General error: 1 near "MODIFY": syntax error
If anyone else has come up against this issue and fixed it, i would love to hear how you did it.
Thanks
SQLite only allows you to rename the table or add a column. The ALTER TABLE statement cannot change or remove columns.
In order to change or remove a column in SQLite, you need to create a new table with the desired schema, copy the data from the original table to the new table, delete the original table, and then rename the new table to the original name.
This is all abstracted out for you by Laravel and DBAL, so your best bet may be to get help with figuring out the issue with your enum column (though that would be a separate question).
You can read more about altering tables in the SQLite docs here.

Mysql Data too long

Context, using doctrine to store an array as longtext in mysql column. Received some Notice: unserialize(): Error at offset 250 of 255 bytes.I therefore did some backtracking to realize the serialized string was truncated because its too big for a long text column. I really doubt that is the case. This string is near and far away from being 4GB.
Someone from this question suggested to take a look at SET max_allowed_packet but mine is 32M.
a:15:{i:0;s:7:"4144965";i:1;s:7:"4144968";i:2;s:7:"4673331";i:3;s:7:"4673539";i:4;s:7:"4673540";i:5;s:7:"4673541";i:6;s:7:"5138026";i:7;s:7:"5140255";i:8;s:7:"5140256";i:9;s:7:"5140257";i:10;s:7:"5140258";i:11;s:7:"5152925";i:12;s:7:"5152926";i:13;s:7:"51
Mysql table collation: utf8_unicode_ci
Any help would be greatly appreciated !!
Full Error
Operation failed: There was an error while applying the SQL script to the database.
ERROR 1406: 1406: Data too long for column 'numLotCommuns' at row 1
SQL Statement:
UPDATE `db`.`table` SET `numLotCommuns`='a:15:{i:0;s:7:\"4144965\";i:1;s:7:\"4144968\";i:2;s:7:\"4673331\";i:3;s:7:\"4673539\";i:4;s:7:\"4673540\";i:5;s:7:\"4673541\";i:6;s:7:\"5138026\";i:7;s:7:\"5140255\";i:8;s:7:\"5140256\";i:9;s:7:\"5140257\";i:10;s:7:\"5140258\";i:11;s:7:\"5152925\";i:12;s:7:\"5152926\";i:13;s:7:\"51}' WHERE `id`='14574'
The column was a tinytext...
Only logical explanation I can understand from this is that whether when I created my table in earlier version of doctrine, the default was tiny text
OR
I remember changing the type of the column within doctrine annotations and maybe the update didn't fully convert the type correctly.
Bottom line, check your types even though you use an orm.
Your column must have been defined as varchar(250).
You need to first convert it to the longtext.

ID cannot be null (Auto Increment)

I'm using an INSERT ON DUPLICATE KEY statement for my website. It's for creating news items, so I figured I could use the same MySQL command for both creating and updating news items.
However, when I use the following:
INSERT INTO table (id,title,content) VALUES(NULL,"Test","Test");
Instead of creating a new auto increment value it throws an error. However, the command works on my main development server. But not on my laptop. Both versions of MySQL are the same, the only difference being MySQL was installed manually on my server, and with WAMP on my laptop.
Are there any MySQL Variables that could be causing this?
I would suggest using INSERT INTO table (title,content) VALUES("Test","Test");
This will create a new row in the table with a new incremented ID.
Managed to solve it as best as I can.
I checked my code and found that when I inserted the empty POST'd ID was wrapping it in quotations. I've now changed it so that it puts NULL without quotations. So my query should now look like:
INSERT INTO table (id,title,content) VALUES(NULL,"test","Test")
ON DUPLICATE KEY UPDATE title=VALUES(title), content=VALUES(content);
That now works.
I think you should make query like this,
INSERT INTO table (title,content) VALUES("Test","Test");
If it still doesn't work then check if id column is set as auto-increment or not.

Should I be able to update a MySQL enum type with a text value?

I'm trying to update the value in a MYSQL enum field from PHP via Doctrine (5.3 and 1.2 respectively).
I get an error when I try and do this:
$q = Doctrine_Query::create()
->update('StMessages')
->set('status','new')
->where('message_id = ?',$msg_id);
I get a sql state error telling me that the column 'new' does not exist. If I enter 3 instead of new (presumably the internal index of the 'new' value), then the query works. In fact it happens in a SQL client too so perhaps this is a quirk of this version of MySQL? Its 5.1.45.
Anyone know if this is how MySQL is supposed to treat enums or if this is more likely a Doctrine issue? I have 'use_native_enum' set to true.
Given a table
create table test (
enumfield enum('a','b','c')
);
you should be able to do
update test set enumfield='a';
which is the whole point of the num field - not having to mess with indexes and whatnot.
What's the exact definition of your enum field?
I think this was actually an issue relating to the quoting of strings when updating the field's values. I'm not sure if Doctrine or I was at fault.

Categories