I am exporting data into a pdf using TCPDF. Everything works fine until I add a certain column (long text format) to the table. Whenever I add it, the table doesn't show up. When I run the sql query all the data shows up fine.
Is it possible there's a character or characters in the data field itself that are causing the table to become corrupted?
Also I can't for the life of me, figure out how I show more than 256 characters in cell.
If anyone can help I'd really appreciate it.
well until I find a better option I am running this to each of my comment fields
UPDATE TABLE_NAME set COLUMN_NAME = replace(COLUMN_NAME, '’', '`');
Related
I have products stored in a MySQL database, it's a Wordpress website, however my data in stored in custom tables. I need to search for products and I'm currently facing some performance issues that I hope someone could help me or point me a way.
Since I receive a file (*.csv) once a day to update all my products (add, update or remove products), I have a process to read the file and populate/update tables. In this process, I add a step to filter data and replace any special character to "unpecial" characters (example: replace 'á' by 'a').
By now, I have a table (products_search) related to product's table (products) and built from it, I use this table to do searches. When the user search something, I modify the input to replace special characters, so the search would be direct on table.
The problem: searching in "text" columns is slow, even adding index on that column. I'm currently search like this:
select * from products_search
where description like %search_word_1%
or description like %search_word_2% ...
If I get a result, I will get the ID and relate to product table and get all info I might need to show to user.
Solution looked for: I'm looking for a way to search on products_search table with a better performance. The wordpress search engine, as I understand, work only on "posts" table. Is there any way to do a quicker search? Perhaps using a plugin or just change the way the search is doing.
Thanks to all
I think we need to revise the nightly loading in order to make the index creation more efficient.
I'm assuming:
The data in the CSV file replaces the existing data.
You are willing to use FULLTEXT for searching.
Then do:
CREATE TABLE new_data (...) ENGINE=InnoDB;
LOAD DATA INTO new_data ...;
Cleanse the data in new_data.
ALTER TABLE new_data ADD FULLTEXT(...); The column(s) to index here either exist, or are added during step 1 or 3.
RENAME TABLE real_data TO old_data, new_data TO real_data;
DROP TABLE old_data;
Note that this has essentially zero downtime for real_data so you can continue to do SELECTs.
You have not explained how you spray the single CSV file into wp_posts and wp_postmeta. That sounds like a nightmare buried inside my step 3.
FULLTEXT is immensely faster than futzing with wp_postmeta. (I don't know if there is an existing way or plugin to achieve such.)
With `FULLTEXT(description), your snippet of code would use
WHERE MATCH(description) AGAINST ('word1 word2' IN BOOLEAN MODE)
instead of the very slow LIKE with a leading wildcard.
If you must use wp_postmeta, I recommend https://wordpress.org/plugins/index-wp-mysql-for-speed/
It has been three days trying to solve my form to submit correctly. I ended up installing codeigniter and grocery crud again but it's always the same problem.
If I type a url inside the input to update it in the database It will not work:
http://example.cc
but if I add a return (empty line) before it it submits correctly.,
(NEW LINE)
http://example.cc
Meanwhile, I'm unable to update a column with html if it contains certain tags such as
<input>
just click on project properties and try to update the youtube video url or try to change the paypal_form
Edit: and what I find really strange is that I can update the description column of the table that came with the example (click on products link and try to put the code below and it works ) but not for project properties -> desription
Here's an example line of code that if I type the form doesnt submit :
<input>
below is the database i'm trying to edit through grocery crud
So what can cause this problem ? my table and the table of the example have the same column types except the number of columns and their names that is different
and below is the code I'm using in my controller to produce the table:
$crud = new grocery_CRUD();
$crud->set_theme('datatables');
$crud->set_table('uf_object_properties');
$crud->set_subject('Property');
$crud->required_fields('value');
$crud->columns('property_name', 'property_value');
$output = $crud->render();
$this->load->view('myview.php', $output);
Grocery_CRUD will strip any tags from the input from a text input to make the content safe. If you want to add HTML change the datatype of your mysql column to "text" and use the HTML editor. There is probably a work around but you would need to get into the library code.
Ok after struggling a lot with it, it turned out to be a BUG in the grocery crud.
Simply, if the column name contain the letter V or something like that it wont work properly with html tags or special charachters
I started deleting column by column from the example that worked for me to make the table structure I want, after reducing all columns it was working, after that i started to rename them column by colum and once i renamed the column 'productDescription' to 'propertyValue' which is the name I wanted for the column it stopped working. So I started deleting letter by letter and found out the problem was with the column name.
Exmple of column names that produce this problem :
propertyValue
propertyVal
propertyV
propertyValeur
so I just replaced the second word value with the another word and problem was solved.
But seriously this is a very confusing bug that needs to be fixed sooner or later
EDIT :
The names above doesn't cause grocery crud to stop working at all, it just make it unable to submit such information which contain for example <input> or a link such as http://exmple.cc
Thanks all for the help
Our client has sent us a CSV file of data that I need to import into a specific table in our Postgresql 8.3.9 database. The database uses UTF-8 character encoding, i.e. our CMS allows multiple languages such as French which are inputted into the database via the CMS in French. One particular facility is for the client to upload images to the server and then enter "alt" tags for them in French. However, due to a bulk update required, we have been sent a CSV to feed into a particular table - for the image alt tags, in French.
The CSV has some special characters such as "é" - e.g.
"Bottes Adaptées Amora Cuir Faux-Croco Fauve Photo d'Ensemble"
The images themselves are hosted on two places - one is a CDN, and one is a local database backup and local server (web server) file backup. I am using a PHP script to read the CSV file and do the needful so that the "alt" tags are updated on two places - our web database, and the CDN.
However, when I read the CSV (using PHP), the character does not "come out" as expected.
The data is comming as "Bottes Adapt�es Amora Cuir Faux-Croco Fauve Photo d'Ensemble".
I don't think this has anything to do with the database, but it has something to do with my PHP file reading the CSV data. Even if I print the data that it is reading, the special character above does not print as above, it' prints as if the special character is not recognised. Other characters print fine.
Here is the code I'm using (not some special custom functions are used here to interact with the database but they can be ignored). The CSV file is made up of {column 1} for image name, and {column 2} for the ALT tag.
$handle = fopen($conn->getIncludePath() . "cronjobs/GIB_img_alt_tags_fr.csv", "r");
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
//normally I run a query here to check if the data exists - "SELECT imageid, image_fileref FROM table1 WHERE image_fileref = '". $data[0]. "'");
if ($conn->Numrows($result)) { //if rows were found -
$row=$conn->fetchArray($result);
//printing the data from $row here
}
}
fclose($handle);
You've still omitted key information - when asking for help with an UPDATE don't delete the UPDATE statement from the code - and your description of the problem is very confused, but there's some hint of what's going on.
Mismatched encodings
It's highly likely that your PHP connection has a client_encoding set to something other than UTF-8. If you're sending UTF-8 data down the connection without conversion, the connection's client_encoding must be UTF-8.
To confirm, run SHOW client_encoding as a SQL statement from PHP and print the result. Add SET client_encoding = 'UTF-8' to your code before importing the CSV and see if that helps. Assuming, of course, that the CSV file is really UTF-8 encoded. If it isn't, you need to either transcode it to UTF-8 or find out what encoding it is in and SET client_encoding to that.
Read The Absolute Minimum Every Software Developer Absolutely, Positively Must Know About Unicode and Character Sets (No Excuses!) and the PostgreSQL manual on character set support.
Better approach
The approach you're taking is unnecessarily slow and inefficient, anyway. You should be:
Opening a transaction
Creating a temporary table in the database with the same structure as the CSV file.
Use pg_copy_from to load the CSV into the temp table, with appropriate options to specify the CSV format.
Merge the contents of the temporary table into the destination table with an INSERT then an UPDATE, eg:
INSERT INTO table1 (image_fileref, ... other fields ...)
SELECT n.image_fileref, ... other fields ...
FROM the_temp_table n
WHERE NOT EXISTS (SELECT 1 from table1 o WHERE o.image_fileref = n.image_fileref);
UPDATE table1 o
SET .... data to update ....
FROM the_temp_table n
WHERE o.image_fileref = n.image_fileref;
Commit the transaction
The INSERT may be more efficiently written as a left outer join with an IS NULL filter to exclude matching rows. It depends on the data. Try it.
I probably could've written a faster CTE-based version, but you didn't say what version of Pg you were using, so I didn't know if your server supported CTEs.
Since you left out the UPDATE I can't be more specific about the UPDATE or INSERT statements. If you'd provided the schema for table1 or even just your INSERT or UPDATE I could've said more. Without sample data I haven't been able to run the statements to check them, and I didn't feel like making up some dummy data, so the above is untested. As it is, completing the code is left as a learning exercise. I will not be updating this answer with fully-written-out statements, you get to work that out.
I have a tab delimited text file with the first row being label headings that are also tab delimited, for example:
Name ID Money
Tom 239482 $2093984
Barry 293984 $92938
The only problem is that there are 30 some columns instead of 3 so I'd rather not have to type out all the (name VARCHAR(50),...) if it's avoidable.
How would I go about writing a function that creates the table from scratch in php from the text file, and say the function takes in $file_path and $table_name? Do I have to write all the column names again telling mysql what type they are and chop off the top or is there a more elegant solution when the names are already there?
You would somehow need to map the column type to the columns in your file. You could do this by adding that data to your textfile. For instance
Name|varchar(32) ID|int(8) Money|int(10)
Tom 239482 $2093984
Barry 293984 $92938
or something similar. Then write a function thet get's the column name and columntype using the first line and the data to fill the table with using all the other rows. You might also want to add a way to name the given table etc. However, this would probably be as much work (if not more) than creating SQL queries using you text file. Add a create table statement at the top and insert statements for each line. With search and replace this could be done very fast.
Even if you could find a way to do this, how would you determine the column type? I guess there would be some way to determine the type of the columns through checking for certain attributes (int, string, etc). And then you'd need to handle weird columns like Money, which might be seen as a string because of the dollar sign, but should almost certainly be stored as an integer.
Unless you plan on using this function quite a bit, I wouldn't bother spending time cobbling it together. Just fat finger the table creation. (Ctrl-C, Ctrl-V is your friend)
I am having a problem inserting a long text (around 9000 characters) with an INSERT query in php. I have already tested changing the column type (TEXT, MEDIUMTEXT,LONGTEXT) even thought TEXT type should do for 9000 chars.
Also tested with a text free of any special chars or quotes.
I print my query and looks ok so I copy and paste into phpMyAdmin and the row inserts correctly. So the problem is coming when I try to INSERT from my php class.
I tested with a smaller text and this seems to work ok.
I really need to get this solved. If anyone has the solution please let me know.
Thanks!
I haven't yet found what is the problem inserting my long texts, but I have found a solution to turn around it, it is not very clean but at least it will work until I found the real problem, just in case anyone has the same issue this is what I did.
Split the text in peaces of 1000 chars, do my INSERT and the UPDATE the text field in the data base adding the peaces of text, so the code :
$textArray = str_split($text,1000);
foreach($textArray as $t){
$model = new Article_Model_UpdateText($id,$t);
}
The query in Article_Model_UpdateText looks like this :
"UPDATE mg_articles SET text = CONCAT (text, '".$text."') WHERE idArticle = ".$id.";";
Hope this helps someone, thanks for all your replies.
Try with the Datatype BLOB or LONGBLOB in mysql.
It will do your work.
My crystal ball suggests the issue may be related to max_allowed_packet:
http://dev.mysql.com/doc/refman/5.0/en/server-system-variables.html#sysvar_max_allowed_packet
(But it's just a blind shot.)
you should use mysqli_real_escape_string for storing long texts...it will be like
$variable= mysqli_real_escape_string(connection_variable,user input long text);
now you can store the $variable in your database by insert query, you should be storing $variable in longtext field in the database