MYSQL load data local Infile & special characters (& PHP) - php

I am trying to understand how to write a query to import my goodreads.com book list (csv) into my own database. I'm using PHP.
$query = "load data local infile 'uploads/filename.csv' into table books
RETURN 1 LINES
FIELDS TERMINATED BY ','
ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
(Title, Author, ISBN)";
This results in data in the wrong columns, I also can't get the ISBN, no matter what I try, guessing the equals (=) sign is causing some problems.
Here's the first two lines of the CSV:
Book Id,Title,Author,Author l-f,Additional Authors,ISBN,ISBN13,My Rating,Average Rating,Publisher,Binding,Number of Pages,Year Published,Original Publication Year,Date Read,Date Added,Bookshelves,Bookshelves with positions,Exclusive Shelf,My Review,Spoiler,Private Notes,Read Count,Recommended For,Recommended By,Owned Copies,Original Purchase Date,Original Purchase Location,Condition,Condition Description,BCID
11487807,"Throne of the Crescent Moon (The Crescent Moon Kingdoms, #1)","Saladin Ahmed","Ahmed, Saladin","",="0756407117",="9780756407117",0,"3.87","DAW","Hardcover","288",2012,2012,,2012/02/19,"to-read","to-read (#51)","to-read",,"",,,,,0,,,,,
What I want to do is only import the Title, Author, and ISBN for all books to a table that has the following fields: id (auto), title, author, isbn. I want to exclude the first row as well because I dont want the column headers, but when I have tried that it fails every time.
What's wrong with my query?

Unfortunately, in case you want to use the native MySQL CSV parser, you will have to create a table that matches the column count of the CSV file. Simply drop the unnecessary columns when the import is finished.

Related

LOAD DATA INFILE with additional columns not working

Txt file:
942,Nike%27s+cities,2469,2,2,164
316,just+me,820,1,1,286
827,RES+4+GOLD,1523,1,1,214
1171,KnightBlood,1550,1,1,211
1172,athens,2095,2,2,177
The above file are alliances from a game i'm trying to make a tool for the format is the following:
id, name, points, towns, members, rank
In my database i'm storing 3 additional columns:
id, name, points, towns, members, rank, world_server, world_id, date
I've looked around and tried multiple different options/queries but I can't seem to get it to work. My current query, in PHP, is:
"LOAD DATA LOCAL INFILE /path/file.txt
INTO TABLE alliance
FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'
SET world_server = $world[server], world_id = $world[id], date = $curDate"
I get that this is probably a duplicate question but I have searched through stackoverflow, mysql docs, etc... for multiple days now and I don't get what's wrong with my query. I am not getting any error messages either which doesn't help, my database just stays empty.
Hope someone can help,
A desperate student.

Export large table using a SELECT query into client machine

I have a table with over a million rows. The PHP export script with headers and ajax that I normally use to create the user interface to export to csv, is not able to handle these many rows and times out. Hence looking for an alternative
After a fews days of digging I collated the below script from the internet, this downloads wonderfully but only to the local server > mysql folder
What I am looking for is to create a php mysql script so I can let users download large tables to csv's through the php user interface itself
SELECT 'a', 'b', 'c' UNION ALL SELECT a,b,c INTO OUTFILE '2026.csv' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\n' FROM table ;
You need to fetch from the DB with paginated results.
Select * from a LIMIT 0,100; the result of this query you must put in in a variable, then parse it like this:
Export to CSV via PHP
After you put the first 100 elements to the csv, you have to fetch from 100 to 200 and so on...then to reopen the csv and put the 100 to 200 elements and so on.
And then finally send the csv to the user.

Export MySql table by query selecting few rows ordered descending

my issue is this
I have table in my database which have more than 1 million rows. Sometimes i need have sql dump of my table in my PC and export whole table taking very long.
Actually i need exported table for example with last 5000 rows.
So is there a way to export MySql table by selecting last X rows.
I know some ways to do it by terminal commands, but i need poor MySql query if it is possible.
Thanks
If I understand well, you could try by using the INTO OUTFILE functionality provided by MySQL. of course I don't know what's your current query but you can easily change my structure with yours:
SELECT *
FROM table_name
INTO OUTFILE '/tmp/your_dump_table.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
ORDER BY id DESC
LIMIT 5000
Since the user #Hayk Abrahamyan has expressed preference to export the dump as .sql file, let's try to analyze a valid alternative:
We can run the query from phpmyadmin or (it's for sure a better solution) MysqlWorkbench SQL editor console and save it by press the export button (see the picture below):
As .sql output file result you will have something like the structure below:
/*
-- Query: SELECT * FROM mytodo.todos
ORDER BY id DESC
LIMIT 5000
-- Date: 2018-01-07 13:15
*/
INSERT INTO `todos` (`id`,`description`,`completed`) VALUES (3,'Eat something',1);
INSERT INTO `todos` (`id`,`description`,`completed`) VALUES (2,'Buy something',1);
INSERT INTO `todos` (`id`,`description`,`completed`) VALUES (1,'Go to the store',0);

Output Database records at data insertion automatically

I was wondering if there was a type of code where when data is inputted into the database, a text file would be updated automatically. I have a website with food menu items and want to output food items to a text file, drinks to another text file and dessert items to another text file. Is it possible ?
SELECT tblnum,food_name FROM clients
enter code hereINTO OUTFILE '/tmp/food_orders.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
You should use RethinkDB to achieve this.
https://www.rethinkdb.com/
With RethinkDB you can push to a script that processes the given json.

How To Iterate UTF-8 String To Mysql

Could please somebody help me find out how to iterate these raw txt data to mysql. The format is
user id | item id | rating | timestamp
and i want to insert these data to my table in MySql (using PHPmyAdmin), let's say the table structure is : user_id (int), item_id(int), rating(int), timestamp(int) with its name "Rating".
So, i want to know how to insert these data to my table, i'm fine with php, or if there are easier way to do this.
If you want to generate raw SQL queries, you can do so by using find and replace in your text editor (that looks like Notepad++). I'm guessing that your delimiters are tabs.
Find and replace all tab characters and replace them with a comma. We do not need to quote anything as all of your fields are integers.
Find and replace all newline characters and replace them with a SQL query.
Execute these commands in regular expression mode:
Columns
Find: \t
Replace: ,
Rows
Find: \r\n (if that doesn't find anything, look for \n)
Replace: );\r\nINSERT INTO Rating (user_id, item_id, rating, timestamp) VALUES (
On the first row, insert the text INSERT INTO Rating (user_id, item_id, rating, timestamp) VALUES ( to make the row a valid SQL statement.
On the last row, remove any trailing portions of SQL query after the last semicolon.
Copy and paste this into your PHPMyAdmin and it should be all good.
The simplest way I have found for doing similar is to use Excel. Import the text file into a new document - judging by the look it should be easy to seperate the columns as they appear to be tab delimited. Once you have the required columns set up a string concatenation to include the values... kind of like
=CONCATENATE("INSERT INTO Rating SET user_id='",A1,"', item_id='",B1,"', rating='",C1,"', timestamp='",D1,"';")
Then repeat for all rows, copy and paste into sql client
you can use toad for mysql , import wisard and you create a table with the same structure (user id | item id | rating | timestamp) of you file after import all data you export the sql insert of you new table.

Categories