I am trying to import write a script that imports a csv file and parses it into mysql, then imports it in the db, I came across this article,but it seems to be written for ms sql, does anyone know of a tutorial for doing this in mysql, or better still a libary or script that can do it ?.
Thanks :-)
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
Using the LOAD DATA INFILE SQL statement
Example :
LOAD DATA LOCAL INFILE '/importfile.csv'
INTO TABLE test_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(field1, filed2, field3);
If you are looking for script / library. I want to refer below link. here you can find :
PHP script to import csv data into mysql
This is a simple script that will allow you to import csv data into your database. This comes handy because you can simply edit the appropriate fields, upload it along with the csv file and call it from the web and it will do the rest.
It allows you to specify the delimiter in this csv file, whether it is a coma, a tab etc. It also allows you to chose the line separator, allows you to save the output to a file (known as a data sql dump).
It also permits you to include an empty field at the beginning of each row, which is usually an auto increment integer primary key.
This script is useful mainly if you don’t have phpmyadmin, or you don’t want the hassle of logging in and prefer a few clicks solution, or you simply are a command prompt guy.
Just make sure the table is already created before trying to dump the data.
Kindly post your comments if you got any bug report.
Related
I am using PHPSpreadsheet to take some spreadsheets a user can upload, add a column with certain values, save the file as CSV, and use the following query to import the csv file:
LOAD DATA LOCAL INFILE '{$file}'
INTO TABLE {$table}
FIELDS TERMINATED by ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
Alternatively I can do something like:
foreach($rows as $row){
// INSERT $row INTO table
}
The spreadsheets will all have the same columns/data-types.
What would be the most efficient way to do this? Going from Xlsx -> CSV -> MySQL Import seems like I am adding extra steps.
MySQL's direct CSV import is usually the fastest option, however it is not without limitations. One is that you need to import all or nothing in the file and you won't know how far along it is until it's done. As some imports can take hours, even days, you may not know where it's at. The entire insert operation on an InnoDB table takes place atomically for performance reasons but that means it's not visible until fully committed.
Another is the file must be present on the server. The LOCAL option is a quirky feature of the mysql command-line tool and probably doesn't work in your database driver unless emulated.
Inserting row-by-row with a CSV parser is almost always slower. If you must do a thing, be sure to prepare an INSERT statement once and re-use it in the loop, or do a "multi-INSERT" with as many rows as you can fit in your max statement size buffer.
My server can be PHP or nodeJS, prefer staying in PHP but if it can't be done/better in node i'll appreciate the answer.
I want to generate and export a very big CSV file, the problem is that I can't load all the data from MySQL once, so currently I limit my data to some amount which doesn't crash the app, but it's slow and most data won't be exported.
I thought of this solutions to generate and export the CSV file:
Send 1..N calls to my server, each call will generate Part1...PartN CSV and the browser will append them [won't the browser crash?]
In one request, stream the data to the browser, but then how does the browser start downloading the file?
I don't mind the client will wait a lot, just want a good way to export ~200MB csv from the data I got in MySQL.
Thanks.
This is a nice solution using Mysql notation:
SELECT *
INTO OUTFILE '/tmp/dbexport.csv'
FIELDS ENCLOSED BY '"'
TERMINATED BY ';'
ESCAPED BY '"'
LINES TERMINATED BY '\r\n'
FROM tables;
Have a look at this answer too.
You could consider also use a .tsb (tab separated file) it does make no difference for you:
mysql your_database -e "select * from tables" -B > export.tsv
I Have created a table on my db, and filled all the records, using CSV file.
I need to do this weekly to keep the table updated.
I want to upload the new records without disturbing the old one onto the same table using csv.
[I have to pick the data from remote host and upload it locally on my server, i dont have access to the remote db]
Kindly guide me.
You can upload records from a CSV into a table VERY quickly using the load data infile syntax (http://dev.mysql.com/doc/refman/5.1/en/load-data.html)
The syntax is pretty simple, but also flexible. This is an example:
LOAD DATA INFILE 'data.txt' INTO TABLE table2
FIELDS TERMINATED BY '\t';
You can kick these off from a console or via code.
This will append to the table, not replace it, so if you don't truncate it first, it should work a charm.
You can of course also load the data manually by parsing the CSV file in your code and manually creating an insert statement for each line of code, but if the format is fixed already, this will be quicker and more efficient.
Edit: It appends the data. By default, no database will delete data from a table unless you specifically tell it to. Any insert statement is what you consider an append statement.
i want to export data from mysql table to excel or any such kind of alternative using php
does any one have any having any code or suggestion
You can get MySQL to export to a CSV file which any version of Excel will open.
SELECT *
INTO OUTFILE '/path/to/folder/result.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY ‘\\’
LINES TERMINATED BY '\n'
FROM example_table
You can still easily customize it but selecting columns etc
http://dev.mysql.com/doc/refman/5.1/en/select-into.html
Of course if you have phpMyAdmin on your server, there is an export option will will do this for you
I usually write to a file using something like fputcsv which excel will then open easily. #jleagle 's method also is a good one when you want a straight dump of the table.
If I'm just wanting a manual export I also use a program called Navicat which I use for managing databases when not working with the command line.
If you're looking to do it yourself every now and then you could just use PHP myadmin and use it's export functions rather than coding your own.
Transforming your table to CSV file is probably your safest bet as Excel can natively handle CSV as it were an Excel file. You can always do a "Save As" once in excel and save it to an .xlsx file. Here is a similar posting that has PHP snippets on transforming a table to CSV.
I just downloaded this csv of infoboxes of wikipedia from dbpedia. However I have no idea how to use it :-S I want to import all this data into a database but am not so sure how to take it from here. I downloaded it from http://wiki.dbpedia.org/Downloads32#infoboxes
I'm working in Php
Just for the record - this csv file is around 1.8 GB. I'm actually going through all this trouble for well just to get a select set of infoboxes from a select set of articles form wikipedia. I would do it manually except I need the infoboxes for over 10 000 entries which includes countries and cities. I'm just looking for an easy way to do this and frankly have been using all my options :(
To import CSV data into MySQL you can use a LOAD DATA INFILE statement, e.g.
LOAD DATA LOCAL INFILE '/importfile.csv'
INTO TABLE test_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(field1, filed2, field3);
Sometimes such data might need a little massaging, it's not tricky to write a script in Perl or similar to parse a file line by line and spit out SQL statements.
If you want to massage the data before importing it, you could take a look at my CSV stream editor, CSVfix - it's FOSS. It can also generate SQL INSERT statements for your database if for some reason your database's bulk loading of CSV data doesn't suit you.