i want to export data from mysql table to excel or any such kind of alternative using php
does any one have any having any code or suggestion
You can get MySQL to export to a CSV file which any version of Excel will open.
SELECT *
INTO OUTFILE '/path/to/folder/result.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY ‘\\’
LINES TERMINATED BY '\n'
FROM example_table
You can still easily customize it but selecting columns etc
http://dev.mysql.com/doc/refman/5.1/en/select-into.html
Of course if you have phpMyAdmin on your server, there is an export option will will do this for you
I usually write to a file using something like fputcsv which excel will then open easily. #jleagle 's method also is a good one when you want a straight dump of the table.
If I'm just wanting a manual export I also use a program called Navicat which I use for managing databases when not working with the command line.
If you're looking to do it yourself every now and then you could just use PHP myadmin and use it's export functions rather than coding your own.
Transforming your table to CSV file is probably your safest bet as Excel can natively handle CSV as it were an Excel file. You can always do a "Save As" once in excel and save it to an .xlsx file. Here is a similar posting that has PHP snippets on transforming a table to CSV.
Related
My server can be PHP or nodeJS, prefer staying in PHP but if it can't be done/better in node i'll appreciate the answer.
I want to generate and export a very big CSV file, the problem is that I can't load all the data from MySQL once, so currently I limit my data to some amount which doesn't crash the app, but it's slow and most data won't be exported.
I thought of this solutions to generate and export the CSV file:
Send 1..N calls to my server, each call will generate Part1...PartN CSV and the browser will append them [won't the browser crash?]
In one request, stream the data to the browser, but then how does the browser start downloading the file?
I don't mind the client will wait a lot, just want a good way to export ~200MB csv from the data I got in MySQL.
Thanks.
This is a nice solution using Mysql notation:
SELECT *
INTO OUTFILE '/tmp/dbexport.csv'
FIELDS ENCLOSED BY '"'
TERMINATED BY ';'
ESCAPED BY '"'
LINES TERMINATED BY '\r\n'
FROM tables;
Have a look at this answer too.
You could consider also use a .tsb (tab separated file) it does make no difference for you:
mysql your_database -e "select * from tables" -B > export.tsv
i have a txt file with around 0.8 million rows in it. i want to import it into SQL. i have tried converting it into CSV as well. but excel is not allowing 0.8 million rows at a time.
Try a simple LOAD DATA command. Assuming your example data is in sample.txt and you have the necessary permissions, it should be accomplished from the shell as:
LOAD DATA INFILE 'sample.txt' INTO TABLE sqlTable FIELDS TERMINATED BY ',' ENCLOSED BY '"'
Use mysqlimport
Something like
mysqlimport --columns='co_no,pd_ch' my_db sample.txt
Probably have to play around with it to get it working.
Using MySQL Workbench will help you a lot. If you need to modify the file first, use a scripting language such as Python. Excel is not a good option for that much data.
I am trying to parse a large sql file to a csv file. I have considered using fread in php but cant figure out if sql is separated by lines...bc I am assuming that fread is loading the data into RAM and that would not work.
Any ideas on how quickly convert sql to csv? (also I am running on a different machine than my db is on...so I cant export as csv unfortunately).
"Large" - what does it mean to you.
You can save to a file on server (the machine DB is running) and compress/download.
Exaple:
SELECT name,lastname,age FROM profile
The query returns three columns of the mysql table. Now for redirecting/print into a file:
SELECT name,lastname,age FROM profile INTO OUTFILE '/tmp/userdata.txt'
This will output data into the passed file in the above statement.
To output data in terms of CSV format add more options to the query as following:
SELECT name,lastname,age FROM profile INTO OUTFILE '/tmp/userdata.txt'
FIELDS enclosed by '"' separated by "," LINES TERMINATED BY '\n'
original post
install mysql on your local machine. now you can import the sql file, then freely export as csv or whatever you want.
I am trying to import write a script that imports a csv file and parses it into mysql, then imports it in the db, I came across this article,but it seems to be written for ms sql, does anyone know of a tutorial for doing this in mysql, or better still a libary or script that can do it ?.
Thanks :-)
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
Using the LOAD DATA INFILE SQL statement
Example :
LOAD DATA LOCAL INFILE '/importfile.csv'
INTO TABLE test_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(field1, filed2, field3);
If you are looking for script / library. I want to refer below link. here you can find :
PHP script to import csv data into mysql
This is a simple script that will allow you to import csv data into your database. This comes handy because you can simply edit the appropriate fields, upload it along with the csv file and call it from the web and it will do the rest.
It allows you to specify the delimiter in this csv file, whether it is a coma, a tab etc. It also allows you to chose the line separator, allows you to save the output to a file (known as a data sql dump).
It also permits you to include an empty field at the beginning of each row, which is usually an auto increment integer primary key.
This script is useful mainly if you don’t have phpmyadmin, or you don’t want the hassle of logging in and prefer a few clicks solution, or you simply are a command prompt guy.
Just make sure the table is already created before trying to dump the data.
Kindly post your comments if you got any bug report.
I just downloaded this csv of infoboxes of wikipedia from dbpedia. However I have no idea how to use it :-S I want to import all this data into a database but am not so sure how to take it from here. I downloaded it from http://wiki.dbpedia.org/Downloads32#infoboxes
I'm working in Php
Just for the record - this csv file is around 1.8 GB. I'm actually going through all this trouble for well just to get a select set of infoboxes from a select set of articles form wikipedia. I would do it manually except I need the infoboxes for over 10 000 entries which includes countries and cities. I'm just looking for an easy way to do this and frankly have been using all my options :(
To import CSV data into MySQL you can use a LOAD DATA INFILE statement, e.g.
LOAD DATA LOCAL INFILE '/importfile.csv'
INTO TABLE test_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(field1, filed2, field3);
Sometimes such data might need a little massaging, it's not tricky to write a script in Perl or similar to parse a file line by line and spit out SQL statements.
If you want to massage the data before importing it, you could take a look at my CSV stream editor, CSVfix - it's FOSS. It can also generate SQL INSERT statements for your database if for some reason your database's bulk loading of CSV data doesn't suit you.