MySQL: INTO OUTFILE in csv generates invalid csv format - php

I am trying to export sql query result to csv file it works but the out csv data does not look good and strange.
MySQL Table Download Link:
https://www.dropbox.com/s/vtr215bcxqo3wsy/data.sql?dl=0
CSV Generated by sql query:
Download Link to original Generated CSV File: https://www.dropbox.com/s/fnjf7ycmh08hd22/data.csv?dl=0
I am using following code:
$query = <<<EOL
SELECT * FROM data ORDER BY FN ASC limit 3
INTO OUTFILE 'folder/data.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
EOL;
$result = mysql_query($query);
Why csv format is looking so weird and unacceptable?
If I try same code for some other table then everything works like charm then what's wrong?

See final answer below
It looks like your lines are terminated by \\n and it is throwing the extra slashes in random places.
Instead try a double slash followed by an n (\\n) and see what happens:
$query = <<<EOL
SELECT * FROM data ORDER BY FN ASC limit 3
INTO OUTFILE 'folder/data.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\\n'
EOL;
$result = mysql_query($query);
EDIT
Final Answer
Another observation: I noticed that in your PROP_TYPE field, there are \r\n characters. Is there any way you can filter them out in your query using the REPLACE() function?
I know you are looking for a solution that is SQL based, and this is a hard issue because of the massive amount of data. Hope this leads you to the correct solution.
As you mentioned, using update data set PROP_TYPE = replace(PROP_TYPE, '"','') fixed the issue.

Consider simply using PHP to connect to MySQL, run query, then output to csv.
<?php
$host="localhost";
$username="user";
$password="password";
$database="dbName";
# open connection
try {
$dbh = new PDO("mysql:host=$host;dbname=$database",$username,$password);
}
catch(PDOException $e) {
echo $e->getMessage();
}
$sql = "SELECT * FROM data ORDER BY FN ASC limit 3;";
$STH = $dbh->query($sql);
$STH->setFetchMode(PDO::FETCH_ASSOC);
while($row = $STH->fetch()) {
# write to csv file
$fs = fopen("folder/data.csv","a");
fputcsv($fs, $row);
fclose($fs);
}
# close connection
$dbh = null;
?>

Finally I fixed my issue.
Actually #Terry is right. There was some issue with a field PROP_TYPE in table.
PROP_TYPE field had double quotes " in its values that was causing issue.
For example
PROP_TYPE
"Value 1"
"Value 2" ....
So first of all I had to remove extra double quotes using update data set PROP_TYPE = replace(PROP_TYPE, '"','') so now my issue is fixed.
Thanks all of you for your efforts.
I really appreciate.

Related

adding a new column to large csv file and populate his fields using php [duplicate]

How can I import a CSV file into a MySQL table? I would like for the first row of data be used as the column names.
I read How do I import CSV file into a MySQL table?, but the only answer was to use a GUI and not a shell?
Instead of writing a script to pull in information from a CSV file, you can link MYSQL directly to it and upload the information using the following SQL syntax.
To import an Excel file into MySQL, first export it as a CSV file. Remove the CSV headers from the generated CSV file along with empty data that Excel may have put at the end of the CSV file.
You can then import it into a MySQL table by running:
load data local infile 'uniq.csv' into table tblUniq fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(uniqName, uniqCity, uniqComments)
as read on: Import CSV file directly into MySQL
EDIT
For your case, you'll need to write an interpreter first, for finding the first row, and assigning them as column names.
EDIT-2
From MySQL docs on LOAD DATA syntax:
The IGNORE number LINES option can be used to ignore lines at the
start of the file. For example, you can use IGNORE 1 LINES to skip
over an initial header line containing column names:
LOAD DATA INFILE '/tmp/test.txt' INTO TABLE test IGNORE 1 LINES;
Therefore, you can use the following statement:
LOAD DATA LOCAL INFILE 'uniq.csv'
INTO TABLE tblUniq
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(uniqName, uniqCity, uniqComments)
Here's a simple PHP command line script that will do what you need:
<?php
$host = 'localhost';
$user = 'root';
$pass = '';
$database = 'database';
$db = mysqli_connect($host, $user, $pass) or die ("could not connect to mysql");
mysqli_select_db($db, $database) or die ("no database");
/********************************************************************************/
// Parameters: filename.csv table_name
$argv = $_SERVER[argv];
if($argv[1]) { $file = $argv[1]; }
else {
echo "Please provide a file name\n"; exit;
}
if($argv[2]) { $table = $argv[2]; }
else {
$table = pathinfo($file);
$table = $table['filename'];
}
/********************************************************************************/
// Get the first row to create the column headings
$fp = fopen($file, 'r');
$frow = fgetcsv($fp);
foreach($frow as $column) {
if($columns) $columns .= ', ';
$columns .= "`$column` varchar(250)";
}
$create = "create table if not exists $table ($columns);";
mysqli_query($db, $create) or die(mysqli_error($db));
/********************************************************************************/
// Import the data into the newly created table.
$file = $_SERVER['PWD'].'/'.$file;
$q = "LOAD DATA INFILE '$file' INTO TABLE $table FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' ignore 1 lines;";
mysqli_query($db, $q) or die(mysqli_error($db));
?>
It will create a table based on the first row and import the remaining rows into it. Here is the command line syntax:
php csv_import.php csv_file.csv table_name
if you have the ability to install phpadmin there is a import section where you can import csv files to your database there is even a checkbox to set the header to the first line of the file contains the table column names (if this is unchecked, the first line will become part of the data
First create a table in the database with same numbers of columns that are in the csv file.
Then use following query
LOAD DATA INFILE 'D:/Projects/testImport.csv' INTO TABLE cardinfo
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
If you start mysql as "mysql -u -p --local-infile ", it will work fine
To load data from text file or csv file the command is
load data local infile 'file-name.csv'
into table table-name
fields terminated by '' enclosed by '' lines terminated by '\n' (column-name);
In above command, in my case there is only one column to be loaded so there is no "terminated by" and "enclosed by" so I kept it empty else programmer can enter the separating character . for e.g . ,(comma) or " or ; or any thing.
**for people who are using mysql version 5 and above **
Before loading the file into mysql must ensure that below tow line are added in side etc/mysql/my.cnf
to edit my.cnf command is
sudo vi /etc/mysql/my.cnf
[mysqld]
local-infile
[mysql]
local-infile
I wrote some code to do this, i'll put in a few snippets:
$dir = getcwd(); // Get current working directory where this .php script lives
$fileList = scandir($dir); // scan the directory where this .php lives and make array of file names
Then get the CSV headers so you can tell mysql how to import (note: make sure your mysql columns exactly match the csv columns):
//extract headers from .csv for use in import command
$headers = str_replace("\"", "`", array_shift(file($path)));
$headers = str_replace("\n", "", $headers);
Then send your query to the mysql server:
mysqli_query($cons, '
LOAD DATA LOCAL INFILE "'.$path.'"
INTO TABLE '.$dbTable.'
FIELDS TERMINATED by \',\' ENCLOSED BY \'"\'
LINES TERMINATED BY \'\n\'
IGNORE 1 LINES
('.$headers.')
;
')or die(mysql_error());
I wrestled with this for some time. The problem lies not in how to load the data, but how to construct the table to hold it. You must generate a DDL statement to build the table before importing the data.
Particularly difficult if the table has a large number of columns.
Here's a python script that (almost) does the job:
#!/usr/bin/python
import sys
import csv
# get file name (and hence table name) from command line
# exit with usage if no suitable argument
if len(sys.argv) < 2:
sys.exit('Usage: ' + sys.argv[0] + ': input CSV filename')
ifile = sys.argv[1]
# emit the standard invocation
print 'create table ' + ifile + ' ('
with open(ifile + '.csv') as inputfile:
reader = csv.DictReader(inputfile)
for row in reader:
k = row.keys()
for item in k:
print '`' + item + '` TEXT,'
break
print ')\n'
The problem it leaves to solve is that the final field name and data type declaration is terminated with a comma, and the mySQL parser won't tolerate that.
Of course it also has the problem that it uses the TEXT data type for every field. If the table has several hundred columns, then VARCHAR(64) will make the table too large.
This also seems to break at the maximum column count for mySQL. That's when it's time to move to Hive or HBase if you are able.
Here's how I did it in Python using csv and the MySQL Connector:
import csv
import mysql.connector
credentials = dict(user='...', password='...', database='...', host='...')
connection = mysql.connector.connect(**credentials)
cursor = connection.cursor(prepared=True)
stream = open('filename.csv', 'rb')
csv_file = csv.DictReader(stream, skipinitialspace=True)
query = 'CREATE TABLE t ('
query += ','.join('`{}` VARCHAR(255)'.format(column) for column in csv_file.fieldnames)
query += ')'
cursor.execute(query)
for row in csv_file:
query = 'INSERT INTO t SET '
query += ','.join('`{}` = ?'.format(column) for column in row.keys())
cursor.execute(query, row.values())
stream.close()
cursor.close()
connection.close()
Key points
Use prepared statements for the INSERT
Open the file.csv in 'rb' binary
Some CSV files may need tweaking, such as the skipinitialspace option.
If 255 isn't wide enough you'll get errors on INSERT and have to start over.
Adjust column types, e.g. ALTER TABLE t MODIFY `Amount` DECIMAL(11,2);
Add a primary key, e.g. ALTER TABLE t ADD `id` INT PRIMARY KEY AUTO_INCREMENT;
Import CSV Files into mysql table
LOAD DATA LOCAL INFILE 'd:\\Site.csv' INTO TABLE `siteurl` FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\r\n';
Character Escape Sequence
\0 An ASCII NUL (0x00) character
\b A backspace character
\n A newline (linefeed) character
\r A carriage return character
\t A tab character.
\Z ASCII 26 (Control+Z)
\N NULL
visits :
http://www.webslessons.com/2014/02/import-csv-files-using-php-and-mysql.html
Use TablePlus application:
Right-Click on the table name from the right panel
Choose Import... > From CSV
Choose CSV file
Review column matching and hit Import
All done!
As others have mentioned, the load data local infile works just fine. I tried the php script that Hawkee posted, but it didn't work for me. Rather than debugging it, here's what I did:
1) Copy/paste the header row of the CSV file into a txt file and edit it with Emacs. Add a comma and CR between each field to get each on its own line.
2) Save that file as FieldList.txt.
3) Edit the file to include definitions for each field (most were varchar, but quite a few were int(x). Add create table *tablename* (to the beginning of the file and) to the end of the file. Save it as CreateTable.sql.
4) Start the mysql client with input from the Createtable.sql file to create the table.
5) Start the mysql client, copy/paste in most of the 'LOAD DATA INFILE' command substituting my table name and csv file name. Paste in the FieldList.txt file. Be sure to include the 'IGNORE 1 LINES' before pasting in the field list.
It sounds like a lot of work, but it's easy with Emacs...
So I attempted to use the script give by Hawkee but some of the commands are outdated. Using mysql_X is depreciated and needs to be replaced by mysqli_x. After doing some troubleshooting I wrote the following script and it is working nicely.
Please note: the following code assumes that you are entering floats. I used this script to import percentiles from the WHO for stats related to growth.
use -drop (before the file name) if you want to drop the table
<?php
//This script is for importing the percentile values.
//Written by Daniel Pflieger # GrowlingFlea Software
$host = 'localhost';
$user = 'root';
$pass = '';
$database = '';
//options. This is what we need so the user can specify whether or not to drop the table
$short_options = "d::";
$options = getopt($short_options);
//check if the flag "-drop" is entered by the end user.
if (!empty($options) && $options['d'] != "rop"){
echo "The only available argument is -drop \n";
exit;
} else if (!empty($options)){
$dropTable = true;
} else {
$dropTable = false;
}
//we use mysqli_* since this is required with newer versions of php
$db = mysqli_connect($host, $user, $pass, $database);
// argv changes if the drop flag is used. here we read in the name of the .csv file we want to import
if (isset($argv[1]) && empty($options) ) {
$file = $argv[1];
} else if (isset($argv[2]) && $options[1] = "rop" ) {
$file = $argv[2];
}
//we call the table name the name of the file. Since this script was used to import who growth chart info
//I appended the '_birth_to_5yrs' to the string. You probably want to remove this and add something that
//makes sense to you
$table = pathinfo($file);
$table = "who_" . $table['filename'] . "_birth_to_5yrs";
$table = str_replace('-', '_', $table);
// We read the first line of the .csv file. It is assumed that these are the headers.
$fp = fopen($file, 'r');
$frow = fgetcsv($fp);
$columns = '';
//we get the header names and for this purpose we make every value 'float'. If you are unsure of
//the datatype you can probably use varchar(250).
foreach($frow as $column) {
$columns .= "`" .$column . "` float,";
}
//drop the table to prevent data issues, if that is what the end user selects
if ($dropTable) {
mysqli_query($db, "drop table if exists $table");
}
// here we form the create statement and we create the table.
// we use the mysqli_real_escape_string to make sure we dont damage the DB
$create = "create table if not exists $table ($columns);";
$create = str_replace(',)', ')', $create);
$create = mysqli_real_escape_string($db, $create);
mysqli_query($db, $create);
// We read the values line-by-line in the .csv file and insert them into the table until we are done.
while ($frow = fgetcsv($fp)){
$insert = implode(", ", $frow);
$insert = "Insert into $table VALUES ( $insert )";
$insert = mysqli_real_escape_string($db, $insert);
$insert = mysqli_query($db, $insert);
}
An example of how to run the script:
php ../git/growlingflea-dev-tools/importCSV.php -drop wfh-female-percentiles-expanded-tables.csv
I have google search many ways to import csv to mysql, include " load data infile ", use mysql workbench, etc.
when I use mysql workbench import button, first you need to create the empty table on your own, set each column type on your own. Note: you have to add ID column at the end as primary key and not null and auto_increment, otherwise, the import button will not visible at later. However, when I start load CSV file, nothing loaded, seems like a bug. I give up.
Lucky, the best easy way so far I found is to use Oracle's mysql for excel. you can download it from here mysql for excel
This is what you are going to do:
open csv file in excel, at Data tab, find mysql for excel button
select all data, click export to mysql.
Note to set a ID column as primary key.
when finished, go to mysql workbench to alter the table,
such as currency type should be decimal(19,4) for large amount decimal(10,2) for regular use.
other field type may be set to varchar(255).

Is this the best format of CSV file for importing into a MySQL database

I have data in the following format (sample data, there are many rows):
"Rec Open Date","number 1","number 2","Data Volume (Bytes)","Device Manufacturer","Device Model","Product Description"
"2015-10-06","0427","70060","137765","Samsung Korea","Samsung SM-G900I","$39 option"
"2015-10-06","7592","55620","0","Apple Inc","Apple iPhone 6 (A1586)","some text #16"
...
what I want to know is, what is the best format/practice for importing this into mysql?
Some specific questions are:
Should the date be "2015-10-06"
Should columns 2, 3, and 4 be in string format with double quotes e.g. "0427"
For the column headers, should I remove all the spaces and the brackets
Anything else
maybe my data would be better looking like this before importing it into my database:
Replace all spaces with underscore
remove brackets
turn columns 2, 3, and 4 into values by removing the double quotes
which would look like this:
"Rec_Open_Date","number_1","number_2","Data_Volume_Bytes","Device_Manufacturer","Device_Model","Product_Description"
"2015-10-06",0427,70060,137765,"Samsung Korea","Samsung SM-G900I","$39 option"
"2015-10-06",7592,55620,0,"Apple Inc","Apple iPhone 6 (A1586)","some text #16"
...
Again just looking for best practice out there.
The next question will be is there a parser that can do all this, maybe in bash or other equivalent?
You can do the job with mysql load data infile from the command line as described here: http://dev.mysql.com/doc/refman/5.7/en/load-data.html
If you have control over the format of the csv file, any values that will load into numeric database types should not have quotes. The date values are fine for mysql DATE type as you have shown them. If you are stuck with that csv format, you can use the load data options to convert input values as described here: LOAD DATA INFILE easily convert YYYYMMDD to YYYY-MM-DD? The example is about date conversion, but many other conversions are possible.
Also note that, if your db table is already defined, you can skip the header row so no need to worry about the headings matching column names and spaces, etc.
1.One way to do this is by importing the csv file into mysql, you can use tools like phpMyAdmin.
2.You can try this code. Its a php script to convert the file to mysql.
<?php
$databasehost = "localhost";
$databasename = "test";
$databasetable = "sample";
$databaseusername="test";
$databasepassword = "";
$fieldseparator = ",";
$lineseparator = "\n";
$csvfile = "filename.csv";
if(!file_exists($csvfile)) {
die("File not found. Make sure you specified the correct path.");
}
try {
$pdo = new PDO("mysql:host=$databasehost;dbname=$databasename",
$databaseusername, $databasepassword,
array(
PDO::MYSQL_ATTR_LOCAL_INFILE => true,
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION
)
);
} catch (PDOException $e) {
die("database connection failed: ".$e->getMessage());
}
$affectedRows = $pdo->exec('
LOAD DATA LOCAL INFILE ".$pdo->quote($csvfile)." INTO TABLE `$databasetable`
FIELDS TERMINATED BY ".$pdo->quote($fieldseparator)."
LINES TERMINATED BY ".$pdo->quote($lineseparator))." ');
echo "Loaded a total of $affectedRows records from this csv file.\n";
?>
3.You can use any framework or tool to do so.

mysqli_info and mysqli_num_rows not working

I have written PHP which will successfully connect to MySQL database and upload CSV to MySQL.
I am trying to get the results of the query using mysqli_info(), but I get no result from this function.
I also tried mysqli_num_rows() but this also returns blank. Everything else works correctly without any errors. When I attempt to run mysqli_info(), there is simply no result, nor any error.
$mysqli_link = mysqli_connect($mysqli_hostname, $mysqli_user,
$mysqli_password, $mysqli_database);
$workingfile = "/arb/path/to/file";
$tablename = "`table_name`";
$query = "LOAD DATA INFILE '".$workingfile."' INTO TABLE ".$tablename.
" FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\\r\\n';";
$result = mysqli_query($mysqli_link, $query);
//deletes temporary file from server
unlink($workingfile);
echo "CSV uploaded successfully to database.<br /><br />";
$info = mysqli_info($mysqli_link); // returns empty
//$info = "".mysqli_num_rows($result); // does not work either, returns empty
echo "Result: ".$info;
mysqli_close($mysqli_link);
This is my first post, so please let me know if I post anything incorrectly or needs editing.
Thank you,
RamzGT
if you have an auto increment column and you try to populate it's value with a value from your CSV file, probably that value is already in use into your table and that's way the operation could not be completed. So, you should remove the value for the auto incremental column from your CSV file and try again.

PHP script doesn't import CSV into MySQL

# download the file off the internet
$file = file_get_contents("http://localhost/sample.csv");
$filepath = "C:/xampp/htdocs/test/file/sample.csv";
file_put_contents($filepath, $file);
# load the data into sample table
$pt1 = "LOAD DATA LOCAL INFILE ";
$pt2 = "'/C:/xampp/htdocs/test/file/sample.csv' INTO TABLE sample ";
$pt3 = "FIELDS TERMINATED BY ',' ENCLOSED BY '\"' ";
$pt4 = "LINES TERMINATED BY '\r\n' ";
$pt5 = "(col1,col2,col3,col4)";
$sqlquerynew = $pt1.$pt2.$pt3.$pt4.$pt5;
mysql_query($sqlquerynew);
This piece of code works on non-csv (well, I tested it with a text file instead).
Before this part gets run, I have to create a table. The table is now created, but no data is loaded. The file stated in the path exists.
What could be the problem?
Thanks
This is a sample csv I found online
"REVIEW_DATE","AUTHOR","ISBN","DISCOUNTED_PRICE"
"1985/01/21","Douglas Adams",0345391802,5.95
"1990/01/12","Douglas Hofstadter",0465026567,9.95
....... etc
Two problems
$pt2 = "'/C:/xampp/htdocs/test/file/sample.csv' INTO TABLE sample ";
Remove the / in front of C.
and secondly,
$pt5 = "(col1,col2,col3,col4)";
Make sure you have the right name for the columns. If you want to import * all columns, just remove it. It is also a good idea to remove the header in your case, because removing $pt5 you will succeed, but the header row will be added to the table.

mysql:connection OK but response error

this is my code:
$Line = mysql_real_escape_string(postVar("showline"));
$Model = mysql_real_escape_string(postVar("showmodel"));
$NIK = mysql_real_escape_string(postVar("showNIK"));
$sql ="SELECT NIK,Line,Model FROM inspection_report";
$sql.="WHERE NIK='".$NIK."' AND Model LIKE '%".$Model."%' AND Line='".$Line."'";
$sql.="ORDER BY Inspection_datetime DESC LIMIT 0 , 30";
$dbc=mysql_connect(_SRV, _ACCID, _PWD) or die(_ERROR15.": ".mysql_error());
mysql_select_db("qdbase") or die(_ERROR17.": ".mysql_error());
$res=mysql_query($sql) or _doError(_ERROR30 . ' (<small>' . htmlspecialchars($sql) . '</small>): ' . mysql_error() ); // submit SQL to MySQL and error trap.
$num=mysql_affected_rows();
$objJSON=new mysql2json();
print(trim($objJSON->getJSON($res,$num,'aaData',false)));
mysql_free_result($res);
at firebugs shows that connection to process page ok...but at response show error..
where is my fault?
I am assuming that is PHP.
Add the command echo $sql; after your lines above. I bet your query is malformed, i.e. no space between the end of the FROM clause and the WHERE. Same with ORDER BY. Happens all the time ;)
What Jason has said is good and will show you where the error is, which looks like a lack of spaces in the line breaks. Add a space before WHERE and another before ORDER
I have found a lot easier to write and read my SQL statements by declaring the SQL String within a single set of quotes as in:
$sql ="SELECT NIK,Line,Model FROM inspection_report
WHERE NIK='$NIK' AND Model LIKE '%$Model%' AND Line='$Line'
ORDER BY Inspection_datetime DESC LIMIT 0 , 30";
This method will also solve your problem with missing spaces between lines.
As stated in other answers, you're lacking spaces in your query:
$sql = "SELECT .... inspection_report";
$sql .= "WHERE NIK=..."
etc...
will generate a query string:
SELECT ... inspection_reportWHERE NIK=...
^^--- problem is here
Notice the lack of a space before the WHERE clause. You have to either modify your string concatenation statements to explicitly include the space:
$sql = "SELECT ... inspection_report";
$sql .= " WHERE NIK=..."
^---notice the space here
or use alternative syntax to build the string. For multi-line string assignments, it's generally always preferable to use HEREDOCs, unless you need to concatenate function call results or constants into the string:
$sql = <<<EOL
SELECT ... inspection report
WHERE NIK=...
EOL;
PHP will honor the line breaks inside the heredoc, and MySQL will silently treat them as spaces, preserving the integrity of your query.

Categories