I use the "load data infile" statement to load multiple .csv files in one table with a foreach loop. This works, only I have one problem. The columns in de .csv's are not the same. For example I have this:
CSV1:
id,name,ean,description
CSV2:
id,name,image,ean,description
And my MySQL table is:
id,name,ean,description
So because I want to import multiple csv's through a loop, I have a problem with CSV2, because of the image column. If possible, I want to match the name of the csv column with the name of the table column. So in this example, [image] is not imported. I can use a #ignore variable, but because every .csv is different, this doesn't work. I have this:
$sql = "
LOAD DATA LOCAL INFILE '$value[path]'
INTO TABLE db1.shoptest
FIELDS TERMINATED BY '$value[seperator]'
OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
(id, name, ean, description)
;";
Is it possible to match the csv column names with the table column names and skip all other columns in the .csv?
Thanks so much for helping me out!
Kind regards,
Mark
I think I understood the question. You wan't to get the column headers from the first row of the CSV, then build a string to use as the col_name_or_user_var argument of LOAD DATA and use #ignore variable assignments as needed.
id, name, state, ean, favorite_color, description
becomes...
(id, name, #bunk, ean, #bunk, description)
$whiteList = ['id', 'name', 'ean', 'description'];
$columnHeader = 'id, name, state, ean, favorite_color, description';
$columnHeader = explode(',', $s);
foreach ($columnHeader as $k => $v) {
if (! in_array(trim($v), $whiteList)) $columnHeader[$k] = '#bunk';
}
echo implode(',', $columnHeaders);
Related
This question's answers are a community effort. Edit existing answers to improve this post. It is not currently accepting new answers or interactions.
Is the database query faster if I insert multiple rows at once:
like
INSERT....
UNION
INSERT....
UNION
(I need to insert like 2-3000 rows)
INSERT statements that use VALUES syntax can insert multiple rows. To do this, include multiple lists of column values, each enclosed within parentheses and separated by commas.
Example:
INSERT INTO tbl_name
(a,b,c)
VALUES
(1,2,3),
(4,5,6),
(7,8,9);
Source
If you have your data in a text-file, you can use LOAD DATA INFILE.
When loading a table from a text file, use LOAD DATA INFILE. This is usually 20 times faster than using INSERT statements.
Optimizing INSERT Statements
You can find more tips on how to speed up your insert statements on the link above.
Just use a SELECT statement to get the values for many lines of the chosen columns and put these values into columns of another table in one go. As an example, columns "size" and "price" of the two tables "test_b" and "test_c" get filled with the columns "size" and "price" of table "test_a".
BEGIN;
INSERT INTO test_b (size, price)
SELECT size, price
FROM test_a;
INSERT INTO test_c (size, price)
SELECT size, price
FROM test_a;
COMMIT;
The code is embedded in BEGIN and COMMIT to run it only when both statements have worked, else the whole run up to that point gets withdrawn.
Here is a PHP solution ready for use with a n:m (many-to-many relationship) table :
// get data
$table_1 = get_table_1_rows();
$table_2_fk_id = 123;
// prepare first part of the query (before values)
$query = "INSERT INTO `table` (
`table_1_fk_id`,
`table_2_fk_id`,
`insert_date`
) VALUES ";
//loop the table 1 to get all foreign keys and put it in array
foreach($table_1 as $row) {
$query_values[] = "(".$row["table_1_pk_id"].", $table_2_fk_id, NOW())";
}
// Implode the query values array with a coma and execute the query.
$db->query($query . implode(',',$query_values));
EDIT : After #john's comment I decided to enhance this answer with a more efficient solution :
divides the query to multiple smaller queries
use rtrim() to delete last coma instead of implod()
// limit of query size (lines inserted per query)
$query_values = "";
$limit = 100;
$table_1 = get_table_1_rows();
$table_2_fk_id = 123;
$query = "INSERT INTO `table` (
`table_1_fk_id`,
`table_2_fk_id`,
`insert_date`
) VALUES ";
foreach($table_1 as $row) {
$query_values .= "(".$row["table_1_pk_id"].", $table_2_fk_id, NOW()),";
// entire table parsed or lines limit reached :
// -> execute and purge query_values
if($i === array_key_last($table_1)
|| fmod(++$i / $limit) == 0) {
$db->query($query . rtrim($query_values, ','));
$query_values = "";
}
}
// db table name / blog_post / menu / site_title
// Insert into Table (column names separated with comma)
$sql = "INSERT INTO product_cate (site_title, sub_title)
VALUES ('$site_title', '$sub_title')";
// db table name / blog_post / menu / site_title
// Insert into Table (column names separated with comma)
$sql = "INSERT INTO menu (menu_title, sub_menu)
VALUES ('$menu_title', '$sub_menu', )";
// db table name / blog_post / menu / site_title
// Insert into Table (column names separated with comma)
$sql = "INSERT INTO blog_post (post_title, post_des, post_img)
VALUES ('$post_title ', '$post_des', '$post_img')";
This question's answers are a community effort. Edit existing answers to improve this post. It is not currently accepting new answers or interactions.
Is the database query faster if I insert multiple rows at once:
like
INSERT....
UNION
INSERT....
UNION
(I need to insert like 2-3000 rows)
INSERT statements that use VALUES syntax can insert multiple rows. To do this, include multiple lists of column values, each enclosed within parentheses and separated by commas.
Example:
INSERT INTO tbl_name
(a,b,c)
VALUES
(1,2,3),
(4,5,6),
(7,8,9);
Source
If you have your data in a text-file, you can use LOAD DATA INFILE.
When loading a table from a text file, use LOAD DATA INFILE. This is usually 20 times faster than using INSERT statements.
Optimizing INSERT Statements
You can find more tips on how to speed up your insert statements on the link above.
Just use a SELECT statement to get the values for many lines of the chosen columns and put these values into columns of another table in one go. As an example, columns "size" and "price" of the two tables "test_b" and "test_c" get filled with the columns "size" and "price" of table "test_a".
BEGIN;
INSERT INTO test_b (size, price)
SELECT size, price
FROM test_a;
INSERT INTO test_c (size, price)
SELECT size, price
FROM test_a;
COMMIT;
The code is embedded in BEGIN and COMMIT to run it only when both statements have worked, else the whole run up to that point gets withdrawn.
Here is a PHP solution ready for use with a n:m (many-to-many relationship) table :
// get data
$table_1 = get_table_1_rows();
$table_2_fk_id = 123;
// prepare first part of the query (before values)
$query = "INSERT INTO `table` (
`table_1_fk_id`,
`table_2_fk_id`,
`insert_date`
) VALUES ";
//loop the table 1 to get all foreign keys and put it in array
foreach($table_1 as $row) {
$query_values[] = "(".$row["table_1_pk_id"].", $table_2_fk_id, NOW())";
}
// Implode the query values array with a coma and execute the query.
$db->query($query . implode(',',$query_values));
EDIT : After #john's comment I decided to enhance this answer with a more efficient solution :
divides the query to multiple smaller queries
use rtrim() to delete last coma instead of implod()
// limit of query size (lines inserted per query)
$query_values = "";
$limit = 100;
$table_1 = get_table_1_rows();
$table_2_fk_id = 123;
$query = "INSERT INTO `table` (
`table_1_fk_id`,
`table_2_fk_id`,
`insert_date`
) VALUES ";
foreach($table_1 as $row) {
$query_values .= "(".$row["table_1_pk_id"].", $table_2_fk_id, NOW()),";
// entire table parsed or lines limit reached :
// -> execute and purge query_values
if($i === array_key_last($table_1)
|| fmod(++$i / $limit) == 0) {
$db->query($query . rtrim($query_values, ','));
$query_values = "";
}
}
// db table name / blog_post / menu / site_title
// Insert into Table (column names separated with comma)
$sql = "INSERT INTO product_cate (site_title, sub_title)
VALUES ('$site_title', '$sub_title')";
// db table name / blog_post / menu / site_title
// Insert into Table (column names separated with comma)
$sql = "INSERT INTO menu (menu_title, sub_menu)
VALUES ('$menu_title', '$sub_menu', )";
// db table name / blog_post / menu / site_title
// Insert into Table (column names separated with comma)
$sql = "INSERT INTO blog_post (post_title, post_des, post_img)
VALUES ('$post_title ', '$post_des', '$post_img')";
Sorry if this has been asked before, but I couldnt find anything that would relate to my case here on SE.
I am trying to import a CSV file into my Mysql database table with both the table the CSV having the exact same amount and order of columns, except that the table's column ID is not missing in the CSV file.
What I want to achieve is to import the CSV into the table while generating an ID number that automatically increases with each record. This does not seem possible as the CSV always seem to want to insert its data into the first colum in the table, but in my case I would need it to be the 2nd column.
How do I approach this and is there any reference code I can study? I currently am working off this PDO approach but am having the above mentioned difficulties.
PHP
<?php
$databasehost = "localhost";
$databasename = "test";
$databasetable = "sample";
$databaseusername="test";
$databasepassword = "";
$fieldseparator = ",";
$lineseparator = "\n";
$csvfile = "filename.csv";
if(!file_exists($csvfile)) {
die("File not found. Make sure you specified the correct path.");
}
try {
$pdo = new PDO("mysql:host=$databasehost;dbname=$databasename",
$databaseusername, $databasepassword,
array(
PDO::MYSQL_ATTR_LOCAL_INFILE => true,
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION
)
);
} catch (PDOException $e) {
die("database connection failed: ".$e->getMessage());
}
$affectedRows = $pdo->exec("
LOAD DATA LOCAL INFILE ".$pdo->quote($csvfile)." INTO TABLE `$databasetable`
FIELDS TERMINATED BY ".$pdo->quote($fieldseparator)."
LINES TERMINATED BY ".$pdo->quote($lineseparator));
echo "Loaded a total of $affectedRows records from this csv file.\n";
?>
Thank you
You can have MySQL set values for certain columns during import. If your id field is set to auto increment, you can set it to null during import and MySQL will then assign incrementing values to it.
LOAD DATA LOCAL INFILE ".$pdo->quote($csvfile)." INTO TABLE `$databasetable`
FIELDS TERMINATED BY ".$pdo->quote($fieldseparator)."
LINES TERMINATED BY ".$pdo->quote($lineseparator))."
SET id=null;
EDIT - In case the ID column is not present in CSV
The col1, col2, col3,... are names of actual columns in the DB table (without id column)
LOAD DATA LOCAL INFILE ".$pdo->quote($csvfile)." INTO TABLE `$databasetable`
FIELDS TERMINATED BY ".$pdo->quote($fieldseparator)."
LINES TERMINATED BY ".$pdo->quote($lineseparator))."
(col1, col2, col3,...)
SET id=null;
The AUTO_INCREMENT attribute can be used to generate a unique identity for new rows. Most version of mysql and engin support this. You need not worry about the ID and can use cron job to insert the needed field and AUTO_INCREMENT will take care of the id itself.
No value was specified for the AUTO_INCREMENT column, so MySQL assigned sequence numbers automatically. You can also explicitly assign 0 to the column to generate sequence numbers, unless the NO_AUTO_VALUE_ON_ZERO SQL mode is enabled. If the column is declared NOT NULL, it is also possible to assign NULL to the column to generate sequence numbers. When you insert any other value into an AUTO_INCREMENT column, the column is set to that value and the sequence is reset so that the next automatically generated value follows sequentially from the largest column value.
You can retrieve the most recent automatically generated AUTO_INCREMENT value with the LAST_INSERT_ID() SQL function or the mysql_insert_id() C API function. These functions are connection-specific, so their return values are not affected by another connection which is also performing inserts.
See example from official link :
[https://dev.mysql.com/doc/refman/5.7/en/example-auto-increment.html]
As you want to recreate the table over and over and want to manipulate the Data from the CSV, try this:
// You have to create the TABLE if not exists
$pdo->exec("TRUNCATE TABLE sample"); // No need to drop the table if columns don't change.
$csvContent = file_get_contents($csvfile); // Raw Data from file
$lines = explode("
", $csvContent); // The standard line separator is an ENTER
// Now you have each line separated
for($i = 0; $i < coount($lines); $i++) {
$col = explode(";", $lines[$i]); // Would be a comma
// Now you have each column separated
$pdo->exec("INSERT INTO sample (id, col1, col2, col3 ... coln) VALUES (NULL, '".$col[0]."', '".$col[1]."', '".$col[2]."' ... '".$col[n]."')");
}
This way you can dig into your Data and, besides setting an AUTO_INCREMENT ID, you can validate what is coming from the CSV and can correct/prevent importation errors.
I have unknown keys and values to import to database from CSV.
My code is
while($data = fgetcsv($handle,1000,",",'"'))
{
$data=array_map('addslashes',$data); // apply addslashes() to all values
$data=array_combine($csv_fields,$data); // csv fields assoc (key=>value)
$data=array_intersect_key($data,$tbl_fields); // discard redundant
$tbl_fields_str=implode("`,`",array_keys($data));
$tbl_vals_str=implode("','",array_values($data));
$q="INSERT INTO `cmid` (`cmid`,`$tbl_fields_str`) VALUES ('$cmidtrenutni','$tbl_vals_str') ON DUPLICATE KEY UPDATE (`$tbl_fields_str`) VALUES ('$tbl_vals_str')";
$conn->query($q);
}
I need to insert and if exist, update.
I try this code above but doesnt work.
I find something like http://dev.mysql.com/doc/refman/5.5/en/insert-on-duplicate.html
But this doesnt help in my way cause my table doesnt have defined fields. Keys and values are different on every input.
Any solution how to do this?
This is your query:
INSERT INTO `cmid` (`cmid`, `$tbl_fields_str`)
VALUES ('$cmidtrenutni', '$tbl_vals_str')
ON DUPLICATE KEY UPDATE (`$tbl_fields_str`) VALUES ('$tbl_vals_str');
The problem is the UPDATE part. You need to split the values so it looks like:
INSERT INTO `cmid` (`cmid`, `$tbl_fields_str`)
VALUES ('$cmidtrenutni', '$tbl_vals_str')
ON DUPLICATE KEY UPDATE
col1 = newcol1val,
col2 = newcol2val,
. . .
The short-hand that you are using is not valid syntax.
To import from a csv file, take a look at the LOAD DATA INFILE statement or the mysqlimport utility.
Try this one:
$q="INSERT INTO `cmid` (`cmid`,`$tbl_fields_str`) VALUES ('$cmidtrenutni','$tbl_vals_str') ON DUPLICATE KEY UPDATE cmid=cmid";
But I prefer using INSERT IGNORE for your problem:
$q="INSERT IGNORE INTO `cmid` (`cmid`,`$tbl_fields_str`) VALUES ('$cmidtrenutni','$tbl_vals_str')";
You cannot have "unknown" columns in mysql DB. if you want to store pairs of key-value in a mysql table, you should have a table with two columns : one would be named "key" and the other one "value". Add an extra column "cmid" to group your pairs.
This table should have a primary index on "cmid" and "key" columns.
Then you should insert values with a query like:
$sqlVals = "";
foreach ($data as $key => $val) {
$sqlVals .= "($cmidtrenutni, $key, $val),";
}
$sqlVals = substr($sqlVals, 0, -1); //remove last comma.
$query = "REPLACE INTO `myTable` (`cmid`, `key`,`value`) VALUES $sqlVals";
Suppose I have a very large array of information for a user:
$user=array(
"name"=>"john",
"ip"=>"xx.xx.xx.xx",
"email"=>"john#something.com",
//lots more values
)
Let's also suppose that this information needs to go into more than one table. For instance a username needs to go table users, address needs to go into a details table, etc.
Now, I use a certain self-made function to insert into my tables that matches array keys to column names and array values to the values being inputted. Something similar to this:
function insert_sql($table, arr $values){
GLOBAL $dbc;
$sql="INSERT INTO `$table` (".implode(array_keys($values), ", ").") VALUES (".implode(array_values($values), ", ").")";
$dbc->prepare($sql)->execute();
return $dbc->lastInsertId();
}
//I don't actually use this function, just trying to show you what is being accomplished.
The problem is that my function uses all the keys and all the values, so when I just need certain parts of the array put into multiple tables, it doesn't work.
The question is:
How do I make an INSERT statement ignore a column if it doesn't exist? So if I insert name,email,address, into table users, but this table doesn't have an address column, I need it to insert the row with the name and email but simply ignore the fact that the address column is not there.
EDIT: The other option is to make an array with the columns of a table and use it to filter the values array. Although I am not really sure how to set this up.
Find given table column names:
SELECT
column_name
FROM
information_schema.columns
WHERE
table_name = 'tablename'
And then just whitelist your keys in $values array
Example:
function insert_sql($table, array $values){
global $connection;
$query = "SELECT column_name FROM information_schema.columns WHERE table_name = :tablename";
/* #var $stmt PDOStatement */
$stmt = $connection->prepare($query);
$stmt->execute(array(
'tablename' => $table
));
$columns = array_flip($stmt->fetchAll(PDO::FETCH_COLUMN, 0));
$values = array_intersect_key($values, $columns);
var_dump($values);
}
How do I make an INSERT statement ignore a column if it doesn't exist?
So if I insert name,email,address, into table users, but this table
doesn't have an address column, I need it to insert the row with the
name and email but simply ignore the fact that the address column is
not there.
You can't
Instead you should map your data to the appropriate tables with separate inserts.