I am stuck with a peculiar issue here. I have a script that basically imports a CSV file into a database using fgetcsv() in php. There is no problem in doing this at all and I am able to update old entries as well using MySQL syntax ON DUPLICATE KEY UPDATE (I am in no way a MySQL expert, hence me asking here).
Here is that part of the code:
$handle = fopen($file,"r");
fgetcsv($handle, 1000, ",");//skip first row since they are headers
while(($fileop = fgetcsv($handle, 1000, ",")) !== false) //read line by line into $fileop
{
//read array values into vars
$item1 = $fileop[0];
$item2 = $fileop[1];
$key = $fileop[2];
// and a couple more
// now INSERT / UPDATE data in MySQL table
$sql = mysql_query("INSERT INTO table (item1,item2,key)
VALUES ('$item1','$item2','$key')
ON DUPLICATE KEY UPDATE item1='$item1',item2='$item2'");
}
This all works fine. What I am stuck with is the fact that some entries may have been removed from the actual CSV (as in the key may no longer be existant). What I would like to do is remove the entries from the MySQL table that are no longer present in the CSV.
Meaning if $key is gone from CSV also remove that row in the database table. I suppose I would do it before I run the Insert / Update query on the MySQL table?
I would appreciate any help guys.
Just keep an account of your keys.
Save every $key in an array in your while, and in the end run a query that says
DELETE FROM tabel WHERE key NOT IN (listofcommaseparatedkeysgoeshere)
$arrayThatYouNeedToTest = array();
$handle = fopen($file,"r");
fgetcsv($handle, 1000, ",");//skip first row since they are headers
while(($fileop = fgetcsv($handle, 1000, ",")) !== false) //read line by line into $fileop
{
//read array values into vars
$item1 = $fileop[0];
$item2 = $fileop[1];
$key = $fileop[2];
// and a couple more
// now INSERT / UPDATE data in MySQL table
$sql = mysql_query("INSERT INTO table (item1,item2,key)
VALUES ('$item1','$item2','$key')
ON DUPLICATE KEY UPDATE item1='$item1',item2='$item2'");
$arrayThatYouNeedToTest[] = $key;
}
$stringThatYouNeedToInspect = implode(",",$arrayThatYouNeedToTest);
$queryYouREALLYneedToCheckFirst = "DELETE FROM tabel WHERE key NOT IN (".$stringThatYouNeedToInspect.")";
//$result = mysql_query($queryYouREALLYneedToCheckFirst);
I do something very similar to this with an affiliate website - having just under 500,000 products.
In your database, simply add another column named "update_flag" or something similar. Set the default to be 0. As you add items from the CSV file, set the update_flag to be "1". In your 'on duplicate statement', set the filed to be "2". I also went and added 2 other fields: "date_added" and "date_updated".
After your import is complete, you can count the old items (to be deleted), newly added items, and those that have been updated. You can then simple delete from table where update_flag = 0
I hope this helps.
Related
I have a SQL database with 5 tables and I also have 5 CSV files, one for each of those tables.
I am struggling to create a PHP script that can be used to read each files and then upload the data into the correct table.
How can I go about this?
URL - http://php.net/manual/en/function.fgetcsv.php
<?php
$filename = "test.csv";
//Open the csv file in the read mode
if (($handle = fopen($filename, "r")) !== FALSE) {
//loop into the each row and do the stuff
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
//$data will have all the related columns
//Make sure you will use INSERT query here
}
fclose($handle);
}
If you want to write to specific tables there are many way to do so
Create 5 files where in each and every file while loop you will have that respective table name.
Eg : Think that you want to insert for customers.csv file
Then in while loop you may use the following way
while(/*Code*/){
//INSERT INTO customers (col1, col2) VALUES (val1, val2);
}
In case if your lazy than in one file only you can make a switch operation as follows
$filename = 'customers.csv';
switch( $filename ) {
case 'customers.csv':
while(){
//Insert
}
break;
case 'products.csv':
while(){
//Insert
}
break;
}
I asked a question yesterday that was unclear and I've now expanded it slightly. In short, this current project calls for a simple web interface where the user can upload a csv file (this web page is created already). I've modified my PHP for a test file but my situation calls for something different. Every day, the user will upload 1 to 5 different CSV reports. These reports have about 110 fields/columns, though not all fields will be filled in every report. I've created a database with 5 tables, each table covering different fields out of the 110. For instance, one table holds info on the water meters (25 fields) and another table holds info for the tests done on the meters (45 fields). I'm having a hard time finding a way to take the CSV, once uploaded, and split the data into the different tables. I've heard of putting the whole CSV into one table and splitting from there with INSERT statements but I have questions with that:
Is there a way to put a CSV with 110 fields into one table without having fields created? Or would I have to create 110 fields in MYSQL workbench and then create a variable for each in PHP?
If not, would I be able to declare variables from the table dump so that the right data then goes into its correct table?
I'm not as familiar with CSVs in terms of uploading like this, usually just pulling a csv from a folder with a known file name, so that's where my confusion is coming from. Here is the PHP i've used as a simple test with only 10 columns. This was done to make sure the CSV upload works, which it does.
<?php
$server = "localhost";
$user = "root";
$pw = "root";
$db = "uwstest";
$connect = mysqli_connect($server, $user, $pw, $db);
if ($connect->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
if(isset($_POST['submit']))
{
$file = $_FILES['file']['tmp_name'];
$handle = fopen($file, "r");
$c = 0;
while(($filesop = fgetcsv($handle, 1000, ",")) !== false)
{
$one = $filesop[0];
$two = $filesop[1];
$three = $filesop[2];
$four = $filesop[3];
$five = $filesop[4];
$six = $filesop[5];
$seven = $filesop[6];
$eight = $filesop[7];
$nine = $filesop[8];
$ten = $filesop[9];
$sql = "INSERT INTO staging (One, Two, Three, Four, Five, Six, Seven, Eight, Nine, Ten) VALUES ('$one','$two', '$three','$four','$five','$six','$seven','$eight','$nine','$ten')";
}
if ($connect->query($sql) === TRUE) {
echo "You database has imported successfully";
} else {
echo "Error: " . $sql . "<br>" . $conn->error;
}
}
}?>
Depending on CSV size, you might want to consider using MySQL's native CSV import function since it runs 10x-100x times faster.
If you do insist on importing row by row, then you can do something like this with PDO (or adapt it to mysqli).
If you want to match columns, then ,either store your csv as associative array, or parse first row and store it in in array like $cols.
in this case, $results is an associative array that stores a row of csv with column_name=>column_value
$cols=implode(',',array_keys($result));
$vals=':'.str_replace(",",",:",$cols);
$inserter = $pdo->prepare("INSERT INTO `mydb`.`mytable`($cols) VALUES($vals);");
foreach ($result as $k => $v) {
$result[':' . $k] = utf8_encode($v);
if(is_null($v))
$result[':' . $k] = null;
unset($result[$k]);
}
$inserter->execute($result);
hope this helps.
I suggest going with PDO just to avoid all kinds of weirdness that you may encounter in CSV's data.
This is how I would create columns/vals.
$is_first=true;
$cols='';
$vals='';
$cols_array=array();
while (($csv = fgetcsv($handle)) !== false) {
if($is_first)
{
$cols_array=$csv;
$cols=implode(',',$csv);
$is_first=false;
$vals=':'.str_replace(",",",:",$cols);
continue;
}
foreach ($result as $k => $v) {
$result[':' . $cols_array[$k]] = utf8_encode($v);
if(is_null($v))
$result[':' . $cols_array[$k]] = null;
unset($result[$k]);
}
$inserter->execute($result);
}
here is the code that I use for CSV imports.
$file='data/data.csv';
$handle = fopen($file, "r");
$path=realpath(dirname(__FILE__));
$full_path=$path."/../../$file";
$cnt = 0;
$is_first = true;
$headers=array();
$bind=array();
$csv = fgetcsv($handle, 10000, ",");
$headers=$csv;
$alt_query='LOAD DATA LOCAL INFILE \''.$full_path.'\' INTO TABLE mytable
FIELDS TERMINATED BY \',\'
ENCLOSED BY \'\"\'
LINES TERMINATED BY \'\r\n\'
IGNORE 1 LINES
(' . implode(',',$headers).')';
echo exec("mysql -e \"USE mydb;$alt_query;\"",$output,$code);
Assuming the relation between the tables and the CSV is arbitrary but uniform for now on you just need to establish that correspondence array index -> table column once.
I want to upload csv file in mysql database in php. I am using,
$handle = fopen($_FILES['filename']['tmp_name'], "r");
while (($data = fgetcsv($handle,1000,',','"')) !== FALSE) {
// insert into database...
}
fclose($handle);
But, it inserts only first row of the file.
EDIT:
when I am trying to print_r $data within while loop, then also it's giving me only one row.
I'm looping through a CSV to insert/update the name field of some records into a table. The script is mean't to insert the record and if it exists, only update the name field.
It's taking quite some time for larger CSV files so I was wondering if this code could be modified into a multiple INSERT query with an ON DUPLICATE KEY UPDATE command which will only update the name field of the record.
The CSV DOES NOT contain all the fields for the table, only the ones for the primary key and the name. And for that reason, REPLACE will not work for this case.
if (($handle = fopen($_FILES['csv']['tmp_name'], "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$title = 'Import: '.date('d-m-Y').' '.$row;
# CHECK IF ALREADY EXISTS
$explode = explode('-',$data[0]);
$areacode = $explode[0];
$exchange = $explode[1];
$number = $explode[2];
$update = "INSERT INTO ".TBLPREFIX."numbers SET
area_code = ".$areacode.",
exchange = ".$exchange.",
number = ".$number.",
status = 1,
name = '".escape($data[1])."'
ON DUPLICATE KEY UPDATE name = '".escape($data[1])."'";
mysql_query($update) or die(mysql_error());
$row++;
}
fclose($handle);
$content .= success($row.' numbers have been imported.');
}
Open a transaction before you start inserting, and commit it after you are done. This way the database can optimize the write operation on disk, because it takes place in a separate memory space. Without a transaction, all single queries are automatically committed at once and effective for every other query.
At least I hope you are using InnoDB as a storage engine - MyISAM does not support transactions and has other significant drawbacks. You should avoid it if possible.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I would like to create and upload page in php and import the uploaded csv file data into multiple tables. tried searching here but looks like can't find any which is importing from a csv to multiple table. any help here is greatly appreciated. thank you.
As the another variant proposed above, you can read your CSV line-by-line and explode each line into fields. Each field will corresponds one variable.
$handle = fopen("/my/file.csv", "r"); // opening CSV file for reading
if ($handle) { // if file successfully opened
while (($CSVrecord = fgets($handle, 4096)) !== false) { // iterating through each line of our CSV
list($field1, $field2, $field3, $field4) = explode(',', $CSVrecord); // exploding CSV record (line) to the variables (fields)
// and here you can easily compose SQL queries and map you data to the tables you need using simple variables
}
fclose($handle); // closing file handler
}
If you have access to PHPmyadmin, you can upload the CSV into there. Then copy if over to each desired table
In response to your comment that some data is going to one table and other data is going to another table, here is a simple example.
Table1 has 3 fields: name, age and sex. Table2 has 2 fields: haircolour, shoesize. So your CSV could be laid out like:
john smith,32,m,blonde,11
jane doe,29,f,red,4
anders anderson,56,m,grey,9
For the next step you will be using the function fgetcsv. This will break each line of the csv into an array that you can then use to build your SQL statements:
if (($handle = fopen($mycsvfile, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
// this loops through each line of your csv, putting the values into array elements
$sql1 = "INSERT INTO table1 (`name`, `age`, `sex`) values ('".$data[0]."', '".$data[1]."', '".$data[2]."')";
$sql2 = "INSERT INTO table2 (`haircolour`, `shoesize`) values ('".$data[3]."', '".$data[4]."')";
}
fclose($handle);
}
Please note that this does not take any SQL security such as validation into account, but that is basically how it will work.
the problem seems to me to differentiate what field is for which table.
when you are sending a header like
table.field, table.field, table.field
and then split the header, you'll get all tables and fields.
could that be a way to go?
all the best
ps: because of your comment ...
A csv file has/can have a first line with fieldnames in it. when there is a need too copy csv data into more than one tables, then you can use a workaround to find out which field is for which table.
user.username, user.lastname, blog.comment, blog.title
"sam" , "Manson" , "this is a comment", "and I am a title"
Now, when reading the csv data you can work over the first line, split the title at the dot to find out wich tables are used and also the fields.
With this method you are able to copy csv data to more than one table.
But it means, you have to code it first :(
To split the fieldnames
// only the first line for the fieldnames
$topfields = preg_split('/,|;|\t/')
foreach( $topfields as $t => $f ) {
// t = tablename, f = field
}
if (($handle = fopen($mycsvfile, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
// this loops through each line of your csv, putting the values into array elements
$sql1 = "INSERT INTO table1 (`name`, `age`, `sex`) values ('".$data[0]."', '".$data[1]."', '".$data[2]."')";
$sql2 = "INSERT INTO table2 (`haircolour`, `shoesize`) values ('".$data[3]."', '".$data[4]."')";
}
fclose($handle);
}
in above code you use two insert queries how you gonna run these queries ?