PHP MySQL LOAD DATA - exclude duplicates - php

Currently I'm using the following to import a CSV file in to a table.
$query = <<<eof
LOAD DATA LOCAL INFILE 'list.csv'
INTO TABLE pupils
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
( -- Field List -- )
eof;
if ($conn->query($query) === TRUE) {
echo "Data Imported successfully";
} else {
echo "Error importing data: " . $conn->error;
}
-- Field List -- is a comma separated list of fields.
This works well and imports the data from the CSV file in to the table.
I've got a new CSV to import and there are going to be some entries that already exist in the database. During the import is it possible to check if a entry already exists ?
My Database has a field called remote ip which will be unique for each entry. Is there anyway to check if it exists and it it does don't import that row from the CSV ?
Thanks

No. LOAD DATA INFILE does exactly what it says. You will have to remove the duplicates separately.

Related

How to import data from CSV file to database [duplicate]

This question already has answers here:
Importing CSV data using PHP/MySQL
(6 answers)
Closed 5 years ago.
I have a csv file which has 13 columns, each of these columns has lots and lots of data and what I need is to be able to extract this data from the .csv file into mysql database.
Firstly I need help with creating specific tables for each of the columns of the csv file, as I'm quite new with mysql and I wasn't too sure on what attributes to assign to each table.
Here is the structure of the csv file...
Columns
pid, start_time, end_time, epoch_start, epoch_end, complete_title, media_type, masterband, service, brand_pid, is_clip, categories, tags
Data under columns
p00547jm (pid), 1003394820 (start_time), 1003999620 (end_time), 2001-10-18T08:47:00 (epoch_start), 2001-10-25T08:47:00 (epoch_end), in_our_time:_democracy (complete_title), audio (media_type), bbc_radio_four (masterband), bbc_radio_four (service), b006qykl (brand_pid),0 (is_clip), [9100005:1:factual.9200041:2:arts_culture_and_the_media.9200055:2:history] (categories), [democracy.history.philosophy.plato.ancient_greece] (tags)
Create your table first and then use
LOAD DATA LOCAL INFILE '/home/dummy/dummy.csv' INTO TABLE tablename
FIELDS TERMINATED BY ',';
to insert the data from the file into your new table.
Also, make sure when you login into MySQL you use
mysql -uusername -ppassword --local-infile
As for creating a table:
We can give the syntax but we cannot write the whole query for you. You can try it yourself as you have the necessary data with you.
create table table_name (col1 datatype(size) NOT NULL primary key,
col1......);
This will generate the create table query $1 will become the table name
[Create mysql table directly from CSV file using the CSV Storage engine?
#!/bin/sh
# pass in the file name as an argument: ./mktable filename.csv
echo "create table $1 ( "
head -1 $1 | sed -e 's/,/ varchar(255),\n/g'
echo " varchar(255) );"
Then you can read the file and put your insert logic in the while loop
[Import CSV into MYSQL but ignore header row
//open the csv file for reading
$handle = fopen($file_path, 'r');
// read the first line and ignore it
fgets($handle);
while (($data = fgetcsv($handle, 1000, ',')) !== FALSE) {
// do your thing
}

Building an application to transform CSV files

I have a rough and complete working CSV transformer. The way my current system works is it imports the CSV file into an SQL database table with static column names, and exports only specific (needed) columns. This system works great but is only specific to one type of CSV file (because the column names are pre-defined.) I'm wondering how I can make this universal. Instead of having it insert column1, column2, column3. I want to insert Spreadsheet Column1, Spreadsheet Column2, Spreadsheet Column3, etc. How would I go about pulling the column names from the CSV file, and creating a new table in the database with the column names being those from the first row of the CSV file.
The current system:
Client uploads CSV file.
A table is created with predefined column names (column 1, column 2, column 3)
Using LOAD DATA INFILE -> PHP scripts will insert the information from the CSV file into the recently created table.
The next query that is ran is simply something along the lines of taking only specific columns out of the table and exporting it to a final CSV file.
The system that would be ideal:
Client uploads CSV file.
PHP scripts read the CSV file and takes only the first row (column names), after taking these column names, it'll create a new table based on the column names.
PHP scripts now use LOAD DATA INFILE.
The rest is the same as current system.
Current code:
import.php
include("/inc/database.php");
include("/inc/functions.php");
include("/inc/data.php");
if($_SERVER['REQUEST_METHOD'] == 'POST'){
$string = random_string(7);
$new_file_name = 'report_'. $string .'.csv';
$themove = move_uploaded_file($_FILES['csv']['tmp_name'], 'C:/xampp/htdocs/uploads/'.$new_file_name);
mysql_query("CREATE TABLE report_". $string ."(". $colNames .")") or die(mysql_error());
$sql = "LOAD DATA INFILE '/xampp/htdocs/uploads/report_". $string .".csv'
INTO TABLE report_". $string ."
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(". $insertColNames .")";
$query = mysql_query($sql) or die(mysql_error());
header('Location: download.php?dlname='.$string.'');
}
data.php (shortened most of this. In reality there are about 200 columns going in, twenty-thirty coming out)
<?php
$colNames = "Web_Site_Member_ID text,
Master_Member_ID text,
API_GUID text,
Constituent_ID text";
$insertColNames = "Web_Site_Member_ID,
Master_Member_ID,
API_GUID,
Constituent_ID";
$exportNames = "Web_Site_Member_ID, Date_Membership_Expires, Membership, Member_Type_Code";
?>
functions.php just includes the block of code for generating a random string/file name.
For CSV file reading please look at the fgetcsv() function. You should easily be able to extract a row of data and access each individual field in the resulting array for your column header definitions.

Import .CSV into MySQL DB

I have a script that I am using to import .CSV data into a MySQL DB. The first field of data (email address) still has quote marks around it when I pull into MySQL. The other fields, the quotes are stripped out.
My CSV lines looks like this:
"email4#email.com"," Karla","Smith"
"email5#email.com"," Carl","Nichols"
The email addresses still have quotes in MySQL. The first and last name are fine.
Any suggestions?
<?php
$conn = mysql_connect('host','username','password');
mysql_select_db('db-name');
mysql_query("TRUNCATE TABLE contacts") or die(mysql_error());
mysql_query("LOAD DATA LOCAL INFILE 'New Member Weekly Report for Marketing.csv'
INTO TABLE contacts
Fields terminated by ',' ENCLOSED BY '\"'
LINES terminated by '\r'(
contact_email
,contact_first
,contact_last)")
or die("Import Error: " . mysql_error());
?>
On a little bit of googling, I found this:
Instead of writing a script to pull in information from a CSV file, you can link MYSQL directly to it and upload the information using the following SQL syntax.
To import an Excel file into MySQL, first export it as a CSV file. Remove the CSV headers from the generated CSV file along with empty data that Excel may have put at the end of the CSV file.
You can then import it into a MySQL table by running:
load data local infile 'uniq.csv' into table tblUniq fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(uniqName, uniqCity, uniqComments)
The fields here are the actual tblUniq table fields that the data needs to sit in. The enclosed by and lines terminated by are optional and can help if you have columns enclosed with double-quotes such as Excel exports, etc.
Source: http://www.tech-recipes.com/rx/2345/import_csv_file_directly_into_mysql/

Importing CSV with odd rows into MySQL

I'm faced with a problematic CSV file that I have to import to MySQL.
Either through the use of PHP and then insert commands, or straight through MySQL's load data infile.
I have attached a partial screenshot of how the data within the file looks:
The values I need to insert are below "ACC1000" so I have to start at line 5 and make my way through the file of about 5500 lines.
It's not possible to skip to each next line because for some Accounts there are multiple payments as shown below.
I have been trying to get to the next row by scanning the rows for the occurrence of "ACC"
if (strpos($data[$c], 'ACC') !== FALSE){
echo "Yep ";
} else {
echo "Nope ";
}
I know it's crude, but I really don't know where to start.
If you have a (foreign key) constraint defined in your target table such that records with a blank value in the type column will be rejected, you could use MySQL's LOAD DATA INFILE to read the first column into a user variable (which is carried forward into subsequent records) and apply its IGNORE keyword to skip those "records" that fail the FK constraint:
LOAD DATA INFILE '/path/to/file.csv'
IGNORE
INTO TABLE my_table
CHARACTER SET utf8
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 4 LINES
(#a, type, date, terms, due_date, class, aging, balance)
SET account_no = #account_no := IF(#a='', #account_no, #a)
There are several approaches you could take.
1) You could go with #Jorge Campos suggestion and read the file line by line, using PHP code to skip the lines you don't need and insert the ones you want into MySQL. A potential disadvantage to this approach if you have a very large file is that you will either have to run a bunch of little queries or build up a larger one and it could take some time to run.
2) You could process the file and remove any rows/columns that you don't need, leaving the file in a format that can be inserted directly into mysql via command line or whatever.
Based on which approach you decide to take, either myself or the community can provide code samples if you need them.
This snippet should get you going in the right direction:
$file = '/path/to/something.csv';
if( ! fopen($file, 'r') ) { die('bad file'); }
if( ! $headers = fgetcsv($fh) ) { die('bad data'); }
while($line = fgetcsv($fh)) {
echo var_export($line, true) . "\n";
if( preg_match('/^ACC/', $line[0] ) { echo "record begin\n"; }
}
fclose($fh);
http://php.net/manual/en/function.fgetcsv.php

Importing CSV data into MySQL table

I have a CSV file which contains data seperated with tabs. I need to import the data into a MySQL table which consists of two columns. The first CSV column should go into the first column of the table and similarly for the second.
<?php
$con=mysql_connect("localhost","root","");
mysql_select_db("translation",$con);
$open=fopen("EH_excel.txt","r");
while(($get=fgetcsv($open,1000,","))!==false) {
mysql_query("insert into dictionary(english,croatian)
values('".$get[0]."','".$get[1]."')");
}
fclose($open); echo "Import Done.";
?>
Can anybody help me?
Since what you have is called Tab Delimited Files
This is the way you import it to
SQL
LOAD DATA LOCAL INFILE 'sample.txt' INTO TABLE sample
FIELDS TERMINATED BY '\t'
OPTIONALLY ENCLOSED BY ''
ESCAPED BY ''
LINES TERMINATED BY '\n';

Categories