How to import data from CSV file to database [duplicate] - php

This question already has answers here:
Importing CSV data using PHP/MySQL
(6 answers)
Closed 5 years ago.
I have a csv file which has 13 columns, each of these columns has lots and lots of data and what I need is to be able to extract this data from the .csv file into mysql database.
Firstly I need help with creating specific tables for each of the columns of the csv file, as I'm quite new with mysql and I wasn't too sure on what attributes to assign to each table.
Here is the structure of the csv file...
Columns
pid, start_time, end_time, epoch_start, epoch_end, complete_title, media_type, masterband, service, brand_pid, is_clip, categories, tags
Data under columns
p00547jm (pid), 1003394820 (start_time), 1003999620 (end_time), 2001-10-18T08:47:00 (epoch_start), 2001-10-25T08:47:00 (epoch_end), in_our_time:_democracy (complete_title), audio (media_type), bbc_radio_four (masterband), bbc_radio_four (service), b006qykl (brand_pid),0 (is_clip), [9100005:1:factual.9200041:2:arts_culture_and_the_media.9200055:2:history] (categories), [democracy.history.philosophy.plato.ancient_greece] (tags)

Create your table first and then use
LOAD DATA LOCAL INFILE '/home/dummy/dummy.csv' INTO TABLE tablename
FIELDS TERMINATED BY ',';
to insert the data from the file into your new table.
Also, make sure when you login into MySQL you use
mysql -uusername -ppassword --local-infile
As for creating a table:
We can give the syntax but we cannot write the whole query for you. You can try it yourself as you have the necessary data with you.
create table table_name (col1 datatype(size) NOT NULL primary key,
col1......);

This will generate the create table query $1 will become the table name
[Create mysql table directly from CSV file using the CSV Storage engine?
#!/bin/sh
# pass in the file name as an argument: ./mktable filename.csv
echo "create table $1 ( "
head -1 $1 | sed -e 's/,/ varchar(255),\n/g'
echo " varchar(255) );"
Then you can read the file and put your insert logic in the while loop
[Import CSV into MYSQL but ignore header row
//open the csv file for reading
$handle = fopen($file_path, 'r');
// read the first line and ignore it
fgets($handle);
while (($data = fgetcsv($handle, 1000, ',')) !== FALSE) {
// do your thing
}

Related

PHP my admin no Ignore duplicate rows option for importing CSV files [duplicate]

This question already has answers here:
how to skip duplicate records when importing in phpmyadmin
(2 answers)
Closed 8 years ago.
I have a CSV file that I successfully imported into a database table.
Over time I will update this CSV file with new data so I was wondering if I import the same file which will contain rows that has already been inserted into the table, is there a way to avoid duplicates being added again?.
When researching I came across an option called "Ignore duplicate rows" however this is not present for me within the import options.
I am using the PHP my admin that is packaged inside XAMPP.
which is •Version information: 4.2.7.1, latest stable version: 4.2.8
add unique key to your database column which you think should be unique
here is the code i have done for uploading csv files which will stop inserting found values maybe it will help you.
using On Duplicate Key Update
<?php
$conn=mysql_connect("localhost","root","");
$db=mysql_select_db("dbname");
$csv_file="/path/to/file";
$savefile="filename.csv";
$handle = fopen($csv_file, "r");
$i=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) { //2nd param is for memory limit use it if u know what u are doing
if($i>0){
$import="INSERT into table_name(col0,col1,col2,col3)values('".addslashes($data[0])."','".addslashes($data[1])."','".addslashes($data[2])."','".addslashes($data[3])."')ON DUPLICATE KEY UPDATE col2= '".addslashes($data[1])."'";
mysql_query($import) or die(mysql_error());
}
$i=1;
}
?>
cheers

Building an application to transform CSV files

I have a rough and complete working CSV transformer. The way my current system works is it imports the CSV file into an SQL database table with static column names, and exports only specific (needed) columns. This system works great but is only specific to one type of CSV file (because the column names are pre-defined.) I'm wondering how I can make this universal. Instead of having it insert column1, column2, column3. I want to insert Spreadsheet Column1, Spreadsheet Column2, Spreadsheet Column3, etc. How would I go about pulling the column names from the CSV file, and creating a new table in the database with the column names being those from the first row of the CSV file.
The current system:
Client uploads CSV file.
A table is created with predefined column names (column 1, column 2, column 3)
Using LOAD DATA INFILE -> PHP scripts will insert the information from the CSV file into the recently created table.
The next query that is ran is simply something along the lines of taking only specific columns out of the table and exporting it to a final CSV file.
The system that would be ideal:
Client uploads CSV file.
PHP scripts read the CSV file and takes only the first row (column names), after taking these column names, it'll create a new table based on the column names.
PHP scripts now use LOAD DATA INFILE.
The rest is the same as current system.
Current code:
import.php
include("/inc/database.php");
include("/inc/functions.php");
include("/inc/data.php");
if($_SERVER['REQUEST_METHOD'] == 'POST'){
$string = random_string(7);
$new_file_name = 'report_'. $string .'.csv';
$themove = move_uploaded_file($_FILES['csv']['tmp_name'], 'C:/xampp/htdocs/uploads/'.$new_file_name);
mysql_query("CREATE TABLE report_". $string ."(". $colNames .")") or die(mysql_error());
$sql = "LOAD DATA INFILE '/xampp/htdocs/uploads/report_". $string .".csv'
INTO TABLE report_". $string ."
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(". $insertColNames .")";
$query = mysql_query($sql) or die(mysql_error());
header('Location: download.php?dlname='.$string.'');
}
data.php (shortened most of this. In reality there are about 200 columns going in, twenty-thirty coming out)
<?php
$colNames = "Web_Site_Member_ID text,
Master_Member_ID text,
API_GUID text,
Constituent_ID text";
$insertColNames = "Web_Site_Member_ID,
Master_Member_ID,
API_GUID,
Constituent_ID";
$exportNames = "Web_Site_Member_ID, Date_Membership_Expires, Membership, Member_Type_Code";
?>
functions.php just includes the block of code for generating a random string/file name.
For CSV file reading please look at the fgetcsv() function. You should easily be able to extract a row of data and access each individual field in the resulting array for your column header definitions.

optimizing Code for inserting 27000*2 keys from plain text file to DB

I need to insert data from a plain text file, explode each line to 2 parts and then insert to the database. I'm doing in this way, But can this programme be optimized for speed ?
the file has around 27000 lines of entry
DB structure [unique key (ext,info)]
ext [varchar]
info [varchar]
code:
$string = file_get_contents('list.txt');
$file_list=explode("\n",$string);
$entry=0;
$db = new mysqli('localhost', 'root', '', 'file_type');
$sql = $db->prepare('INSERT INTO info (ext,info) VALUES(?, ?)');
$j=count($file_list);
for($i=0;$i<$j;$i++)
{
$data=explode(' ',$file_list[$i],2);
$sql->bind_param('ss', $data[0], $data[1]);
$sql->execute();
$entry++;
}
$sql->close();
echo $entry.' entry inserted !<hr>';
If you are sure that file contains unique pairs of ext/info, you can try to disable keys for import:
ALTER TABLE `info` DISABLE KEYS;
And after import:
ALTER TABLE `info` ENABLE KEYS;
This way unique index will be rebuild once for all records, not every time something is inserted.
To increase speed even more you should change format of this file to be CSV compatible and use mysql LOAD DATA to avoid parsing every line in php.
When there are multiple items to be inserted you usually put all data in a CSV file, create a temporary table with columns matching CSV, and then do a LOAD DATA [LOCAL] INFILE, and then move that data into destination table. But as I can see you don't need much additional processing, so you can even treat your input file as a CSV without any additional trouble.
$db->exec('CREATE TEMPORARY TABLE _tmp_info (ext VARCHAR(255), info VARCHAR(255))');
$db->exec("LOAD DATA LOCAL INFILE '{$filename}' INTO TABLE _tmp_info
FIELDS TERMINATED BY ' '
LINES TERMINATED BY '\n'"); // $filename = 'list.txt' in your case
$db->exec('INSERT INTO info (ext, info) SELECT t.ext, t.info FROM _tmp_info t');
You can run a COUNT(*) on temp table after that to show how many records were there.
If you have a large file that you want to read in I would not use file_get_contents. By using it you force the interpreter to store the entire contents in memory all at once, which is a bit wasteful.
The following is a snippet taken from here:
$file_handle = fopen("myfile", "r");
while (!feof($file_handle)) {
$line = fgets($file_handle);
echo $line;
}
fclose($file_handle);
This is different in that all you are keeping in memory from the file at a single instance in time is a single line (not the entire contents of the file), which in your case will probably lower the run-time memory footprint of your script. In your case, you can use the same loop to perform your INSERT operation.
If you can use something like Talend. It's an ETL program, simple and free (it has a paid version).
Here is the magic solution [3 seconds vs 240 seconds]
ALTER TABLE info DISABLE KEYS;
$db->autocommit(FALSE);
//insert
$db->commit();
ALTER TABLE info ENABLE KEYS;

How to current snapshot of MySQL Table and store it into CSV file(after creating it)?

I have large database table, approximately 5GB, now I wan to getCurrentSnapshot of Database using "Select * from MyTableName", am using PDO in PHP to interact with Database. So preparing a query and then executing it
// Execute the prepared query
$result->execute();
$resultCollection = $result->fetchAll(PDO::FETCH_ASSOC);
is not an efficient way as lots of memory is being user for storing into the associative array data which is approximately, 5GB.
My final goal is to collect data returned by Select query into an CSV file and put CSV file at an FTP Location from where Client can get it.
Other Option I thought was to do:
SELECT * INTO OUTFILE "c:/mydata.csv"
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY "\n"
FROM my_table;
But I am not sure if this would work as I have cron that initiates the complete process and we do not have an csv file, so basically for this approach,
PHP Scripts will have to create an CSV file.
Do a Select query on the database.
Store the select query result into the CSV file.
What would be the best or efficient way to do this kind of task ?
Any Suggestions !!!
You can use the php function fputcsv (see the PHP Manual) to write single lines of csv into a file. In order not to run into the memory problem, instead of fetching the whole result set at once, just select it and then iterate over the result:
$fp = fopen('file.csv', 'w');
$result->execute();
while ($row = $result->fetch(PDO::FETCH_ASSOC)) {
// and here you can simply export every row to a file:
fputcsv($fp, $row);
}
fclose($fp);

PHP parse csv list and insert each record into wordpress users database

I've got an excel file containing a list of records by firstname, lastname, email_address, transaction_id
I'd like to create a script that reads in the excel data (can be csv or whatever export is easiest to work with) and inserts each record as a user in my wordpress database with the role of "member".
You can explode each line on the delimter (like ,) or you could use fopen and fgetcsv to parse line by line. Overall the general procedure is not difficult. for example:
$fileResource = fopen('/path/to/file.csv', 'r');
while(($data = fgetcsv($fileResource, ',')) !== false)
{
// $data is numeric indexed array
// do stuff with $data
}

Categories