Import CSV file directly into MySQL database [duplicate] - php

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Import CSV to mysql
Right I need some help with this:
I am trying to import a .csv file into a mysql database using php, rather than doing it manually through phpmyadmin.
This is the code I have at the moment:
if($_REQUEST['func'] == "iid"){
$db->conn = new mysqli(DB_SERVER, DB_USER, DB_PASSWORD, DB_NAME) or
die('There was a problem connecting to the database.');
$csv = $_POST['csv-file'];
$path = $csv;
$row = 1;
if (($handle = fopen($path, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ";")) !== FALSE) {
$row++;
$data_entries[] = $data ;
}
fclose($handle);
}
// this you'll have to expand
foreach($data_entries as $line){
$sql = $db->conn->prepare('INSERT INTO `bd_results`');
$db->execute($line);
}
}
However I get the following error:
Fatal error: Call to undefined method stdClass::execute() in /homepages/19/d372249701/htdocs/business-sites/bowlplex-doubles-new/admin/scores.php on line 44
For reference I am using this code taken from: Here
I am not well versed in the $db->conn business I'm used to mysql_connect!! so any help would be appreciated.

Try this simple one.
if (($handle = fopen("google.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$db->conn->query("INSERT INTO values('" . implode('\',\'', $data) . "');");
}
fclose($handle);
}
Your script might also need to add quotes to the CSV values if they don't have quotes already. If you'll be needing to deal with quotes and all in your CSV files, I recommend you look at my blog post at http://www.fusionswift.com/2012/07/php-import-csv-to-mysql/

foreach($data_entries as $line) {
$stmt = $db->conn->prepare('INSERT INTO `bd_results` (field1, field2) VALUES (?, ?)');
$stmt->bindParam('ss', $field1Value, $field2Value);
list($field1Value, $field2Value) = $line
$stmt->execute();
}
Where $field1Value is first CSV column, $field2Value is second CSV column and both are of type string, specified as such in bindParam() method.
Basically you will need to prepare the query in its entirety, then you can assign variables to it, and once the variables have desired values you execute the query using execute() method.
This is how you use prepared statements. Personally I'd go with Mahn's suggestion though and avoid using prepared statements for such a task unless you need to process the data while on it.
mysqli_stmt::bind_param
mysqli_stmt::execute

Related

SQL import CSV file with PHP

I'm trying to import a pretty big CSV file into my database (locally)
the file is 230MB and its about 8.8 million lines
the problem I have isn't opening the CSV or dont know how to import it,
the file opens, imports about 500,000 lines and then it quits and trows no error or timeout or anything, i just get to see my webpage.
this is the code:
try {
$conn = new PDO("mysql:host=$servername;dbname=adresses_database", $username, $password);
// set the PDO error mode to exception
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
echo "Connected successfully";
$row = 1;
if (($handle = fopen("bagadres.csv", "c+")) !== FALSE) {
while (($data = fgetcsv($handle, '', ";")) !== FALSE) {
if (!isset($write_position)) { // move the line to previous position, except the first line
$write_position = 0;
$num = count($data); // $num is 15
$row++; //i dont need this?
$stmt = $conn->prepare("INSERT INTO adresses (openbareruimte, huisnummer, huisletter, huisnummertoevoeging, postcode, woonplaats, gemeente, provincie, object_id, object_type, nevenadres, x, y, lon, lat) VALUES (:openbareruimte, :huisnummer, :huisletter, :huisnummertoevoeging, :postcode, :woonplaats, :gemeente, :provincie, :object_id, :object_type, :nevenadres, :x, :y, :lon, :lat)");
$stmt->bindParam(':openbareruimte', $data[0]);
$stmt->bindParam(':huisnummer', $data[1]);
$stmt->bindParam(':huisletter', $data[2]);
$stmt->bindParam(':huisnummertoevoeging', $data[3]);
$stmt->bindParam(':postcode', $data[4]);
$stmt->bindParam(':woonplaats', $data[5]);
$stmt->bindParam(':gemeente', $data[6]);
$stmt->bindParam(':provincie', $data[7]);
$stmt->bindParam(':object_id', $data[8]);
$stmt->bindParam(':object_type', $data[9]);
$stmt->bindParam(':nevenadres', $data[10]);
$stmt->bindParam(':x', $data[11]);
$stmt->bindParam(':y', $data[12]);
$stmt->bindParam(':lon', $data[13]);
$stmt->bindParam(':lat', $data[14]);
$stmt->execute();
} else {
$read_position = ftell($handle); // get actual line
fseek($handle, $write_position); // move to previous position
fputs($handle, $line); // put actual line in previous position
fseek($handle, $read_position); // return to actual position
$write_position += strlen($line); // set write position to the next loop
}
fflush($handle); // write any pending change to file
ftruncate($handle, $write_position); // drop the repeated last line
flock($handle, LOCK_UN);
}
fclose($handle);
}
}
catch(PDOException $e)
{
echo "Connection failed: " . $e->getMessage();
}
$conn = null;
i came this far looking for help on stackoverflow and PHP manual, i also searched if it were a mysql error.
but i cannot figure this out,
(for any suggestions about mysql settings im using linux mint 18)
I would strongly recommend that you use MySQL's LOAD DATA INFILE, which is probably the fastest and most efficient to get CSV data into a MySQL table. The command for you setup would look something like this:
LOAD DATA INFILE 'bagadres.csv'
INTO TABLE adresses
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
If your fields are not enclosed by quotes, or are enclosed by something other than quotes, then remove or modify the ENCLOSED BY clause. Also, IGNORE 1 ROWS will ignore the first row, which would make sense assuming that the first line of your file were a header row (i.e. not actual data but column labels).

PHP ODBC execute is trying to open a file

For some reason this code is causing odbc_execute(); to attempt to open a file...
$file = fopen('somefile.csv', 'r');
fgetcsv($file); // Skip the first line
$data = [];
while (($line = fgetcsv($file)) != false) {
$data[] = $line;
}
fclose($file);
try {
$conn = odbc_connect("Teradata", "User", "Pass");
odbc_autocommit($conn, false);
odbc_exec($conn, 'DELETE FROM table');
foreach ($data as &$test) {
$stmt = odbc_prepare($conn, 'INSERT INTO table (experiment_code, experiment_name, variant_code, variant_name, version_number, report_start_date, report_end_date, status, transaction_date, experiment_test_id, test_manager, product_manager, pod, created_date, last_updated_date) VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)');
odbc_execute($stmt, $test);
}
odbc_commit($conn);
$result = odbc_exec($conn, 'SELECT * FROM table');
odbc_result_all($result);
} catch (Exception $e) {
odbc_rollback($conn);
echo $e->getMessage();
}
Here is a snip-it of the CSV file...
H1225,Some random text,H1225:001.000,Control,3,02/06/2014,03/31/2014,Completed,,HMVT-1225,Some random name,Some random name,Checkout,03/31/2014 16:54,02/06/2014 16:38
H1225,Some random text,H1225:001.000,Control,3,02/06/2014,03/31/2014,Completed,,HMVT-1225,Some random name,Some random name,Checkout,03/31/2014 16:54,02/06/2014 16:38
And here is the type of error I am getting...
Warning: odbc_execute(): Can't open file Control in C:\wamp\www\HEXinput\assets\php\dumpCSV.php on line 19
I get multiple version of the same error just with a different file name. The file name seems to be coming from column 3 (0 based). Another weird thing is that it actually does insert some lines correctly.
The final error I get is...
Fatal error: Maximum execution time of 120 seconds exceeded in C:\wamp\www\HEXinput\assets\php\dumpCSV.php on line 27
I am using Teradatas ODBC Drivers for version 15 on windows 7 64bit.
What could be causing this?
Turns out that some of the fields in the CSV file had single quotes in them which broke the query.
Simple but annoying oversight.

Why is this PDO Insert executing twice?

This has been inserting each line from the csv into the database twice, and now today three times. Nothing else I put in the loop happens more than it should.
$file_handle = fopen("uploads/numbers.csv", "r");
$stmt = $db->prepare("INSERT INTO database
(firstname,lastname,phonenumber) VALUES
(:field1,:field2,:field3)");
while (($line_of_data = fgetcsv($file_handle, 1000, ",")) !== FALSE)
{
$stmt->execute(array(':field1' => $line_of_data [0], ':field2' => $line_of_data[1], ':field3' => $line_of_data[2]));
}
Setup a proper primary key on database. Either (firstname, lastname) or (firstname, lastname, phonenumber) depending on the usage. Ta-da, no more duplicates.
I'm going to assume James was right in the columns by the fact that the CSV contains preexisting data in the database, but either way, a primary key will prevent duplicates.
If you use the firstname, lastname key and you want to have the script be able to update the phone number, you could use REPLACE instead of INSERT.
Your check is here:
while (($line_of_data = fgetcsv($file_handle, 1000, ",")) !== FALSE)
First, you don’t need the !== FALSE. It can just be this:
while (($line_of_data = fgetcsv($file_handle, 1000, ",")))
Also, your code is just checking while that fgetcsv is not empty. So what if there is an empty line in the file? It gets run twice. So how about this:
while (($line_of_data = trim(fgetcsv($file_handle, 1000, ",")))) {
if (!empty($line_of_data)) {
$stmt->execute(array(':field1' => $line_of_data [0], ':field2' => $line_of_data[1], ':field3' => $line_of_data[2]));
}
}
The idea is that when you call fgetcsv let’s trim the line to get rid of extra stuff like maybe a line break at the end of the line. Then the if (!empty($line_of_data)) { checks if the line is not empty & only acts on the query if it is definitely not empty. If somehow trim(fgetcsv(…)) doesn’t work, you can do it this way instead:
while (($line_of_data = fgetcsv($file_handle, 1000, ","))) {
if (!empty(trim($line_of_data))) {
$stmt->execute(array(':field1' => $line_of_data [0], ':field2' => $line_of_data[1], ':field3' => $line_of_data[2]));
}
}
With all of that logic in if (!empty(trim($line_of_data))) {.

putcsv formatting wrong when result is from pdo

I am trying to import data from a db via pdo and output the results to a csv file. I am able to output to a screen correctly but the formatting in the csv is wild, double names and no '\n'
<?php
require_once('auth.php');
$conn = new PDO("mysql:host=localhost;dbname=$dbname", $username, $pw);
if (($handle = fopen("nameList2.txt", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, " ")) !== FALSE) {
$firstname = $data[0];
$lastname = $data[1];
$stmt = $conn->prepare("SELECT * FROM list WHERE FName = :firstname AND LName = :lastname");
$stmt->bindParam(':firstname', $firstname);
$stmt->bindParam(':lastname', $lastname);
$stmt->execute();
$result = $stmt->fetchAll();
//var_dump($firstname);
//var_dump($lastname);
//var_dump($result);
$fp = fopen('file.csv', 'w');
foreach($result as $chunk){
echo $chunk[4]." ".$chunk[6]." ".$chunk[7]." ".$chunk[10]." ".$chunk[11]."".$chunk[12]." ".$chunk[13]." ".$chunk[18]." ".$chunk[19]." ".$chunk[20]."<br />";
fputcsv($fp, $chunk);
}
fclose($fp);
}
fclose($handle);
//fclose($fp);
}
?>
You are feeding fputcsv bad data, so it's giving you bad output. Specifically, fetchAll retrieves each row as an array with both numeric and string keys, so each value appears twice.
Fix this by setting the fetch mode appropriately, for example
$result = $stmt->fetchAll(PDO::FETCH_NUM);
It's unclear what the problem with the line endings is -- you don't say, and I can't tell from the screenshot. What is certain is that fputcsv writes a single line feed as the line termination character. While the vast majority of programs will correctly detect and handle these Unix-style line endings, there are some others (e.g. Notepad) that won't.
Your problem with double names is because you doesn't use the method fetchAll() right:
you get the names twice in the $result.
Use that:
$result = $stmt->fetchAll(PDO::FETCH_ASSOC);
To fix the problem with \n try
ini_set('auto_detect_line_endings', true);

CSV PDO Insert Loop?

I have recently asked how to insert a CSV into a MySQL database. It was suggested to me to use LOAD DATA LOCAL INFILE, however it turns out that this is disabled on my host, so no longer an option. Back to PHP loops..
I'm having an issue looping through the results of a temp upload, since I'm mapping the values to an array on insert. On multiple lines therefore, this causes the same entry to be entered twice (the first line's values), as the array values are explicitly defined.
It's inserting 1, 2, 3, 4 and then 1, 2, 3, 4. I want to insert 1, 2, 3, 4 then 5, 6, 7, 8 of the array.
What's the solution (aside from hacky for's and row++)?
Thanks in advance.
$handle = fopen($_FILES['csv']['tmp_name'], "r");
$sql = "INSERT INTO tbl (col1, col2, col3, col4) VALUES (?, ?, ?, ?)";
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$query = $db->prepare($sql);
if ($query->execute(array($data[0],$data[1],$data[2],$data[3]))) return true;
else return false;
}
The only thing I can think of is that your loop is only executing once, but you run the loop twice. (You have a "return" statement in your loop.)
The following should work:
function loadCsv($db, $filename){
$fp = fopen($filename, 'rb');
$sql = 'INSERT INTO tbl (col1, col2, col3, col4) VALUES (?,?,?,?)';
$pstmt = $db->prepare($sql);
while (FALSE !== ($data = fgetcsv($fp, 1000))) {
$cols = array_slice($data, 0, 4);
$query->execute($cols);
}
$pstmt->closeCursor();
fclose($fp);
}
For maximum compatibility and performance, I recommend connecting to PDO with a function like this connect_PDO function.
First of all, you only need to prepare the query once (that's one of the two main advantages of using prepared statements, with injection prevention being the other one), so put the call to prepare() before the while loop, not inside it.
Outside of that, I see no reason why the code you've posted would behave the way you claim it does, unless your data is duplicated in your CSV file.
The issue was with the return statement. Removing the return instantly fixed the issue.
Unfortunately the user who posted this answer has since removed it.
Thanks everyone for your suggestion and help with this!

Categories