I am trying to import data from a db via pdo and output the results to a csv file. I am able to output to a screen correctly but the formatting in the csv is wild, double names and no '\n'
<?php
require_once('auth.php');
$conn = new PDO("mysql:host=localhost;dbname=$dbname", $username, $pw);
if (($handle = fopen("nameList2.txt", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, " ")) !== FALSE) {
$firstname = $data[0];
$lastname = $data[1];
$stmt = $conn->prepare("SELECT * FROM list WHERE FName = :firstname AND LName = :lastname");
$stmt->bindParam(':firstname', $firstname);
$stmt->bindParam(':lastname', $lastname);
$stmt->execute();
$result = $stmt->fetchAll();
//var_dump($firstname);
//var_dump($lastname);
//var_dump($result);
$fp = fopen('file.csv', 'w');
foreach($result as $chunk){
echo $chunk[4]." ".$chunk[6]." ".$chunk[7]." ".$chunk[10]." ".$chunk[11]."".$chunk[12]." ".$chunk[13]." ".$chunk[18]." ".$chunk[19]." ".$chunk[20]."<br />";
fputcsv($fp, $chunk);
}
fclose($fp);
}
fclose($handle);
//fclose($fp);
}
?>
You are feeding fputcsv bad data, so it's giving you bad output. Specifically, fetchAll retrieves each row as an array with both numeric and string keys, so each value appears twice.
Fix this by setting the fetch mode appropriately, for example
$result = $stmt->fetchAll(PDO::FETCH_NUM);
It's unclear what the problem with the line endings is -- you don't say, and I can't tell from the screenshot. What is certain is that fputcsv writes a single line feed as the line termination character. While the vast majority of programs will correctly detect and handle these Unix-style line endings, there are some others (e.g. Notepad) that won't.
Your problem with double names is because you doesn't use the method fetchAll() right:
you get the names twice in the $result.
Use that:
$result = $stmt->fetchAll(PDO::FETCH_ASSOC);
To fix the problem with \n try
ini_set('auto_detect_line_endings', true);
Related
I'm trying to import a pretty big CSV file into my database (locally)
the file is 230MB and its about 8.8 million lines
the problem I have isn't opening the CSV or dont know how to import it,
the file opens, imports about 500,000 lines and then it quits and trows no error or timeout or anything, i just get to see my webpage.
this is the code:
try {
$conn = new PDO("mysql:host=$servername;dbname=adresses_database", $username, $password);
// set the PDO error mode to exception
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
echo "Connected successfully";
$row = 1;
if (($handle = fopen("bagadres.csv", "c+")) !== FALSE) {
while (($data = fgetcsv($handle, '', ";")) !== FALSE) {
if (!isset($write_position)) { // move the line to previous position, except the first line
$write_position = 0;
$num = count($data); // $num is 15
$row++; //i dont need this?
$stmt = $conn->prepare("INSERT INTO adresses (openbareruimte, huisnummer, huisletter, huisnummertoevoeging, postcode, woonplaats, gemeente, provincie, object_id, object_type, nevenadres, x, y, lon, lat) VALUES (:openbareruimte, :huisnummer, :huisletter, :huisnummertoevoeging, :postcode, :woonplaats, :gemeente, :provincie, :object_id, :object_type, :nevenadres, :x, :y, :lon, :lat)");
$stmt->bindParam(':openbareruimte', $data[0]);
$stmt->bindParam(':huisnummer', $data[1]);
$stmt->bindParam(':huisletter', $data[2]);
$stmt->bindParam(':huisnummertoevoeging', $data[3]);
$stmt->bindParam(':postcode', $data[4]);
$stmt->bindParam(':woonplaats', $data[5]);
$stmt->bindParam(':gemeente', $data[6]);
$stmt->bindParam(':provincie', $data[7]);
$stmt->bindParam(':object_id', $data[8]);
$stmt->bindParam(':object_type', $data[9]);
$stmt->bindParam(':nevenadres', $data[10]);
$stmt->bindParam(':x', $data[11]);
$stmt->bindParam(':y', $data[12]);
$stmt->bindParam(':lon', $data[13]);
$stmt->bindParam(':lat', $data[14]);
$stmt->execute();
} else {
$read_position = ftell($handle); // get actual line
fseek($handle, $write_position); // move to previous position
fputs($handle, $line); // put actual line in previous position
fseek($handle, $read_position); // return to actual position
$write_position += strlen($line); // set write position to the next loop
}
fflush($handle); // write any pending change to file
ftruncate($handle, $write_position); // drop the repeated last line
flock($handle, LOCK_UN);
}
fclose($handle);
}
}
catch(PDOException $e)
{
echo "Connection failed: " . $e->getMessage();
}
$conn = null;
i came this far looking for help on stackoverflow and PHP manual, i also searched if it were a mysql error.
but i cannot figure this out,
(for any suggestions about mysql settings im using linux mint 18)
I would strongly recommend that you use MySQL's LOAD DATA INFILE, which is probably the fastest and most efficient to get CSV data into a MySQL table. The command for you setup would look something like this:
LOAD DATA INFILE 'bagadres.csv'
INTO TABLE adresses
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
If your fields are not enclosed by quotes, or are enclosed by something other than quotes, then remove or modify the ENCLOSED BY clause. Also, IGNORE 1 ROWS will ignore the first row, which would make sense assuming that the first line of your file were a header row (i.e. not actual data but column labels).
I am trying to array_map to sanitize an array, which I have created from a csv file. Here's my code:
if (isset($_FILES['csv']['size'])) {
if ($_FILES['csv']['size'] > 0 && $_FILES['csv']['size'] != NULL ) {
//Clear existing qty_csv table
mysqli_query($conn,'TRUNCATE TABLE qty_csv');
$row_count = 0;
//get the csv file
$filename = $_FILES['csv']['tmp_name'];
$handle = fopen($filename,"r");
$delimiter = ',';
$unescapedArray = array();
$data = csv_to_array($filename,$delimiter);
function array_map_callback($a)
{
global $conn;
return mysqli_real_escape_string($conn, $a);
}
$data2 = array_map('array_map_callback',$data);
Whenever I run my bit of code I get the warning:
Warning: mysqli_real_escape_string() expects parameter 2 to be string, array given in C:\xampp\htdocs\
Why does this happen, and how can I fix it?
This is the structure of the original data:
part_code varchar(20)
part_descr varchar(255)
part_location varchar(20)
part_qty_in_stock int(11)
reorder_level int(11)
reorder_qty int(11)
part_price decimal(6,2)
This is what people in the comments were talking about with prepared statements. The statement is pre-loaded with ? placeholders, and then each of the placeholders is bound to a variable.
So file() gives us each line of the file in an array element which we can easily loop through with foreach. Within the loop, we use str_getcsv() to turn each CSV line into an array (though if you prefer to roll your own, be my guest) and execute the prepared statement.
Every time the statement is executed, the bound variable value is checked and placed into the statement. The overhead of setting up the database is only done once, resulting in a lot less overhead. Plus you get the bonus of not needing to escape strings; MySQL does it for you.
Of course for production code you'd want to include checks to make sure statement preparation, variable binding, and execution don't throw any errors. Also you didn't include a CSV sample, so you may have to allow for any non-standard separators or terminators in str_getcsv().
//assuming you have up here something like this:
$conn = new mysqli($host, $user, $pass, $dbase);
if (!empty($_FILES['csv']['size'])) {
//Clear existing qty_csv table
$conn->query('TRUNCATE TABLE qty_csv');
//get the csv file
$filename = $_FILES['csv']['tmp_name'];
$data = file($filename, FILE_IGNORE_NEW_LINES | FILE_SKIP_EMPTY_LINES);
$row_count = count($data);
//value has to exist for bind_param to work
$csv = str_getcsv($data[0]);
$query = "INSERT INTO table (part_code, part_descr, part_location, part_qty_in_stock, reorder_level, reorder_qty, part_price) VALUES (?,?,?,?,?,?,?)";
$stmt = $conn->prepare($query);
$stmt->bind_param("sssiiid", $csv[0], $csv[1], $csv[2], $csv[3], $csv[4], $csv[5], $csv[6]);
foreach ($data as $row) {
$csv = str_getcsv($row);
$stmt->execute();
}
}
The error is caused by an item in $data which is not a string. Do a var_dump to see what's inside the $data before passing it to array map. Or you could do something like:
function array_map_callback($a)
{
global $conn;
if (is_array($a) {
foreach($a as $idx => $item) {
$a[$idx] = mysqli_real_escape_string($conn, $item)
}
return $a;
} else {
return mysqli_real_escape_string($conn, $a);
}
}
But this is just a possible solution, it may be better to find out why you have a non-string in your $data array, and make sure it doesn't get there.
I have serious question about importing data from CSV to Database.
Import script:
if (file_exists('temp.csv')) {
$i=0;
require "connection.php";
$handle = fopen("temp.csv", "r");
try {
$import= $db->prepare("INSERT INTO adherence(
dateandtime,
lastname,
firstname,
paidtime,
approvedtime,
notadhering) VALUES(
?,?,?,?,?,?)");
$i = 0;
while (($data = fgetcsv($handle, 1000, ",", "'")) !== FALSE) {
if($i > 0) {
$data = str_replace('"', '', $data);
$myDate = date("Y/m/d",strtotime(str_replace('/','-',$data[0])));
$import->bindParam(1, $myDate, PDO::PARAM_STR);
$import->bindParam(2, $data[1], PDO::PARAM_STR);
$import->bindParam(3, $data[2], PDO::PARAM_STR);
$import->bindParam(4, $data[3], PDO::PARAM_STR);
$import->bindParam(5, $data[4], PDO::PARAM_STR);
$import->bindParam(6, $data[5], PDO::PARAM_STR);
$import->execute();
}
$i++;
}
fclose($handle);
Problem is, I need some sort of conditional logic to check, if row allready exist in database before importing, and skip it - if it exist. How to handle this kind of thing?
Basically you have two different ways to approach it.
1. via the rdbms:
Use an unique index in your table. Once you insert a duplicate,
you'll encounter an error, which can be properly displayed/logged/whatever.
2. via application logic:
Search for your item before inserting with a proper SELECT statement.. If
you find a match, don't insert it.
Example:
$sth = $db->prepare("SELECT yourfields FROM yourtable WHERE yourcondition = :cond");
$sth->bindParam(':cond',$yourvariable, PDO::PARAM_STR);
$sth->execute();
if ($sth->rowCount() > 0) {
// results - don't insert
} else {
// place your insert terms here
}
In most conditions, coders will implement the first way, since it reduces traffic between application and RDBMS and makes your data model more robust. If this is any issue, try the second way.
So I'm using PHP to take the contents of a csv file, put it into a string array and then use SQL to add it to a database on an IBM iSeries.
However PHP keeps trying to treat the contents of the string (which contains special characters like "*" and "-") like a mathematical computation.
How do I prevent this?
here is the code in question
if (($handle = fopen($_FILES['uploadcsv']['tmp_name'], "r")) !== FALSE)
{
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE)
{
$length = count($data);
$s_data = implode(',', $data);
if($length > $maxcol)
{
// echo $length;
// die;
$uploadMsg = "Data Error: Not ($maxcol) Columns: ($s_data) <br>";
}
else
{
if($data[0] <> '')
{
$recda[0] = trim($data[0]); // qty = 1 roll
// Prepare the SQL statement (possibly faster, safer, better practice)
$insertsql = "INSERT INTO MIKELIB/PALLETS (PALLET)
VALUES($recda[0]) with nc";
$stmt = db2_prepare($db2conn, $insertsql);
//$result = db2_exec($db2conn, "Insert into file ...$data[0]"
$result = db2_execute($stmt, $data[0]);
if(!$result)
{
$uploadMsg .= "Result code: " . $result . "Data Error: " . db2_stmt_error() . " msg: " . db2_stmt_errormsg() . "data: ($s_data)<br>";
}
else
{
$s_data = implode(',', $recda);
$uploadMsg .= "Added row ($s_data)<br>";
}
}
}
}
fclose($handle);
}
Here is an example output of the error "Result code: Data Error: 42604 msg: Numeric constant 5D09C not valid. SQLCODE=-103data: (A2501-0044*970*5D09C*034)"
Actually, it's your database that is parsing your data as math.
Take a look at this line:
$insertsql = "INSERT INTO MIKELIB/PALLETS (PALLET)
VALUES($recda[0]) with nc";
$stmt = db2_prepare($db2conn, $insertsql);
You're putting the values directly into the query, so if the query has math, or invalid symbols, it'll break your query.
What you should do is:
$insertsql = "INSERT INTO `MIKELIB/PALLETS` (PALLET)
VALUES(?) with nc";
$stmt = db2_prepare($db2conn, $insertsql);
$recda0 = $recda[0];
db2_bind_param($stmt, 1, "recda0", DB2_PARAM_IN);
That way, there's nothing in $recda[0] that will break the query, or be parsed as part of the query.
Joesph, try modifing your SQL to treat that value as a string by wrapping it in single quotes.
$insertsql = "INSERT INTO MIKELIB/PALLETS (PALLET)
VALUES('$recda[0]') with nc";
You may also need to consider escaping single quotes in the string if there is a possibility it will contain any.
I get the impression that you may be trying to load values into multiple columns per row. That won't work in SQL. You have to specify each column.
I know DB2 for i, but not PHP, so I'll attempt to build on David's answer as a template.
$insertsql = "INSERT INTO MYLIB/MYTABLE (cola, colb, colc)
VALUES(?,?,?) with nc";
$stmt = db2_prepare($db2conn, $insertsql);
$vala = $recda[0];
$valb = $recda[1];
$valc = $recda[2];
db2_bind_param($stmt, 1, "vala", DB2_PARAM_IN);
db2_bind_param($stmt, 2, "valb", DB2_PARAM_IN);
db2_bind_param($stmt, 3, "valc", DB2_PARAM_IN);
You may need additional PHP code, perhaps to make sure each value is appropriate for its column, and might perhaps need to detect missing values and load a null or default value, depending on your table definition. But I'll leave that to those who know PHP.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Import CSV to mysql
Right I need some help with this:
I am trying to import a .csv file into a mysql database using php, rather than doing it manually through phpmyadmin.
This is the code I have at the moment:
if($_REQUEST['func'] == "iid"){
$db->conn = new mysqli(DB_SERVER, DB_USER, DB_PASSWORD, DB_NAME) or
die('There was a problem connecting to the database.');
$csv = $_POST['csv-file'];
$path = $csv;
$row = 1;
if (($handle = fopen($path, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ";")) !== FALSE) {
$row++;
$data_entries[] = $data ;
}
fclose($handle);
}
// this you'll have to expand
foreach($data_entries as $line){
$sql = $db->conn->prepare('INSERT INTO `bd_results`');
$db->execute($line);
}
}
However I get the following error:
Fatal error: Call to undefined method stdClass::execute() in /homepages/19/d372249701/htdocs/business-sites/bowlplex-doubles-new/admin/scores.php on line 44
For reference I am using this code taken from: Here
I am not well versed in the $db->conn business I'm used to mysql_connect!! so any help would be appreciated.
Try this simple one.
if (($handle = fopen("google.csv", "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$db->conn->query("INSERT INTO values('" . implode('\',\'', $data) . "');");
}
fclose($handle);
}
Your script might also need to add quotes to the CSV values if they don't have quotes already. If you'll be needing to deal with quotes and all in your CSV files, I recommend you look at my blog post at http://www.fusionswift.com/2012/07/php-import-csv-to-mysql/
foreach($data_entries as $line) {
$stmt = $db->conn->prepare('INSERT INTO `bd_results` (field1, field2) VALUES (?, ?)');
$stmt->bindParam('ss', $field1Value, $field2Value);
list($field1Value, $field2Value) = $line
$stmt->execute();
}
Where $field1Value is first CSV column, $field2Value is second CSV column and both are of type string, specified as such in bindParam() method.
Basically you will need to prepare the query in its entirety, then you can assign variables to it, and once the variables have desired values you execute the query using execute() method.
This is how you use prepared statements. Personally I'd go with Mahn's suggestion though and avoid using prepared statements for such a task unless you need to process the data while on it.
mysqli_stmt::bind_param
mysqli_stmt::execute