Recursively MySQL Query - php

How can I implement recursive MySQL Queries. I am trying to look for it but resources are not very helpful.
Trying to implement similar logic.
public function initiateInserts()
{
//Open Large CSV File(min 100K rows) for parsing.
$this->fin = fopen($file,'r') or die('Cannot open file');
//Parsing Large CSV file to get data and initiate insertion into schema.
$query = "";
while (($data=fgetcsv($this->fin,5000,";"))!==FALSE)
{
$query = $query + "INSERT INTO dt_table (id, code, connectid, connectcode)
VALUES (" + $data[0] + ", " + $data[1] + ", " + $data[2] + ", " + $data[3] + ")";
}
$stmt = $this->prepare($query);
// Execute the statement
$stmt->execute();
$this->checkForErrors($stmt);
}
#Author: Numenor
Error Message: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '0' at line 1
This Approach inspired to look for an MySQL recursive query approach.
Here is the Approach I was using Earlier:
Current Code:
public function initiateInserts()
{
//Open Large CSV File(min 100K rows) for parsing.
$this->fin = fopen($file,'r') or die('Cannot open file');
//Parsing Large CSV file to get data and initiate insertion into schema.
while (($data=fgetcsv($this->fin,5000,";"))!==FALSE)
{
$query = "INSERT INTO dt_table (id, code, connectid, connectcode)
VALUES (:id, :code, :connectid, :connectcode)";
$stmt = $this->prepare($query);
// Then, for each line : bind the parameters
$stmt->bindValue(':id', $data[0], PDO::PARAM_INT);
$stmt->bindValue(':code', $data[1], PDO::PARAM_INT);
$stmt->bindValue(':connectid', $data[2], PDO::PARAM_INT);
$stmt->bindValue(':connectcode', $data[3], PDO::PARAM_INT);
// Execute the statement
$stmt->execute();
$this->checkForErrors($stmt);
}
}
Updated Code
public function initiateInserts()
{
//Open Large CSV File(min 100K rows) for parsing.
$this->fin = fopen($file,'r') or die('Cannot open file');
//Prepare insertion query to insert data into schema.
$query = "INSERT INTO dt_table (id, code, connectid, connectcode)
VALUES (:id, :code, :connectid, :connectcode)";
$stmt = $this->prepare($query);
// Then, for each line : bind the parameters
$stmt->bindValue(':id', $data[0], PDO::PARAM_INT);
$stmt->bindValue(':code', $data[1], PDO::PARAM_INT);
$stmt->bindValue(':connectid', $data[2], PDO::PARAM_INT);
$stmt->bindValue(':connectcode', $data[3], PDO::PARAM_INT);
//Loop through CSV file and execute inserts prepared, but this is not working
//and there are not data being populated into database.
while (($data=fgetcsv($this->fin,5000,";"))!==FALSE)
{
// Execute the statement
list($id, $code, $connid, $conncode)=$data;
$stmt->execute();
$this->checkForErrors($stmt);
}
}
This was my Main Question for which I am looking for suggestions !!!

There's nothing recursive in that code snippet.
The wrong operator is used to concatenate the strings, it's . (dot) not +
You'd have to use something like mysqli::multi_query() to execute more than one statement with a single function call and the statements would have to be separated by a delimiter character (by default a semicolon)
Since you're already using prepare() and execute() why not simply make it a parametrized prepared statement and then assign the values in each iteration of the loop and execute the statement? (Exactly what is $this and what type of object does $this->prepare() return?)
edit and btw: $this->prepare() indicates that your class extends a database class. And it also holds a file descriptor $this->fin. This has a certain code smell. My guess is that your class uses/has a database/datasink object and a file/datasource, but not is a database+readfile class. Only extend a class if your derived class is something.
edit: a simple example
class Foo {
protected $pdo;
public function __construct(PDO $pdo) {
$this->pdo = $pdo;
}
public function initiateInserts($file)
{
$query = '
INSERT INTO
dt_table_tmp
(id, code, connectid, connectcode)
VALUES
(:id, :code, :connid, :conncode)
';
$stmt = $this->pdo->prepare($query);
$stmt->bindParam(':id', $id);
$stmt->bindParam(':code', $code);
$stmt->bindParam(':connid', $connid);
$stmt->bindParam(':conncode', $conncode);
$fin = fopen($file, 'r') or die('Cannot open file');
while ( false!==($data=fgetcsv($fin,5000,";")) ) {
list($id, $code, $connid, $conncode)=$data;
$stmt->execute();
}
}
}
$pdo = new PDO("mysql:host=localhost;dbname=test", 'localonly', 'localonly');
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
// set up a demo table and some test data
$pdo->exec('CREATE TEMPORARY TABLE dt_table_tmp (id int, code int, connectid int, connectcode int)');
$sourcepath = 'sample.data.tmp';
$fh = fopen($sourcepath, 'wb') or die('!fopen(w)');
for($i=0; $i<10000; $i++) {
fputcsv($fh, array($i, $i%4, $i%100, $i%3), ';');
}
fclose($fh); unset($fh);
// test script
$foo = new Foo($pdo);
$foo->initiateInserts($sourcepath);

a few tips about speeding up mysql data import
check if your data really requires to be parsed, sometimes load data works just fine for csv
if possible, create an sql file first via php and then execute it with mysql command line client
use multivalue inserts
disable keys before inserting
multivalue insert statement is something like
INSERT INTO users(name, age) VALUES
("Sam", 13),
("Joe", 14),
("Bill", 33);
this is much faster than three distinct insert statements.
Disabling keys is important to prevent indexing each time you're executing an INSERT:
ALTER TABLE whatever DISABLE KEYS;
INSERT INTO whatever .....
INSERT INTO whatever .....
INSERT INTO whatever .....
ALTER TABLE whatever ENABLE KEYS;
further reading http://dev.mysql.com/doc/refman/5.1/en/insert-speed.html

Inspired by this question I would say you should do something similar. If you really have so many data, then a bulk import is the most appropriate approach for this. And you already have the data in a file.
Have a look at the LOAD DATA INFILE command.
The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed. The file name must be given as a literal string.
If you are interested in the speed differences then read Speed of INSERT Statements.
E.g. you can do this:
$query = "LOAD DATA INFILE 'data.txt' INTO TABLE tbl_name
FIELDS TERMINATED BY ';'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
"
This will also ignore the first line assuming that it only indicates the columns.

Related

insert sqlite row from php array

So I am attempting to write a generic sqlite insert that can be used no matter how many items a row has. This is for a single row, assumes all columns other than ID, which is set to autoincrementing integer, are assigned, and bindparam must be used. I have been attempting it like so:
Table Quarterbacks
ID---firstName---lastName
public static function insert($table, $values)
{
$pdo = new PDO('sqlite:testTable.sqlite');
$inputString = implode(",", $values);
$statement = $pdo->prepare("insert into $table values (:value)");
$statement->bindParam(':value', $inputString);
$statement->execute();
}
$new = array("Steve", "Young");
Query::insert("Quarterbacks", $new);
The idea being that the table will now add a new row, increment the ID, and add Steve Young. But I get the generic error that the prepare statement is false. I know my pdo is connecting to the database, as other test methods work. There's a lot of array related threads out there but it seems like they're much more complicated than what I'm trying to do. I'm pretty sure it has to do with it treating the single string as invalid, but it also won't take an array of strings for values.
Edit:I'm starting to lean towards a compromise like bash, ie provide a large but not infinite amount of function parameters. Also open to the ...$ style but I feel like that ends up with the same problem as now.
I was able to get this to work
$name = "('daunte' , 'culpepper')";
$cats = "('firstName', 'lastName')";
$statement = $pdo->prepare("insert into $table" .$cats ." values" .$name);
$statement->execute();
But not this
$name = "('reggie' , 'wayne')";
$cats = "('firstName', 'lastName')";
$statement = $pdo->prepare("insert into $table:cats values:name");
$statement->bindParam(':cats', $cats);
$statement->bindParam(':name', $name);
$statement->execute();

Bulk insert using PDO and PHP variable containing all database values

I am new to PHP and am trying to update a deprecated code from mysql to PDO.
Considering that the variable $insert contains all values to bulk insert such as:
('82817cf5-52be-4ee4-953c-d3f4ed1459b0','1','EM3X001P.1a','04.03.10.42.00.02'),
('82817cf5-52be-4ee4-953c-d3f4ed1459b0','2','EM3X001P.2a','04.03.10.33.00.02'),
...etc 13k lines to insert
here is the deprecated code:
mysql_connect('localhost', 'root', '') or die(mysql_error());
mysql_select_db("IPXTools") or die(mysql_error());
if ($insert != '')
{
$insert = "INSERT INTO IPXTools.MSSWireList (ID,Record,VlookupNode,HostWireLocation) VALUES ".$insert;
$insert .= "ON DUPLICATE KEY UPDATE Record=VALUES(Record),VlookupNode=VALUES(VlookupNode),HostWireLocation=VALUES(HostWireLocation)";
mysql_query($insert) or die(mysql_error());
$insert = '';
}
here is the new code:
try
{
$conn = new PDO("mysql:host=$servername;dbname=$dbname", $username, $password);
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION); //set the PDO error mode to exception
// prepare sql and bind parameters
$stmt = $conn->prepare("INSERT INTO IPXTools.MSSWireList (ID, Record, VlookupNode, HostWireLocation)
VALUES (:ID, :Record, :VlookupNode, :HostWireLocation)");
$stmt->bindParam(':ID', $ID);
$stmt->bindParam(':Record', $Record);
$stmt->bindParam(':VlookupNode', $VlookupNode);
$stmt->bindParam(':HostWireLocation', $HostWireLocation);
// insert a row
// loop through all values inside the $insert variable??????? how?
$stmt->execute();
}
catch(PDOException $e)
{
echo "Error: " . $e->getMessage();
}
$conn = null;
During my research I found an excellent post:
PDO Prepared Inserts multiple rows in single query
One method says I would have to change my $insert variable to include all the field names.
And other method says I dont have to do that. I am looking at Chris M. suggestion:
The Accepted Answer by Herbert Balagtas works well when the $data array is small. With larger $data arrays the array_merge function becomes prohibitively slow. My test file to create the $data array has 28 cols and is about 80,000 lines. The final script took 41s to complete
but I didnt understand what he is doing and I am trying to adapt my code to his. The PHP sintax is new to me so I am strugling with handling the arrays, etc...
I guess the starting point would be the variable $insert which contains all the database values I need.
Do I need to modify my $insert variable to include the field names?
Or I could just use its content and extract the values (how?) and include the values in a loop statement? (that would probably execute my 13k rows one at at time)
Thank you
If you have 13k records to insert, it is good for performance to do not use prepared SQL statement. Just generate SQL query in format like this:
INSERT INTO IPXTools.MSSWireList
(ID, Record, VlookupNode, HostWireLocation)
VALUES
('id1', 'r1', 'node1', 'location1'),
('id2', 'r2', 'node2', 'location2'),
...
('id13000', 'r13000', 'node13000', 'location13000');
What you may do for it - use maner of your legacy code. Your try block will looks loke this:
try
{
$conn = new PDO("mysql:host=$servername;dbname=$dbname", $username, $password);
$conn->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$stmt = $conn->exec($insert);
}

import csv to mysql without load data infile

I have to import a csv to a mysql database
I can't use load data infile because it's disabled on the webserver.
are there any other ways to do so?
If you have a scripting language available, you can loop through the CSV lines and have it generate SQL code:
PHP example:
<?php
$lines = file('file.csv');
foreach($lines as $line){
//sepatates each cell by the delimiter "," (watch for delimiters in the cell, escaped or not)
$cell = explode(",",$line);
$sql = "INSERT INTO table (col1,col2,col3) VALUES (";
$sql.= "'".$cell[0]."','".$cell[1]."','".$cell[2]."');";
echo $sql;
}
?>
Loop through the file and insert using a prepared query. Prepared querys should be quicker too, since the DB doesn't have to recompile every SQL string you send it. That will be more noticeable when you have thousands and thousands of lines.
<?php
// assume $db is a PDO connection
$stmt = $db->prepare('INSERT INTO table (col1, col2, col3) VALUES(?, ?, ?)');
// read file contents to an array
$lines = file('file.csv');
// insert each line
foreach ($lines as $line) {
// see manual to specify $delimter, $enclousure, or $more
$cols = str_getcsv($lines);
$stmt->execute($cols);
}
That'll work. Since we're using file(), the script can consume a lot of memory if your CSV file is HUGE. To make better use of resources, do the following to keep only one line in memory at a time:
<?php
// assume $db is a PDO connection
$stmt = $db->prepare('INSERT INTO table (col1, col2, col3) VALUES(?, ?, ?)');
$handle = fopen('test.csv', 'r');
while ($cols = fgetcsv($handle)) {
$stmt->execute($cols);
}

Inserting multiple rows in a table using PHP

I am trying to insert multiple rows into MySQL DB using PHP and HTML from. I know basic PHP and searched many examples on different forums and created one script however it doesn't seem working. Can anybody help with this. Here is my script:
include_once 'include.php';
foreach($_POST['vsr'] as $row=>$vsr) {
$vsr=mysql_real_escape_string($vsr);
$ofice=mysql_real_escape_string($_POST['ofice'][$row]);
$date=mysql_real_escape_string($_POST['date'][$row]);
$type=mysql_real_escape_string($_POST['type'][$row]);
$qty=mysql_real_escape_string($_POST['qty'][$row]);
$uprice=mysql_real_escape_string($_POST['uprice'][$row]);
$tprice=mysql_real_escape_string($_POST['tprice'][$row]);
}
$sql .= "INSERT INTO maint_track (`vsr`, `ofice`, `date`, `type`, `qty`, `uprice`,
`tprice`) VALUES ('$vsr','$ofice','$date','$type','$qty','$uprice','$tprice')";
$result = mysql_query($sql, $con);
if (!$result) {
die('Error: ' . mysql_error());
} else {
echo "$row record added";
}
MySQL can insert multiple rows in a single query. I left your code as close as possible to the original. Keep in mind that if you have a lot of data, this could create a large query that could be larger than what MySQL will accept.
include_once 'include.php';
$parts = array();
foreach($_POST['vsr'] as $row=>$vsr) {
$vsr=mysql_real_escape_string($vsr);
$ofice=mysql_real_escape_string($_POST['ofice'][$row]);
$date=mysql_real_escape_string($_POST['date'][$row]);
$type=mysql_real_escape_string($_POST['type'][$row]);
$qty=mysql_real_escape_string($_POST['qty'][$row]);
$uprice=mysql_real_escape_string($_POST['uprice'][$row]);
$tprice=mysql_real_escape_string($_POST['tprice'][$row]);
$parts[] = "('$vsr','$ofice','$date','$type','$qty','$uprice','$tprice')";
}
$sql = "INSERT INTO maint_track (`vsr`, `ofice`, `date`, `type`, `qty`, `uprice`,
`tprice`) VALUES " . implode(', ', $parts);
$result = mysql_query($sql, $con);
Please try this code. Mysql query will not accept multiple insert using php. Since its is a for loop and the values are dynamically changing you can include the sql insert query inside the for each loop. It will insert each rows with the dynamic values. Please check the below code and let me know if you have any concerns
include_once 'include.php';
foreach($_POST['vsr'] as $row=>$vsr) {
$vsr=mysql_real_escape_string($vsr);
$ofice=mysql_real_escape_string($_POST['ofice'][$row]);
$date=mysql_real_escape_string($_POST['date'][$row]);
$type=mysql_real_escape_string($_POST['type'][$row]);
$qty=mysql_real_escape_string($_POST['qty'][$row]);
$uprice=mysql_real_escape_string($_POST['uprice'][$row]);
$tprice=mysql_real_escape_string($_POST['tprice'][$row]);
$sql = "INSERT INTO maint_track (`vsr`, `ofice`, `date`, `type`, `qty`, `uprice`,
`tprice`) VALUES ('$vsr','$ofice','$date','$type','$qty','$uprice','$tprice')";
$result = mysql_query($sql, $con);
if (!$result)
{
die('Error: ' . mysql_error());
}
else
{
echo "$row record added";
}
}
I would prefer a more modern approach that creates one prepared statement and binds parameters, then executes within a loop. This provides stable/secure insert queries and avoids making so many escaping calls.
Code:
// switch procedural connection to object-oriented syntax
$stmt = $con->prepare('INSERT INTO maint_track (`vsr`,`ofice`,`date`,`type`,`qty`,`uprice`,`tprice`)
VALUES (?,?,?,?,?,?,?)'); // use ?s as placeholders to declare where the values will be inserted into the query
$stmt->bind_param("sssssss", $vsr, $ofice, $date, $type, $qty, $uprice, $tprice); // assign the value types and variable names to be used when looping
foreach ($_POST['vsr'] as $rowIndex => $vsr) {
/*
If you want to conditionally abort/disqualify a row...
if (true) {
continue;
}
*/
$ofice = $_POST['ofice'][$rowIndex];
$date = $_POST['date'][$rowIndex];
$type = $_POST['type'][$rowIndex];
$qty = $_POST['qty'][$rowIndex];
$uprice = $_POST['uprice'][$rowIndex];
$tprice = $_POST['tprice'][$rowIndex];
echo "<div>Row# {$rowIndex} " . ($stmt->execute() ? 'added' : 'failed') . "</div>";
}
To deny the insertion of a row, use the conditional continue that is commented in my snippet -- of course, write your logic where true is (anywhere before the execute call inside the loop will work).
To adjust submitted values, overwrite the iterated variables (e.g. $vsr, $ofice, etc) before the execute call.
If you'd like to enjoy greater data type specificity, you can replace s (string) with i (integer) or d (double/float) as required.

Store array contents in database

I have a php function getContactList():
$res = getContactList(trim($_POST['username']), trim($_POST['password']));
which returns this array:
$contactList[] = array('name' => $name, 'email' =>$email);
I need to store the contents of this array into a MySQL database.
Somehow can i store the entire contents of the array in the database at one go ??
The number of array items will be more than 500 at each go , so i wanna avoid the usual looping practice and calling the "Insert" statement in the for loop as this will need a long time to execute.
Note: I need to store the results in separate columns - One for Name and another for Email. With 500 items in that array I need to have 500 rows inserted - one Name-Email pair per row.
$values = array();
// the above array stores strings such as:
// ('username', 'user#domain.com')
// ('o\'brien', 'baaz#domain.com')
// etc
foreach($contactList as $i => $contact) {
$values[] = sprintf(
"('%s', '%s')",
mysql_real_escape_string($contact['name'], $an_open_mysql_connection_identifier),
mysql_real_escape_string($contact['email'], $an_open_mysql_connection_identifier)
);
}
$query = sprintf(
"INSERT INTO that_table(name, email) VALUES %s",
implode(",", $values)
);
If you are trying to insert many rows, i.e. run the same query many times, use a prepared statement.
Using PDO, you can do this:
// prepare statement
$stmt = $dbh->prepare("INSERT INTO contacts (email, name) VALUES (:email, :name);");
// execute it a few times
foreach($contactList as $contact) {
$stmt->bindValue(':email', $contact['email'], PDO::PARAM_STR);
$stmt->bindValue(':name', $contact['name'], PDO::PARAM_STR);
$stmt->execute();
}
PDO will also take care of proper string escaping.
use the standard php serialize function
$serData_str = serialize($contactList);
then save it in the DB
after reading your data from DB simply unserialize it
$myData_arr = unserialize($serContactList);
In the loop you can create an insert statement and execute it after the loop. the insert statement will include all content.
The statement would look like
INSERT INTO employee (name) VALUES ('Arrayelement[1]'),('Arrayelement[2]'), ('Arrayelement[3]'),('Arrayelement[4]');

Categories