I'm trying to load data from a few hundred text files into a database.
I believe MYSQL is exiting out of the loop without inserting all the rows.
Can anyone suggest how to insert blocks of 1000 rows to the end of data, with PHP code?
$filenames_array = array();
foreach($filenames_array as $filename)
{
$file_array = file($filename);
$file_value = $file_array[0];
$new_array = explode(",", $file_value);
$length = count($new_array);
for($i = 0; $i < $length; $i++)
{
$sql = "INSERT INTO `names`
(`id`, `name`)
VALUES
('',
'" . $new_array[$i] . "'
)";
$result = mysql_query($sql) or die(mysql_error());
echo $i . 'Row Inserted<br />';
}
}
you're probably trying to run too many INSERT statements in a single query.
look into PDO and prepared statements or use SQL syntax like this:
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
Is it possible that one of the entries you're trying to insert contains a single quote '? In this case, an error would occur and the the loop wouldn't finish.
You should always escape the values you insert into the database with mysql_real_escape_string to prevent problems like that, and to make sure you're not vulnerable to sql injection.
$sql = "INSERT INTO `names`
(`id`, `name`)
VALUES
('',
'" . mysql_real_escape_string($new_array[$i]) . "'
)";
Why not combine every txt file into one big text file, and read it line by line? See the examples here http://php.net/manual/en/function.fgets.php
Mainly:
<?php
$handle = #fopen("/tmp/inputfile.txt", "r");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
echo $buffer;
}
fclose($handle);
}
?>
Related
Is it possible to insert multiple data with delay kind of thing/sleep for a few second?
for example here, Im going to insert the values (1,2) into my user table. Then after 5 seconds, it will proceed to insert value (3,4) into the same table, wait for 5 seconds and finally insert (5,6) into the table.
INSERT INTO User (col1, col2)
VALUES (1, 2), (3, 4), (5, 6)
Any suggestions are really appreciated!
<?php // da
$array = array(1, 2, 3, 4);
foreach ($array as $data) {
mysqli_query($conn,"INSERT INTO ... SET data='".$data."'");
sleep(1)
}
?>
Something like that? I did not understand very well.
I found the solution. Maybe it's a bit messy but this is what I've come with and solved my problem. Basically I just need to use prepare if I want to reuse the statement.
$stmt = $dbh->prepare("INSERT INTO user (col1, col2) VALUES ('1','2');
$stmt2 = $dbh->prepare("INSERT INTO user (col1, col2) VALUES ('3','4');
$stmt3 = $dbh->prepare("INSERT INTO user (col1, col2) VALUES ('5','6');
{ $stmt->execute();
sleep(5);
$stmt2->execute();
sleep(5);
$stmt3->execute();
}
That's all. Thanks for those who tried to solve this.
So what I do is create the query by concatenating it like this. But if you want to have multiple inserts you can call that function in a loop and remove this sleep put it in the loop outside the function or you can leave it there if the call is not a thread.
<?php
static function insert_table_db($anArra){
$dbconn = mysqli_connect(DB_HOST, DB_USER,DB_PASSWORD, DB_DATABASE) or die('MySQL connection failed!' . mysqli_connect_error());
mysqli_set_charset($dbconn, "utf8");
$query = "INSERT INTO `table`(`col1`, `col2`, `col3`, `col4`, `col5`) VALUES ";
$i = 1;
foreach ($anArray as $item) {
$query = $query . "(" . $item[0]. "," . $item[1]. "," . $item[2] . "," . $item[3] . ",'" . $item[4] . "')";
if (count($anArray) == $i){
$query = $query . ";";
} else {
$query = $query . ",";
}
$i++;
}
// Run query
if($dbconn->query($query) === true) {
$dbconn->close();
} else {
echo "Database Error: " . $dbconn->error;
$dbconn->close();
}
sleep(5);
}
?>
UPDATE: Sorry if some variables don't make sense I stripped it out of a library I have built and there are some extra things that I didn't delete ;D.
I have the following code which is working fine,
But I need a way to batch process it and
Insert it all in one go.
for($i=0; $i<sizeof($my_array); $i++)
{
$sql = "INSERT INTO exclude_resource (id,resource_id,config_id)
VALUES(DEFAULT, '$my_array[$i]' ,'$insert_id')";
$command = $connection->createCommand($sql);
$result = $command->execute();
}
Your query should resemble something like (see mysql docs):
INSERT INTO table (field1, field2, field3)
VALUES
(value1a, value2a, value3a),
(value1b, value2b, value3b),
(value1c, value2b, value3c),
...
So put the values in an array, join them with commas, and execute the resulting query. Incorporated in PHP:
$values = array();
for($i=0; $i<sizeof($my_array); $i++) {
$values[] = "(DEFAULT, '$my_array[$i]' ,'$insert_id')";
}
$sql =
"INSERT INTO exclude_resource (id,resource_id,config_id) VALUES '.
join(',', $values);
$command = $connection->createCommand($sql);
$result = $command->execute();
$insert_id is needs to have a value, but thats the same with your code snippet.
If you have more than 5k or 10k rows to insert, you should run an INSERT in between and reset the array.
So, I've got a few txt files, each container around 400,000 lines.
Each line is a word, I need to add to my database, if it isn't in there already.
Currently my code for checking/adding every word is
$sql = mysql_sql("SELECT `id` FROM `word_list` WHERE `word`='{$word}' LIMIT 1");
$num = mysql_num($sql);
if($num == '0'){
$length = strlen($word);
$timestamp = time();
#mysql_sql("INSERT INTO `word_list` (`word`, `length`, `timestamp`) VALUES ('{$word}', '{$length}', '{$timestamp}')");
}
and the functions being called are:
function mysql_sql($sql){
global $db;
$result = $db->query($sql);
return $result;
}
function mysql_num($result){
return $result->num_rows;
}
I'm looking for a better way to insert each word into the database.
Any ideas would be greatly appreciated.
I can think of some ways to do this.
First, if you have access to the MySQL server's file system you can use LOAD DATA INFILE to create a new table, then do an insert from that new table into your word_list table. This will most likely be your fastest option.
Second (if you don't have access to the MySQL server's file system), put a primary key or unique index on word_list.word. Then get rid of your SELECT query and use INSERT IGNORE INTO word_list .... That will allow MySQL automatically to skip the duplicate items without any need for you to do it with a query/insert operation.
Third, if your table uses an access method that handles transactions (InnoDB, not MyISAM), issue a BEGIN; statement before you start your insert loop. Then every couple of hundred rows issue COMMIT;BEGIN; . Then at the end issue COMMIT;. This will wrap your operations in multirow transactions so will speed things up a lot.
Try out this code. It will first create query with all your values and you will run query only ONCE ... Not again and again for ever row
$values = array();
$sql = mysql_sql("SELECT `id` FROM `word_list` WHERE `word`='{$word}' LIMIT 1");
$num = mysql_num($sql);
$insert_query = "INSERT INTO `word_list` (`word`, `length`, `timestamp`) VALUES ";
if ($num == '0') {
$length = strlen($word);
$timestamp = time();
$values[] = "('$word', '$length', '$timestamp')";
}
$insert_query .= implode(', ', $values);
#mysql_sql($insert_query);
I have the following script which uploads the data from a CSV file to my database.
Problem occurs though when one of the fields in the CSV has a apostrophe (')
Sample data from CSV:
"12345","John","Smith","john.smith#gmail.com","Company Name"
"12346","Joe","Blogg","joe.blogg#gmail.com","Company's Name"
Code I'm using:
<?
$link = mysql_connect("localhost", "######", "######") or die("Could not connect: ".mysql_error());
$db = mysql_select_db("######") or die(mysql_error());
$row = 1;
$handle = fopen ("file.csv","r");
while ($data = fgetcsv ($handle, 1000, ",")) {
$query = "INSERT INTO suppliers(`regid`, `firstname`, `lastname`, `email`, `company`) VALUES('".$data[0]."', '".$data[1]."', '".$data[2]."', '".$data[3]."', '".$data[4]."') ON DUPLICATE KEY UPDATE REGID='".$data[0]."', firstname='".$data[1]."', lastname='".$data[2]."', email= '".$data[3]."', company= '".$data[4]."'";
$result = mysql_query($query) or die("Invalid query: " . mysql_error().__LINE__.__FILE__);
$row++;
}
fclose ($handle);
?>
Can anyone suggest a solution to get around this?
Many thanks
The best solution would be to upgrade to PDO or mysqli, and make use of their parametrized queries.
If you can't, you should escape the data before inserting it into queries:
while ($data = fgetcsv($handle, 1000, ",")) {
$data = array_map('mysql_real_escape_string', $data);
$query = "INSERT INTO suppliers(`regid`, `firstname`, `lastname`, `email`, `company`) VALUES('".$data[0]."', '".$data[1]."', '".$data[2]."', '".$data[3]."', '".$data[4]."') ON DUPLICATE KEY UPDATE REGID='".$data[0]."', firstname='".$data[1]."', lastname='".$data[2]."', email= '".$data[3]."', company= '".$data[4]."'";
$result = mysql_query($query) or die("Invalid query: " . mysql_error().__LINE__.__FILE__);
$row++;
}
You should always be using mysql_real_escape_string on any user-supplied data, to protect against SQL injection or deal with syntax problems like this.
I would use LOAD DATA for this.
LOAD DATA LOCAL INFILE 'file.csv' REPLACE INTO TABLE suppliers
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
(regid, firstname, lastname, email, company);
No need to fopen(), fgetcsv(), or execute INSERT so many times. No need to worry about apostrophes or other special characters.
If you want to carry on using mysql_query as described, i'd suggest running $data though array _map to escape the single quotes. Something like
$data = array_map(function($string){
return str_replace("'", "\'", $string);
}, $data);
The best way to do this is using the LOAD DATA INFILE statement to bulk load a CSV into MySQL:
LOAD DATA INFILE 'csvfile' into suppliers FIELDS TERMINATED BY ','
The added advantage of this is that you don't need to escape your strings at all -- LOAD DATA does it for you. Let me know if you have further questions.
I have a text file to read which has around 10000 points separated by,
x1,y1
x2,y2
x3,y3
.
.
10000 times
I read them using a loop in PHP and then store in an array and then I run a loop and insert one line at a time in my database. It takes really long time. Is there any way I can insert the whole array
for ($i=0; $i<10000; $i++)
{
$sql = '
INSERT INTO `firefly`.`FreeFormPoly`
(`markedObjectID`, `order`, `x`, `y`)
VALUES
('.$markedObjectsID.', '.$order.', '.$valuesx[i].','.$valuesy[i].')';
$db->query($sql, $markedObjectsID, $order, $values[1], $values[0]);
}
Try using multiple insert statement. Generate one insert and submit the entire statement
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
SO:
$sql = 'INSERT INTO `firefly`.`FreeFormPoly` (`markedObjectID`, `order`, `x`, `y`) VALUES';
for($i=0;$i<10000;$i++) {
if($i != 0) $sql .= ',';
$sql .= '('.$markedObjectsID.', '.$order.', '.$valuesx[i].','.$valuesy[i].')';
}
$db->query($sql);
I would do something like this:
$sql = 'INSERT INTO `firefly`.`FreeFormPoly` (`markedObjectID`, `order`, `x`, `y`) VALUES';
for($i=0;$i<length;$i++) {
$sql .= '('.$markedObjectsID.', '.$order.', .$valuesx[i].','.$valuesy[i].'),';
}
$sql = substr($sql,0,-1);
$db->query($sql);
Explanation:
The syntax to enter multiple records is
INSERT INTO TABLE_NAME VALUES(VAL1, VAL2, ....), (...), (...);
In the SQL you are concatenating (val1,val2,val3), every time you execute the loop hece you will get an extra , in the last position and substr() trims it off.
More preferably I would do
$sql = 'INSERT INTO `firefly`.`FreeFormPoly` (`markedObjectID`, `order`, `x`, `y`) VALUES ';
for($i=0;$i<length;$i++) {
$sql .= '('.$markedObjectsID.', '.$order.', .$valuesx[i].','.$valuesy[i].'),';
}
$sql = substr($sql,0,-1);
$result = mysqli_query($db,$sql)
or die('Error in querying the database');
You should be able to speed it up considerably by sending BEGIN TRANSACTION before the loop and COMMIT after the loop. One time I had to insert 14,000 data points on SQLite, and it took 20 minutes, but when I put the data in as a transaction it completed in 0.3 seconds.
First off, you can use prepared statements to reduce the network overhead. Assuming PDO:
$stmt = $db->prepare('INSERT INTO mytable (foo, bar, baz) VALUES (:foo, :bar, :baz)');
for ($i = 0; $i < $length; ++$i) {
$stmt->execute(array(
':foo' => $data[$i]['foo'],
':bar' => $data[$i]['bar'],
':baz' => $data[$i]['baz'],
));
}
Second, you could wrap the whole code inside $db->beginTransaction() and $db->commit().