I write c++ to do 30000*30000 query MySQL insert
Example
for(i=0;i<30000*30000;i++){
// do the MySQL insert
call the function to insert code (maybe just query insert the i value)
}
and I am using mysqlsart() and mysqlclose() sub-function
void mysqlstart()
{
//MYSQL *conn_ptr;
conn_ptr = mysql_init(NULL);
if(!conn_ptr)
{
fprintf(stderr,"mysql_init failed\n");
//return EXIT_FAILURE;
}
conn_ptr = mysql_real_connect(conn_ptr,"localhost","root","nlpgroup","testdb",0,NULL,0);
}
void mysqlclose()
{
mysql_close(conn_ptr);
}
First time I call the function like
mysqlstart();
for loop
mysqlclose();
That will be great in the beginning and after few days I found the error : MySQL server has gone away
And find some solution from internet maybe change max_allowed_packet or some else...
And I don't know what is suitable settings for 30000*30000
And I want to is there something I can change in my code or way to speed up query
You can insert multiple rows at once in this way
http://dev.mysql.com/doc/refman/5.5/en/insert.html
INSERT statements that use VALUES syntax can insert multiple rows. To
do this, include multiple lists of column values, each enclosed within
parentheses and separated by commas. Example:
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
Related
I have a function that is designed to copy a product with all attributes with help of sql querys. My problem is to return new_product_id to php after completion.
If i run sql script in phpmyadmin all is working.
If i run sql script with php function all is working.
What i need help with is how to assign mysql-set-variable: #new_product_id from last query to php variable that I want to return.
----- sql query ------
CREATE TEMPORARY TABLE tmptable SELECT * FROM product WHERE id='19' AND site_id='1';
UPDATE tmptable SET id = 0,parent_id='19',status_id='1',name_internal=concat('NEW ',name_internal);
INSERT INTO product SELECT * FROM tmptable;
SET #new_product_id = LAST_INSERT_ID();
DROP TABLE tmptable;
CREATE TEMPORARY TABLE tmptable SELECT * FROM product_abcd WHERE product_id='19' AND site_id='1';
UPDATE tmptable SET product_id = #new_product_id,id=0;
INSERT INTO product_abcd SELECT * FROM tmptable;
DROP TABLE tmptable;
CREATE TEMPORARY TABLE tmptable SELECT * FROM product_efgh WHERE product_id='19' AND site_id='1';
UPDATE tmptable SET product_id = #new_product_id,id=0;
INSERT INTO product_efgh SELECT * FROM tmptable;
DROP TABLE tmptable;
(Here is more correct SQL insert statements)
SELECT #new_product_id AS new_product_id;
----- sql query ------
----- php function (not complete)------
This function is working making a new copy of product, code below is not complete but works so please only focus on multiquery part.
//return 0 for fail or new product_id (!=0) for success
public function copyProduct($data){
$res=0;
//if something, build sql-query as
$sql="sql from above";
//if we have a query to run
if(!empty($sql)){
//this is multi query, use correct function
if ($this->connect()->multi_query($sql) === TRUE) {
//loop it
while ($this->connect()->more_results()){
$result=$this->connect()->next_result();
}//while more results
}//if multiquery ok
return $res;
}//end function copy
----- php function (not complete)------
above code works, i get a nice copy of product with
result =0 for fail and
result 1 for success, (this works)
How i would like it to work is
result= 0 for fail and
result= new_product_id for success
so i can redirect user to the newly created product and therefore save user one click.
Results from query, same from phpmyadmin as from php (all good so far, no incorrect querys at this time)
Mysql returned empty results (no rows) (create temporary table)
1 row affected (update tmpt table)
1 row insert (insert into product)
mysql returned emtpy result (set $new_product_id)
mysql returened empty results (drop tmp table)
mysql returned empty result (create temporary table)
mysql x row affected (update tmp table)
mysql x row affected (insert into table)
mysql returned empty results (drop table tmptable)
mysql returned empty results (create temporary table)
.... N.....
last query "showing rows 0-0 ( 1 total) (select #new_product_id)
new_product_id=25
What have I tried?
I placed the select variable as my final query, i thought it was smart only check last query and assign variable there, but i failed due to php mysqli fetch_assoc is not possible on non object.
so next up was not so bright, i know i have 16 results from mysql and i only need the result from one of them, but anyway i places this inside multiquery
----- php function (not complete)------
This function is working making a new copy of product, NOT WORKING assigning new_product_id
//return 0 for fail or new product_id (!=0) for success
public function copyProduct($data){
$res=0;
//if something, build sql-query as
$sql="sql from above";
//if we have a query to run
if(!empty($sql)){
//this is multi query, use correct function
if ($this->connect()->multi_query($sql) === TRUE) {
//loop it
while ($this->connect()->more_results()){
//insert,update,drop will return false even if sql is ok, this would be sufficient for us now
if ($result = $this->connect()->store_result()) {
$row = $result->fetch_assoc();
if(isset($row["new_product_id"])){
//new return value of newly copied product
$res=$row["new_product_id"];
$result->free();
}
}
$result=$this->connect()->next_result();
}//while more results
}//if multiquery ok
return $res;
}//end function copy
----- php function (not complete)------
Checking other questions on stackoverflow recommended sending multiple normal querys, this seems like a bad solution when multi_query exists.
checking php library for multiquery did me no good, i cant understand how it works, as many others pointed out the documentation seems like a copy from another function.
Remember that multi_query() sends a clump of SQL queries to MySQL server but waits for the execution of only the first one. If you want to execute SQL using multi_query() and get only the result of the last query ignoring the previous ones then you need to perform a blocking loop and buffer the results into PHP array. Iterate over all results waiting for MySQL to process each query and once MySQL responds there are no more results you can keep the last fetched result.
For example, consider this function. It sends a bunch of concatenated SQL queries to the MySQL server and then waits for MySQL to process each query one by one. Every result is fetched into PHP array and the last available array is returned from the function.
function executeMultiQueryAndGetOnlyLastResult(mysqli $mysqli):array {
$mysqli->multi_query('
SELECT "a";
SELECT 2;
SELECT "val";
');
$values = [];
do {
$result = $mysqli->use_result();
if ($result) {
// process the results here
$values = $result->fetch_all();
$result->free();
}
} while ($mysqli->next_result()); // next_result will block and wait for next query to finish on MySQL server
$mysqli->store_result(); // Needed to fetch the error as exception
return $values;
}
Obviously it would be much easier to send each query separately to MySQL instead. multi_query() is very complicated and has very limited use. It can be useful if you have a number of SQL queries which you cannot execute separately via PHP, but most of the time you should be using prepared statements and send each query separately.
Another one bites the dust, I gave up and defined an array of sql querys from 0 to 14 and run it as mysqli->query() instead. Thank you all for comments and your time.
You could try using .multi_query() for all the queries in your operation except the last one, the SELECT that returns the id you want. Then run that SELECT as a single query.
This is a robust solution to your problem: #-variables belong to MySql connections and persist for the lifetimes of those connections.
And, it makes for clean and predictable operation of your software. When you need a result set returned to your program, use a single query.
I need to execute my function inside a query like this
$re = $bddp->prepare("SELECT * FROM `shop`, `hours` WHERE isOpen('`hours`.`day1`') = true";
$re->execute();
hours.day1 is a varchar with opening hours of monday like this "10:00-14:00"
Function isOpen test if its open or not and return true or false
The question is who i can send hours.day1 like a variable into isOpen function in WHERE isOpen('hours.day1') ?
Its not possible to use PDO prepare or execute for this ?
In the SQL query you can use only MySQL native functions, stored functions/procedures and User-Defined Functions (UDF).
I think that you would not have a problem if the table structure was right (start-end times were in the separate columns). Then you would able to achieve your goal only with a few conditions in the WHERE part.
If the data amount (rows count) is not big, then you can just select all rows and done the validation in the PHP side.
I have to INSERT thousand of records.
I use msqli::multi_query() in a loop and want to 'multi_query' block of n query (where 'n' is a parameter).
first INSERT goes ok, the second goes wrong because I have to manage result like this :
while($mysqli->more_results())
{
$mysqli->next_result();
if($res = $mysqli->store_result()) // added closing bracket
{
$res->free();
}
}
The problem is that this chek is slow.
Question is : how can I optimize this bulk INSERT makeing faster the manage of result ?
If you are inserting in a one table just use batch mod in INSERT.
Example:
INSERT INTO TABLE (field1,field2,field3) VALUES (value1,value2,value3),(value4,value5,value6)
Use for or foreach in php to make query and then simply use mysqli_query. Judging by errors you mentioned in comment section, you violating ket restrictions meaning you are trying to insert the same value twice.
If you are inserting in a different tables, you can use multi_query interface.
Prepare one query Insert1;Insert2;
And use multi_query like this
$con->multi_query($query);
while (mysqli_more_results($con)) {
mysqli_use_result($con);
mysqli_next_result($con);
}
I'm downloading large sets of data via an XML Query through PHP with the following scenario:
- Query for records 1-1000, download all parts (1000 parts has roughly 4.5 megs of text), then store those in memory while i query the next 1001 - 2000, store in mem (up to potentially 400k)
I'm wondering if it would be better to write these entries to a text field, rather than storing them in memory and once the complete download is done trying to insert them all up into the DB or to try and write them to the DB as they come in.
Any suggestions would be greatly appreciated.
Cheers
You can run a query like this:
INSERT INTO table (id, text)
VALUES (null, 'foo'), (null, 'bar'), ..., (null, 'value no 1000');
Doing this you'll do the thing in one shoot, and the parser will be called once. The best you can do, is running something like this with the MySQL's Benchmark function, running 1000 times a query that inserts 1000 records, or 1000000 of inserts of one record.
(Sorry about the prev. answer, I've misunderstood the question).
I think write them to database as soon as you receive them. This will save memory and u don't have to execute a 400 times slower query at the end. You will need mechanism to deal with any problems that may occur in this process like a disconnection after 399K results.
In my experience it would be better to download everything in a temporary area and then, when you are sure that everything went well, to move the data (or the files) in place.
As you are using a database you may want to dump everything into a table, something like this code:
$error=false;
while ( ($row = getNextRow($db)) && !error ) {
$sql = "insert into temptable(key, value) values ($row[0], $row[1])";
if (mysql_query ($sql) ) {
echo '#';
} else {
$error=true;
}
}
if (!error) {
$sql = "insert into myTable (select * from temptable)";
if (mysql_query($sql) {
echo 'Finished';
} else {
echo 'Error';
}
}
Alternatively, if you know the table well, you can add a "new" flag field for newly inserted lines and update everything when you are finished.
I often have large arrays, or large amounts of dynamic data in PHP that I need to run MySQL queries to handle.
Is there a better way to run many processes like INSERT or UPDATE without looping through the information to be INSERT-ed or UPDATE-ed?
Example (I didn't use prepared statement for brevity sake):
$myArray = array('apple','orange','grape');
foreach($myArray as $arrayFruit) {
$query = "INSERT INTO `Fruits` (`FruitName`) VALUES ('" . $arrayFruit . "')";
mysql_query($query, $connection);
}
OPTION 1
You can actually run multiple queries at once.
$queries = '';
foreach(){
$queries .= "INSERT....;"; //notice the semi colon
}
mysql_query($queries, $connection);
This would save on your processing.
OPTION 2
If your insert is that simple for the same table, you can do multiple inserts in ONE query
$fruits = "('".implode("'), ('", $fruitsArray)."')";
mysql_query("INSERT INTO Fruits (Fruit) VALUES $fruits", $connection);
The query ends up looking something like this:
$query = "INSERT INTO Fruits (Fruit)
VALUES
('Apple'),
('Pear'),
('Banana')";
This is probably the way you want to go.
If you have the mysqli class, you can iterate over the values to insert using a prepared statement.
$sth = $dbh->prepare("INSERT INTO Fruits (Fruit) VALUES (?)");
foreach($fruits as $fruit)
{
$sth->reset(); // make sure we are fresh from the previous iteration
$sth->bind_param('s', $fruit); // bind one or more variables to the query
$sth->execute(); // execute the query
}
one thing to note about your original solution over the implosion method of jerebear (which I have used before, and love) is that it is easier to read. The implosion takes more programmer brain cycles to understand, which can be more expensive than processor cycles. premature optimisation, blah, blah, blah... :)
One thing to note about jerebear's answer with multiple VALUE-blocks in one INSERT:
It can be rather dangerous for really large amounts of data, because most DBMS have an upper limit on the size of the commands they can handle. If you exceed that with too many VALUE-blocks, your insert will fail. On MySQL for example the limit is usually 1MB AFAIK.
So you should figure out what the maximum size is (ideally at runtime, might be available from the database metadata), and make sure you don't exceed it by spreading your lists of values over several INSERTs.
I was inspired by jerebear's answer to build something like his second option for one of my current projects. Because of the shear volume of records I couldn't save and do all the data at once. So I built this to do imports. You add your data, and then call a method when each record is done. After a certain, configurable, number of records the data in memory will be saved with a mass insert like jerebear's second option.
// CREATE TABLE example ( Id INT, Field1 INT, Field2 INT, Field3 INT);
$import=new DataImport($dbh, 'example', 'Id, Field1, Field2, Field3');
foreach ($whatever as $row) {
// add data in the order of your column definition
$import->addValue($Id);
$import->addValue($Field1);
$import->addValue($Field2);
$import->addValue($Field3);
$import->nextRow();
}
$import->lastRow();