PHP script with MySQL query timing out - php

I am having issues running a PHP script which inserts data to MySQL. The error I get is "504 Gateway Time - out nginx" When the PHP page gets stuck with this timeout 10,102 lines of data have been entered to the database. I'm planning to insert 160,000 lines in one load of the script.
I have made my code more efficient by using a prepared statement for the SQL.
The SQL is also set up in this structure:
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
I have read SO PHP script times out and How to keep a php script from timing out because of a long mysql query
I have tried adding to the start of my code but doesn't seem to make a difference:
set_time_limit(0);
ignore_user_abort(1);
Can anyone show me data to split dataset into chunnks and for each chunk data is inserted?
I will show the section of code that inserts to MySQL below
// prepare and bind
$stmt = $link->prepare("INSERT INTO MyGuests (`eventID`,`location`,`date`,`barcode`,`runner`,`time`,`Run Points`,`Volunteer Points`,`Gender`, `Gender pos`) VALUES (?,?,?,?,?,?,?,?,?,?)");
$stmt->bind_param("isssssiisi", $eventID,$location,$date,$barcode,$runner,$time,$runpoints,$volpoints,$gender,$genderpos);
// set parameters and execute
for( $x=0; $x < count($array_runner); $x++ ){
$eventID=null;
$barcode=$array_barcode[$x];
$runner=$array_runner[$x];
$time=$array_time[$x];
$runpoints=$array_score[$x];
$volpoints=' ';
$gender=$array_gender[$x];
$genderpos=$array_gender_pos[$x];
$stmt->execute();
}
$stmt->close();
$link->close();
I am new to working with MySQL and am looking for some guidance with this problem.

set_time_limit(0); resets the count when it is executed. It does not change the max_execution_time in php.ini so to make it have any useful effect you would have to run it in the loop.
// prepare and bind
$stmt = $link->prepare("INSERT INTO MyGuests (`eventID`,`location`,`date`,`barcode`,`runner`,`time`,`Run Points`,`Volunteer Points`,`Gender`, `Gender pos`) VALUES (?,?,?,?,?,?,?,?,?,?)");
$stmt->bind_param("isssssiisi", $eventID,$location,$date,$barcode,$runner,$time,$runpoints,$volpoints,$gender,$genderpos);
// set parameters and execute
for( $x=0; $x < count($array_runner); $x++ ){
$eventID=null;
$barcode=$array_barcode[$x];
$runner=$array_runner[$x];
$time=$array_time[$x];
$runpoints=$array_score[$x];
$volpoints=' ';
$gender=$array_gender[$x];
$genderpos=$array_gender_pos[$x];
$stmt->execute();
// every 5000 times through the loop reset the timeout
if ( $x % 5000 == 0 ) {
set_time_limit(30);
}
}
$stmt->close();
$link->close();
Of course you can play with the value 5000 so it does the reset less often.
From the Manual:
When called, set_time_limit() restarts the timeout counter from zero. In other words, if the timeout is the default 30 seconds, and 25 seconds into script execution a call such as set_time_limit(20) is made, the script will run for a total of 45 seconds before timing out.

If you are using query inside a loop with so large number of rows it would definitely stuck.
The best way I can suggest is simply handle all the data to be inserted in a PHP string and then fire a single query to insert data.
Let me elaborate
$data_to_insert = '' // will contain all data to inserted
$count = 1;
$eventID = null; // if it is null for all rows
for( $x=0; $x < count($array_runner); $x++ )
{
if($count == 1) // checking if it is the first value to be inserted
{
$data_to_insert = "(";
$count = 2;
}
else // with second value onwards
{
$data_to_insert = ",(" ;
}
$data_to_insert = $data_to_insert . $eventID . ",";
$data_to_insert = $data_to_insert . "'". $barcode . "'";
$data_to_insert = $data_to_insert . "'". $array_runner[$x] . "'";
$data_to_insert = ")";
}
// so in the last $data_to_insert should look like this
// $data_to_insert = (eventid1 , 'barcode1', 'runner1'), (eventid2 , 'barcode2', 'runner2') and so on...
Then fire the query
mysqli_query("INSERT INTO MyGuests (`eventID`,`barcode`,`runner`) values" . $data_to_insert);
// which would look like
// INSERT INTO MyGuests (`eventID`,`barcode`,`runner`) values (eventid1 , 'barcode1', 'runner1'), (eventid2 , 'barcode2', 'runner2')
Note :
There might be some syntax error in my code, but you get the logic here.

Related

more then 100 rows not inserting in postgresql using php throught form

I'm trying to insert more than 100 rows in to postgresql database using PHP in a loop. I am not getting any errors.
The data is getting inserted when i try to add around 50 to 60 records. but when records is around 100 and above it is not getting inserted.
Below is the code i tried. Please go through and help me in solving this issue.
Thanks in advance.
<?php
$userid = $_SESSION['user_id'];
$array ='';
$resultAgain ='';
if (isset($_POST['save'])) {
// $sponsorship_id = $_POST['sponsorid'];
$resultAgain=array();
$resultAgain = $_SESSION['arr_rows'];
for ($i = 0; $i <count($resultAgain); ++$i) {
$recieptid = $resultAgain[$i]['recieptid'];
$childid = $resultAgain[$i]['childid'];
$openingbalance_fee = $resultAgain[$i]['openingbalance_fee'];
$openingbalance_familyhelp = $resultAgain[$i]['openingbalance_familyhelp'];
$mayreciept = $resultAgain[$i]['mayreciept'];
$december_reciept = $_POST['decreciept'.$resultAgain[$i]['presentclass']];
$adminfees = $_POST['adminfees'.$resultAgain[$i]['presentclass']];
$schoolfee = $_POST['schoolfee'.$resultAgain[$i]['presentclass']];
$familyhelp = $resultAgain[$i]['family_help'];
$year = $_POST['yearName'];
$submit = $_POST['save'];
// call insert function
$sql1="SELECT fn_reciept_insert($childid,
'$openingbalance_fee',
'$openingbalance_familyhelp',
'$mayreciept',
'$december_reciept',
'$adminfees',
'$familyhelp',
'$schoolfee',
'$year',
$userid,
localtimestamp,
$userid,
localtimestamp)";
$result1 = pg_query($dbconn,$sql1);
if (!$result1) {
echo '<script>alertMX("Data Not Updated")</script>';
}
else
{
echo '<script>alertMX("Data inserted Successfully")</script>';
} }
}
}
?>
First, make sure you are closing that db connection at the end of your script. Then if you keep having problems, try increasing your max_connections https://wiki.postgresql.org/wiki/Tuning_Your_PostgreSQL_Server#max_connections
ALTER SYSTEM SET max_connections TO 300;
and restart postgresql server.
Also the problem may not be in the amount of data you try to INSERT. You are creating the query using unescaped data. What that means? If one of your variables that you put into that big query string has a ' character, the query string will get messed up and that is a big security problem (read about sql injection). Bigger the amount of data, bigger the chance a ' will appear. You should escape your data

Multiple row inserts as fast as possible

I've seen multiple threads discussing this but there always has been totally different conclusion in the answers. Especially I wonder whether it is really necessary to create a own prepared statement (with the right amount of placeholders) in order to insert it as single query. I expected that when I use beginTransaction and endTransaction before and after my for loop, that pdo/php waits with the transaction until all data is collected and it will send these data's as a single query once the server hits the line endTransaction.
How would I need to rewrite such a for loop insert with multiple inserts in order to reach the best performance (it has between 1 and 300 rows usually but it also could reach 2000 rows).
for($i=0; $i<$baseCount; $i++)
{
$thLevel = $bases[$i]["ThLevel"];
$gold = $bases[$i]["Gold"];
$elixir = $bases[$i]["Elixir"];
$darkElixir = $bases[$i]["DarkElixir"];
$dateFound = $elixir = $bases[$i]["TimeFound"];
$query = $db->prepare("INSERT INTO bot_attacks_searchresults (attack_id, available_gold, available_elixir, available_dark_elixir, date_found, opponent_townhall_level)
VALUES (:attack_id, :available_gold, :available_elixir, :available_dark_elixir, :date_found, :opponent_townhall_level)");
$query->bindValue(':attack_id', $attackId);
$query->bindValue(':available_gold', $gold);
$query->bindValue(':available_elixir', $elixir);
$query->bindValue(':available_dark_elixir', $darkElixir);
$query->bindValue(':date_found', $dateFound);
$query->bindValue(':opponent_townhall_level', $thLevel);
$query->execute();
}
Prepare the statement once. MySQL lexes it once, so any subsequent call to the query will be quick since it's already lexed and juts needs parameters.
Start the transaction before the loop. This is done so your hard drive can write down all the rows in one input output operation. The default mode is that 1 insert query = 1 I/O of the hdd.
Create the loop, bind your parameters there and call the $query->execute();
Exit the loop and commit() the transaction.
Full code:
$db->beginTransaction();
$query = $db->prepare("INSERT INTO bot_attacks_searchresults (attack_id, available_gold, available_elixir, available_dark_elixir, date_found, opponent_townhall_level)
VALUES (:attack_id, :available_gold, :available_elixir, :available_dark_elixir, :date_found, :opponent_townhall_level)");
for($i = 0; $i < $baseCount; $i++)
{
$thLevel = $bases[$i]["ThLevel"];
$gold = $bases[$i]["Gold"];
$elixir = $bases[$i]["Elixir"];
$darkElixir = $bases[$i]["DarkElixir"];
$dateFound = $elixir = $bases[$i]["TimeFound"];
$query->bindValue(':attack_id', $attackId);
$query->bindValue(':available_gold', $gold);
$query->bindValue(':available_elixir', $elixir);
$query->bindValue(':available_dark_elixir', $darkElixir);
$query->bindValue(':date_found', $dateFound);
$query->bindValue(':opponent_townhall_level', $thLevel);
$query->execute();
}
$db->commit();
Here's a very crude proof of concept:
<?php
$values = array();
for($i=0;$i<10;$i++)
{
$values[] = "($i)";
}
$values = implode($values,',');
$query = "INSERT INTO my_table VALUES $values";
echo $query;
?>
outputs INSERT INTO my_table VALUES (0),(1),(2),(3),(4),(5),(6),(7),(8),(9)
You would need to restructure this slightly to work with prepare (PHP is not my forte), but the principle is the same; i.e. you build the query inside the loop, but execute it only once.

Why my PHP loop got broken every time?

I am a beginner to PHP. I am trying to insert large amount of items into database. I am using for loop and mysqli_query commands.
My pseudo-code is something like this:
for ($i=0; $i<50000; $i++) {
mysqli_query($con, "INSERT INTO users (name, age) VALUES ('$name','$age'); ") ;
}
I have my code working a few hundred loops and then poof: Fatal error: Maximum execution time of 30 seconds exceeded in xxx on line xxx
For me, only solution is to manually increment my counter every time after loop breaks.
Is there any other solution? Please, help!
Don't do a query for every record one-at-time. You can do multiple inserts with one statement. The example below does 50 at one time. You can probably safely increase that to 100 or even 500.
$counter = 0;
$query = 'INSERT INTO users (name, age) VALUES ';
for ($i=0; $i<50000; $i++) {
$counter++;
$sql .= "('$name','$age'),";
// execute query every 50 records
if ($counter % 50 === 0) {
$sql = rtrim($sql, ',');
mysqli_query($con, $query . $sql) ;
$sql = '';
}
}
Throw a try catch
for ($i=0; $i<50000; $i++) {
try {
mysqli_query($con, "INSERT INTO users (name, age) VALUES ('$name','$age'); ") ;
} catch ($e) {
// You can log $e here if you want
// It would also probably be good to print out your query
// so you can go back and try it again later
}
}
The most helpful answer was the comment from Jay Blanchard:
Try the set_time_limit() function. Calling set_time_limit(0) will
remove any time limits for execution of the script.
I have inserted set_time_limit(0) at the beginning of the page, and my PHP script now works perfectly!

Why str_shuffle always generate similar patterns?

I am trying to generate 1500 authentication code using the following code:
<?php
include "../include/top.php";
set_time_limit(0);
ini_set("memory_limit", "-1");
$end=0;
while ($end<1500)
{
//generate an authentication code
$string="ABCDEFGHJKLMNPQRSTUVWXYZ123456789";
$string= substr(str_shuffle($string),5,8) ;
//check whether generated code already exist
$query = "select count(*) from auth where code = '$string' ";
$stmt = prepare ($query);
execute($stmt);
$bind = mysqli_stmt_bind_result($stmt, $count);
check_bind_result($bind);
mysqli_stmt_fetch($stmt);
mysqli_stmt_free_result($stmt);
//If generated code does not already exist, insert it to Database table
if ($count == 0)
{
echo $string."<br>";
$query = "insert into auth (Code) values ('$string')";
$stmt = prepare ($query);
execute($stmt);
$end++;
}
}
?>
It generated and inserted 1024 codes in database and printed 667 codes in browser within 15 seconds and the browser continue loading without inserting/printing further codes, until I close the browser window after half an hour.
After that when opening any web page in browser from the WAMP, It shows like the browser is loading and does not show the content. That is, I need to restart the WAMP after running this script before opening any web pages.
I have tried this many times.
Why the script does not generate 1500 codes and always stop when it reach the count 667/1024?
UPDATE
As an experiment, I have added an ELSE clause to the IF condition and wrote the code to print "Code Already Exist" in ELSE clause. And ran the script with an empty(truncated) copy of the same table , then it print and inserted 1024 codes and after that, it print "Code Already Exist" continuously (Around 700 000+ entries within 5 minutes and continuing). And again when running the script with table having only 1024 rows, It doesn't print or insert even a single code. Instead it infinitely continuing print "Code Already Exist".
Another thing I observed that the very first 1024 iteration of the WHILE loop passes the IF condition(if the table is empty). And all the subsequent iteration failed the IF condition.
I dont think that randomiser in str_shuffle is up to this.
If I run it once I get 1024 unique codes and then it just generates duplicates. If I then restart it, it will generate another 976 unique codes giving a total of 2000 codes on the database.
I therefore assume that the randomiser used by str_shuffle() needs a reset to accomplish the generation of the required 1500 unique codes.
Try this minor modification, it will at least stop the execution after 15000 failed attempts at generating a unique code.
Basically I think you have to come up with a much better randomisation mehanism.
<?php
include "../include/top.php";
set_time_limit(0);
ini_set("memory_limit", "-1");
$end=0;
$dups=0;
while ($end<1500 && $dups < 15000)
{
//generate an authentication code
$string="ABCDEFGHJKLMNPQRSTUVWXYZ123456789";
$string= substr(str_shuffle($string),5,8) ;
//check whether generated code already exist
$query = "select count(*) from auth where code = '$string' ";
$stmt = prepare ($query);
execute($stmt);
$bind = mysqli_stmt_bind_result($stmt, $count);
check_bind_result($bind);
mysqli_stmt_fetch($stmt);
mysqli_stmt_free_result($stmt);
//If generated code does not already exist, insert it to Database table
if ($count == 0) {
echo $string."<br>";
$query = "insert into auth (Code) values ('$string')";
$stmt = prepare ($query);
execute($stmt);
$end++;
} else {
$dups++
echo "DUPLICATE for $string, Dup Count = $dups<br>";
}
}
?>
Why you need to restart = you set php timeout so it never times out.
I don't see any specific coding errors. The use of the str_shuffle for creating a authentication code is peculiar because it will prevent duplicated letters, making a much smaller possible range of values. So it may just be repeating the patterns.
Try something like this instead, so that you are shuffling the shuffled string:
$origstring= str_shuffle("ABCDEFGHJKLMNPQRSTUVWXYZ123456789");
while ($end<1500 && $dups < 15000)
{
$origstring = str_shuffle( $origstring);
$string = substr( $newstring, 5, 8);
Or, use something like this to generate the string so that you can have duplicates, creating a much larger range of possible values:
$characters = 'ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890';
for ($i = 0; $i < 8; $i++)
{
$code .= $characters[mt_rand(0, 35)];
}
You have to fine-tune some variables in your php.ini configuration file, find those:
; Maximum execution time of each script, in seconds
; http://php.net/max-execution-time
; Note: This directive is hardcoded to 0 for the CLI SAPI
max_execution_time = 600
And, not mandatory, you can change those also:
suhosin.post.max_vars = 5000
suhosin.request.max_vars = 5000
After modification, restart your web server.

Creating a very large MySQL Database from PHP Script

Please bear with me on this question.
I'm looking to create a relatively large MySQL database that I want to use to do some performance testing. I'm using Ubuntu 11.04 by the way.
I want to create about 6 tables, each with about 50 million records. Each table will have about 10 columns. The data would just be random data.
However, I'm not sure how I can go about doing this. Do I use PHP and loop INSERT queries (bound to timeout)? Or if that is inefficient, is there a way I can do this via some command line utility or shell script?
I'd really appreciate some guidance.
Thanks in advance.
mysql_import is what you want. Check this for full information. It's command line and very fast.
Command-line mode usually has the timeouts disabled, as that's a protection against taking down a webserver, which doesn't apply at the command line.
You can do it from PHP, though generating "random" data will be costly. How random does this information have to be? You can easily read from /dev/random and get "garbage", but it's not a source of "good" randomness (You'd want /dev/urandom, then, but that will block if there isn't enough entropy available to make good garbage).
Just make sure that you have keys disabled on the tables, as keeping those up-to-date will be a major drag on your insert operations. You can add/enable the keys AFTER you've got your data set populated.
If you do want to go the php way, you could do something like this:
<?php
//Edit Following
$millionsOfRows = 2;
$InsertBatchSize = 1000;
$table = 'ATable';
$RandStrLength = 10;
$timeOut = 0; //set 0 for no timeout
$columns = array('col1','col2','etc');
//Mysql Settings
$username = "root";
$password = "";
$database = "ADatabase";
$server = "localhost";
//Don't edit below
$letters = range('a','z');
$rows = $millionsOfRows * 1000000;
$colCount = count($columns);
$valueArray = array();
$con = #mysql_connect($server, $username, $password) or die('Error accessing database: '.mysql_error());
#mysql_select_db($database) or die ('Couldn\'t connect to database: '.mysql_error());
set_time_limit($timeOut);
for ($i = 0;$i<$rows;$i++)
{
$values = array();
for ($k = 0; $k<$colCount;$k++)
$values[] = RandomString();
$valueArray[] = "('".implode("', '", $values)."')";
if ($i > 0 && ($i % $InsertBatchSize) == 0)
{
echo "--".$i/$InsertBatchSize."--";
$sql = "INSERT INTO `$table` (`".implode('`,`',$columns)."`) VALUES ".implode(',',$valueArray);
mysql_query($sql);
echo $sql."<BR/><BR/>";
$valueArray = array();
}
}
mysql_close($con);
function RandomString ()
{
global $RandStrLength, $letters;
$str = "";
for ($i = 0;$i<$RandStrLength;$i++)
$str .= $letters[rand(0,25)];
return $str;
}
Of course you could just use a created dataset, like the NorthWind Database.
all you need to do is launch your script from command line like this:
php -q generator.php
it can then be a simple php file like this:
<?php
$fid = fopen("query.sql", "w");
fputs($fid, "create table a (id int not null auto_increment primary key, b int, c, int);\n");
for ($i = 0; $i < 50000000; $i++){
fputs($fid, "insert into table a (b,c) values (" . rand(0,1000) . ", " . rand(0,1000) . ")\n");
}
fclose($fid);
exec("mysql -u$user -p$password $db < query.sql");
Probably it is fastest to run multiple inserts in one query as:
INSERT INTO `test` VALUES
(1,2,3,4,5,6,7,8,9,0),
(1,2,3,4,5,6,7,8,9,0),
.....
(1,2,3,4,5,6,7,8,9,0)
I created a PHP script to do this. First I tried to construct a query that will hold 1 million inserts but it failed. Then I tried with 100 thousend and it failed again. 50 thousends don't do it also. My nest try was with 10 000 and it works fine. I guess I am hitting the transfer limit from PHP to MySQL. Here is the code:
<?php
set_time_limit(0);
ini_set('memory_limit', -1);
define('NUM_INSERTS_IN_QUERY', 10000);
define('NUM_QUERIES', 100);
// build query
$time = microtime(true);
$queries = array();
for($i = 0; $i < NUM_QUERIES; $i++){
$queries[$i] = 'INSERT INTO `test` VALUES ';
for($j = 0; $j < NUM_INSERTS_IN_QUERY; $j++){
$queries[$i] .= '(1,2,3,4,5,6,7,8,9,0),';
}
$queries[$i] = rtrim($queries[$i], ',');
}
echo "Building query took " . (microtime(true) - $time) . " seconds\n";
mysql_connect('localhost', 'root', '') or die(mysql_error());
mysql_select_db('store') or die(mysql_error());
mysql_query('DELETE FROM `test`') or die(mysql_error());
// execute the query
$time = microtime(true);
for($i = 0; $i < NUM_QUERIES; $i++){
mysql_query($queries[$i]) or die(mysql_error());
// verify all rows inserted
if(mysql_affected_rows() != NUM_INSERTS_IN_QUERY){
echo "ERROR: on run $i not all rows inserted (" . mysql_affected_rows() . ")\n";
exit;
}
}
echo "Executing query took " . (microtime(true) - $time) . " seconds\n";
$result = mysql_query('SELECT count(*) FROM `test`') or die(mysql_error());
$row = mysql_fetch_row($result);
echo "Total number of rows in table: {$row[0]}\n";
echo "Total memory used in bytes: " . memory_get_usage() . "\n";
?>
The result on my Win 7 dev machine are:
Building query took 0.30241012573242 seconds
Executing query took 5.6592788696289 seconds
Total number of rows in table: 1000000
Total memory used in bytes: 22396560
So for 1 mil inserts it took 5 and a half seconds. Then I ran it with this settings:
define('NUM_INSERTS_IN_QUERY', 1);
define('NUM_QUERIES', 1000000);
which is basically doing one insert per query. The results are:
Building query took 1.6551470756531 seconds
Executing query took 77.895285844803 seconds
Total number of rows in table: 1000000
Total memory used in bytes: 140579784
Then I tried to create a file with one insert per query in it, as suggested by #jancha. My code is slightly modified:
$fid = fopen("query.sql", "w");
fputs($fid, "use store;");
for($i = 0; $i < 1000000; $i++){
fputs($fid, "insert into `test` values (1,2,3,4,5,6,7,8,9,0);\n");
}
fclose($fid);
$time = microtime(true);
exec("mysql -uroot < query.sql");
echo "Executing query took " . (microtime(true) - $time) . " seconds\n";
The result is:
Executing query took 79.207592964172 seconds
Same as executing the queries through PHP. So, probably the fastest way is to do multiple inserts in one query and shouldn't be a problem to use PHP to do the work.
Do I use PHP and loop INSERT queries (bound to timeout)
Certainly running long duration scripts via a webserver mediated requset is not a good idea. But PHP can be compiled to run from the command line - in fact most distributions of PHP come bundled with this.
There are lots of things you do to make this run more efficiently, exactly which ones will vary depedning on how you are populating the data set (e.g. once only, lots of batch additions). However for a single load, you might want to have a look at the output of mysqldump (note disabling, enabling indexes, multiple insert lines) and recreate this in PHP rather than connecting directly to the database from PHP.
I see no point in this question, and, especially, in raising a bounty for it.
as they say, "the best is the enemy of good"
You have asked this question ten days ago.
If you'd just go with whatever code you've got, you'd have your tables already and even done with your tests. But you lose so much time just in vain. It's above my understanding.
As for the method you've been asking for (just to keep away all these self-appointed moderators), there are some statements as a food for thought:
mysql's own methods considered more effective in general.
mysql can insert all data from the table into another using INSERT ... SELECT syntax. so, you will need to run only about 30 queries to get your 50 mil records.
and sure mysql can copy whole tables as well.
keep in mind that there should be no indexes at the time of table creation.
I just want to point you to http://www.mysqldumper.net/ which is a tool that allows you to backup and restore big databases with PHP.
The script has some mechanisms to circumvent the maximum execution time of PHP -> imo worth a look.
This is not a solution for generating data, but a great one for importing / exporting.

Categories