Multiple tabs, Multiple while - php

I'm running multiple PHP scripts that have a while loop. This while insert and read from a MySQL database.
It is a long running process so it takes up to 2 hours.
What i need to do is to open the script in multiple tabs in the same browser.
When i do this and open the script in multiple tabs, I can't open over 6 tabs . any tab that is over 6 it just keeps loading and shows nothing.
When going to an other browser it works but when i reach the 6 tabs it does the same.
Code :
<?php
ini_set('memory_limit', -1);
ob_implicit_flush(TRUE);
set_time_limit(0);
$sqlselect = "SELECT * FROM old_Users Where age < 18";
$content2 = mysqli_query($conn,$sqlselect);
While($row = mysqli_fetch_assoc($content2)){
$Sql = "INSERT INTO New_Table_Users('first_name','last_name','ID') VALUES('".$row["firstname"]."','".$row["lastname"]."','".$row["idd"]."');
mysqli_query($conn,$sql);
}
?>
The problem is not about the RAM, CPU because whenever i open a new browser it works fine, But when I try to open the 7'th tab it just keeps loading...
So to open 12 Tabs i would need to have 2 browsers each should have 6 tabs open...
Any help would be really appreciated

Dagon solution is the best but in case you need to process stuff in PHP and still be able to insert at a fast pace.
Using PDO (sorry dont like mysqli neither while) to do it faster than you breath. This will insert all data with very few queries (batches). It could even be with only one insert for it all.
WARNING: this technique is fast but know your limits . It needs RAM or lower the number of simultaneous inserts.
Depending on the size of what you are inserting, limit the number of simultaneous inserts, depending on your RAM capacity. With 3 params , as you have (very very few), insert batches of 10000 sounds reasonable. Try various to see how your database and server handles it.
ini_set('memory_limit', -1);
set_time_limit(0);
$table = 'New_Table_Users'; // inserted table name
$nb_max_insert = 10000; // number of maximum simultaneous insert
$age=18;// param age
$stmt = $conn->prepare("SELECT * FROM old_Users Where age < ?");
$stmt->bindParam(1, $age, PDO::PARAM_INT); // prepare binder
try {
$stmt->execute();
$result = $stmt->fetchAll(PDO::FETCH_ASSOC);
} catch (PDOException $e) {
var_dump('error main');
}
if (count($result) !== 0) {
$data = array();// extract needed data , yes you are using * in you query. mheeeee
foreach ($result as $key => $el) {
$row['first_name'] = $el['first_name'];
$row['last_name'] = $el['last_name'];
$row['ID'] = $el['ID'];
array_push($data, $row);
}
$batches = array_chunk($data, $nb_max_insert);// split data into batches
foreach ($batches as $key => $batch) {
foreach ($batch as $d) {
$question_marks[] = '(' . query_placeholders('?', sizeof($d)) . ')'; // create question_marks sequence for PDO
$insert_values = array_merge($insert_values, array_values($d));// what to insert
}
$sql = "INSERT INTO $table (" . implode(",", array_keys($row)) . ") VALUES " . implode(',', $question_marks); //concat the query
$stmt = $conn->prepare($sql);
try {
$stmt->execute($insert_values);
} catch (PDOException $e) {
var_dump('error batch');
}
}
}
Note: I am using it to insert millions of rows into huge tables, across PHP7 pthreads (12 CPU x 20 cores) reaching the limit of the server with 1024 async connections with 3X 12Go RAID X4 SSD 1to. So I guess it should work for you too....

Related

Querying from database and writing the result to text file

I have a query selects all from the database table and writes it to a text file. If the state is small (say max of 200k rows), the code still works and writes it to the text file. Problem arises when I have a state that has 2M rows when queried, then there's also the fact that the table has 64 columns.
Here's a part of the code:
create and open file
$file = "file2.txt";
$fOpen = fopen($file, "a"); // Open file, write and append
$qry = "SELECT * FROM tbl_two WHERE STE='48'";
$res = mysqli_query($con, $qry);
if(!$res) {
echo "No data record" . "<br/>";
exit;
}
$num_res =mysqli_num_rows($res);
for ($i=0; $i<=$num_res; $i++) {
$row = mysqli_fetch_assoc ($res);
$STATE = (trim($row['STATE'] === "") ? " " : $row['STATE']);
$CTY = (trim($row['CTY']=== "") ? " " : $row['CTY']);
$ST = (trim($row['ST']=== "") ? " " : $row['ST']);
$BLK = (trim($row['BLK']=== "") ? " " : $row['BLK']);
....
....
//64th column
$data = "$STATE$CTY$ST$BLK(to the 64th variable)\r\n";
fwrite($f,$data);
}
fclose($f);
I tried putting a limit to the query:
$qry = "SELECT * FROM tbl_two WHERE STE='48' LIMIT 200000";
Problem is, it just writes until the 200kth line, and it doesn't write the remaining 1.8m lines.
If I don't put a limit to the query, it encounters the error Out of memory .... . TIA for any kind suggestions.
First you need to use buffer query for fetching the data Read it
Queries are using the buffered mode by default. This means that query results are immediately transferred from the MySQL Server to PHP and then are kept in the memory of the PHP process.
Unbuffered MySQL queries execute the query and then return a resource while the data is still waiting on the MySQL server for being fetched. This uses less memory on the PHP-side, but can increase the load on the server. Unless the full result set was fetched from the server no further queries can be sent over the same connection. Unbuffered queries can also be referred to as "use result".
NOTE: buffered queries should be used in cases where you expect only a limited result set or need to know the amount of returned rows before reading all rows. Unbuffered mode should be used when you expect larger results.
Also optimize the array try to put variable directly and you while loop only
pdo = new PDO("mysql:host=localhost;dbname=world", 'my_user', 'my_pass');
$pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);
$uresult = $pdo->query("SELECT * FROM tbl_two WHERE STE='48' LIMIT 200000");
if ($uresult) {
$lineno = 0;
while ($row = $uresult->fetch(PDO::FETCH_ASSOC)) {
echo $row['Name'] . PHP_EOL;
// write value in text file
$lineno++;
}
}

Drupal 7 database API very slow compared to PHP mysqli_connect()

I am trying to loop over some data coming to me from a SOAP request and insert the records into a custom table in my Drupal install.
At first I created a custom module and used standard mysqli_connect() syntax to connect to the database and loop through the records and insert them. This was working great and fetched and inserted my remote data in about 2 seconds without a hitch.
I then remembered that Drupal has a database API (I am fairly new to Drupal) so I decided to do it right and use the API instead. I converted my code to how I think I should be doing it per the API docs, but now the process takes more like 5 or 6 seconds and sometimes even randomly hangs and doesn't complete at all and I get weird Session errors. The records end up inserting fine, but it just takes forever.
I'm wondering if I am doing it wrong. I would also like to wrap the inserts into a transaction, because I will first be deleting ALL of the records in the destination table first and then inserting the new data and since I am deleting first, I want to be able to roll back if the inserts fail for whatever reason.
I did not add transaction code to my original PHP only code, but did try to attempt it with the Drupal API, although completely removing the transaction/try/catch code doesn't seem to affect the speed or issues at all.
Anyway here is my original code:
$data = simplexml_load_string($jobsXml);
$connection = mysqli_connect("localhost","user","pass","database");
if (mysqli_connect_errno($connection))
{
echo "Failed to connect to MySQL: " . mysqli_connect_error();
exit();
}
// delete * current jobs
mysqli_query($connection,'TRUNCATE TABLE jobs;');
$recordsInserted = 0;
foreach ($data->NewDataSet->Table as $item) {
//escape and cleanup some fields
$image = str_replace('http://www.example.com/public/images/job_headers/', '', $item->job_image_file);
$specialty_description = mysqli_real_escape_string($connection, $item->specialty_description);
$job_board_title = mysqli_real_escape_string($connection, $item->job_board_title);
$job_board_subtitle = mysqli_real_escape_string($connection, $item->job_board_subtitle);
$job_state_code = ($item->job_country_code == 'NZ') ? 'NZ' : $item->job_state_code;
$sql = "
INSERT INTO jobs (
job_number,
specialty,
specialty_description,
division_code,
job_type,
job_type_description,
job_state_code,
job_country_code,
job_location_display,
job_board_type,
job_image_file,
job_board_title,
job_board_subtitle
) VALUES (
$item->job_number,
'$item->specialty',
'$specialty_description',
'$item->division_code',
'$item->job_type',
'$item->job_type_description',
'$job_state_code',
'$item->job_country_code',
'$item->job_location_display',
'$item->job_board_type',
'$image',
'$job_board_title',
'$job_board_subtitle'
)
";
if (!mysqli_query($connection,$sql))
{
die('Error: ' . mysqli_error($connection) . $sql);
}
$recordsInserted++;
}
mysqli_close($connection);
echo $recordsInserted . ' records inserted';
and this is my Drupal code. Can anyone tell me if maybe I am doing this wrong or not the most efficient way?
$data = simplexml_load_string($jobsXml);
// The transaction opens here.
$txn = db_transaction();
// delete all current jobs
$records_deleted = db_delete('jobs')
->execute();
$records_inserted = 0;
try {
$records = array();
foreach ($data->NewDataSet->Table as $item) {
$records[] = array(
'job_number' => $item->job_number,
'specialty' => $item->specialty,
'specialty_description' => $item->specialty_description,
'division_code' => $item->division_code,
'job_type' => $item->job_type,
'job_type_description' => $item->job_type_description,
'job_state_code' => ($item->job_country_code == 'NZ') ? 'NZ' : $item->job_state_code,
'job_country_code' => $item->job_country_code,
'job_location_display' => $item->job_location_display,
'job_board_type' => $item->job_board_type,
'job_image_file' => str_replace('http://www.example.com/public/images/job_headers/', '', $item->job_image_file),
'job_board_title' => $item->$job_board_title,
'job_board_subtitle' => $item->job_board_subtitle,
);
$records_inserted++;
}
$fields = array(
'job_number',
'specialty',
'specialty_description',
'division_code',
'job_type',
'job_type_description',
'job_state_code',
'job_country_code',
'job_location_display',
'job_board_type',
'job_image_file',
'job_board_title',
'job_board_subtitle'
);
$query = db_insert('jobs')
->fields($fields);
foreach ($records as $record) {
$query->values($record);
}
$query->execute();
} catch (Exception $e) {
// Something went wrong somewhere, so roll back now.
$txn->rollback();
// Log the exception to watchdog.
watchdog_exception('Job Import', $e);
echo $e;
}
echo $records_deleted . ' records deleted<br>';
echo $records_inserted . ' records inserted';
How big is the dataset you are trying to insert? If the dataset is very large then perhaps you might right into query size issues. Try looping over records and inserting each record one by one like you did with PHP.

Efficient php code to insert array data into mysql table?

So I have a flatfile db in the format of
username:$SHA$1010101010101010$010110010101010010101010100101010101001010:255.255.255.255:1342078265214
Each record on a new line... about 5000+ lines.. I want to import it into a mysql table. Normally I'd do this using phpmyadmin and "file import", but now I want to automate this process by using php to download the db via ftp and then clean up the existing table data and upload the updated db.
id(AUTH INCREMENT) | username | password | ip | lastlogin
The script I've got below for the most part works.. although php will generate an error:
"PHP Fatal error: Maximum execution time of 30 seconds exceeded" I believe I could just increase this time, but on remote server I doubt I'll be allowed, so I need to find better way of doing this.
Only about 1000 records will get inserted into the database before that timeout...
The code I'm using is below.. I will say right now I'm not a pro in php and this was mainly gathered up and cobbled together. I'm looking for some help to make this more efficient as I've heard that doing an insert like this is just bad. And it really sounds bad aswel, as a lot of disk scratching when I run this script on local pc.. I mean why does it want to kill the hdd for doing such a seemingly simple task.
<?php
require ('Connections/local.php');
$wx = array_map('trim',file("auths.db"));
$username = array();
$password = array();
$ip = array();
$lastlogin = array();
foreach($wx as $i => $line) {
$tmp = array_filter(explode(':',$line));
$username[$i] = $tmp[0];
$password[$i] = $tmp[1];
$ip[$i] = $tmp[2];
$lastlogin[$i] = $tmp[3];
mysql_query("INSERT INTO authdb (username,password,ip,lastlogin) VALUES('$username[$i]', '$password[$i]', '$ip[$i]', '$lastlogin[$i]') ") or die(mysql_error());
}
?>
Try this, with bound parameters and PDO.
<?php
require ('Connections/local.php');
$wx = array_map('trim',file("auths.db"));
$username = array();
$password = array();
$ip = array();
$lastlogin = array();
try {
$dbh = new PDO("mysql:host=$ip;dbname=$database", $dbUsername, $dbPassword);
$dbh->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
} catch(PDOException $e) {
echo 'ERROR: ' . $e->getMessage();
}
$mysql_query = "INSERT INTO authdb (username,password,ip,lastlogin) VALUES(:username, :password, :ip, :lastlogin)";
$statement = $dbh->prepare($mysql_query);
foreach($wx as $i => $line) {
set_time_limit(0);
$tmp = array_filter(explode(':',$line));
$username[$i] = $tmp[0];
$password[$i] = $tmp[1];
$ip[$i] = $tmp[2];
$lastlogin[$i] = $tmp[3];
$params = array(":username" => $username[$i],
":password" => $password[$i],
":ip" => $ip[$i],
":lastlogin" => $lastlogin[$i]);
$statement->execute($params);
}
?>
Instead of sending queries to server one by one in the form
insert into table (x,y,z) values (1,2,3)
You should use extended insert syntax, as in:
insert into table (x,y,z) values (1,2,3),(4,5,6),(7,8,9),...
This will increase insert performance by miles. However you need to be careful about how many rows you insert in one statement, since there is a limit to the size of a single SQL can be. So, I'd say start with 100 row packs and see how it goes, then adjust pack size accordingly. Chances are your insert time will go down to like 5 seconds, putting it way under max_execution_time limit.

Creating a very large MySQL Database from PHP Script

Please bear with me on this question.
I'm looking to create a relatively large MySQL database that I want to use to do some performance testing. I'm using Ubuntu 11.04 by the way.
I want to create about 6 tables, each with about 50 million records. Each table will have about 10 columns. The data would just be random data.
However, I'm not sure how I can go about doing this. Do I use PHP and loop INSERT queries (bound to timeout)? Or if that is inefficient, is there a way I can do this via some command line utility or shell script?
I'd really appreciate some guidance.
Thanks in advance.
mysql_import is what you want. Check this for full information. It's command line and very fast.
Command-line mode usually has the timeouts disabled, as that's a protection against taking down a webserver, which doesn't apply at the command line.
You can do it from PHP, though generating "random" data will be costly. How random does this information have to be? You can easily read from /dev/random and get "garbage", but it's not a source of "good" randomness (You'd want /dev/urandom, then, but that will block if there isn't enough entropy available to make good garbage).
Just make sure that you have keys disabled on the tables, as keeping those up-to-date will be a major drag on your insert operations. You can add/enable the keys AFTER you've got your data set populated.
If you do want to go the php way, you could do something like this:
<?php
//Edit Following
$millionsOfRows = 2;
$InsertBatchSize = 1000;
$table = 'ATable';
$RandStrLength = 10;
$timeOut = 0; //set 0 for no timeout
$columns = array('col1','col2','etc');
//Mysql Settings
$username = "root";
$password = "";
$database = "ADatabase";
$server = "localhost";
//Don't edit below
$letters = range('a','z');
$rows = $millionsOfRows * 1000000;
$colCount = count($columns);
$valueArray = array();
$con = #mysql_connect($server, $username, $password) or die('Error accessing database: '.mysql_error());
#mysql_select_db($database) or die ('Couldn\'t connect to database: '.mysql_error());
set_time_limit($timeOut);
for ($i = 0;$i<$rows;$i++)
{
$values = array();
for ($k = 0; $k<$colCount;$k++)
$values[] = RandomString();
$valueArray[] = "('".implode("', '", $values)."')";
if ($i > 0 && ($i % $InsertBatchSize) == 0)
{
echo "--".$i/$InsertBatchSize."--";
$sql = "INSERT INTO `$table` (`".implode('`,`',$columns)."`) VALUES ".implode(',',$valueArray);
mysql_query($sql);
echo $sql."<BR/><BR/>";
$valueArray = array();
}
}
mysql_close($con);
function RandomString ()
{
global $RandStrLength, $letters;
$str = "";
for ($i = 0;$i<$RandStrLength;$i++)
$str .= $letters[rand(0,25)];
return $str;
}
Of course you could just use a created dataset, like the NorthWind Database.
all you need to do is launch your script from command line like this:
php -q generator.php
it can then be a simple php file like this:
<?php
$fid = fopen("query.sql", "w");
fputs($fid, "create table a (id int not null auto_increment primary key, b int, c, int);\n");
for ($i = 0; $i < 50000000; $i++){
fputs($fid, "insert into table a (b,c) values (" . rand(0,1000) . ", " . rand(0,1000) . ")\n");
}
fclose($fid);
exec("mysql -u$user -p$password $db < query.sql");
Probably it is fastest to run multiple inserts in one query as:
INSERT INTO `test` VALUES
(1,2,3,4,5,6,7,8,9,0),
(1,2,3,4,5,6,7,8,9,0),
.....
(1,2,3,4,5,6,7,8,9,0)
I created a PHP script to do this. First I tried to construct a query that will hold 1 million inserts but it failed. Then I tried with 100 thousend and it failed again. 50 thousends don't do it also. My nest try was with 10 000 and it works fine. I guess I am hitting the transfer limit from PHP to MySQL. Here is the code:
<?php
set_time_limit(0);
ini_set('memory_limit', -1);
define('NUM_INSERTS_IN_QUERY', 10000);
define('NUM_QUERIES', 100);
// build query
$time = microtime(true);
$queries = array();
for($i = 0; $i < NUM_QUERIES; $i++){
$queries[$i] = 'INSERT INTO `test` VALUES ';
for($j = 0; $j < NUM_INSERTS_IN_QUERY; $j++){
$queries[$i] .= '(1,2,3,4,5,6,7,8,9,0),';
}
$queries[$i] = rtrim($queries[$i], ',');
}
echo "Building query took " . (microtime(true) - $time) . " seconds\n";
mysql_connect('localhost', 'root', '') or die(mysql_error());
mysql_select_db('store') or die(mysql_error());
mysql_query('DELETE FROM `test`') or die(mysql_error());
// execute the query
$time = microtime(true);
for($i = 0; $i < NUM_QUERIES; $i++){
mysql_query($queries[$i]) or die(mysql_error());
// verify all rows inserted
if(mysql_affected_rows() != NUM_INSERTS_IN_QUERY){
echo "ERROR: on run $i not all rows inserted (" . mysql_affected_rows() . ")\n";
exit;
}
}
echo "Executing query took " . (microtime(true) - $time) . " seconds\n";
$result = mysql_query('SELECT count(*) FROM `test`') or die(mysql_error());
$row = mysql_fetch_row($result);
echo "Total number of rows in table: {$row[0]}\n";
echo "Total memory used in bytes: " . memory_get_usage() . "\n";
?>
The result on my Win 7 dev machine are:
Building query took 0.30241012573242 seconds
Executing query took 5.6592788696289 seconds
Total number of rows in table: 1000000
Total memory used in bytes: 22396560
So for 1 mil inserts it took 5 and a half seconds. Then I ran it with this settings:
define('NUM_INSERTS_IN_QUERY', 1);
define('NUM_QUERIES', 1000000);
which is basically doing one insert per query. The results are:
Building query took 1.6551470756531 seconds
Executing query took 77.895285844803 seconds
Total number of rows in table: 1000000
Total memory used in bytes: 140579784
Then I tried to create a file with one insert per query in it, as suggested by #jancha. My code is slightly modified:
$fid = fopen("query.sql", "w");
fputs($fid, "use store;");
for($i = 0; $i < 1000000; $i++){
fputs($fid, "insert into `test` values (1,2,3,4,5,6,7,8,9,0);\n");
}
fclose($fid);
$time = microtime(true);
exec("mysql -uroot < query.sql");
echo "Executing query took " . (microtime(true) - $time) . " seconds\n";
The result is:
Executing query took 79.207592964172 seconds
Same as executing the queries through PHP. So, probably the fastest way is to do multiple inserts in one query and shouldn't be a problem to use PHP to do the work.
Do I use PHP and loop INSERT queries (bound to timeout)
Certainly running long duration scripts via a webserver mediated requset is not a good idea. But PHP can be compiled to run from the command line - in fact most distributions of PHP come bundled with this.
There are lots of things you do to make this run more efficiently, exactly which ones will vary depedning on how you are populating the data set (e.g. once only, lots of batch additions). However for a single load, you might want to have a look at the output of mysqldump (note disabling, enabling indexes, multiple insert lines) and recreate this in PHP rather than connecting directly to the database from PHP.
I see no point in this question, and, especially, in raising a bounty for it.
as they say, "the best is the enemy of good"
You have asked this question ten days ago.
If you'd just go with whatever code you've got, you'd have your tables already and even done with your tests. But you lose so much time just in vain. It's above my understanding.
As for the method you've been asking for (just to keep away all these self-appointed moderators), there are some statements as a food for thought:
mysql's own methods considered more effective in general.
mysql can insert all data from the table into another using INSERT ... SELECT syntax. so, you will need to run only about 30 queries to get your 50 mil records.
and sure mysql can copy whole tables as well.
keep in mind that there should be no indexes at the time of table creation.
I just want to point you to http://www.mysqldumper.net/ which is a tool that allows you to backup and restore big databases with PHP.
The script has some mechanisms to circumvent the maximum execution time of PHP -> imo worth a look.
This is not a solution for generating data, but a great one for importing / exporting.

maximum execution time of 30 seconds exceeded php

When I run my script I receive the following error before processing all rows of data.
maximum execution time of 30 seconds
exceeded
After researching the problem, I should be able to extend the max_execution_time time which should resolve the problem.
But being in my PHP programming infancy I would like to know if there is a more optimal way of doing my script below, so I do not have to rely on "get out of jail cards".
The script is:
1 Taking a CSV file
2 Cherry picking some columns
3 Trying to insert 10k rows of CSV data into a my SQL table
In my head I think I should be able to insert in chunks, but that is so far beyond my skillset I do not even know how to write one line :\
Many thanks in advance
<?php
function processCSV()
{
global $uploadFile;
include 'dbConnection.inc.php';
dbConnection("xx","xx","xx");
$rowCounter = 0;
$loadLocationCsvUrl = fopen($uploadFile,"r");
if ($loadLocationCsvUrl <> false)
{
while ($locationFile = fgetcsv($loadLocationCsvUrl, ','))
{
$officeId = $locationFile[2];
$country = $locationFile[9];
$country = trim($country);
$country = htmlspecialchars($country);
$open = $locationFile[4];
$open = trim($open);
$open = htmlspecialchars($open);
$insString = "insert into countrytable set officeId='$officeId', countryname='$country', status='$open'";
switch($country)
{
case $country <> 'Country':
if (!mysql_query($insString))
{
echo "<p>error " . mysql_error() . "</p>";
}
break;
}
$rowCounter++;
}
echo "$rowCounter inserted.";
}
fclose($loadLocationCsvUrl);
}
processCSV();
?>
First, in 2011 you do not use mysql_query. You use mysqli or PDO and prepared statements. Then you do not need to figure out how to escape strings for SQL. You used htmlspecialchars which is totally wrong for this purpose. Next, you could use a transaction to speed up many inserts. MySQL also supports multiple interests.
But the best bet would be to use the CSV storage engine. http://dev.mysql.com/doc/refman/5.0/en/csv-storage-engine.html read here. You can instantly load everything into SQL and then manipulate there as you wish. The article also shows the load data infile command.
Well, you could create a single query like this.
$query = "INSERT INTO countrytable (officeId, countryname, status) VALUES ";
$entries = array();
while ($locationFile = fgetcsv($loadLocationCsvUrl, ',')) {
// your code
$entries[] = "('$officeId', '$country', '$open')";
}
$query .= implode(', ', $enties);
mysql_query($query);
But this depends on how long your query will be and what the server limit is set to.
But as you can read in other posts, there are better way for your requirements. But I thougt I should share a way you did thought about.
You can try calling the following function before inserting. This will set the time limit to unlimited instead of the 30 sec default time.
set_time_limit( 0 );

Categories