Efficient php code to insert array data into mysql table? - php

So I have a flatfile db in the format of
username:$SHA$1010101010101010$010110010101010010101010100101010101001010:255.255.255.255:1342078265214
Each record on a new line... about 5000+ lines.. I want to import it into a mysql table. Normally I'd do this using phpmyadmin and "file import", but now I want to automate this process by using php to download the db via ftp and then clean up the existing table data and upload the updated db.
id(AUTH INCREMENT) | username | password | ip | lastlogin
The script I've got below for the most part works.. although php will generate an error:
"PHP Fatal error: Maximum execution time of 30 seconds exceeded" I believe I could just increase this time, but on remote server I doubt I'll be allowed, so I need to find better way of doing this.
Only about 1000 records will get inserted into the database before that timeout...
The code I'm using is below.. I will say right now I'm not a pro in php and this was mainly gathered up and cobbled together. I'm looking for some help to make this more efficient as I've heard that doing an insert like this is just bad. And it really sounds bad aswel, as a lot of disk scratching when I run this script on local pc.. I mean why does it want to kill the hdd for doing such a seemingly simple task.
<?php
require ('Connections/local.php');
$wx = array_map('trim',file("auths.db"));
$username = array();
$password = array();
$ip = array();
$lastlogin = array();
foreach($wx as $i => $line) {
$tmp = array_filter(explode(':',$line));
$username[$i] = $tmp[0];
$password[$i] = $tmp[1];
$ip[$i] = $tmp[2];
$lastlogin[$i] = $tmp[3];
mysql_query("INSERT INTO authdb (username,password,ip,lastlogin) VALUES('$username[$i]', '$password[$i]', '$ip[$i]', '$lastlogin[$i]') ") or die(mysql_error());
}
?>

Try this, with bound parameters and PDO.
<?php
require ('Connections/local.php');
$wx = array_map('trim',file("auths.db"));
$username = array();
$password = array();
$ip = array();
$lastlogin = array();
try {
$dbh = new PDO("mysql:host=$ip;dbname=$database", $dbUsername, $dbPassword);
$dbh->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
} catch(PDOException $e) {
echo 'ERROR: ' . $e->getMessage();
}
$mysql_query = "INSERT INTO authdb (username,password,ip,lastlogin) VALUES(:username, :password, :ip, :lastlogin)";
$statement = $dbh->prepare($mysql_query);
foreach($wx as $i => $line) {
set_time_limit(0);
$tmp = array_filter(explode(':',$line));
$username[$i] = $tmp[0];
$password[$i] = $tmp[1];
$ip[$i] = $tmp[2];
$lastlogin[$i] = $tmp[3];
$params = array(":username" => $username[$i],
":password" => $password[$i],
":ip" => $ip[$i],
":lastlogin" => $lastlogin[$i]);
$statement->execute($params);
}
?>

Instead of sending queries to server one by one in the form
insert into table (x,y,z) values (1,2,3)
You should use extended insert syntax, as in:
insert into table (x,y,z) values (1,2,3),(4,5,6),(7,8,9),...
This will increase insert performance by miles. However you need to be careful about how many rows you insert in one statement, since there is a limit to the size of a single SQL can be. So, I'd say start with 100 row packs and see how it goes, then adjust pack size accordingly. Chances are your insert time will go down to like 5 seconds, putting it way under max_execution_time limit.

Related

PHP/SQL - Uploading CSV to database - Different sheets daily/Multiple Tables

I asked a question yesterday that was unclear and I've now expanded it slightly. In short, this current project calls for a simple web interface where the user can upload a csv file (this web page is created already). I've modified my PHP for a test file but my situation calls for something different. Every day, the user will upload 1 to 5 different CSV reports. These reports have about 110 fields/columns, though not all fields will be filled in every report. I've created a database with 5 tables, each table covering different fields out of the 110. For instance, one table holds info on the water meters (25 fields) and another table holds info for the tests done on the meters (45 fields). I'm having a hard time finding a way to take the CSV, once uploaded, and split the data into the different tables. I've heard of putting the whole CSV into one table and splitting from there with INSERT statements but I have questions with that:
Is there a way to put a CSV with 110 fields into one table without having fields created? Or would I have to create 110 fields in MYSQL workbench and then create a variable for each in PHP?
If not, would I be able to declare variables from the table dump so that the right data then goes into its correct table?
I'm not as familiar with CSVs in terms of uploading like this, usually just pulling a csv from a folder with a known file name, so that's where my confusion is coming from. Here is the PHP i've used as a simple test with only 10 columns. This was done to make sure the CSV upload works, which it does.
<?php
$server = "localhost";
$user = "root";
$pw = "root";
$db = "uwstest";
$connect = mysqli_connect($server, $user, $pw, $db);
if ($connect->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
if(isset($_POST['submit']))
{
$file = $_FILES['file']['tmp_name'];
$handle = fopen($file, "r");
$c = 0;
while(($filesop = fgetcsv($handle, 1000, ",")) !== false)
{
$one = $filesop[0];
$two = $filesop[1];
$three = $filesop[2];
$four = $filesop[3];
$five = $filesop[4];
$six = $filesop[5];
$seven = $filesop[6];
$eight = $filesop[7];
$nine = $filesop[8];
$ten = $filesop[9];
$sql = "INSERT INTO staging (One, Two, Three, Four, Five, Six, Seven, Eight, Nine, Ten) VALUES ('$one','$two', '$three','$four','$five','$six','$seven','$eight','$nine','$ten')";
}
if ($connect->query($sql) === TRUE) {
echo "You database has imported successfully";
} else {
echo "Error: " . $sql . "<br>" . $conn->error;
}
}
}?>
Depending on CSV size, you might want to consider using MySQL's native CSV import function since it runs 10x-100x times faster.
If you do insist on importing row by row, then you can do something like this with PDO (or adapt it to mysqli).
If you want to match columns, then ,either store your csv as associative array, or parse first row and store it in in array like $cols.
in this case, $results is an associative array that stores a row of csv with column_name=>column_value
$cols=implode(',',array_keys($result));
$vals=':'.str_replace(",",",:",$cols);
$inserter = $pdo->prepare("INSERT INTO `mydb`.`mytable`($cols) VALUES($vals);");
foreach ($result as $k => $v) {
$result[':' . $k] = utf8_encode($v);
if(is_null($v))
$result[':' . $k] = null;
unset($result[$k]);
}
$inserter->execute($result);
hope this helps.
I suggest going with PDO just to avoid all kinds of weirdness that you may encounter in CSV's data.
This is how I would create columns/vals.
$is_first=true;
$cols='';
$vals='';
$cols_array=array();
while (($csv = fgetcsv($handle)) !== false) {
if($is_first)
{
$cols_array=$csv;
$cols=implode(',',$csv);
$is_first=false;
$vals=':'.str_replace(",",",:",$cols);
continue;
}
foreach ($result as $k => $v) {
$result[':' . $cols_array[$k]] = utf8_encode($v);
if(is_null($v))
$result[':' . $cols_array[$k]] = null;
unset($result[$k]);
}
$inserter->execute($result);
}
here is the code that I use for CSV imports.
$file='data/data.csv';
$handle = fopen($file, "r");
$path=realpath(dirname(__FILE__));
$full_path=$path."/../../$file";
$cnt = 0;
$is_first = true;
$headers=array();
$bind=array();
$csv = fgetcsv($handle, 10000, ",");
$headers=$csv;
$alt_query='LOAD DATA LOCAL INFILE \''.$full_path.'\' INTO TABLE mytable
FIELDS TERMINATED BY \',\'
ENCLOSED BY \'\"\'
LINES TERMINATED BY \'\r\n\'
IGNORE 1 LINES
(' . implode(',',$headers).')';
echo exec("mysql -e \"USE mydb;$alt_query;\"",$output,$code);
Assuming the relation between the tables and the CSV is arbitrary but uniform for now on you just need to establish that correspondence array index -> table column once.

more then 100 rows not inserting in postgresql using php throught form

I'm trying to insert more than 100 rows in to postgresql database using PHP in a loop. I am not getting any errors.
The data is getting inserted when i try to add around 50 to 60 records. but when records is around 100 and above it is not getting inserted.
Below is the code i tried. Please go through and help me in solving this issue.
Thanks in advance.
<?php
$userid = $_SESSION['user_id'];
$array ='';
$resultAgain ='';
if (isset($_POST['save'])) {
// $sponsorship_id = $_POST['sponsorid'];
$resultAgain=array();
$resultAgain = $_SESSION['arr_rows'];
for ($i = 0; $i <count($resultAgain); ++$i) {
$recieptid = $resultAgain[$i]['recieptid'];
$childid = $resultAgain[$i]['childid'];
$openingbalance_fee = $resultAgain[$i]['openingbalance_fee'];
$openingbalance_familyhelp = $resultAgain[$i]['openingbalance_familyhelp'];
$mayreciept = $resultAgain[$i]['mayreciept'];
$december_reciept = $_POST['decreciept'.$resultAgain[$i]['presentclass']];
$adminfees = $_POST['adminfees'.$resultAgain[$i]['presentclass']];
$schoolfee = $_POST['schoolfee'.$resultAgain[$i]['presentclass']];
$familyhelp = $resultAgain[$i]['family_help'];
$year = $_POST['yearName'];
$submit = $_POST['save'];
// call insert function
$sql1="SELECT fn_reciept_insert($childid,
'$openingbalance_fee',
'$openingbalance_familyhelp',
'$mayreciept',
'$december_reciept',
'$adminfees',
'$familyhelp',
'$schoolfee',
'$year',
$userid,
localtimestamp,
$userid,
localtimestamp)";
$result1 = pg_query($dbconn,$sql1);
if (!$result1) {
echo '<script>alertMX("Data Not Updated")</script>';
}
else
{
echo '<script>alertMX("Data inserted Successfully")</script>';
} }
}
}
?>
First, make sure you are closing that db connection at the end of your script. Then if you keep having problems, try increasing your max_connections https://wiki.postgresql.org/wiki/Tuning_Your_PostgreSQL_Server#max_connections
ALTER SYSTEM SET max_connections TO 300;
and restart postgresql server.
Also the problem may not be in the amount of data you try to INSERT. You are creating the query using unescaped data. What that means? If one of your variables that you put into that big query string has a ' character, the query string will get messed up and that is a big security problem (read about sql injection). Bigger the amount of data, bigger the chance a ' will appear. You should escape your data

Multiple tabs, Multiple while

I'm running multiple PHP scripts that have a while loop. This while insert and read from a MySQL database.
It is a long running process so it takes up to 2 hours.
What i need to do is to open the script in multiple tabs in the same browser.
When i do this and open the script in multiple tabs, I can't open over 6 tabs . any tab that is over 6 it just keeps loading and shows nothing.
When going to an other browser it works but when i reach the 6 tabs it does the same.
Code :
<?php
ini_set('memory_limit', -1);
ob_implicit_flush(TRUE);
set_time_limit(0);
$sqlselect = "SELECT * FROM old_Users Where age < 18";
$content2 = mysqli_query($conn,$sqlselect);
While($row = mysqli_fetch_assoc($content2)){
$Sql = "INSERT INTO New_Table_Users('first_name','last_name','ID') VALUES('".$row["firstname"]."','".$row["lastname"]."','".$row["idd"]."');
mysqli_query($conn,$sql);
}
?>
The problem is not about the RAM, CPU because whenever i open a new browser it works fine, But when I try to open the 7'th tab it just keeps loading...
So to open 12 Tabs i would need to have 2 browsers each should have 6 tabs open...
Any help would be really appreciated
Dagon solution is the best but in case you need to process stuff in PHP and still be able to insert at a fast pace.
Using PDO (sorry dont like mysqli neither while) to do it faster than you breath. This will insert all data with very few queries (batches). It could even be with only one insert for it all.
WARNING: this technique is fast but know your limits . It needs RAM or lower the number of simultaneous inserts.
Depending on the size of what you are inserting, limit the number of simultaneous inserts, depending on your RAM capacity. With 3 params , as you have (very very few), insert batches of 10000 sounds reasonable. Try various to see how your database and server handles it.
ini_set('memory_limit', -1);
set_time_limit(0);
$table = 'New_Table_Users'; // inserted table name
$nb_max_insert = 10000; // number of maximum simultaneous insert
$age=18;// param age
$stmt = $conn->prepare("SELECT * FROM old_Users Where age < ?");
$stmt->bindParam(1, $age, PDO::PARAM_INT); // prepare binder
try {
$stmt->execute();
$result = $stmt->fetchAll(PDO::FETCH_ASSOC);
} catch (PDOException $e) {
var_dump('error main');
}
if (count($result) !== 0) {
$data = array();// extract needed data , yes you are using * in you query. mheeeee
foreach ($result as $key => $el) {
$row['first_name'] = $el['first_name'];
$row['last_name'] = $el['last_name'];
$row['ID'] = $el['ID'];
array_push($data, $row);
}
$batches = array_chunk($data, $nb_max_insert);// split data into batches
foreach ($batches as $key => $batch) {
foreach ($batch as $d) {
$question_marks[] = '(' . query_placeholders('?', sizeof($d)) . ')'; // create question_marks sequence for PDO
$insert_values = array_merge($insert_values, array_values($d));// what to insert
}
$sql = "INSERT INTO $table (" . implode(",", array_keys($row)) . ") VALUES " . implode(',', $question_marks); //concat the query
$stmt = $conn->prepare($sql);
try {
$stmt->execute($insert_values);
} catch (PDOException $e) {
var_dump('error batch');
}
}
}
Note: I am using it to insert millions of rows into huge tables, across PHP7 pthreads (12 CPU x 20 cores) reaching the limit of the server with 1024 async connections with 3X 12Go RAID X4 SSD 1to. So I guess it should work for you too....

Unable to pass large array to mySQL database with PHP

I currently have a relatively large HTML form (100+ fields). I want to take the data from that form and upload it to a mySQL database when the use hits submit. I have created the PHP code below and have been slowly adding fields and testing to see if the connection is successful. Everything was working through $skilled_nursing, but when I added the next set of values I am no longer successfully creating database entries. All of my echo commands are displayed and I am not getting failures in my error log, but the data is not being received in the database.
Can anyone see what is going wrong? I have checked multiple times for spelling errors, but I haven't seen any. I am wondering if I am somehow timing out with the connection or if I am trying to stick too many values into the execute command.
<?php
echo 'started ok';
// configuration
$dbtype = "mysql";
$dbhost = "localhost";
$dbname = "dbname";
$dbuser = "dbuser";
$dbpass = "userpass";
echo 'variables assigned ok';
// database connection
$conn = new PDO("mysql:host=$dbhost;dbname=$dbname",$dbuser,$dbpass);
echo 'connection established';
// new data
$facility_name = $_POST['facility_name'];
$facility_street = $_POST['facility_street'];
$facility_county = $_POST['facility_county'];
$facility_city = $_POST['facility_city'];
$facility_state = $_POST['facility_state'];
$facility_zipcode = $_POST['facility_zipcode'];
$facility_phone = $_POST['facility_phone'];
$facility_fax = $_POST['facility_fax'];
$facility_licensetype = $_POST['facility_licensetype'];
$facility_licensenumber = $_POST['facility_licensenumber'];
$facility_email = $_POST['facility_email'];
$facility_administrator = $_POST['facility_administrator'];
$skilled_nursing = $_POST['skilled_nursing'];
$independent_living = $_POST['independent_living'];
$assisted_living = $_POST['assisted_living'];
$memory_care = $_POST['memory_care'];
$facility_type_other = $_POST['facility_type_other'];
$care_ratio = $_POST['care_ratio'];
$nurse_ratio = $_POST['nurse_ratio'];
// query
$sql = "INSERT INTO Facilities (facility_name, facility_street, facility_county, facility_city, facility_state, facility_zipcode, facility_phone, facility_fax, facility_licensetype, facility_licensenumber, facility_email, facility_administrator, skilled_nursing, independent_living, assisted_living, memory_care, facility_type_other, care_ratio, nurse_ratio) VALUES (:facility_name, :facility_street, :facility_county, :facility_city, :facility_state, :facility_zipcode, :facility_phone, :facility_fax, :facility_licensetype, :facility_licensenumber, :facility_email, :facility_administrator, :skilled_nursing, :independent_living, :assisted_living, :memory_care, :facility_type_other, :care_ratio, :nurse_ratio)";
$q = $conn->prepare($sql);
$q->execute(array(':facility_state'=>$facility_name,
':facility_street'=>$facility_street,
':facility_county'=>$facility_county,
':facility_city'=>$facility_city,
':facility_state'=>$facility_state,
':facility_name'=>$facility_name,
':facility_zipcode'=>$facility_zipcode,
':facility_phone'=>$facility_phone,
':facility_fax'=>$facility_fax,
':facility_licensetype'=>$facility_licensetype,
':facility_licensenumber'=>$facility_licensenumber,
':facility_email'=>$facility_email,
':facility_administrator'=>$facility_administrator,
':skilled_nursing'=>$skilled_nursing,
':independent_living'=>$independent_living,
':assisted_living'=>$assisted_living,
':memory_care'=>$memory_care,
':facility_type_other'=>$facility_type_other,
':care_ratio'=>$care_ratio,
':nurse_ratio'=>$nurse_ratio));
echo 'query parsed';
?>
This doesn't exactly answer what's going wrong with your code, but it might help solve it.
I would do this a bit differently. You say that you have a lot of fields. Your code is likely to get very long and repetitive. Since it looks like your form field names already correspond with your table columns, I would do something more like this (not tested):
// get a list of column names that exist in the table
$sql = "SELECT column_name FROM information_schema.columns WHERE table_name = 'Facilities'";
$q = $conn->prepare($sql);
$q->execute();
$columns = $q->fetchAll(PDO::FETCH_COLUMN, 0);
$cols = array();
foreach ($_POST as $key=>$value)
{
// if a field is passed in that doesn't exist in the table, remove it
if (!in_array($key, $columns)) {
unset($_POST[$key]);
}
}
$cols = array_keys($_POST);
$sql = "INSERT INTO Facilities(". implode(", ", $cols) .") VALUES (:". implode(", :", $cols) .")";
$q = $conn->prepare($sql);
array_walk($_POST, "addColons");
$q->execute($_POST);
function addColons($value, &$key)
{
$key = ":{$key}";
}
This way, you could have 10, 100, or 1000 fields and this code won't have to change at all. You also reduce your chance for typo errors because there's only one place where the column name is specified. You don't have to worry about SQL injection on the column names because you check to make sure that the column exists before allowing it to be used in your query.
This does, of course, assume that all fields passed in via $_POST correspond with column names in your table. If this isn't the case, it may be easiest to just store those particular field values that aren't columns in separate variables and unset() them from the $_POST array.

Creating a very large MySQL Database from PHP Script

Please bear with me on this question.
I'm looking to create a relatively large MySQL database that I want to use to do some performance testing. I'm using Ubuntu 11.04 by the way.
I want to create about 6 tables, each with about 50 million records. Each table will have about 10 columns. The data would just be random data.
However, I'm not sure how I can go about doing this. Do I use PHP and loop INSERT queries (bound to timeout)? Or if that is inefficient, is there a way I can do this via some command line utility or shell script?
I'd really appreciate some guidance.
Thanks in advance.
mysql_import is what you want. Check this for full information. It's command line and very fast.
Command-line mode usually has the timeouts disabled, as that's a protection against taking down a webserver, which doesn't apply at the command line.
You can do it from PHP, though generating "random" data will be costly. How random does this information have to be? You can easily read from /dev/random and get "garbage", but it's not a source of "good" randomness (You'd want /dev/urandom, then, but that will block if there isn't enough entropy available to make good garbage).
Just make sure that you have keys disabled on the tables, as keeping those up-to-date will be a major drag on your insert operations. You can add/enable the keys AFTER you've got your data set populated.
If you do want to go the php way, you could do something like this:
<?php
//Edit Following
$millionsOfRows = 2;
$InsertBatchSize = 1000;
$table = 'ATable';
$RandStrLength = 10;
$timeOut = 0; //set 0 for no timeout
$columns = array('col1','col2','etc');
//Mysql Settings
$username = "root";
$password = "";
$database = "ADatabase";
$server = "localhost";
//Don't edit below
$letters = range('a','z');
$rows = $millionsOfRows * 1000000;
$colCount = count($columns);
$valueArray = array();
$con = #mysql_connect($server, $username, $password) or die('Error accessing database: '.mysql_error());
#mysql_select_db($database) or die ('Couldn\'t connect to database: '.mysql_error());
set_time_limit($timeOut);
for ($i = 0;$i<$rows;$i++)
{
$values = array();
for ($k = 0; $k<$colCount;$k++)
$values[] = RandomString();
$valueArray[] = "('".implode("', '", $values)."')";
if ($i > 0 && ($i % $InsertBatchSize) == 0)
{
echo "--".$i/$InsertBatchSize."--";
$sql = "INSERT INTO `$table` (`".implode('`,`',$columns)."`) VALUES ".implode(',',$valueArray);
mysql_query($sql);
echo $sql."<BR/><BR/>";
$valueArray = array();
}
}
mysql_close($con);
function RandomString ()
{
global $RandStrLength, $letters;
$str = "";
for ($i = 0;$i<$RandStrLength;$i++)
$str .= $letters[rand(0,25)];
return $str;
}
Of course you could just use a created dataset, like the NorthWind Database.
all you need to do is launch your script from command line like this:
php -q generator.php
it can then be a simple php file like this:
<?php
$fid = fopen("query.sql", "w");
fputs($fid, "create table a (id int not null auto_increment primary key, b int, c, int);\n");
for ($i = 0; $i < 50000000; $i++){
fputs($fid, "insert into table a (b,c) values (" . rand(0,1000) . ", " . rand(0,1000) . ")\n");
}
fclose($fid);
exec("mysql -u$user -p$password $db < query.sql");
Probably it is fastest to run multiple inserts in one query as:
INSERT INTO `test` VALUES
(1,2,3,4,5,6,7,8,9,0),
(1,2,3,4,5,6,7,8,9,0),
.....
(1,2,3,4,5,6,7,8,9,0)
I created a PHP script to do this. First I tried to construct a query that will hold 1 million inserts but it failed. Then I tried with 100 thousend and it failed again. 50 thousends don't do it also. My nest try was with 10 000 and it works fine. I guess I am hitting the transfer limit from PHP to MySQL. Here is the code:
<?php
set_time_limit(0);
ini_set('memory_limit', -1);
define('NUM_INSERTS_IN_QUERY', 10000);
define('NUM_QUERIES', 100);
// build query
$time = microtime(true);
$queries = array();
for($i = 0; $i < NUM_QUERIES; $i++){
$queries[$i] = 'INSERT INTO `test` VALUES ';
for($j = 0; $j < NUM_INSERTS_IN_QUERY; $j++){
$queries[$i] .= '(1,2,3,4,5,6,7,8,9,0),';
}
$queries[$i] = rtrim($queries[$i], ',');
}
echo "Building query took " . (microtime(true) - $time) . " seconds\n";
mysql_connect('localhost', 'root', '') or die(mysql_error());
mysql_select_db('store') or die(mysql_error());
mysql_query('DELETE FROM `test`') or die(mysql_error());
// execute the query
$time = microtime(true);
for($i = 0; $i < NUM_QUERIES; $i++){
mysql_query($queries[$i]) or die(mysql_error());
// verify all rows inserted
if(mysql_affected_rows() != NUM_INSERTS_IN_QUERY){
echo "ERROR: on run $i not all rows inserted (" . mysql_affected_rows() . ")\n";
exit;
}
}
echo "Executing query took " . (microtime(true) - $time) . " seconds\n";
$result = mysql_query('SELECT count(*) FROM `test`') or die(mysql_error());
$row = mysql_fetch_row($result);
echo "Total number of rows in table: {$row[0]}\n";
echo "Total memory used in bytes: " . memory_get_usage() . "\n";
?>
The result on my Win 7 dev machine are:
Building query took 0.30241012573242 seconds
Executing query took 5.6592788696289 seconds
Total number of rows in table: 1000000
Total memory used in bytes: 22396560
So for 1 mil inserts it took 5 and a half seconds. Then I ran it with this settings:
define('NUM_INSERTS_IN_QUERY', 1);
define('NUM_QUERIES', 1000000);
which is basically doing one insert per query. The results are:
Building query took 1.6551470756531 seconds
Executing query took 77.895285844803 seconds
Total number of rows in table: 1000000
Total memory used in bytes: 140579784
Then I tried to create a file with one insert per query in it, as suggested by #jancha. My code is slightly modified:
$fid = fopen("query.sql", "w");
fputs($fid, "use store;");
for($i = 0; $i < 1000000; $i++){
fputs($fid, "insert into `test` values (1,2,3,4,5,6,7,8,9,0);\n");
}
fclose($fid);
$time = microtime(true);
exec("mysql -uroot < query.sql");
echo "Executing query took " . (microtime(true) - $time) . " seconds\n";
The result is:
Executing query took 79.207592964172 seconds
Same as executing the queries through PHP. So, probably the fastest way is to do multiple inserts in one query and shouldn't be a problem to use PHP to do the work.
Do I use PHP and loop INSERT queries (bound to timeout)
Certainly running long duration scripts via a webserver mediated requset is not a good idea. But PHP can be compiled to run from the command line - in fact most distributions of PHP come bundled with this.
There are lots of things you do to make this run more efficiently, exactly which ones will vary depedning on how you are populating the data set (e.g. once only, lots of batch additions). However for a single load, you might want to have a look at the output of mysqldump (note disabling, enabling indexes, multiple insert lines) and recreate this in PHP rather than connecting directly to the database from PHP.
I see no point in this question, and, especially, in raising a bounty for it.
as they say, "the best is the enemy of good"
You have asked this question ten days ago.
If you'd just go with whatever code you've got, you'd have your tables already and even done with your tests. But you lose so much time just in vain. It's above my understanding.
As for the method you've been asking for (just to keep away all these self-appointed moderators), there are some statements as a food for thought:
mysql's own methods considered more effective in general.
mysql can insert all data from the table into another using INSERT ... SELECT syntax. so, you will need to run only about 30 queries to get your 50 mil records.
and sure mysql can copy whole tables as well.
keep in mind that there should be no indexes at the time of table creation.
I just want to point you to http://www.mysqldumper.net/ which is a tool that allows you to backup and restore big databases with PHP.
The script has some mechanisms to circumvent the maximum execution time of PHP -> imo worth a look.
This is not a solution for generating data, but a great one for importing / exporting.

Categories