Fast inserting to array/sql - php

I got this code
$i = -1;
$random_string = array();
while (sizeof($random_string) < 1600000) {
$i++;
$zmienna = generatePassword();
if (!in_array($zmienna, $random_string))
$random_string[$i] = $zmienna;
else
continue;
}
//print_r($random_string);
foreach ($random_string as $value) {
$sql = "INSERT INTO `kody`(`kod`) VALUES ('$value')";
mysql_query($sql, $con);
}
But it will take a lot of hours to insert it to databse, or even to array. Do someone know how to improve this code?

Well, in_array() is rather expensive. Use a hash instead of a simple array, and then you can use isset() instead of in_array().
Also, don't use things like sizeof() and count() as loop conditions. Instead, just use a simple for ($i = 0; $i < 1600000; ++$i) { ... } array.
Depending on your web host permissions, another significant optimization would be to use fputcsv() to write your array to disk and then make use of MySQL's LOAD DATA INFILE to load the contents into your database, instead of generating 1.6 million queries.

Yes, use one query to insert all of them at once with an SQL multi-insert:
$values = "('" . implode( "'), ('", $random_string) . "')";
$sql="INSERT INTO `kody`(`kod`) VALUES " . $values;
mysql_query($sql,$con);
As drrcknlsn very correctly points out, in_array() is inefficient, as it performs a linear O(n) search on the array. Here is how you can fix that (which is a hash implementation):
while( sizeof($random_string) < 1600000) {
$i++;
$zmienna = generatePassword();
if( !isset( $random_string[$zmienna]))
$random_string[$zmienna] = $zmienna;
else
continue;
}
Now, you can use the above code to generate a single SQL query, and this should run much, much faster.

The problem is probably that it's trying to update the INDEX after each insert. Try using transactions. This will only update the INDEX once (after COMMIT) is called. This will also let you ROLLBACK if something goes wrong.
mysql_query("SET AUTOCOMMIT=0");
mysql_query("START TRANSACTION");
foreach($random_string as $value)
{
$sql="INSERT INTO `kody`(`kod`) VALUES ('$value')";
mysql_query($sql,$con);
}
mysql_query("COMMIT");

Related

GROUP_CONCAT inside an INSERT statement

I am not really sure if this is possible, or if there is any alternative way to do this.
The following code takes multiple input from a user and inserts them in different rows, I don't want it like that. I am looking for a way to insert them in the same row.
<?php $test=$_POST['test'];
foreach ($test as $a => $b){?>
<?php echo $test[$a];?>
<?php
if( !$error ) {
$query = "INSERT INTO test_tbl(test) VALUES('$test[$a]')";
$output = mysql_query($query);
if ($output) {
$errTyp = "success";
$errMSG = "Update Posted";
?>
I am aware there is a GROUP_CONCAT function but I can't seem to get it to work in insert statements and from researching I found out it doesn't really work with insert only with select. So is there any way of sorting this mess?
Here is my attempt on the GROUP_CONCAT (Obviously it's a big error that I receive)
$query = "INSERT INTO testing(item) VALUES(GROUP_CONCAT($test[$a]))";
PS I know this is completely against normalization standards but I am doing it since a customer requested it..
If i understood correctly you are inserting in a foreach loop, this will insert each $test[$a] seperately each time the loop runs. ALternatively you can put all $test[$a] values in an array and then insert that array in one row(which is way faster than multiple insert queries).
Here is a way to do it:
$all_test_values = array();
foreach ($test as $a => $b){
$all_test_values[] = $test[$a];
}
$comma_separated = implode(",", $all_test_values);
if( !$error ) {
$query = "INSERT INTO test_tbl(test) VALUES('$comma_separated')";
$output = mysql_query($query);
}
PS: mysql is bad use mysqli instead

SQL statement inside loop with PHP, good idea?

I ran into the following question while writing a PHP script. I need to store the first two integers from an array of variable lenght into a database table, remove them and repeat this until the array is empty. I could do it with a while loop, but I read that you should avoid writing SQL statements inside a loop because of the performance hit.
A simpliefied example:
while(count($array) > 0){
if ($sql = $db_connect->prepare("INSERT INTO table (number1, number2) VALUES (?,?)")){
$sql->bind_param('ii',$array[0],$array[1]);
$sql->execute();
$sql->close();
}
array_shift($array);
array_shift($array);
}
Is this the best way, and if not, what's a better approach?
You can do something like this, which is way faster aswell:
Psuedo code:
$stack = array();
while(count($array) > 0){
array_push($stack, "(" . $array[0] . ", " . $array[1] . ")");
array_shift($array);
array_shift($array);
}
if ($sql = $db_connect->prepare("INSERT INTO table (number1, number2)
VALUES " . implode(',', $stack))){
$sql->execute();
$sql->close();
}
The only issue here is that it's not a "MySQL Safe" insert, you will need to fix that!
This will generate and Array that holds the values. Within 1 query it will insert all values at once, where you need less MySQL time.
Whether you run them one by one or in an array, an INSERT statement is not going to make a noticeable performance hit, from my experience.
The database connection is only opened once, so it is not a huge issue. I guess if you are doing some insane amount of queries, it could be.
I think as long as your loop condition is safe ( will break in time ) and you got something from it .. it's ok
You would be better off writing a bulk insert statement, less hits on mysql
$sql = "INSERT INTO table(number1, number2) VALUES";
$params = array();
foreach( $array as $item ) {
$sql .= "(?,?),\n";
$params[] = $item;
}
$sql = rtrim( $sql, ",\n" ) . ';';
$sql = $db_connect->prepare( $sql );
foreach( $params as $param ) {
$sql->bind_param( 'ii', $param[ 0 ], $param[ 1 ] );
}
$sql->execute();
$sql->close();
In ColdFusion you can put your loop inside the query instead of the other way around. I'm not a php programmer but my general belief is that most things that can be done in language a can also be done in language b. This code shows the concept. You should be able to figure out a php version.
<cfquery>
insert into mytable
(field1, field2)
select null, null
from SomeSmallTable
where 1=2
<cfloop from="1' to="#arrayLen(myArray)#" index="i">
select <cfqueryparam value="myArray[i][1]
, <cfqueryparam value="myArray[i][]
from SomeSmallTable
</cfloop>
</cfquery>
When I've looked at this approach myself, I've found it to be faster than query inside loop with oracle and sql server. I found it to be slower with redbrick.
There is a limitation with this approach. Sql server has a maximum number of parameters it will accept and a maximum query length. Other db engines might as well, I've just not discovered them yet.

php request of mysql query timeout

i'm trying to make a long mysql query and process and update the row founded:
$query = 'SELECT tvshows.id_show, tvshows.actors FROM tvshows where tvshows.actors is not NULL';
$result = mysql_query($query);
$total = mysql_num_rows($result);
echo $total;
while ($db_row = mysql_fetch_assoc($result))
{
//process row
}
but after 60 second give me a timeout request, i have try to insert these in my php code:
set_time_limit(400);
but it's the same, how i can do?
EDIT:
only the query:
$query = 'SELECT tvshows.id_show, tvshows.actors FROM tvshows where tvshows.actors is not NULL';
takes 2-3 second to perform, so i think the problem is when in php i iterate all the result to insert to row or update it, so i think the problem is in the php, how i can change the timeout?
EDIT:
here is the complete code, i don't think is a problem here in the code...
$query = 'SELECT tvshows.id_show, tvshows.actors FROM tvshows where tvshows.actors is not NULL';
$result = mysql_query($query);
$total = mysql_num_rows($result);
echo $total;
while ($db_row = mysql_fetch_assoc($result)) {
//print $db_row['id_show']."-".$db_row['actors']."<BR>";
$explode = explode("|", $db_row['actors']);
foreach ($explode as $value) {
if ($value != "") {
$checkactor = mysql_query(sprintf("SELECT id_actor,name FROM actors WHERE name = '%s'",mysql_real_escape_string($value))) or die(mysql_error());
if (mysql_num_rows($checkactor) != 0) {
$actorrow = mysql_fetch_row($checkactor);
$checkrole = mysql_query(sprintf("SELECT id_show,id_actor FROM actor_role WHERE id_show = %d AND id_actor = %d",$db_row['id_show'],$actorrow[0])) or die(mysql_error());
if (mysql_num_rows($checkrole) == 0) {
$insertactorrole = mysql_query(sprintf("INSERT INTO actor_role (id_show, id_actor) VALUES (%d, %d)",$db_row['id_show'],$actorrow[0])) or die(mysql_error());
}
} else {
$insertactor = mysql_query(sprintf("INSERT INTO actors (name) VALUES ('%s')",mysql_real_escape_string($value))) or die(mysql_error());
$insertactorrole = mysql_query(sprintf("INSERT INTO actor_role (id_show, id_actor, role) VALUES (%d, %d,'')",$db_row['id_show'],mysql_insert_id())) or die(mysql_error());
}
}
}
}
Should definitely try what #rid suggested, and to execute the query on the server and see the results/duration to debug - if the query is not a simple one, construct it as you would in your PHP script, and only echo the SQL command, don't have to execute it, and just copy that in to the server MySQL command line or whichever tool you use.
If you have shell access, use the top command after running the above script again, and see if the MySQL demon server is spiking in resources to see if it really is the cause.
Can you also try a simpler query in place of the longer one? Like just a simple SELECT count(*) FROM tvshows and see if that also takes a long time to return a value?
Hope these suggestions help.
There are so many problems with your code.
Don't store multiple values in a single column. Your actors column is pipe-delimited text. This is a big no-no.
Use JOINs instead of additional queries. You can (or could, if the above weren't true) get all of this data in a single query.
All of your code can be done in a single query on the server. As I see it, it takes no input from the user and produces no output. It just updates a table. Why do this in PHP? Learn about INSERT...SELECT....
Here are some resources to get you started (from Googling, but hopefully they'll be good enough):
http://www.sitepoint.com/understanding-sql-joins-mysql-database/
http://dev.mysql.com/doc/refman/5.1/en/join.html
http://dev.mysql.com/doc/refman/5.1/en/insert-select.html
What is Normalisation (or Normalization)?
Let me know if you have any further questions.

Apache crashes in a loop of string concatenation

I have a script that opens a huge XLSX file and reads 3000 rows of data, saving it to a two dimensional array. Of all places for Apache to crash, it does so in a simple loop that builds a MySQL query. I know this because if I remove the following lines from my application, it runs without issue:
$query = "INSERT INTO `map.lmds.dots` VALUES";
foreach($data as $i => $row)
{
$id = $row["Abonnementsid"];
$eier = $row["Eier"];
$status = $row["Status"];
if($i !== 0) $query .= "\n,";
$query .= "('$id', '$eier', '$status', '0', '0')";
}
echo $query;
I can't see a thing wrong with the code.
I'm using PHPExcel and dBug.php
Why is this script crashing Apache?
EDIT: Perhaps I should elaborate on what I mean by crash. I mean a classic Windows "Program has stopped working":
EDIT: Another attempt inspired by one of the answers. Apache still crashes:
$query = "INSERT INTO `map.lmds.dots` VALUES";
$records = array();
foreach($data as $i => &$row)
{
$id = $row["Abonnementsid"];
$eier = $row["Eier"];
$status = $row["Status"];
$records[] = "('$id', '$eier', '$status', '0', '0')";
}
echo $query . implode(",", $records);
EDIT: I have narrowed it down further. As soon as I add a foreach loop, Apache crashes.
foreach($data as $i => $row) {};
Like the others respondents have said this is most likely a memory issue, you should check both your Apache error logs and your PHP error logs for more info.
Assuming this is a memory problem, I suggest you change your code so that you execute multiple insert statements inside the foreach loop rather than storing the whole thing in a big string and sending it to the database all at once. Of course, this means that you're making 3000+ calls to the database rather than just one, I'd expect this to be a bit slower, you can mitigate this problem by using a prepared statement which should be a bit more efficient. If this is still too slow, try changing your loop so that you only call the database every N times round the loop.
The amount of string concatenation and the amount of string data involved could be too much to handle at once during the permitted execution time.
You could try to just collect the values in an array and put them together at the end:
$query = "INSERT INTO `map.lmds.dots` VALUES";
$records = array();
foreach($data as $i => $row) {
$records[] = "('".mysql_real_escape_string($row["Abonnementsid"])."', '".mysql_real_escape_string($row["Eier"])."', '".mysql_real_escape_string($row["Status"])."', '0', '0')";
}
$query .= implode("\n,", $records);
Or insert the records in chunks:
$query = "INSERT INTO `map.lmds.dots` VALUES";
$records = array();
foreach($data as $i => $row) {
$records[] = "('".mysql_real_escape_string($row["Abonnementsid"])."', '".mysql_real_escape_string($row["Eier"])."', '".mysql_real_escape_string($row["Status"])."', '0', '0')";
if ($i % 1000 === 999) {
mysql_query($query . implode("\n,", $records));
$records = array();
}
}
if (!empty($records)) {
mysql_query($query . implode("\n,", $records));
}
Also try it with reference in foreach to avoid that an internal copy of the array is made:
foreach($data as $i => &$row) {
// …
}
This does sounds like a memory issue. Maybe it has nothing to do with the loop building the SQL query. It could be related to reading the "very" large file before that. And the loop pushing memory usage over the limit. Did you try freeing up memory after reading the file?
You can use memory_get_peak_usage() and memory_get_usage() to get some more info about consumed memory.
If that doesn't solve your issue. Install a debugger like Xdebug or Zend Debugger and do some profiling.
Alright, it turns out that updating PHP from 5.3.1 to 5.3.5 made the problem go away. I still have no idea as to what made PHP crash in the first place, but I suppose my PHP could simply have been broken and in need of a reinstall.

PHP foreach insert statement issue with arrays

Hey guys, i'm currently learning php and I need to do this
$connection = mysql_open();
$likes= array();
foreach($likes as $like)
{
$insert3 = "insert into ProfileInterests " .
"values ('$id', '$like', null)";
$result3 = # mysql_query ($insert3, $connection)
or showerror();
}
mysql_close($connection)
or showerror();
For some reason this does not work =/ I don't know why. $likes is an array which was a user input. I need it to insert into the table it multiple times until all of the things in the array are in.
EDIT I fixed the issue where I was closing it in my foreach loop. mysql_open is my own function btw.
Any ideas?
For one $likes is an empty array in your example, I am assuming you fix that in the code you run.
The second is you close the MySQL connection the first the time the loop would run, which would prevent subsequent MySQL queries from running.
there's no such function as mysql_open
you may need mysql_connect
also $likes variable is empty. so no foreach iterations will execute.
You close the connection within the foreach loop.
Here is the proper formatted code to insert data...You can use this.
// DATABASE CONNECTION
$conn=mysql_connect(HOST,USER,PASS);
$link=mysql_select_db(DATABASE_NAME,$conn);
// function to insert data ..here $tableName is name of table and $valuesArray array of user input
function insertData($tableName,$valuesArray) {
$sqlInsert="";
$sqlValues="";
$arrayKeys = array_keys($valuesArray);
for($i=0;$i < count($arrayKeys);$i++)
{
$sqlInsert .= $arrayKeys[$i].",";
$sqlValues .= '"'.$valuesArray[$arrayKeys[$i]].'",';
}
if($sqlInsert != "")
{
$sqlInsert = substr($sqlInsert,0,strlen($sqlInsert)-1);
$sqlValues = substr($sqlValues,0,strlen($sqlValues)-1);
}
$sSql = "INSERT INTO $tableName ($sqlInsert) VALUES ($sqlValues)";
$inser_general_result=mysql_query($sSql) or die(mysql_error());
$lastID=mysql_insert_id();
$_false="0";
$_true="1";
if(mysql_affected_rows()=='0')
{
return $_false;
}
else
{
return $lastID;
}
}
// End Of Function
While many PHP newbies (myself included) begin working with databases from good ole' mysql_connect/query/etc., I can't help suggest that you look into PDO, PHP Data Objects. Depending on your prior knowledge and programming background, there may be a steeper learning curve. However, it's much more powerful, extensible, etc.; I use PDO in all my production code database wheelings-and-dealings now.

Categories