I run into a memory limit error using fetchAll so I'm trying to use fetch instead but I can't find a way to do it. Any suggestion? Where/How to use the while instead of foreach ?
here is the original code:
// We get all the data from the table
$Qselect = $pdo->prepare('SELECT * FROM '.$table_to_export.'');
$Qselect->execute(array(''));
$results = $Qselect->fetchAll(PDO::FETCH_ASSOC); // Here is the problem
$countRmain = $Qselect->rowCount();
// We get the column names
$Qdescribe = $pdo->prepare('DESCRIBE '.$table_to_export.'');
$Qdescribe->execute();
$limit = $Qdescribe->rowCount()-1; // Number of column in the table
$table_fields = $Qdescribe->fetchAll(PDO::FETCH_COLUMN); // No problem here
foreach($table_fields as $key => $fields){
$outputCsv .= trim($fields).';';
}
// We remove the ; at the end
$outputCsv = rtrim($outputCsv, ';');
// Data
$outputCsv .= "\n";
if($countRmain > 0){
foreach($results as $row){
$column = 0 ;
foreach ($row as $key => $val){
if (is_null($val)){
$outputCsv .= 'NULL;'; // If the field id empty, we add NULL
}
else {
$outputCsv .= $val.';'; // We add the value in the file
}
if ($column == $limit)
$outputCsv .= "\n";
$column ++;
}
}
}
else
exit('No data to export');
I tried to include the foreach loop into while($results = $Qselect->fetch()){ but that takes a really long time (10min for 50000 rows)
PS: If I increase the PHP memory_limit it works with fetchAll but I don't want this solution.
Try this one.
1. comment line 4
2. Replace line 23:
while($row = $Qselect->fetch(PDO::FETCH_ASSOC)){
3. Skip (or Replace) checks on line 22
Idea is simple: u got $result earlier, but not loading whole array. You just step-by-step loading records. That cant be slow enough even for 1M rows becouse u replace one piece of code iterations loop for another same code iterations.
If u still got time problems, try reorganize\optimize your code.
I have also encounter this problem.
Increase the following variables so that your page execution will not stop:
max_input_time
memory_limit
max_execution_time
or you can use
while($row = $Qselect->fetch(PDO::FETCH_ASSOC)){
in place of fatchAll
Related
I am dealing with 700 rows of data in my excel.
And I add on a column this entry:
foreach($data as $k => $v){
$users ->getCell('A'.$k)->setValue($v['Username']);
$users->setCellValueExplicit('B'.$k,
'=INDEX(\'Feed\'!H2:H'.$lastRow.',MATCH(A'.$k.',\'Feed\'!G2:G'.$lastRow.',0))',
PHPExcel_Cell_DataType::TYPE_FORMULA);
}
$users stands for a spreadsheet.
I see that writing 700 cells with the above setCellValueExplicit() takes more than 2 minutes to get processed. If I omit that line it takes 4 seconds for the same machine to process it.
2 minutes can be ok, but what if I have 2000 cells. Is there any way that can be speed optimized?
ps: =VLOOKUP is the same slow as the above function.
Update
The whole idea of the script:
read a CSV file (13 columns and at least 100 rows), write it into a spreadsheet, create a new spreadsheet ($users), read two columns, sort them based to one column and write it to the $users spreadsheet.
Read the columns:
$data = array();
for ($i = 1; $i <= $lastRow; $i++) {
$user = $Feed ->getCell('G'.$i)->getValue();
$number = $Feed ->getCell('H'.$i)->getValue();
$row = array('User' => $user, 'Number' => $number);
array_push($data, $row);
}
Sort the data
function cmpb($a,$b){
//get which string is less or 0 if both are the same
if($a['Number']>$b['Number']){
$cmpb = -1;
}elseif($a['Number']<$b['Number']){
$cmpb = 1;
}else{
$cmpb = 0;
}
//if the strings are the same, check name
if($cmpb == 0){
//compare the name
$cmpb = strcasecmp($a['User'], $b['User']);
}
return $cmpb;
}
usort($data, 'cmpb');
Write data
foreach($data as $k => $v){
$users ->getCell('A'.$k)->setValue($v['Username']);
$users ->getCell("B{$k}")->setValueExplicit("=INDEX('Feed'!H2:H{$lastRow},MATCH(A{$k},'Feed'!G2:G{$lastRow},0))",
PHPExcel_Cell_DataType::TYPE_FORMULA);
}
and also unset the data for memory:
unset($data);
So if comment the line with setValueExplicit everything becomes smoother.
Looking at PHPExcel's source code, this is PHPExcel_Worksheet::setCellValueExplicit function:
public function setCellValueExplicitByColumnAndRow($pColumn = 0, $pRow = 1, $pValue = null, $pDataType = PHPExcel_Cell_DataType::TYPE_STRING)
{
return $this->getCell(PHPExcel_Cell::stringFromColumnIndex($pColumn) . $pRow)->setValueExplicit($pValue, $pDataType);
}
For the data type you're using, PHPExcel_Cell_DataType::TYPE_FORMULA, the PHPExcel_Cell::setValueExplicit function just executes:
case PHPExcel_Cell_DataType::TYPE_FORMULA:
$this->_value = (string)$pValue;
break;
I can't find a logical explanation for the old up on the execution of that particular instruction. Try to replace it for the following and let me know if there was any improvement:
$users ->getCell("B{$k}")->setValueExplicit("=INDEX('Feed'!H2:H{$lastRow},MATCH(A{$k},'Feed'!G2:G{$lastRow},0))", PHPExcel_Cell_DataType::TYPE_FORMULA);
As a last resource my advice would be to time track the execution of the instruction to find the bottleneck.
I have an array that can hold up to several thousands of items.(usually around 5000 items).
I need to split this array into hundreds and process them and then continue with the rest of items.
So far i handle the whole array which is slow.
My code is
foreach($result as $p){
$sqlQuery = mysql_query("INSERT INTO message_details(contact, message_id)VALUES('$p', $message_last_id)");
$last_id = mysql_insert_id();
$xmlString .= "<gsm messageId=\"$last_id\">".$p."</gsm>";
$cnt++;
}
How can i process the items in the array in hundreds? Eg. 100, then 200, then 300 etc
Best Regards,
Nicos
Maybe you might try it that way:
First select the last ID of message_details.
$sqlQuery = mysql_query("SELECT %last_id_col_name% FROM message_details ORDER BY %last_id_col_name% DESC LIMIT 1".$sInserts);
then:
$sInserts = '';
foreach($result as $p){
$sInserts .= "('{$p}', {$message_last_id}),";
}
//Remove last "," from Insert-String
$sInserts = substr($sInserts,0,-1);
//Insert all with one query
$sqlQuery = mysql_query("INSERT INTO message_details(contact, message_id)VALUES".$sInserts);
then select all entries from that table, where id is greater than the one you've selected first
and write to
$xmlString .= '<gsm messageId="'.$last_id.'">'.$p.'</gsm>';
If you are doing it that way, you only need 3 DB-Queries, instead of thousands.
foreach($result as $p){
$sqlQuery = mysql_query("INSERT INTO message_details(contact, message_id)VALUES('$p', $message_last_id)");
$last_id = mysql_insert_id();
$xmlString .= "<gsm messageId=\"$last_id\">".$p."</gsm>";
$cnt++;
if ($cnt % 1000 == 0) { usleep(100) }
}
You can use usleep or sleep depending on what you want. I've already used this. You gain performance. Try it.
If you have to do it in the code the you can use the php function array_chunk to chunk your array into arrays of 100 elements each. Here's the docs on array_chunk: link. But as pointed out this is unlikely to be the bottle neck.
How can I get the number of rows from a SELECT statement result set using the Advantage Database PHP Extension?
I ended up writing my own function that works similar to mysql_num_rows:
function my_num_rows($result) {
ob_start(); // begin disable output from ads_result_all
(int)$number = ads_result_all($result);
ob_end_clean(); //close and clean the output buffer
ads_fetch_row($r1, 0); // reset the result set pointer to the beginning
if ($number >= 0){
return $number;
} else {
return FALSE;
}
}
It could also be rewritten to count the rows using ads_fetch_row but this was easier for what I needed. With large result sets there could be slower performance using ads_result_all.
You will have to count the rows as they are fetched. (you can see this KB item 070618-1888) or you can execute a second query with the COUNT() scalar (suggest excluding order by if possible)
Here is an example of counting as you go:
$rStmt = ads_do ($rConn, "select id, name from table1");
$RowCount = 0;
while (ads_fetch_row($rStmt))
{
$id = ads_result ($rStmt, "id");
$name = ads_result($rStmt, "name");
echo $id . "\t" . $name . "\n";
$RowCount++;
}
echo "RowCount:" . $RowCount . "\n";
In Version 12 as far as I know you have the function ads_num_rows(). Please see in official documentation for further usage.
I'm having some issues returning values from a server with php + mysql.
This is my code
$result = mysql_query("SELECT * FROM Nicknames", $con);
if (mysql_real_escape_string($_POST['Create']) == "NICKNAME") {
$output;
while ($row = mysql_fetch_assoc($result)) {
if ($row['Taken'] == '0') $output = $output . $row['Nickname'] . ",";
}
echo substr($output, 0, -1);
}
If I add break; in the while loop, it works perfectly and I just get 1 row of my table.
If instead I want to return all 3000 rows, I just get an empty answer from the server.
If the table has 10 rows it works.
I was wondering if it is about the amount of rows, or it is because eventual special characters.
thanks
UPDATE
It works until 1330 rows, if I try to get more, I get an empty result
$counter = 0;
while ($row = mysql_fetch_assoc($result)) {
if ($row['Taken'] == '0') $output = $output . $row['Nickname'] . ",";
if ($counter == 1330) break;
$counter++;
}
echo substr($output, 0, -1);
Somewhere in the middle of your table's rows there may have been an invalid character.
Since you know which row it stops working at, try running the SELECT with different ORDER BY's to determine if this is the case. :)
Have you tried showing error messages?
error_reporting(E_ALL|E_STRICT);
ini_set('display_errors', true);
It is possible that $output is filling the memory_limit before getting to your echo, and if display_errors is false, you won't see the error stating out of memory.
Maybe it's too obvious, but ...
Have you checked the max_execution_time value in your php.ini ?
Maybe the script is dying before finishing the 3000 rows fetch.
Try setting
set_time_limit(0);
at the beginning of the script to avoid this problem
$output after the if condition should be $output = "";
I'm trying to convert some MYSQL querys to MYSQLI, but I'm having an issue, below is part of the script I am having issues with, the script turn a query into csv:
$columns = (($___mysqli_tmp = mysqli_num_fields($result)) ? $___mysqli_tmp : false);
// Build a header row using the mysql field names
$rowe = mysqli_fetch_assoc($result);
$acolumns = array_keys($rowe);
$csvstring = '"=""' . implode('""","=""', $acolumns) . '"""';
$header_row = $csvstring;
// Below was used for MySQL, Above was added for MySQLi
//$header_row = '';
//for ($i = 0; $i < $columns; $i++) {
// $column_title = $file["csv_contain"] . stripslashes(mysql_field_name($result, $i)) . $file["csv_contain"];
// $column_title .= ($i < $columns-1) ? $file["csv_separate"] : '';
// $header_row .= $column_title;
// }
$csv_file .= $header_row . $file["csv_end_row"]; // add header row to CSV file
// Build the data rows by walking through the results array one row at a time
$data_rows = '';
while ($row = mysqli_fetch_array($result)) {
for ($i = 0; $i < $columns; $i++) {
// clean up the data; strip slashes; replace double quotes with two single quotes
$data_rows .= $file["csv_contain"] .$file["csv_equ"] .$file["csv_contain"] .$file["csv_contain"] . preg_replace('/'.$file["csv_contain"].'/', $file["csv_contain"].$file["csv_contain"], stripslashes($row[$i])) . $file["csv_contain"] .$file["csv_contain"] .$file["csv_contain"];
$data_rows .= ($i < $columns-1) ? $file["csv_separate"] : '';
}
$data_rows .= $this->csv_end_row; // add data row to CSV file
}
$csv_file .= $data_rows; // add the data rows to CSV file
if ($this->debugFlag) {
echo "Step 4 (repeats for each attachment): CSV file built. \n\n";
}
// Return the completed file
return $csv_file;
The problem I am having is when building a header row for the column titles mysqli doesn't use field_names so I am fetching the column titles by using mysqli_fetch_assoc() and then implode() the array, adding the ,'s etc for the csv.
This works but when I produce the csv I am deleting the first data row when the header is active, when I remove my header part of the script and leave the header as null I get all data rows and a blank header (As expected).
So I must be missing something when joining my header to array to the $csv_file.
Can anyone point me in the right direction?
Many Thanks
Ben
A third alternative is to refactor the loop body as a function, then also call this function on the first row before entering the loop. You can use fputcsv as this function.
$csv_stream = fopen('php://temp', 'r+');
if ($row = $result->fetch_assoc()) {
fputcsv($csv_stream, array_keys($row));
fputcsv($csv_stream, $row);
while ($row = $result->fetch_row()) {
fputcsv($csv_stream, $row);
}
fseek($csv_stream, 0);
}
$csv_data = stream_get_contents($csv_stream);
if ($this->debugFlag) {
echo "Step 4 (repeats for each attachment): CSV file built. \n\n";
}
// Return the completed file
return $csv_data;
As this basically does the same thing as a do ... while loop, which would make more sense to use. I bring up this alternative to present the loop body refactoring technique, which can be used when a different kind of loop doesn't make sense.
Best of all would be to use both mysqli_result::fetch_fields and fputcsv
$csv_stream = fopen('php://temp', 'r+');
$fields = $result->fetch_fields();
foreach ($fields as &$field) {
$field = $field->name;
}
fputcsv($csv_stream, $fields);
while ($row = $result->fetch_row()) {
fputcsv($csv_stream, $row);
}
fseek($csv_stream, 0);
$csv_data = stream_get_contents($csv_stream);
if ($this->debugFlag) {
echo "Step 4 (repeats for each attachment): CSV file built. \n\n";
}
// Return the completed file
return $csv_data;
If you can require that PHP be at least version 5.3, you can replace the foreach that generates the header line with a call to array_map. There admittedly isn't much advantage to this, I just find the functional approach more interesting.
fputcsv($csv_stream,
array_map(function($field) {return $field->name},
$result->fetch_fields()));
As you observe, you're using the first row to obtain the field names but then not using the data from the row. Evidently, you need to change your code so that you get both of those things.
There are a number of ways you might do this. The most appropriate one is to use mysqli_fetch_fields() instead to get the field metadata from the result object.
http://www.php.net/manual/en/mysqli-result.fetch-fields.php
Alternatively, you could make the loop lower down in the code a do... while instead of a while.