I made a method on my class to fetch and store in a array all the results the desired SQL statement has in it, and it works just fine. Now, after some months in production, I came across this error:
Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 3503272 bytes) in C:\xxpathxx\class.DGC_Gerais.php on line 0
As I begin to inspect, I tracked the error to the mysql_free_result(), but upon commenting that line, it still doenst work.
Here's the _fetch method:
PUBLIC static function _fetch($sql){
$q = mysql_query($sql);
if(mysql_num_rows($q) > 0):
for($a=0 ; $linha = mysql_fetch_row($q) ; $a++){ // forEach row.
foreach($linha as $chave => $valor): // forEach field.
$result[$a][$chave] = $valor;
endforeach;
}
// mysql_free_result($q);
return $result;
else:
return false;
endif;
}
That code is extremely convoluted and can be simplified to:
public static function _fetch($sql) {
$q = mysql_query($sql);
if (mysql_num_rows($q) == 0) {
return false;
}
$result = array();
while ($linha = mysql_fetch_row($q)) {
$result[] = $linha;
}
return $result;
}
Does exactly the same without double loops.
The problem is that you're fetching all that data from the database and are storing it in $result, which means it needs to be stored in memory. And PHP limits the amount of memory available to scripts by default, so you're simply exceeding that limit. mysql_free_result has nothing as such to do with it. First try to fetch less data, or to process the data inside that while loop without needing to store everything in an array.
If that doesn't help, carefully turn up the memory limit with ini_set('memory_limit', $limit).
Related
I have this PHP code below to generate a set of 12 digit unique random numbers(ranging from 100000 to a million) and save it in db. I am first fetching the existing codes from MySQL db(right now there are already a million of them), flipping the array, generating a new codes. Later I use array_diff_key and array_keys on $random and $existingRandom to get the new codes which are to be saved back to db.
// total labels is the number of new codes to generate
//$totalLabels is fetched as user input and could range from 100000 to a million
$codeObject = new Codes();
//fetch existing codes from db
$codesfromDB = $codeObject->getAllCodes();
$existingRandom = $random = array();
$existingRandom = $random = array_flip($codesfromDB);
$existingCount = count($random); //The codes you already have
do {
$random[mt_rand(100000000000,999999999999)] = 1;
} while ((count($random)-$existingCount) < $totalLabels);
$newCodes = array_diff_key($random,$existingRandom);
$newCodes = array_keys($newCodes);
The issue I am facing is that the array_flip function is running out of memory and causing my program to crash Error
"Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 72 bytes)"
My questions are below:
1) Can someone help me understand why the array_flip is running out of memory. Memory limit in php.ini file is 256M. Please show me calculation of the memory used by the function if possible. (Also if array_flip passes array_diff_key and array_keys run out of memory)
2) How do I optimize the code so that the memory used is under the limit. I even tried to break the array_flip operation in smaller chunks but even that is running out of memory.
$size = 5000;
$array_chunk = array_chunk($codesfromDB, $size);
foreach($array_chunk as $values){
$existingRandom[] = $random[] = array_flip($values);
}
3) Is what I am doing optimal would it be fair to further increase the memory limit in php.ini file. What are the things to keep in mind while doing that.
Here is my query as well to fetch the existing codes from db if needed:
$sql = "SELECT codes FROM code";
$stmt = $this->db->prepare($sql);
$stmt->execute();
$result = $stmt->fetchAll(PDO::FETCH_COLUMN, 0);
return $result;
i have a problem whit a error: maximum size of arrays in php, $ids has a maximum and i don't know how to resolve it.
Array size is limited only by amount of memory your server has. I 'll get out of memory" error and i can not change php.ini
I need optimize this funcctions, some idea ?
function delete_ScormByIdPlataforma($idPlatforma)
{
if ($this->getIdScormVarsToDelete($idPlatforma) != 0)
{
$ids = $this->getIdScormVarsToDelete($idPlatforma);
$this->db->where_in('ID_dispatch', $ids);
$this->db->delete('scormvars');
}//else
//log_message('error', 'No se han encontrado scorms a borrar'.$this->db->_error_message().' - '.$this->db->last_query());
}
function getIdScormVarsToDelete($idPlataforma)
{
$this->db->select('s.ID_dispatch');
$this->db->from('scormvars as s');
$this->db->join('dispatch as d', 's.ID_dispatch = d.ID_dispatch', 'INNER');
$this->db->join('licencias as l', 'd.ID_licencia = l.ID_licencia','INNER');
$this->db->where('l.id_plataforma', $idPlataforma);
$query = $this->db->get();
if($query)
{
if($query->num_rows()>0){
foreach ($query->result() as $fila){
$data[] = $fila->ID_dispatch;
}
return array_unique($data);
}
}
else
{
//log_message('error', 'No se han encontrado Dispatch a borrar'.$this->db->_error_message().' - '.$this->db->last_query());
return 0;
}
}
It's in way you pass data and way you write codes. So If there are lot of joins or some bigger function you can switch it to SP(Stored Procedure).
And you can check your query speed by
Enable Profiler in your __construct
$this->output->enable_profiler(TRUE);
And in your code
$this->benchmark->mark('my_mark_start'); # my_mark_start can set any name with _start/ generate_start
$data['some_name'] = $this->model_name->modelFunctionName();
$this->benchmark->mark('my_mark_end'); # generate_end
This will shows how much your code took took to proceed the data
You don't need to edit the php.ini file to increase the memory limit. You can set a new memory limit at runtime inside your script, using ini_set():
<?php
ini_set('memory_limit', '256M');
// From here on, the memory limit will be 256M.
I have a big table in my MySQL database. I want to go over one of it's column and pass it in a function to see if it exist in another table and if not create it there.
However, I always face either a memory exhausted or execution time error.
//Get my table
$records = DB::($table)->get();
//Check to see if it's fit my condition
foreach($records as $record){
Check_for_criteria($record['columnB']);
}
However, when I do that, I get a memory exhausted error.
So I tried with a for statement
//Get min and max id
$min = \DB::table($table)->min('id');
$max = \DB::table($table)->max('id');
//for loop to avoid memory problem
for($i = $min; $i<=$max; $i++){
$record = \DB::table($table)->where('id',$i)->first();
//To convert in array for the purpose of the check_for_criteria function
$record= get_object_vars($record);
Check_for_criteria($record['columnB']);
}
But going this way, I got a maximum execution time error.
FYI the check_for_criteria function is something like
check_for_criteria($record){
$user = User::where('record', $record)->first();
if(is_null($user)){
$nuser = new User;
$nuser->number = $record;
$nuser->save();
}
}
I know I could ini_set('memory_limit', -1); but I would rather find a way to limit my memory usage in some way or at least spreading it some way.
Should I run these operations in background when traffic is low? Any other suggestion?
I solved my problem by limiting my request to distinct values in ColumnB.
//Get my table
$records = DB::($table)->distinct()->select('ColumnB')->get();
//Check to see if it's fit my condition
foreach($records as $record){
Check_for_criteria($record['columnB']);
}
I'm learning php MVC and in my display model i got this fatal error
Fatal error: Maximum execution time of 30 seconds exceeded in C:\xampp\htdocs\kowab\app\models\display.php on line 36
line 36 is $data = mysql_fetch_array($sql);
To remove this error you have to increase max_execution_time in your php.ini. Afterwards you have to restart the apache.
Or you add ini_set('max_execution_time', x) at the top of your script.
But you should think about optimizing your query and code first.
Are you watching from the Arabic man's tutorials? (Ali Hamdi)
I experienced the same thing and I made my else statement of the display class this way:
else
{
$num = mysql_num_rows($sql);
while ($num > 0)
{
$data = mysql_fetch_array($sql);
$num--;
}
}
return $data;
}
}
?>
It didn't solve the problem, but it brought back the form at least. So I continued watching the rest of the tutorials and following him so that later I address that part. I have written him and awaiting for his response. as soon as he does, I'll get back to you with the solution.
Up your execution time by making your first line of code:
set_time_limit($x);
$x should be the maximum time in seconds for running the script. A value of 0 will let the script run infinitely.
http://us1.php.net/set_time_limit
NOTE: It is weird that you hit a 30 second time limit on line 36, so you probably have a problem with your code that we can't identify, because you haven't posted it.
You can increase that time by looking for max_execution_time in php.ini but before that you need to know what cause this issue. Check your query there might be some loop or it returns a huge data
set_time_limit($seconds);
Per the docs. If you pass a value of 0 for $seconds there will be no time limit.
here is my model
// display
class display extends awebarts {
public function __construct($tablename) {
$this->tablename= $tablename;
$this->connectToDb();
$this->getData();
$this->close();
}
function getData() {
$query = "SELECT * FROM $this->tablename ORDER BY `ID` DESC LIMIT 1";
if(!$sql = mysql_query($query))
{
throw new Exception (mysql_error());
}
else {
$num= mysql_num_rows($sql);
while($num >0)
{
$data= mysql_fetch_array($sql);
}
}
return $data;
}
}
?>``
Ever since developing my first MySQL project about 7 years ago, I've been using the same set of simple functions for accessing the database (although, have recently put these into a Database class).
As the projects I develop have become more complex, there are many more records in the database and, as a result, greater likelihood of memory issues.
I'm getting the PHP error Allowed memory size of 67108864 bytes exhausted when looping through a MySQL result set and was wondering whether there was a better way to achieve the flexibility I have without the high memory usage.
My function looks like this:
function get_resultset($query) {
$resultset = array();
if (!($result = mysql_unbuffered_query($query))) {
$men = mysql_errno();
$mem = mysql_error();
echo ('<h4>' . $query . ' ' . $men . ' ' . $mem . '</h4>');
exit;
} else {
$xx = 0 ;
while ( $row = mysql_fetch_array ($result) ) {
$resultset[$xx] = $row;
$xx++ ;
}
mysql_free_result($result);
return $resultset;
}
}
I can then write a query and use the function to get all results, e.g.
$query = 'SELECT * FROM `members`';
$resultset = get_resultset($query);
I can then loop through the $resultset and display the results, e.g.
$total_results = count($resultset);
for($i=0;$i<$total_results;$i++) {
$record = $resultset[$i];
$firstname = $record['firstname'];
$lastname = $record['lastname'];
// etc, etc display in a table, or whatever
}
Is there a better way of looping through results while still having access to each record's properties for displaying the result list?
I've been searching around for people having similar issues and the answers given don't seem to suit my situation or are a little vague.
Your problem is that you're creating an array and filling it up with all the results in your result set, then returning this huge array from the function. I suppose that the reason for which this is not supported by any mysql_* function is that it's extremely inefficient.
You should not fill up the array with everything you get. You should step through the results, just like you do when filling up the array, but instead of filling anything, you should process the result and get to the next one, so that the memory for this one gets a chance to be freed.
If you use the mysql_* or mysqli_* functions, you should return the resource, then step through it right there where you're using it, the same way you're stepping through it to fill the array. If you use PDO, then you can return the PDOStatement and use PDOStatement::fetch() to step through it.