i have a problem whit a error: maximum size of arrays in php, $ids has a maximum and i don't know how to resolve it.
Array size is limited only by amount of memory your server has. I 'll get out of memory" error and i can not change php.ini
I need optimize this funcctions, some idea ?
function delete_ScormByIdPlataforma($idPlatforma)
{
if ($this->getIdScormVarsToDelete($idPlatforma) != 0)
{
$ids = $this->getIdScormVarsToDelete($idPlatforma);
$this->db->where_in('ID_dispatch', $ids);
$this->db->delete('scormvars');
}//else
//log_message('error', 'No se han encontrado scorms a borrar'.$this->db->_error_message().' - '.$this->db->last_query());
}
function getIdScormVarsToDelete($idPlataforma)
{
$this->db->select('s.ID_dispatch');
$this->db->from('scormvars as s');
$this->db->join('dispatch as d', 's.ID_dispatch = d.ID_dispatch', 'INNER');
$this->db->join('licencias as l', 'd.ID_licencia = l.ID_licencia','INNER');
$this->db->where('l.id_plataforma', $idPlataforma);
$query = $this->db->get();
if($query)
{
if($query->num_rows()>0){
foreach ($query->result() as $fila){
$data[] = $fila->ID_dispatch;
}
return array_unique($data);
}
}
else
{
//log_message('error', 'No se han encontrado Dispatch a borrar'.$this->db->_error_message().' - '.$this->db->last_query());
return 0;
}
}
It's in way you pass data and way you write codes. So If there are lot of joins or some bigger function you can switch it to SP(Stored Procedure).
And you can check your query speed by
Enable Profiler in your __construct
$this->output->enable_profiler(TRUE);
And in your code
$this->benchmark->mark('my_mark_start'); # my_mark_start can set any name with _start/ generate_start
$data['some_name'] = $this->model_name->modelFunctionName();
$this->benchmark->mark('my_mark_end'); # generate_end
This will shows how much your code took took to proceed the data
You don't need to edit the php.ini file to increase the memory limit. You can set a new memory limit at runtime inside your script, using ini_set():
<?php
ini_set('memory_limit', '256M');
// From here on, the memory limit will be 256M.
Related
I have table with 100 000+ rows, and I want to select all of it in doctrine and to do some actions with each row, in symfony2 with doctrine I try to do with this query:
$query = $this->getDefaultEntityManager()
->getRepository('AppBundle:Contractor')
->createQueryBuilder('c')
->getQuery()->iterate();
foreach ($query as $contractor) {
// doing something
}
but then I get memory leak, because I think It wrote all data in memory.
I have more experience in ADOdb, in that library when I do so:
$result = $ADOdbObject->Execute('SELECT * FROM contractors');
while ($arrRow = $result->fetchRow()) {
// do some action
}
I do not get any memory leak.
So how to select all data from the table and do not get memory leak with doctrine in symfony2 ?
Question EDIT
When I try to delete foreach and just do iterate, I also get memory leak:
$query = $this->getDefaultEntityManager()
->getRepository('AppBundle:Contractor')
->createQueryBuilder('c')
->getQuery()->iterate();
The normal approach is to use iterate().
$q = $this->getDefaultEntityManager()->createQuery('select u from AppBundle:Contractor c');
$iterableResult = $q->iterate();
foreach ($iterableResult as $row) {
// do something
}
However, as the doctrine documentation says this can still result in errors.
Results may be fully buffered by the database client/ connection allocating additional memory not visible to the PHP process. For large sets this may easily kill the process for no apparant reason.
The easiest approach to this would be to simply create smaller queries with offsets and limits.
//get the count of the whole query first
$qb = $this->getDefaultEntityManager();
$qb->select('COUNT(u)')->from('AppBundle:Contractor', 'c');
$count = $qb->getQuery()->getSingleScalarResult();
//lets say we go in steps of 1000 to have no memory leak
$limit = 1000;
$offset = 0;
//loop every 1000 > create a query > loop the result > repeat
while ($offset < $count){
$qb->select('u')
->from('AppBundle:Contractor', 'c')
->setMaxResults($limit)
->setFirstResult($offset);
$result = $qb->getQuery()->getResult();
foreach ($result as $contractor) {
// do something
}
$offset += $limit;
}
With this heavy datasets this will most likely go over the maximum execution time, which is 30 seconds by default. So make sure to manually change set_time_limit in your php.ini. If you just want to update all datasets with a known pattern, you should consider writing one big update query instead of looping and editing the result in PHP.
Try using this approach:
foreach ($query as $contractor) {
// doing something
$this->getDefaultEntityManager()->detach($contractor);
$this->getDefaultEntityManager()->clear($contractor);
unset($contractor); // tell to the gc the object is not in use anymore
}
Hope this help
If you really need to get all the records, I'd suggest you to use database_connection directly. Look at its interface and choose method which won't load all the data into memory (and won't map the records to your entity).
You could use something like this (assuming this code is in controller):
$db = $this->get('database_connection');
$query = 'select * from <your_table>';
$sth = $db->prepare($query);
$sth->execute();
while($row = $sth->fetch()) {
// some stuff
}
Probably it's not what you need because you might want to have objects after handling all the collection. But maybe you don't need the objects. Anyway think about this.
I am using Yii 1.1.14 with php 5.3 on centos 6 and I am using CDbCommand to fetch data from a very large table, the result set is ~90,000 records over 10 columns I am exporting it to a csv file and the file size is about 15MB,
the script always crashed without any error messages and only after some research I figured out that I need to raise the memory_limit in php.ini in order to be able to execute the script successfully.
The only problem is that for a successful execution I had to raise the memory limit to 512MB(!) which is a lot! and if 10 users will be executing the same script my server will not respond very well...
I was wondering if anyone might know of a way to reduce memory consumption on sql queries with Yii?
I know I can split the query to multiple queries using limits and offsets, but it just doesn't seem logical that a 15MB query will consume 512MB.
Here is the code:
set_time_limit(0);
$connection = new CDbConnection($dsn,$username,$password);
$command = $connection->createCommand('SELECT * FROM TEST_DATA');
$result = $command->queryAll(); //this is where the script crashes
print_r($result);
Any ideas would be greatly appreciated!
Thanks,
Instead of using readAll that will returns all the rows in a single array (the real memory problem is here), you should simply use a foreach loop (take a look at CDbDataReader), e.g. :
$command = $connection->createCommand('SELECT * FROM TEST_DATA');
$rows = $command->query();
foreach ($rows as $row)
{
}
EDIT : Using LIMIT
$count = Yii::app()->db->createCommand('SELECT COUNT(*) FROM TEST_DATA')->queryScalar();
$maxRows = 1000:
$maxPages = ceil($count / $maxRows);
for ($i=0;$i<$maxPages;$i++)
{
$offset = $i * $maxRows;
$rows = $connection->createCommand("SELECT * FROM TEST_DATA LIMIT $offset,$maxRows")->query();
foreach ($rows as $row)
{
// Here your code
}
}
I have a big table in my MySQL database. I want to go over one of it's column and pass it in a function to see if it exist in another table and if not create it there.
However, I always face either a memory exhausted or execution time error.
//Get my table
$records = DB::($table)->get();
//Check to see if it's fit my condition
foreach($records as $record){
Check_for_criteria($record['columnB']);
}
However, when I do that, I get a memory exhausted error.
So I tried with a for statement
//Get min and max id
$min = \DB::table($table)->min('id');
$max = \DB::table($table)->max('id');
//for loop to avoid memory problem
for($i = $min; $i<=$max; $i++){
$record = \DB::table($table)->where('id',$i)->first();
//To convert in array for the purpose of the check_for_criteria function
$record= get_object_vars($record);
Check_for_criteria($record['columnB']);
}
But going this way, I got a maximum execution time error.
FYI the check_for_criteria function is something like
check_for_criteria($record){
$user = User::where('record', $record)->first();
if(is_null($user)){
$nuser = new User;
$nuser->number = $record;
$nuser->save();
}
}
I know I could ini_set('memory_limit', -1); but I would rather find a way to limit my memory usage in some way or at least spreading it some way.
Should I run these operations in background when traffic is low? Any other suggestion?
I solved my problem by limiting my request to distinct values in ColumnB.
//Get my table
$records = DB::($table)->distinct()->select('ColumnB')->get();
//Check to see if it's fit my condition
foreach($records as $record){
Check_for_criteria($record['columnB']);
}
I made a method on my class to fetch and store in a array all the results the desired SQL statement has in it, and it works just fine. Now, after some months in production, I came across this error:
Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 3503272 bytes) in C:\xxpathxx\class.DGC_Gerais.php on line 0
As I begin to inspect, I tracked the error to the mysql_free_result(), but upon commenting that line, it still doenst work.
Here's the _fetch method:
PUBLIC static function _fetch($sql){
$q = mysql_query($sql);
if(mysql_num_rows($q) > 0):
for($a=0 ; $linha = mysql_fetch_row($q) ; $a++){ // forEach row.
foreach($linha as $chave => $valor): // forEach field.
$result[$a][$chave] = $valor;
endforeach;
}
// mysql_free_result($q);
return $result;
else:
return false;
endif;
}
That code is extremely convoluted and can be simplified to:
public static function _fetch($sql) {
$q = mysql_query($sql);
if (mysql_num_rows($q) == 0) {
return false;
}
$result = array();
while ($linha = mysql_fetch_row($q)) {
$result[] = $linha;
}
return $result;
}
Does exactly the same without double loops.
The problem is that you're fetching all that data from the database and are storing it in $result, which means it needs to be stored in memory. And PHP limits the amount of memory available to scripts by default, so you're simply exceeding that limit. mysql_free_result has nothing as such to do with it. First try to fetch less data, or to process the data inside that while loop without needing to store everything in an array.
If that doesn't help, carefully turn up the memory limit with ini_set('memory_limit', $limit).
I'm learning php MVC and in my display model i got this fatal error
Fatal error: Maximum execution time of 30 seconds exceeded in C:\xampp\htdocs\kowab\app\models\display.php on line 36
line 36 is $data = mysql_fetch_array($sql);
To remove this error you have to increase max_execution_time in your php.ini. Afterwards you have to restart the apache.
Or you add ini_set('max_execution_time', x) at the top of your script.
But you should think about optimizing your query and code first.
Are you watching from the Arabic man's tutorials? (Ali Hamdi)
I experienced the same thing and I made my else statement of the display class this way:
else
{
$num = mysql_num_rows($sql);
while ($num > 0)
{
$data = mysql_fetch_array($sql);
$num--;
}
}
return $data;
}
}
?>
It didn't solve the problem, but it brought back the form at least. So I continued watching the rest of the tutorials and following him so that later I address that part. I have written him and awaiting for his response. as soon as he does, I'll get back to you with the solution.
Up your execution time by making your first line of code:
set_time_limit($x);
$x should be the maximum time in seconds for running the script. A value of 0 will let the script run infinitely.
http://us1.php.net/set_time_limit
NOTE: It is weird that you hit a 30 second time limit on line 36, so you probably have a problem with your code that we can't identify, because you haven't posted it.
You can increase that time by looking for max_execution_time in php.ini but before that you need to know what cause this issue. Check your query there might be some loop or it returns a huge data
set_time_limit($seconds);
Per the docs. If you pass a value of 0 for $seconds there will be no time limit.
here is my model
// display
class display extends awebarts {
public function __construct($tablename) {
$this->tablename= $tablename;
$this->connectToDb();
$this->getData();
$this->close();
}
function getData() {
$query = "SELECT * FROM $this->tablename ORDER BY `ID` DESC LIMIT 1";
if(!$sql = mysql_query($query))
{
throw new Exception (mysql_error());
}
else {
$num= mysql_num_rows($sql);
while($num >0)
{
$data= mysql_fetch_array($sql);
}
}
return $data;
}
}
?>``