I'm running into an issue when trying to insert ~20K records into my DB. I notice that even though I'm echoing inside my foreach loop, I'm not getting anything outputted in the command line. Instead, I get an error after inserting ~9440 records relating to...
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried
to allocate 91 bytes) in
/Users/me/Sites/Laravel/database/connection.php on line 293
Here is my code (tried using both Eloquent and Fluent):
<?php
class Process_Controller extends Base_Controller
{
public function action_migrate()
{
$properties = DB::table('raw_properties')->get('id');
$total = count($properties);
foreach ($properties as $x => $p) {
$r = RawProperty::find($p->id);
$count = $x + 1;
$prop_details = array(
'column' => $r->field,
// Total of 21 fields
);
DB::table('properties')->insert($prop_details);
echo "Created #$count of $total\n";
}
}
}
The accepted answer is fixing the symptom rather then the problem. The problem is the Laravel query log (in memory) is eating all your RAM when you execute such a large # of queries. See the answer here: https://stackoverflow.com/a/18776710/221745
Or, in brief, turn off query logging via:
DB::disableQueryLog()
Before executing 20k queries
This error depicts that your PHP script has exhausted memory limit due to insufficient memory allocated for script.
You need to increase memory_limit using the ini_set function
e.g ini_set('memory_limit','128M');
I did the DB::disableQueryLog() and continued to get the error. I ended up Pausing Telescope from recording the queries. You can do this from the telescope web interface > queries > Click the Pause Icon.
Related
I have a very simple query to get records from the database:
\DB::table("table")->get();
When I try to get more than ±145000 records from the database I am getting:
500 server error.
The code like:
\DB::table("table")->take(14500)->get();
although works. When I try to get more than 15k I get the error immediately without any loading or further information.
I cannot get any more info from logs as well. An odd thing is that when I write that code to tinker - I can get all records. (with eloquent works the same)
If you'd check your error log you will most likely see something along the lines of:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 54 bytes)
It would be better to chunk your results instead of loading it all at once to memory
\DB::table("table")->chunk(500, function($results) {
foreach($results as $result) {
do your thing
}
});
I have a problem that drives me nuts for some time, and can't find a solution to it :-(
From time-to-time I get this error message on our wordpress/woocommerce site
Fatal error: Allowed memory size of 201326592 bytes exhausted (tried to allocate 1048576 bytes) in xxxxx/wp-includes/wp-db.php on line 1842
After refresh, the page loads regularly.
I checked the referenced lines in wp-db.php, and it is this code:
$num_rows = 0;
if ( $this->use_mysqli && $this->result instanceof mysqli_result ) {
while ( $row = mysqli_fetch_object( $this->result ) ) {
$this->last_result[$num_rows] = $row;
$num_rows++;
}
While searching the net, I found some tips for solution, like checking MySQLi, enlarging php memory, optimising database (deleting orphaned post-meta and WPs Automatic Database Optimizing), but none of them worked. The host says the memory is not any problem with the load of the server, but it is some coding issue.
Any idea how I can find the problem that needs fixing?
I tried the ini_set('memory_limit', '-1'); hack to make the php memory infinite (I know it is not a good practise), but it didn't help.
Any idea how this issue can be traced down, which code makes the mess?
Try to use the function mysqli_result::fetch_all
and iterate result yourself, to avoid inifinite loop ... that can be solved your problem may be ^^
if ( $this->use_mysqli && $this->result instanceof mysqli_result ) {
$results = mysqli_fetch_all($this->result);
foreach($result as $r) {
//do stuff
}
}
I am creating a website in the CMS/framework called Processwire now I do want to fill a few arrays with values i collected from database results (called pages in processwire).
Now what happens is that i get a memory error and I know that i should raise the memory limit but that doesn't feel right because the pages will even get more and more then it is already therefor the load will get bigger and i might end up increasing the memory limit even more. Now I would like kind of pick batches from the big pile and process them and then pick the next batch.
this is the part that causes to crash:
for($i = 0; $i<=$count_log; $i+=100)
{
$all_log_files = $pages->find("template=logs, start=$i, limit=100");
foreach($all_log_files as $log_file)
{
$u++;
if(empty($actions["$log_file->action"]))
{
$actions[$log_file->action] = 1;
}
if(empty($ips["$log_file->ip"]))
{
$ips[$log_file->ip] = 1;
}
if(empty($values["$log_file->value"]))
{
$values[$log_file->value] = 1;
}
if(empty($results["$log_file->result"]))
{
$results[$log_file->result] = 1;
}
}
}
Now as you can see i already tried to make "batches" but I failed since i'd still get the error...
What can i do this fix this or isn't there anything i can do and should i just raise the memory limit?
EDIT:
The specific error:
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 6 bytes) in /xxx/xxx/xxx/xxx/wire/core/DatabaseQuery.php on line 47
You're having a memory issue because the data that you're trying to store in memory all at once is too big. You can either increase the memory limit to hold all the data or store less data and change your algorithms to work with batches of the data.
For example, if you have a function that uses the values stored in $results, then you can try to change this function to work just as well with only part of the results, and call the function for each iteration, then empty the array after the iteration finishes.
I am trying to grab an entire channels video feed (all of the videos data) and store it into my MySQL database for use in an application I am currently working on. I am not the most experienced with the YouTube API. The code I am working with is the following:
public function printVideoFeed($count)
{
$this->startIndex($count);
$data = $this->yt->getVideoFeed($this->query);
foreach($data as $video)
{
echo $count .' - '.$video->getVideoTitle().'<br/>';
$count++;
}
//check if there are more videos
try{
$nextFeed = $data->getNextFeed();
} catch(Zend_Gdata_App_Exception $e)
{
echo $e->getMessage(). '<br/>';
}
if($nextFeed)
{
$this->printVideoFeed($count);
}
}
The error I am getting is:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 36 bytes) in C:\Program Files\ZendFrameworkCli\library\Zend\Gdata\App\Base.php on line 431
This is one of a few errors I am getting while trying to grab upwards of 3000 videos. My question is how can I make this not continue to expand the memory usage while continuing to do the printVideoFeed method again. If there is a way to make it break out of the loop but restart if there are still videos left that would be awesome. Ive been looking but to google this question is kind of a hard thing to do (to get the results im looking for).
Have you tried using iteration instead of recursion? I can imagine that PHP might keep the variables declared in the function, i.e. especially $data, until the function is left. Alternatively, you could call unset($data); before starting the recursion.
Also: Are you sure you have no infinite loop? Maybe you need to call startIndex() again before calling getNextFeed()?
I keep seeing a memory exhausted error
PHP Fatal error: Allowed memory size of 268 435 456 bytes exhausted
in my log file.
This error comes randomly even when the server is very lightly loaded & not reproducible on localhost. I have a VPS 4 server from hostgator with with loads of MB. The php config allows upto 256 Mb.
The code is below
function func_select_array($qry)
{
$i=0;
$data=array();
$qry_result=mysql_query($qry);
if($qry_result)
{
while ($row=mysql_fetch_assoc($qry_result))
{
$data[$i] = $row;
$i++;
}
return $data;
}
else
{
return 2;
}
}
function func_check_rule_out_bid($auc_id,$bid_amount,$return_freq,$recheck)
{
$bid_qry="select * from tbl_bid where ubaid='".$auc_id."' and ubf='1' order by uba desc limit 0,10";
$bid_array=func_select_array($bid_qry);
}
The table tbl_bid has 2800 records.
I get memory exhausted error in the while loop inside func_select_array function.
I cannot imagine this query needs 256M+. It does not appear to be a php problem, but something in mysql. Please help...
Memory exhaustion can be very difficult to debug because the error only tells you where the memory finally ran out, not where the bulk of it was used.
You could use a memory profiler, such as the one in Xdebug, to find where all that memory went.
You should use EXPLAIN to youre query. To assure that youre not doing some heavy query(at first look doesnt seem).
At second way, may be are you getting some BLOB ot TEXT field that is too heavy. Not at your local, thats why your not able to get this at local enviroment.
Assure that youre not getting an infinte loop. Could be?
Use trace php errors and try/catch this block to get better information about the error.
try {
while ($row=mysql_fetch_assoc($qry_result))
{
$data[$i] = $row;
$i++;
}
} catch (Exception $e) {
debug_print_backtrace();
var_dump($e->getMessage());
}