Laravel cannot get more than 15k records - php

I have a very simple query to get records from the database:
\DB::table("table")->get();
When I try to get more than ±145000 records from the database I am getting:
500 server error.
The code like:
\DB::table("table")->take(14500)->get();
although works. When I try to get more than 15k I get the error immediately without any loading or further information.
I cannot get any more info from logs as well. An odd thing is that when I write that code to tinker - I can get all records. (with eloquent works the same)

If you'd check your error log you will most likely see something along the lines of:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 54 bytes)
It would be better to chunk your results instead of loading it all at once to memory
\DB::table("table")->chunk(500, function($results) {
foreach($results as $result) {
do your thing
}
});

Related

Wordpress Database Memory Error

I'm running a large Wordpress multisite install that, for each site, runs a number of database queries to display information in the respective blog. The data queries aren't too heavy however I often see this in my error log:
PHP Fatal error: Allowed memory size of 1572864000 bytes exhausted (tried to allocate 97 bytes) in /home/********/public_html/wp-includes/wp-db.php on line 1775
When this occurs I believe the page being called (that causes the error) stops loading and the user has to reload to access the information. I've been through every page being called an all load on their own without any issue.
Looking at the relevant line in the wp-db.php file this is the line that causes the error:
preg_match( '/^\s*(create|alter|truncate|drop)\s/i', $query ) ) {
$return_val = $this->result;
i.e. when a database query is being executed. Something is obviously going quite wrong here as I've tried upping my memory limit for php resources. Does anyone know how I would go about identifying what is causing this error so I can fix it?
Put following line of code in your wp-config.php file.
define( ‘WP_MEMORY_LIMIT’, ‘2000M’ ); // Value must be greater than current value
Please let me know if you need further help.
Thanks!

Optimization db queries - scaling big data - Laravel 5

I've been working on a web-app that uses Laravel 5. It's running on localhost (xampp) on a windows 8.1 PC. 4GB RAM, 2,67GHz processor, pretty simple.
The table I'm querying most of the times contains a lot of rows (10.000 give or take) - so many that to write a route that does:
return User::all();
Running this just returns a white screen. Sometimes Chrome console lists a 500 (Internal Server error).
Echoes or prints made before the query are shown but nothing after that is executed. Querying another model (whose table only has 2 rows) returns the data correctly.
Which leads me to conclude that my server isn't scaling well for this amount of data. I'm trying to fix this by doing.
User::all()->chunk(200, function($chunkOfTickets){ /*some code*/});
which I expected would split the data into chunks to make it easier on the server. This doesn't work, however, because Eloquent is first fetching all the data (and breaking because it can't handle it) and only then dividing it into chunks.
Thanks for reading.
EDIT: just tested over and over, requesting increasingly greater ammounts of data. The limit is 26000 rows approximately (27000 and out of memory error is returned).
As stated in the comments the php log states this. Apparently I was requesting so much memory it crashed before Laravel could show the error message
[01-Jul-2015 17:27:51 UTC] PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 8376445 bytes) in C:\xampp\htdocs\gamescaffold\vendor\laravel\framework\src\Illuminate\Support\Collection.php on line 791
Extra Edit
Is there a way I can divide in chunks the reply from the DB? like:
User::chunk(200)->all(); /*which obviously does not work*/
If I write a seemingly complex query to the database directly through phpMyAdmin it's returning 37035 rows in 0.0045 seconds. (I suspect there's some under the hood optimizations here by xampp or something though)
Note: I'm posting this as an answer because it involves some code. I guess I've totally missed your point in the original question because I though that you were trying to return the whole result set to the client. If I'm still missing something, please leave a comment and I'll delete this answer.
So I want to take a set of objects, do stuff with them and save them back to the DB
That's an easy one!
$chunkSize = 100; // or whatever your memory allows
$totalUsers = User::count();
$chunks = floor($totalUsers / $chunkSize);
for ($chunk = 0; $chunk <= $chunks; $chunk++) {
$offset = $chunk * $chunkSize;
$users = User::skip($offset)->take($chunkSize)->get();
foreach($users as $user)
{
// do something
$user->save();
}
}
If it takes too long, you'll probably get a timeout if you trigger this loop from HTTP, so you should probably run it from console.
DB::table("table")
->select('column1', 'column2')
->orderBy('column2', 'asc')
->chunk(70000, function($users) {
foreach ($users as $row) {
// To do with data
}
}

For loop causes memory exhausted

I am creating a website in the CMS/framework called Processwire now I do want to fill a few arrays with values i collected from database results (called pages in processwire).
Now what happens is that i get a memory error and I know that i should raise the memory limit but that doesn't feel right because the pages will even get more and more then it is already therefor the load will get bigger and i might end up increasing the memory limit even more. Now I would like kind of pick batches from the big pile and process them and then pick the next batch.
this is the part that causes to crash:
for($i = 0; $i<=$count_log; $i+=100)
{
$all_log_files = $pages->find("template=logs, start=$i, limit=100");
foreach($all_log_files as $log_file)
{
$u++;
if(empty($actions["$log_file->action"]))
{
$actions[$log_file->action] = 1;
}
if(empty($ips["$log_file->ip"]))
{
$ips[$log_file->ip] = 1;
}
if(empty($values["$log_file->value"]))
{
$values[$log_file->value] = 1;
}
if(empty($results["$log_file->result"]))
{
$results[$log_file->result] = 1;
}
}
}
Now as you can see i already tried to make "batches" but I failed since i'd still get the error...
What can i do this fix this or isn't there anything i can do and should i just raise the memory limit?
EDIT:
The specific error:
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 6 bytes) in /xxx/xxx/xxx/xxx/wire/core/DatabaseQuery.php on line 47
You're having a memory issue because the data that you're trying to store in memory all at once is too big. You can either increase the memory limit to hold all the data or store less data and change your algorithms to work with batches of the data.
For example, if you have a function that uses the values stored in $results, then you can try to change this function to work just as well with only part of the results, and call the function for each iteration, then empty the array after the iteration finishes.

Laravel DB Insert Error: Allowed Memory Size Exhausted

I'm running into an issue when trying to insert ~20K records into my DB. I notice that even though I'm echoing inside my foreach loop, I'm not getting anything outputted in the command line. Instead, I get an error after inserting ~9440 records relating to...
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried
to allocate 91 bytes) in
/Users/me/Sites/Laravel/database/connection.php on line 293
Here is my code (tried using both Eloquent and Fluent):
<?php
class Process_Controller extends Base_Controller
{
public function action_migrate()
{
$properties = DB::table('raw_properties')->get('id');
$total = count($properties);
foreach ($properties as $x => $p) {
$r = RawProperty::find($p->id);
$count = $x + 1;
$prop_details = array(
'column' => $r->field,
// Total of 21 fields
);
DB::table('properties')->insert($prop_details);
echo "Created #$count of $total\n";
}
}
}
The accepted answer is fixing the symptom rather then the problem. The problem is the Laravel query log (in memory) is eating all your RAM when you execute such a large # of queries. See the answer here: https://stackoverflow.com/a/18776710/221745
Or, in brief, turn off query logging via:
DB::disableQueryLog()
Before executing 20k queries
This error depicts that your PHP script has exhausted memory limit due to insufficient memory allocated for script.
You need to increase memory_limit using the ini_set function
e.g ini_set('memory_limit','128M');
I did the DB::disableQueryLog() and continued to get the error. I ended up Pausing Telescope from recording the queries. You can do this from the telescope web interface > queries > Click the Pause Icon.

Grabbing data using the YouTube API over 1000 results and without using the Max Memory

I am trying to grab an entire channels video feed (all of the videos data) and store it into my MySQL database for use in an application I am currently working on. I am not the most experienced with the YouTube API. The code I am working with is the following:
public function printVideoFeed($count)
{
$this->startIndex($count);
$data = $this->yt->getVideoFeed($this->query);
foreach($data as $video)
{
echo $count .' - '.$video->getVideoTitle().'<br/>';
$count++;
}
//check if there are more videos
try{
$nextFeed = $data->getNextFeed();
} catch(Zend_Gdata_App_Exception $e)
{
echo $e->getMessage(). '<br/>';
}
if($nextFeed)
{
$this->printVideoFeed($count);
}
}
The error I am getting is:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 36 bytes) in C:\Program Files\ZendFrameworkCli\library\Zend\Gdata\App\Base.php on line 431
This is one of a few errors I am getting while trying to grab upwards of 3000 videos. My question is how can I make this not continue to expand the memory usage while continuing to do the printVideoFeed method again. If there is a way to make it break out of the loop but restart if there are still videos left that would be awesome. Ive been looking but to google this question is kind of a hard thing to do (to get the results im looking for).
Have you tried using iteration instead of recursion? I can imagine that PHP might keep the variables declared in the function, i.e. especially $data, until the function is left. Alternatively, you could call unset($data); before starting the recursion.
Also: Are you sure you have no infinite loop? Maybe you need to call startIndex() again before calling getNextFeed()?

Categories