Php laravel memory limit error - php

I am making currently migration from one database to another, project is on laravel so I am creating laravel command for this. I have one table with about 700000 records. I have created function with LIMIT and transactions to optimize query but still getting out of memory error from PHP.
Here is my code:
ini_set('memory_limit', '750M'); // at beginning of file
$circuit_c = DB::connection('legacy')->select('SELECT COUNT(*) FROM tbl_info');
$count = (array) $circuit_c[0];
$counc = $count['COUNT(*)'];
$max = 1000;
$pages = ceil($counc / $max);
for ($i = 1; $i < ($pages + 1); $i++) {
$offset = (($i - 1) * $max);
$start = ($offset == 0 ? 0 : ($offset + 1));
$infos = DB::connection('legacy')->select('SELECT * from tbl_info LIMIT ' . $offset . ', ' . $max);
DB::connection('mysql')->transaction(function() use ($infos) {
foreach ($infos as $info) {
$validator = Validator::make($data = (array) $info, Info::$rules);
if ($validator->passes()) {
if ($info->record_type == 'C') {
$b_user_new = Info::create($data);
unset($b_user_new);
}
}
unset($info);
unset($validator);
}
});
unset($infos);
}
Error is this:
user#lenovo /var/www/info $ php artisan migratedata
PHP Fatal error: Allowed memory size of 786432000 bytes exhausted (tried to allocate 32 bytes) in /var/www/info/vendor/laravel/framework/src/Illuminate/Database/Grammar.php on line 75
Error is show after importing about 50000 records.

There is kind of a "memory leak" in here. You need to find out which of the variables is hogging all of this memory. Try this function to debug and see which variable keep on growing constantly
function sizeofvar($var) {
$start_memory = memory_get_usage();
$tmp = unserialize(serialize($var));
return memory_get_usage() - $start_memory;
}
Once you know what variable is taking all the memory then you can start implementíng appropriate measures.

Found the answer, laravel caches all queries, so just: DB::connection()->disableQueryLog();

Related

How to create a log file in PHP

I have 2 functions to parse each one a file Excel their sizes are 673K and 131K. They have the same code just their names.
One function it read the data from file Excel and the other function it return:
Fatal Error: Allowed Memory Size of 134217728 Bytes Exhausted (Tried to allow 72 bytes)
I have others files Excel their sizes more bigger than this ones and their parsing functions works well.
I want to create a logfile to register every action they do inside the system but I have no idea how to do it. On the other hand I found this solution in Stackoverflow for #Lawrence Cherone:
enter link description here
But the problem is the first time I will do a logfile, I don't know how I create it ? I create a new file and I put it in my project ? How I excute it and how I can see the reason of the error in my function ? Or I put this file in the function when I have the error like the solution proposed by #Lawrence Cherone ?
This solution it seems worked fine and the problem is resolved.
This following is my small code of my function, can you guide me how I create this logfile to debug it:
public function parseEquipment($filePath = null) {
set_time_limit(0);
$listEquipement = [];
$count = 0;
$chunkSize = 1024;
$objReader = PHPExcel_IOFactory::createReader(PHPExcel_IOFactory::identify($filePath));
$spreadsheetInfo = $objReader->listWorksheetInfo($filePath);
$chunkFilter = new \Floose\Parse\ChunkReadFilter();
$objReader->setReadFilter($chunkFilter);
$objReader->setReadDataOnly(true);
$chunkFilter->setRows(0, 1);
$objPHPExcel = $objReader->load($filePath);
$totalRows = $spreadsheetInfo[0]['totalRows'];
for ($startRow = 1; $startRow <= $totalRows; $startRow += $chunkSize) {
$chunkFilter->setRows($startRow, $chunkSize);
$objPHPExcel = $objReader->load($filePath);
$sheetData = $objPHPExcel->getActiveSheet()->toArray(null, null, true, false);
$startIndex = ($startRow == 1) ? $startRow : $startRow - 1;
if (!empty($sheetData) && $startRow < $totalRows) {
$dataToAnalyse = array_slice($sheetData, $startIndex, $chunkSize);
//echo 'test1';
if($dataToAnalyse[1][0]==NULL){
//echo 'test2';
break;
}
//echo 'test3';
//var_dump($sheetData);
for ($i = 0; $i < $chunkSize; $i++) {
if ($dataToAnalyse[$i]['0'] != NULL) {
//echo 'OK';
$listEquipement[] = new Article($dataToAnalyse[$i]['3'], $dataToAnalyse[$i]['4'], $dataToAnalyse[$i]['2']);
// echo 'test4';
$count++;
}
}
}
$objPHPExcel->disconnectWorksheets();
unset($objPHPExcel, $sheetData);
}
//var_dump(array_slice($sheetData, $startIndex, $chunkSize););
return $listEquipement;
}
error_log("You messed up!".$my_message, 3, "/var/tmp/custom-errors.log");

FATAL ERROR: allocated bytes on simple php script

I am currently learning php, and am struggling to write the results of a function to an array. After a google search it seems as the the function I have created has a bug in it that will create an array that is too large:
Fatal Error: Allowed Memory Size of 134217728 Bytes Exhausted (CodeIgniter + XML-RPC)
Like I said, I am currently learning PHP, and the resources for a custom written script are fairly limited, therefore I am asking if anybody would be kind enough to give me a push in the right direction as to where my code is buggy...
function multiples($number) {
echo "$number entered";
$arr = array();
for ($i = 1; $i = $number; $i++)
{
if($i % 3 == 0){
$arr[] = $i;
}
else if($i % 5 == 0){
$arr[] = $i;
}
else {}
}
}
this returns this when accessing the file through xampp, '$number' being the argument for the function...
'$number' entered
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 36 bytes) in C:\xampp\htdocs\assessment\one.php on line 12
for ($i = 1; $i == $number; $i++) {
if($i % 3 == 0){
$arr[] = $i;
}
else if($i % 5 == 0){
$arr[] = $i;
}
else {}
}
Use == instead of =, since == is a conditional operator while = assigns a value.

GMP handle overflow

I have following PHP script which I call via CLI:
#!/usr/bin/php
<?php
$max_stellen = 10;
for ($base=2; $base<=62; $base++) {
for ($power=2; $power<=10; $power++) {
$result = array();
$max_base = gmp_pow($base, $max_stellen);
$x = gmp_init(0);
while ((gmp_cmp($x, $max_base) == -1)) {
$val = gmp_strval($x, $base);
$i = strlen($val);
$left = gmp_pow($x, $power);
$right = gmp_pow($base, $i);
$mod = gmp_mod($left, $right);
if (gmp_cmp($mod, $x) == 0) {
$result[] = $val;
}
unset($left);
unset($right);
unset($mod);
$x = gmp_add($x, 1); // !!! line 30
}
unset($x);
unset($max_base);
$res2 = array();
foreach ($result as &$r) {
$root = substr($r, -1);
$res2[$root][] = $r;
}
unset($result);
foreach ($res2 as $root => &$r) {
echo "X^${power}_${base}($root) = {".implode(', ', $r)."}\n";
}
unset($res2);
echo "\n";
}
}
After a short time (values base=6, power=9), I get following error message:
PHP Warning: gmp_add(): -2147483648 is not a valid GMP integer resource in ... on line 30
If I manually run the code with base=6, power=9, it works, so the error only happens when the loop is running multiple times.
The error message sounds like there are GMP handles allocated which are not freed when they are not used anymore. So after a short time, the handle is out of integer range. But how do I free GMP handles? I already tried to use unset() everywhere, and functions like gmp_free() or gmp_destroy() do not exist.
Update
Reported issue to PHP, since I believe that it is not an expected behavior: https://bugs.php.net/bug.php?id=69702

out of memory issue with PHP script

My PHP script is running out of memory. I have put memory_get_usage( ) inside the loops and found that it is running out of memory after 6MB.
I would like to clearly explain my script and like to get suggestions from you.
I am not sure, If my script is running out of memory because of problems with code .
So, here is the algorithm for my script:
Read the input json data into an array ( approximatley 200 lines of data)
calculate the sum of the array ($sum)
3 variables $starting, $ending and $compare.( Intialize $compare = $starting)
start with $starting=$vmin , calculate scaling factor, scale the original array and calculate sum of it .That new sum is $ending.
Add 10 to $starting until the diff between $ending and $compare is 50.
repeat all the steps until $vmin reaches $vmax with an increment of $vinc
The above algorithm might look insane, But it finds an optimal solution for supply chain related optimization problem.
I would like you to see if my weak coding capabilities are the reason for the memory outage .If so, Please suggest the changes to my script.
Here is my code
$vmax = 10000;
$vmin = 1000;
$vinc = 2000;
//decode the json data
$fp = fopen("data_save3.json", "r");
$inp=file_get_contents("data_save3.json");
fclose($fp);
$inp=json_decode($inp);
//calculate the array sum
foreach($inp as $i=>$element) {
foreach($element as $j=>$sub_element) {
$sum+= $inp[$i][$j];
}
}
//start at vmin and increment it until vmax
for(;$vmin <=$vmax;) {
$starting=$vmin;
$compare = $starting;
//calculate scaling factor
$scale = $starting/$sum;
//calculate the scaled array
$sum2 = 0;
$inp_info2 = $inp;
$ending = newscale($inp_info2,$scale,$sum2);
$optimal = getClosest($starting,$ending,$compare,$sum);
echo $optimal.PHP_EOL;
$vmin = $vmin+$vinc ;
}
// function to find the closest value
$inp_info1=$inp;
function getClosest($starting,$compare,$ending,$sum,$inp) {
global $sum2;
global $compare;
global $inp_info1;
global $starting , $ending, $scale1, $sum, $inp, $inp_info2, $inp_info3,$rowsum2, $rowsum3, $rowsum4, $rowsum5;
if (abs($ending - $compare) < 50){
return $starting;
} else{
$starting = $starting +10;
$scale1 = $starting/$sum;
$inp_info1 = $inp;
$sum2 = 0;
$ending = newscale($inp_info1,$scale1,$sum2);
return getClosest($starting);
}
}
//array scaling function
function newscale($array,$scale,$sum2){
global $sum2,$scale;
foreach($array as $i=>$element) {
foreach($element as $j=>$sub_element) {
$array[$i][$j]*= $scale;
$array[$i][$j] = truncate($array[$i][$j]);
$sum2+=$array[$i][$j];
}
}
return $sum2;
}
//truncate function
function truncate($num, $digits = 0) {
$shift = pow(10, $digits);
return ((floor($num * $shift)) / $shift);
}

memory leak in Zend_Db_Table_Row?

This is the code I have:
<?php
$start = memory_get_usage();
$table = new Zend_Db_Table('user');
for ($i = 0; $i < 5; $i++) {
$row = $table->createRow();
$row->name = 'Test ' . $i;
$row->save();
unset($row);
echo (memory_get_usage() - $start) . "\n";
}
This is what I see:
90664
93384
96056
98728
101400
Isn't it a memory leak? When I have 500 objects to insert into DB in one script I'm getting memory overflow. Can anyone help?
If you get a memory error if you insert 500 instead of 5, it really is a leak (could be some caching, too). If memory usage climbs up and down instead, it is normal: the garbage collector is freeing the memory again.

Categories