This is the code I have:
<?php
$start = memory_get_usage();
$table = new Zend_Db_Table('user');
for ($i = 0; $i < 5; $i++) {
$row = $table->createRow();
$row->name = 'Test ' . $i;
$row->save();
unset($row);
echo (memory_get_usage() - $start) . "\n";
}
This is what I see:
90664
93384
96056
98728
101400
Isn't it a memory leak? When I have 500 objects to insert into DB in one script I'm getting memory overflow. Can anyone help?
If you get a memory error if you insert 500 instead of 5, it really is a leak (could be some caching, too). If memory usage climbs up and down instead, it is normal: the garbage collector is freeing the memory again.
Related
I am making currently migration from one database to another, project is on laravel so I am creating laravel command for this. I have one table with about 700000 records. I have created function with LIMIT and transactions to optimize query but still getting out of memory error from PHP.
Here is my code:
ini_set('memory_limit', '750M'); // at beginning of file
$circuit_c = DB::connection('legacy')->select('SELECT COUNT(*) FROM tbl_info');
$count = (array) $circuit_c[0];
$counc = $count['COUNT(*)'];
$max = 1000;
$pages = ceil($counc / $max);
for ($i = 1; $i < ($pages + 1); $i++) {
$offset = (($i - 1) * $max);
$start = ($offset == 0 ? 0 : ($offset + 1));
$infos = DB::connection('legacy')->select('SELECT * from tbl_info LIMIT ' . $offset . ', ' . $max);
DB::connection('mysql')->transaction(function() use ($infos) {
foreach ($infos as $info) {
$validator = Validator::make($data = (array) $info, Info::$rules);
if ($validator->passes()) {
if ($info->record_type == 'C') {
$b_user_new = Info::create($data);
unset($b_user_new);
}
}
unset($info);
unset($validator);
}
});
unset($infos);
}
Error is this:
user#lenovo /var/www/info $ php artisan migratedata
PHP Fatal error: Allowed memory size of 786432000 bytes exhausted (tried to allocate 32 bytes) in /var/www/info/vendor/laravel/framework/src/Illuminate/Database/Grammar.php on line 75
Error is show after importing about 50000 records.
There is kind of a "memory leak" in here. You need to find out which of the variables is hogging all of this memory. Try this function to debug and see which variable keep on growing constantly
function sizeofvar($var) {
$start_memory = memory_get_usage();
$tmp = unserialize(serialize($var));
return memory_get_usage() - $start_memory;
}
Once you know what variable is taking all the memory then you can start implementíng appropriate measures.
Found the answer, laravel caches all queries, so just: DB::connection()->disableQueryLog();
I'm using a clustered redis. All I want to do is adding a new value to a limited set.
This is my code:
$redis->watch(keyMem)
$count = $redis->scard($keyMem);
if($count < $limit)
$redis->multi()
->sadd($keyMem, $value)
->exec();
and I get:
"cannot use 'watch' over clusters of connections."
I couldn't find any solution and i coded my own lock.
$keyLock = $keyMem."lock";
$start_time = microtime();
while(true){
if($redis->setnx($keyLock, "1")){
$count = $redis->scard($keyMem);
if(i$count < $limit){
$r = $tredis->sadd($keyMem, $value);
}
$redis->del($keyLock);
}
else{
if(microtime() - $start_time > 0.05){
$tredis->del($keyLock);
}
}
}
I am trying to construct a php script that will page through an API. The api return ~25197 XML records. I am able to pass a start_offset and a end_offset to the API which will return a subset of the results.
The challenge I am having is that the for loop is not capturing the remaining records that are not within the 1000.
Example, the current for loop processes the records in blocks of 1000 (0-1000,1001-2000,2001-3000, etc.) I am not able to get the final block - 25,000 to 26,000. The for loop stop processing at 24,000 - 25,000. This leaves me with 197 unprocessed XML results.
<?php
//Set Start and Offset Parameters
$start_offset = 0;
$end_offset = 0;
$items_per_page = 1000;
$number = 0;
$counter = -2;
for ($count=0; $count<=100; $count++) {
$counter++;
//Validate that the counter is not null
if ($number != null){
echo "\n";
echo file_get_contents($static_url . "/sc_vuln_query-compliance.php?start=$start_offset&end=$end_offset&seq=$counter");
}
//Initialize the start and end offset variables
$end_offset = $number+=$items_per_page;
$start_offset = $number-$items_per_page+1;
//We want to start at record 0, reset start_offset back to 0 instead of 1
if($start_offset == 1) {
$start_offset = $number-$items_per_page;
}
// We are at the end of the total records, display the remaining
if ($number>$total_xml_records) {
$counter = $counter+1;
$padding = $end_offset + $items_per_page;
echo "\n";
echo file_get_contents($static_url. "/sc_vuln_query-compliance.php?start=$start_offset&end=$padding&seq=$counter");
break;
}
}
?>
Your code is at least buggy at this line $padding = $end_offset + $items_per_page; - here you raise the end for your last loop by another 1000 and therefor get ?start=25001&end=27000&seq=25.
Try $padding = $total_xml_records; instead, this will get you ?start=25001&end=25197&seq=25.
Anyway your code is quite complicated. Try this:
$total_xml_records = 25197; // index 0 .. 25196
$offset = 0;
$counter = 0;
while ($offset < $total_xml_records) {
echo "\n";
echo $static_url . "/sc_vuln_query-compliance.php?start=".($offset)."&end=".(min($offset+$items_per_page-1, $total_xml_records-1))."&seq=".($counter++);
$offset += $items_per_page;
}
I have a doubt in memory allocation with a php 5.3 script.
Imagine you have 2 static classes (MyData and Test) like these:
class MyData {
private static $data = null;
public static function getData() {
if(self::$data == null)
self::$data = array(1,2,3,4,5,);
return self::$data;
}
}
class Test {
private static $test_data = null;
public static function getTestData1() {
if(self::$test_data==null) {
self::$test_data = MyData::getData();
self::$test_data[] = 6;
}
return self::$test_data;
}
public static function getTestData2() {
$test = MyData::getData();
$test[] = 6;
return $test;
}
}
And a simple test.php script:
for($i = 0; $i < 200000; $i++) {
echo "Pre-data1 Test:\n\t" . memory_get_usage(true) . "\n";
Test::getTestData1();
echo "Post-data1 Test:\n\t" . memory_get_usage(true) . "\n";
}
for($i = 0; $i < 200000; $i++) {
echo "Pre-data2 Test:\n\t" . memory_get_usage(true) . "\n";
Test::getTestData2();
echo "Post-data2 Test:\n\t" . memory_get_usage(true) . "\n";
}
I might suppose that the call to Test::getTestData1() will alloc memory for 2 static variables, while Test::getTestData2() will destroy $test (the copy of the static variable) on function return, so the second call is less "memory expensive".
But if I run the test.php script, memory_get_usage will show the same values for Test::getTestData1() and Test::getTestData2()
Why?
You are testing memory usage in the wrong way.
use memory_get_usage(false); to get the memory that is actually used by the script.
memory_get_usage(true); simply returns the memory allocated by System and this will always be the same for small scripts.
this is extremely basic I'm sure but I haven't used PHP ever and thus finding it hard, a code that I am using is giving me an error and I'm unsure what I can do to fix it
<?php
$query = 'dogs';
$searches = 100; // number of results
$start = 0;
$pos = 1;
while($start < $searches)
{
$data = getPage('http://www.google.com/search?start=' . $start . '&q=' . urlencode($query));
preg_match_all("/\<li class\=g\>\<h3 class\=\"r\"\>\<a href\=\"([^\<\>]*)\" class\=l\>/",$data,$matches);
for($x = 0; $x < count($matches[1]); $x++)
{
echo '<p>' . $pos . ' ' . ($matches[1][$x]) . '</p>';
$pos++;
}
$start += 10;
}
?>
Error:
Call to undefined function getPage() on line 11
Any help?
There is no "getPage" function in PHP (unless you defined it).
It looks like the function file_get_contents() is what your going for.
Is the getPage() function being defined somewhere else that you're not including?
My guess is you want to use file_get_contents() instead.
There's no getPage(); function in your code and in PHP. You have to have one in order to call/use it.
See file_get_contents(); | fopen();