Working on a XML string variable, that normally occupies 1.4 Mb, happens to me that.
When is executed that part of the script:
echo memory_get_usage()." - ";
$aux_array['T3']=substr($xml, $array_min["T3"], strpos($xml, "</second>", $contador)-$array_min["T3"]);
print_r(memory_get_usage());
The display is
5059720 - 5059896
But when is that part:
echo memory_get_usage()." - ";
$aux_array['US']=substr($xml, $array_min["US"], strpos($xml, "</second>", $contador)-$array_min["US"]);
print_r(memory_get_usage());
the display is
5059896 - 6417152
To me, both orders are the same, but memory_get_usage() don't lie.
the exact output is:
PHP Fatal error: Allowed memory size of 268435456 bytes exhausted
(tried to allocate 1305035 bytes) in
/var/www/html/devel/soap/index.php on line 138
Because a while sentence makes allocate that size many times.
Can you figure it out what's the problem?
The problem is that you are holding ~268MB of data in your PHP script, even though your input XML file is only 1.4MB big. Clear/remove variables you don't need anymore to clean up some memory while your PHP script is running. Or change the way you work with the XML file. Instead of loading the whole XML file at once you might want to use an XMLReader which walks over the XML nodes instead.
Using php.exe 5.2.17.17 on Windows 7, this:
include_once('simple_html_dom.php');
function fix($setlink)
{
$setaddr = $setlink->href;
$filename="..\\".urldecode($setaddr);
$set=file_get_contents($filename);
$setstr = str_get_html($set);
// Do stuff requiring whole file
unset($set);
unset($setstr);
}
$setindexpath = "..\index.htm";
foreach(file_get_html($setindexpath)->find('a.setlink') as $setlink)
{
fix($setlink);
}
(relying on external data files) fails thus:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes) in [snip]\simple_html_dom.php on line 620
"function fix" is a suggestion from the answer to a similar question here. unset() is wishful thinking :-)
How may I avoid the continuing consumption of memory by the strings unused on the next loop iteration? Without defacing the code too much. And while providing the whole file as string.
try $setstr->clear(); before unset($setstr);
see http://simplehtmldom.sourceforge.net/manual_faq.htm#memory_leak
side note: $setstr seems to be a misnomer; it's not a string but the dom repesentation of the html doc.
I created a local web application to display and interact with data that I got an a big Json file (around 250 Mo). I have several function in order to display differently the data but currently each one starts by reading and parsing the Json file :
$string = file_get_contents("myFile.json");
$json_a = json_decode($string, true);
The thing is that it is quite slow (about 4 seconds) because the file is big... So I would like to parse the Json file once and for all and store the parsed data in memory so that each function could use it :
session_start();
$string = file_get_contents("myFile.json");
$_SESSION['json_a'] = json_decode($string, true);
and then use $_SESSION['json_a'] in my functions.
But I got an error when the function access to this $_SESSION['json_a'] variable :
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 195799278 bytes)
I assume it is because the file is too big.. but why does it crash when the variable is used and not when it is built ? And why can I do it with my first solution (parsing each time) if it is too big for the memory ?
And finally my real question : how can I optimise that ? (I know sql database would be much better but I happen to have json data)
I am receiving the following error on a script when it reaches the bind section of this code.
Error:
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 4294967296 bytes)
Code:
$result = $dbc->prepare('SELECT make, line, model, name, year, image, processor_family, processor_speeds, shipped_ram, max_ram, ram_type, video_card, video_memory, screen_type, screen_size, hard_drives, internal_drives, external_drives, audio_card, speakers, microphone, networking, external_ports, internal_ports, description, driver_link, manufacturer_link FROM laptop_archive where name=?');
$result->bind_param('s', $name);
$result->execute();
$result->bind_result($make, $line, $model, $name, $year, $image, $processor_family, $processor_speeds, $shipped_ram, $max_ram, $ram_type, $video_card, $video_memory, $screen_type, $screen_size, $hard_drive, $internal_drives, $external_drives, $audio_card, $speakers, $microphone, $networking, $external_ports, $internal_ports, $description, $driver_link, $manufacturer_link);
The portion of the database that it is attempting to access has a number of fields to retrieve although none of them contain a large amount of data with the majority being around 20 characters and the description at around 300 characters.
I have had a search which reveals a few answers although none of them have worked for me, the code is running on a VPS with 512MB RAM and has a memory limit of 256MB as set in php.ini.
The amount of memory to be allocated is 4GB which seems extremely excessive for what is in the database, am I missing something stupid here.
normal times it shouldnt take any huge memory because you didnt fetch the data.
$res = $mysqli->query("SELECT * FROM laptop_archive where name=[REPLACE THIS] limit 1")#
$res->fetch_assoc()
please try this one :)
The issue appears to be caused by the use of LONGTEXT in the database, after changing the type to VARCHAR the issue has gone away.
I had this error two days ago. From what I have learned about it the problems lies in the size of the column your selecting from. In my case I was selecting a pdf and I had to column as a long blob which is 4294967296 bytes. The server was allocating that much to grab the file regardless of the size of the file. So I had to change the column to a medium blob and it works fine. So in your case it looks like it would be the image. I would change that to a medium blob and it should work fine. Otherwise you would have to configure your server to allow bigger allocations of grabs.
I have been running a loop where I inserted 50,000 data sets into a MySQL database. Suddenly the server on my hosted space shows this message:
Fatal error: Allowed memory size of 20971520 bytes exhausted (tried to allocate 32 bytes) in /srv/www/htdocs/server/html/email.php on line
14
I know what it means, but how can I reset all the memory so it can be used again?
Note:
I do not want to increase my memory, like with
ini_set('memory_limit', '-1');
but want to completely release it.
When you have a shared system from a Provider in most cases you cannot set the memory limit for yourself. Then when every 100 People on the Server set a higher memory limit the provider has a problem.
So they give a fixed size in your case 20MB and disable the memory_limit function.
20MB is not so much. And when you try to insert a lot of datasets you need some memory. This is the reason why there comes the message 32 bytes from allowed memory 20971520 bytes.
I think with 20MB there is no real good solution to solve this problem. Perhaps you can try to put some logic to the database and make a stored procedures.
I know this is an old thread, but I found it so I think it's worth posting for others.
I came across this error whilst trying to carry out a bulk system update to a single field in every record.
The records were read into an array, processed and updated back to the database.
I believe the error was caused by too much data stored in the array on the server.
To get over the problem, I split the task up to handle 1000 records at a time.
i.e; (Code Tested.
$r_count=1; $start_id=0; $end_id=1000;
while($r_count>0)
{
$strSQL="SELECT * from spAddbook
WHERE id > '$start_id' AND id <= '$end_id'";
$result = mysql_query($strSQL, $link) or die ("Couldn't read because ".mysql_error());
$r_count = mysql_num_rows($result);
echo"<br>$r_count found<br>";
if (mysql_num_rows($result))
{
while ($qry = mysql_fetch_array($result))
{
// Read the Data and carry out processing
}
}
$start_id+=1000; $end_id+=1000;
}
echo"<br><br>All Done";
I Hope this helps.. It worked for me.