Avoiding wasteful memory consumption - php

Using php.exe 5.2.17.17 on Windows 7, this:
include_once('simple_html_dom.php');
function fix($setlink)
{
$setaddr = $setlink->href;
$filename="..\\".urldecode($setaddr);
$set=file_get_contents($filename);
$setstr = str_get_html($set);
// Do stuff requiring whole file
unset($set);
unset($setstr);
}
$setindexpath = "..\index.htm";
foreach(file_get_html($setindexpath)->find('a.setlink') as $setlink)
{
fix($setlink);
}
(relying on external data files) fails thus:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes) in [snip]\simple_html_dom.php on line 620
"function fix" is a suggestion from the answer to a similar question here. unset() is wishful thinking :-)
How may I avoid the continuing consumption of memory by the strings unused on the next loop iteration? Without defacing the code too much. And while providing the whole file as string.

try $setstr->clear(); before unset($setstr);
see http://simplehtmldom.sourceforge.net/manual_faq.htm#memory_leak
side note: $setstr seems to be a misnomer; it's not a string but the dom repesentation of the html doc.

Related

Allocate memory failure

Working on a XML string variable, that normally occupies 1.4 Mb, happens to me that.
When is executed that part of the script:
echo memory_get_usage()." - ";
$aux_array['T3']=substr($xml, $array_min["T3"], strpos($xml, "</second>", $contador)-$array_min["T3"]);
print_r(memory_get_usage());
The display is
5059720 - 5059896
But when is that part:
echo memory_get_usage()." - ";
$aux_array['US']=substr($xml, $array_min["US"], strpos($xml, "</second>", $contador)-$array_min["US"]);
print_r(memory_get_usage());
the display is
5059896 - 6417152
To me, both orders are the same, but memory_get_usage() don't lie.
the exact output is:
PHP Fatal error: Allowed memory size of 268435456 bytes exhausted
(tried to allocate 1305035 bytes) in
/var/www/html/devel/soap/index.php on line 138
Because a while sentence makes allocate that size many times.
Can you figure it out what's the problem?
The problem is that you are holding ~268MB of data in your PHP script, even though your input XML file is only 1.4MB big. Clear/remove variables you don't need anymore to clean up some memory while your PHP script is running. Or change the way you work with the XML file. Instead of loading the whole XML file at once you might want to use an XMLReader which walks over the XML nodes instead.

Make an array out of contents of a very large file

I have a 16GB file. I'm trying to make an array that splits by line down. Right now I'm using file_get_contents with preg_split, using the following.
$list = preg_split('/$\R?^/m', file_get_contents("file.txt));
However, I get the following error:
Fatal error: Allowed memory size of 10695475200 bytes exhausted (tried to allocate 268435456 bytes) in /var/www/html/mysite/script.php on line 35
I don't want to use too much memory. I know you can buffer it with fopen, but I'm not sure how to create an array using the contents of the file with a line down being the delimiter.
The question does not address how I would make an array from the contents of the file using preg_split similar to how I do above.

Attempting to parse large file to array in PHP/Wordpress. Keep getting memory error

I am trying to get a lot of words into a database as individual entries.
The file is a 17MB text file of comma separated words.
The PHP is:
$file = file_get_contents(plugins_url( '/cipher.txt' , __FILE__ ));
$words = explode(',', $file);
foreach($words as $word){
$wpdb->query("INSERT IGNORE INTO cipher_words (word) VALUES (\"" . $word . "\")");
}
I keep running into a memory error similar to:
[20-Feb-2016 15:26:26 UTC] PHP Fatal error: Out of memory (allocated 247726080) (tried to allocate 16777216 bytes) in C:\xampp\htdocs\testsite\wp-content\plugins\joshscipher\cipher.php on line 26
[20-Feb-2016 15:26:29 UTC] PHP Fatal error: Out of memory (allocated 139460608) (tried to allocate 8388608 bytes) in C:\xampp\htdocs\testsite\wp-content\plugins\joshscipher\cipher.php on line 26
[20-Feb-2016 15:26:29 UTC] PHP Fatal error: Out of memory (allocated 247726080) (tried to allocate 16777216 bytes) in C:\xampp\htdocs\testsite\wp-content\plugins\joshscipher\cipher.php on line 26
Is there a better way to handle such a large array? Something maybe asynchronous?
Would a CSV work better, or would it just meet the same limit?
I have tried increasing PHP limits and WP limits to no avail.
Edit: The question marked as a duplicate does not get a memory error. Or any error at all.
Here is a quick python script if you want to give that a try, it usually handles larger amounts of data better than PHP.
import string
import mysql.connector
# Database connection
cnx = mysql.connector.connect(user='admin', password='password', database='python')
cursor = cnx.cursor()
# Get the file contents
with open('cipher.txt', 'r') as content_file:
content = content_file.read()
# 'Explode' the string
content_array = string.split(content, ',')
# Foreach item, insert into db
for x in content_array:
query = "INSERT IGNORE INTO cipher_words (word) VALUES ('"+x+"');"
cursor.execute(query)
# Make sure data is committed to the database
cnx.commit()
# Close database connections
cursor.close()
cnx.close()
Are you open to increasing your PHP memory limit just for the import?
You can try putting this in the top of your php file.
ini_set('memory_limit', '128M');
Or you can change it globally in php.ini. (\xampp\php\php.ini)
memory_limit = 128M
You'll probably need to restart apache after changing the global memory limit.

php-excel-reader Allowed memory size of 134217728 bytes exhausted

require_once 'Excel/reader.php';
$data = new Spreadsheet_Excel_Reader();
$data->read('Senator.xls');
I get the following error in my error.log
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes)
The weird thing is, this works perfectly fine on my development instance. But not in production. What differences should I be looking for.
note: both envs have memory_limit=128M
Probably one server has a 64 bit processor. The GetInt4d bit shift doesn't work with 64 bit processors.
Use this hack to ensure correct result of the <<24 block on 32 and 64bit systems, just replace the code of the GetInt4d function with the following: Location : Excel/olereader.inc line no-27 ,function GetInt4d()
$_or_24 = ord($data[$pos+3]);
if ($_or_24>=128) $_ord_24 = -abs((256-$_or_24) << 24); else $_ord_24 = ($_or_24&127) << 24;
return ord($data[$pos]) | (ord($data[$pos+1]) << 8) | (ord($data[$pos+2]) << 16) | $_ord_24;
Depends on witch line it's giving the error: mine is
PHP Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 32 bytes) in [] on line 1688
it is the function
addcell($row, $col, $string, $info=null) {
in particular the foreach loop. As i understood it's info about color cell offset or something similar so i commented the loop and now it is using much less memory.
If you don't need the execution of such code you can comment it and try.

Memory exhausted error for json_parse with PHP

I have the following code:
<?php
$FILE="giant-data-barf.txt";
$fp = fopen($FILE,'r');
//read everything into data
$data = fread($fp, filesize($FILE));
fclose($fp);
$data_arr = json_decode($data);
var_dump($data_arr);
?>
The file giant-data-barf.txt is, as its name suggests, a huge file(it's 5.4mb right now, but it could go up to several GB)
When I execute this script, I get the following error:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 71 bytes) in ........./data.php on line 12
I looked at possible solutions, and saw this:
ini_set('memory_limit','16M');
and my question is, is there a limit to how big I should set my memory? Or is there a better way of solving this problem?
THIS IS A VERY BAD IDEA, that said, you'll need to set
ini_set('memory_limit',filesize($FILE) + SOME_OVERHEAD_AMOUNT);
because you're reading the entire thing into memory. You may very well have to set the memory limit to two times the size of the file since you also want to JSON_DECODE
NOTE THAT ON A WEB SERVER THIS WILL CONSUME MASSIVE AMOUNTS OF MEMORY AND YOU SHOULD NOT DO THIS IF THE FILE WILL BE MANY GIGABYTES AS YOU SAID!!!!
Is it really a giant JSON blob? You should look at converting this to a database or other format which you can use random or row access with before parsing with PHP.
I've given all my servers a memory_limit of 100M... didn't run into trouble yet.
I would consider splitting up that file somehow, or get rid of it and use a database

Categories