Memory limit while index solrfal in TYPO3 - php

I use TYPO3 7.6 and solr 6.1.3 and solrfal 4.1.0. No I get a PHP memory limit error everytime I've tried to run the solrfal scheduler task. He is still on 57 %. I debugged and deleted the last file he tries to index. But the error was also thrown also with the next file.
I got the error:
Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 464255211 bytes) in /var/www/web1/htdocs/leipzig.ihk.de/typo3conf/ext/solr/Classes/ExtractingQuery.php on line 104
on this line file_get_contents() throws the error. The file has only 90KB. Has anybody an idea?

I'd recommend not uploading such a large file.

I would check to reduce the number of items per run, or increase the memory limit.

You need to Increase memory_limit in your php.ini file.

Got the error. In /typo3conf/ext/solrfal/Classes/Queue/ItemRepository.php on line 154, the first "merged_id" was empty. However this happens. I wrapped the line in a if statement and not it works again
if($mergeId['merge_id']) {
$rows[] = $mergeId['merge_id'];
}
Another solution would be to add merge_id > 0 to the where statement.

The previous propose solution does not fix the problem. For me this looks like, that the merge_id is not set in the database and therefore all items will be merged.
By defaut the merge_id has the following format:
'type/fileUid/languageUid/rootPageId'
If there are items in the file index queue without a merge_id your should clear the file index queue and fill it again.

Related

Allocate memory failure

Working on a XML string variable, that normally occupies 1.4 Mb, happens to me that.
When is executed that part of the script:
echo memory_get_usage()." - ";
$aux_array['T3']=substr($xml, $array_min["T3"], strpos($xml, "</second>", $contador)-$array_min["T3"]);
print_r(memory_get_usage());
The display is
5059720 - 5059896
But when is that part:
echo memory_get_usage()." - ";
$aux_array['US']=substr($xml, $array_min["US"], strpos($xml, "</second>", $contador)-$array_min["US"]);
print_r(memory_get_usage());
the display is
5059896 - 6417152
To me, both orders are the same, but memory_get_usage() don't lie.
the exact output is:
PHP Fatal error: Allowed memory size of 268435456 bytes exhausted
(tried to allocate 1305035 bytes) in
/var/www/html/devel/soap/index.php on line 138
Because a while sentence makes allocate that size many times.
Can you figure it out what's the problem?
The problem is that you are holding ~268MB of data in your PHP script, even though your input XML file is only 1.4MB big. Clear/remove variables you don't need anymore to clean up some memory while your PHP script is running. Or change the way you work with the XML file. Instead of loading the whole XML file at once you might want to use an XMLReader which walks over the XML nodes instead.

Attempting to parse large file to array in PHP/Wordpress. Keep getting memory error

I am trying to get a lot of words into a database as individual entries.
The file is a 17MB text file of comma separated words.
The PHP is:
$file = file_get_contents(plugins_url( '/cipher.txt' , __FILE__ ));
$words = explode(',', $file);
foreach($words as $word){
$wpdb->query("INSERT IGNORE INTO cipher_words (word) VALUES (\"" . $word . "\")");
}
I keep running into a memory error similar to:
[20-Feb-2016 15:26:26 UTC] PHP Fatal error: Out of memory (allocated 247726080) (tried to allocate 16777216 bytes) in C:\xampp\htdocs\testsite\wp-content\plugins\joshscipher\cipher.php on line 26
[20-Feb-2016 15:26:29 UTC] PHP Fatal error: Out of memory (allocated 139460608) (tried to allocate 8388608 bytes) in C:\xampp\htdocs\testsite\wp-content\plugins\joshscipher\cipher.php on line 26
[20-Feb-2016 15:26:29 UTC] PHP Fatal error: Out of memory (allocated 247726080) (tried to allocate 16777216 bytes) in C:\xampp\htdocs\testsite\wp-content\plugins\joshscipher\cipher.php on line 26
Is there a better way to handle such a large array? Something maybe asynchronous?
Would a CSV work better, or would it just meet the same limit?
I have tried increasing PHP limits and WP limits to no avail.
Edit: The question marked as a duplicate does not get a memory error. Or any error at all.
Here is a quick python script if you want to give that a try, it usually handles larger amounts of data better than PHP.
import string
import mysql.connector
# Database connection
cnx = mysql.connector.connect(user='admin', password='password', database='python')
cursor = cnx.cursor()
# Get the file contents
with open('cipher.txt', 'r') as content_file:
content = content_file.read()
# 'Explode' the string
content_array = string.split(content, ',')
# Foreach item, insert into db
for x in content_array:
query = "INSERT IGNORE INTO cipher_words (word) VALUES ('"+x+"');"
cursor.execute(query)
# Make sure data is committed to the database
cnx.commit()
# Close database connections
cursor.close()
cnx.close()
Are you open to increasing your PHP memory limit just for the import?
You can try putting this in the top of your php file.
ini_set('memory_limit', '128M');
Or you can change it globally in php.ini. (\xampp\php\php.ini)
memory_limit = 128M
You'll probably need to restart apache after changing the global memory limit.

Filling Array Allowed memory size of

I have about 150.000 Records in my table and I want to fill in my array and then I want to search in it, but because the data is too much I am getting
** PHP Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 39 bytes) in /home//.....// on line 21**
I am trying to fill in my array via this code
$myArray=array();
while($row=mysql_fetch_array($myQueryName)){
$myArray[$row['WordName']]=$row;
echo $row['WordName'];
}
To avoid from that problem I think I can divide filling array into two or three like
0 - 50.000 array 1
50.000 - 100.000 array 2
100.000 - 150.000 array 3
then I will try to search in these 3 arrays.
How Can I solve it .
I totally Advise against using the php script to search the database,use the database engine to search for that data , it's optimized for that.
If you want to change the memory limit that php uses for a script,you can set memory_limit in php.ini (In linux, php.ini is located in /etc/php/).
memory_limit = 128M // This will be global
If u want it per script use ini_set() method in the beginning
ini_set("memory_limit","128M");

Allowed memory size exhausted error < how to reset?

I have been running a loop where I inserted 50,000 data sets into a MySQL database. Suddenly the server on my hosted space shows this message:
Fatal error: Allowed memory size of 20971520 bytes exhausted (tried to allocate 32 bytes) in /srv/www/htdocs/server/html/email.php on line
14
I know what it means, but how can I reset all the memory so it can be used again?
Note:
I do not want to increase my memory, like with
ini_set('memory_limit', '-1');
but want to completely release it.
When you have a shared system from a Provider in most cases you cannot set the memory limit for yourself. Then when every 100 People on the Server set a higher memory limit the provider has a problem.
So they give a fixed size in your case 20MB and disable the memory_limit function.
20MB is not so much. And when you try to insert a lot of datasets you need some memory. This is the reason why there comes the message 32 bytes from allowed memory 20971520 bytes.
I think with 20MB there is no real good solution to solve this problem. Perhaps you can try to put some logic to the database and make a stored procedures.
I know this is an old thread, but I found it so I think it's worth posting for others.
I came across this error whilst trying to carry out a bulk system update to a single field in every record.
The records were read into an array, processed and updated back to the database.
I believe the error was caused by too much data stored in the array on the server.
To get over the problem, I split the task up to handle 1000 records at a time.
i.e; (Code Tested.
$r_count=1; $start_id=0; $end_id=1000;
while($r_count>0)
{
$strSQL="SELECT * from spAddbook
WHERE id > '$start_id' AND id <= '$end_id'";
$result = mysql_query($strSQL, $link) or die ("Couldn't read because ".mysql_error());
$r_count = mysql_num_rows($result);
echo"<br>$r_count found<br>";
if (mysql_num_rows($result))
{
while ($qry = mysql_fetch_array($result))
{
// Read the Data and carry out processing
}
}
$start_id+=1000; $end_id+=1000;
}
echo"<br><br>All Done";
I Hope this helps.. It worked for me.

Memory exhausted error for json_parse with PHP

I have the following code:
<?php
$FILE="giant-data-barf.txt";
$fp = fopen($FILE,'r');
//read everything into data
$data = fread($fp, filesize($FILE));
fclose($fp);
$data_arr = json_decode($data);
var_dump($data_arr);
?>
The file giant-data-barf.txt is, as its name suggests, a huge file(it's 5.4mb right now, but it could go up to several GB)
When I execute this script, I get the following error:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 71 bytes) in ........./data.php on line 12
I looked at possible solutions, and saw this:
ini_set('memory_limit','16M');
and my question is, is there a limit to how big I should set my memory? Or is there a better way of solving this problem?
THIS IS A VERY BAD IDEA, that said, you'll need to set
ini_set('memory_limit',filesize($FILE) + SOME_OVERHEAD_AMOUNT);
because you're reading the entire thing into memory. You may very well have to set the memory limit to two times the size of the file since you also want to JSON_DECODE
NOTE THAT ON A WEB SERVER THIS WILL CONSUME MASSIVE AMOUNTS OF MEMORY AND YOU SHOULD NOT DO THIS IF THE FILE WILL BE MANY GIGABYTES AS YOU SAID!!!!
Is it really a giant JSON blob? You should look at converting this to a database or other format which you can use random or row access with before parsing with PHP.
I've given all my servers a memory_limit of 100M... didn't run into trouble yet.
I would consider splitting up that file somehow, or get rid of it and use a database

Categories