I have about 150.000 Records in my table and I want to fill in my array and then I want to search in it, but because the data is too much I am getting
** PHP Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 39 bytes) in /home//.....// on line 21**
I am trying to fill in my array via this code
$myArray=array();
while($row=mysql_fetch_array($myQueryName)){
$myArray[$row['WordName']]=$row;
echo $row['WordName'];
}
To avoid from that problem I think I can divide filling array into two or three like
0 - 50.000 array 1
50.000 - 100.000 array 2
100.000 - 150.000 array 3
then I will try to search in these 3 arrays.
How Can I solve it .
I totally Advise against using the php script to search the database,use the database engine to search for that data , it's optimized for that.
If you want to change the memory limit that php uses for a script,you can set memory_limit in php.ini (In linux, php.ini is located in /etc/php/).
memory_limit = 128M // This will be global
If u want it per script use ini_set() method in the beginning
ini_set("memory_limit","128M");
Related
I have found a problems about exporting PDF document by laravel-pdf.
I have used this library - https://github.com/niklasravnsborg/laravel-pdf
So I would like to export a huge of data to PDF (around ~10,000 rows).
The memory limit and max execution time were occurring when I tired to export them.
Many times, I tired to increase memory limit and max execution time on php.ini, but it didn't works. And I thought that its not a really correct solution (I have already increased memory limit up-to 2000mb. It worked for around ~5,000 rows and it take time more than 3 minutes).
Here is my simply code.
return PDF::loadView('report.member_report_pdf', ['data' => $huge_data], [], [
'format' => 'A2-L',
])->stream('report'.pdf');
Here is my errors I got.
[04-Oct-2017 19:04:43 Asia/Bangkok] PHP Fatal error: Maximum execution time of 320 seconds exceeded in /Applications/XAMPP/xamppfiles/htdocs/tab_member/vendor/dompdf/dompdf/src/Frame.php on line 325
[04-Oct-2017 19:15:15 Asia/Bangkok] PHP Fatal error: Allowed memory size of 2147483648 bytes exhausted (tried to allocate 72 bytes) in /Applications/XAMPP/xamppfiles/htdocs/tab_member/vendor/mpdf/mpdf/mpdf.php on line 24005
I use TYPO3 7.6 and solr 6.1.3 and solrfal 4.1.0. No I get a PHP memory limit error everytime I've tried to run the solrfal scheduler task. He is still on 57 %. I debugged and deleted the last file he tries to index. But the error was also thrown also with the next file.
I got the error:
Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 464255211 bytes) in /var/www/web1/htdocs/leipzig.ihk.de/typo3conf/ext/solr/Classes/ExtractingQuery.php on line 104
on this line file_get_contents() throws the error. The file has only 90KB. Has anybody an idea?
I'd recommend not uploading such a large file.
I would check to reduce the number of items per run, or increase the memory limit.
You need to Increase memory_limit in your php.ini file.
Got the error. In /typo3conf/ext/solrfal/Classes/Queue/ItemRepository.php on line 154, the first "merged_id" was empty. However this happens. I wrapped the line in a if statement and not it works again
if($mergeId['merge_id']) {
$rows[] = $mergeId['merge_id'];
}
Another solution would be to add merge_id > 0 to the where statement.
The previous propose solution does not fix the problem. For me this looks like, that the merge_id is not set in the database and therefore all items will be merged.
By defaut the merge_id has the following format:
'type/fileUid/languageUid/rootPageId'
If there are items in the file index queue without a merge_id your should clear the file index queue and fill it again.
I have a 16GB file. I'm trying to make an array that splits by line down. Right now I'm using file_get_contents with preg_split, using the following.
$list = preg_split('/$\R?^/m', file_get_contents("file.txt));
However, I get the following error:
Fatal error: Allowed memory size of 10695475200 bytes exhausted (tried to allocate 268435456 bytes) in /var/www/html/mysite/script.php on line 35
I don't want to use too much memory. I know you can buffer it with fopen, but I'm not sure how to create an array using the contents of the file with a line down being the delimiter.
The question does not address how I would make an array from the contents of the file using preg_split similar to how I do above.
I am receiving the following error on a script when it reaches the bind section of this code.
Error:
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 4294967296 bytes)
Code:
$result = $dbc->prepare('SELECT make, line, model, name, year, image, processor_family, processor_speeds, shipped_ram, max_ram, ram_type, video_card, video_memory, screen_type, screen_size, hard_drives, internal_drives, external_drives, audio_card, speakers, microphone, networking, external_ports, internal_ports, description, driver_link, manufacturer_link FROM laptop_archive where name=?');
$result->bind_param('s', $name);
$result->execute();
$result->bind_result($make, $line, $model, $name, $year, $image, $processor_family, $processor_speeds, $shipped_ram, $max_ram, $ram_type, $video_card, $video_memory, $screen_type, $screen_size, $hard_drive, $internal_drives, $external_drives, $audio_card, $speakers, $microphone, $networking, $external_ports, $internal_ports, $description, $driver_link, $manufacturer_link);
The portion of the database that it is attempting to access has a number of fields to retrieve although none of them contain a large amount of data with the majority being around 20 characters and the description at around 300 characters.
I have had a search which reveals a few answers although none of them have worked for me, the code is running on a VPS with 512MB RAM and has a memory limit of 256MB as set in php.ini.
The amount of memory to be allocated is 4GB which seems extremely excessive for what is in the database, am I missing something stupid here.
normal times it shouldnt take any huge memory because you didnt fetch the data.
$res = $mysqli->query("SELECT * FROM laptop_archive where name=[REPLACE THIS] limit 1")#
$res->fetch_assoc()
please try this one :)
The issue appears to be caused by the use of LONGTEXT in the database, after changing the type to VARCHAR the issue has gone away.
I had this error two days ago. From what I have learned about it the problems lies in the size of the column your selecting from. In my case I was selecting a pdf and I had to column as a long blob which is 4294967296 bytes. The server was allocating that much to grab the file regardless of the size of the file. So I had to change the column to a medium blob and it works fine. So in your case it looks like it would be the image. I would change that to a medium blob and it should work fine. Otherwise you would have to configure your server to allow bigger allocations of grabs.
I have the following code:
<?php
$FILE="giant-data-barf.txt";
$fp = fopen($FILE,'r');
//read everything into data
$data = fread($fp, filesize($FILE));
fclose($fp);
$data_arr = json_decode($data);
var_dump($data_arr);
?>
The file giant-data-barf.txt is, as its name suggests, a huge file(it's 5.4mb right now, but it could go up to several GB)
When I execute this script, I get the following error:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 71 bytes) in ........./data.php on line 12
I looked at possible solutions, and saw this:
ini_set('memory_limit','16M');
and my question is, is there a limit to how big I should set my memory? Or is there a better way of solving this problem?
THIS IS A VERY BAD IDEA, that said, you'll need to set
ini_set('memory_limit',filesize($FILE) + SOME_OVERHEAD_AMOUNT);
because you're reading the entire thing into memory. You may very well have to set the memory limit to two times the size of the file since you also want to JSON_DECODE
NOTE THAT ON A WEB SERVER THIS WILL CONSUME MASSIVE AMOUNTS OF MEMORY AND YOU SHOULD NOT DO THIS IF THE FILE WILL BE MANY GIGABYTES AS YOU SAID!!!!
Is it really a giant JSON blob? You should look at converting this to a database or other format which you can use random or row access with before parsing with PHP.
I've given all my servers a memory_limit of 100M... didn't run into trouble yet.
I would consider splitting up that file somehow, or get rid of it and use a database