Memory Error in PHP Using SQL Query - php

I have a PHP page that has a search box and searches media from a fairly large database. I have made it so that if you only put the first 3 characters (e.g. V75) all the V75 tapes are shown (V75000, V75001, etc.). However when I search just V7 it gives me this error:
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 47 bytes) on line 68
$query->execute(); // LINE 66
$result = $query->fetchAll(PDO::FETCH_ASSOC); // LINE 68
}catch (Exception $e) // LINE 69
{
die('Cant fetch rows.'); // LINE 70
I was wondering what I must change for it to display the V7 tapes as well?
The approximate number of 'V7' tapes is 255000

Note that, even if you set the PHP memory limit higher to allow for this many results, sending more than 256 MB to the user for a search query will make the request terribly slow if it doesn't even time out.
Use some sort of pagination and limit your queries.

Related

Laravel PDF exports a huge of data

I have found a problems about exporting PDF document by laravel-pdf.
I have used this library - https://github.com/niklasravnsborg/laravel-pdf
So I would like to export a huge of data to PDF (around ~10,000 rows).
The memory limit and max execution time were occurring when I tired to export them.
Many times, I tired to increase memory limit and max execution time on php.ini, but it didn't works. And I thought that its not a really correct solution (I have already increased memory limit up-to 2000mb. It worked for around ~5,000 rows and it take time more than 3 minutes).
Here is my simply code.
return PDF::loadView('report.member_report_pdf', ['data' => $huge_data], [], [
'format' => 'A2-L',
])->stream('report'.pdf');
Here is my errors I got.
[04-Oct-2017 19:04:43 Asia/Bangkok] PHP Fatal error: Maximum execution time of 320 seconds exceeded in /Applications/XAMPP/xamppfiles/htdocs/tab_member/vendor/dompdf/dompdf/src/Frame.php on line 325
[04-Oct-2017 19:15:15 Asia/Bangkok] PHP Fatal error: Allowed memory size of 2147483648 bytes exhausted (tried to allocate 72 bytes) in /Applications/XAMPP/xamppfiles/htdocs/tab_member/vendor/mpdf/mpdf/mpdf.php on line 24005

Memory limit while index solrfal in TYPO3

I use TYPO3 7.6 and solr 6.1.3 and solrfal 4.1.0. No I get a PHP memory limit error everytime I've tried to run the solrfal scheduler task. He is still on 57 %. I debugged and deleted the last file he tries to index. But the error was also thrown also with the next file.
I got the error:
Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 464255211 bytes) in /var/www/web1/htdocs/leipzig.ihk.de/typo3conf/ext/solr/Classes/ExtractingQuery.php on line 104
on this line file_get_contents() throws the error. The file has only 90KB. Has anybody an idea?
I'd recommend not uploading such a large file.
I would check to reduce the number of items per run, or increase the memory limit.
You need to Increase memory_limit in your php.ini file.
Got the error. In /typo3conf/ext/solrfal/Classes/Queue/ItemRepository.php on line 154, the first "merged_id" was empty. However this happens. I wrapped the line in a if statement and not it works again
if($mergeId['merge_id']) {
$rows[] = $mergeId['merge_id'];
}
Another solution would be to add merge_id > 0 to the where statement.
The previous propose solution does not fix the problem. For me this looks like, that the merge_id is not set in the database and therefore all items will be merged.
By defaut the merge_id has the following format:
'type/fileUid/languageUid/rootPageId'
If there are items in the file index queue without a merge_id your should clear the file index queue and fill it again.

Attempting to parse large file to array in PHP/Wordpress. Keep getting memory error

I am trying to get a lot of words into a database as individual entries.
The file is a 17MB text file of comma separated words.
The PHP is:
$file = file_get_contents(plugins_url( '/cipher.txt' , __FILE__ ));
$words = explode(',', $file);
foreach($words as $word){
$wpdb->query("INSERT IGNORE INTO cipher_words (word) VALUES (\"" . $word . "\")");
}
I keep running into a memory error similar to:
[20-Feb-2016 15:26:26 UTC] PHP Fatal error: Out of memory (allocated 247726080) (tried to allocate 16777216 bytes) in C:\xampp\htdocs\testsite\wp-content\plugins\joshscipher\cipher.php on line 26
[20-Feb-2016 15:26:29 UTC] PHP Fatal error: Out of memory (allocated 139460608) (tried to allocate 8388608 bytes) in C:\xampp\htdocs\testsite\wp-content\plugins\joshscipher\cipher.php on line 26
[20-Feb-2016 15:26:29 UTC] PHP Fatal error: Out of memory (allocated 247726080) (tried to allocate 16777216 bytes) in C:\xampp\htdocs\testsite\wp-content\plugins\joshscipher\cipher.php on line 26
Is there a better way to handle such a large array? Something maybe asynchronous?
Would a CSV work better, or would it just meet the same limit?
I have tried increasing PHP limits and WP limits to no avail.
Edit: The question marked as a duplicate does not get a memory error. Or any error at all.
Here is a quick python script if you want to give that a try, it usually handles larger amounts of data better than PHP.
import string
import mysql.connector
# Database connection
cnx = mysql.connector.connect(user='admin', password='password', database='python')
cursor = cnx.cursor()
# Get the file contents
with open('cipher.txt', 'r') as content_file:
content = content_file.read()
# 'Explode' the string
content_array = string.split(content, ',')
# Foreach item, insert into db
for x in content_array:
query = "INSERT IGNORE INTO cipher_words (word) VALUES ('"+x+"');"
cursor.execute(query)
# Make sure data is committed to the database
cnx.commit()
# Close database connections
cursor.close()
cnx.close()
Are you open to increasing your PHP memory limit just for the import?
You can try putting this in the top of your php file.
ini_set('memory_limit', '128M');
Or you can change it globally in php.ini. (\xampp\php\php.ini)
memory_limit = 128M
You'll probably need to restart apache after changing the global memory limit.

Filling Array Allowed memory size of

I have about 150.000 Records in my table and I want to fill in my array and then I want to search in it, but because the data is too much I am getting
** PHP Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 39 bytes) in /home//.....// on line 21**
I am trying to fill in my array via this code
$myArray=array();
while($row=mysql_fetch_array($myQueryName)){
$myArray[$row['WordName']]=$row;
echo $row['WordName'];
}
To avoid from that problem I think I can divide filling array into two or three like
0 - 50.000 array 1
50.000 - 100.000 array 2
100.000 - 150.000 array 3
then I will try to search in these 3 arrays.
How Can I solve it .
I totally Advise against using the php script to search the database,use the database engine to search for that data , it's optimized for that.
If you want to change the memory limit that php uses for a script,you can set memory_limit in php.ini (In linux, php.ini is located in /etc/php/).
memory_limit = 128M // This will be global
If u want it per script use ini_set() method in the beginning
ini_set("memory_limit","128M");

Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 4294967296 bytes)

I am receiving the following error on a script when it reaches the bind section of this code.
Error:
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 4294967296 bytes)
Code:
$result = $dbc->prepare('SELECT make, line, model, name, year, image, processor_family, processor_speeds, shipped_ram, max_ram, ram_type, video_card, video_memory, screen_type, screen_size, hard_drives, internal_drives, external_drives, audio_card, speakers, microphone, networking, external_ports, internal_ports, description, driver_link, manufacturer_link FROM laptop_archive where name=?');
$result->bind_param('s', $name);
$result->execute();
$result->bind_result($make, $line, $model, $name, $year, $image, $processor_family, $processor_speeds, $shipped_ram, $max_ram, $ram_type, $video_card, $video_memory, $screen_type, $screen_size, $hard_drive, $internal_drives, $external_drives, $audio_card, $speakers, $microphone, $networking, $external_ports, $internal_ports, $description, $driver_link, $manufacturer_link);
The portion of the database that it is attempting to access has a number of fields to retrieve although none of them contain a large amount of data with the majority being around 20 characters and the description at around 300 characters.
I have had a search which reveals a few answers although none of them have worked for me, the code is running on a VPS with 512MB RAM and has a memory limit of 256MB as set in php.ini.
The amount of memory to be allocated is 4GB which seems extremely excessive for what is in the database, am I missing something stupid here.
normal times it shouldnt take any huge memory because you didnt fetch the data.
$res = $mysqli->query("SELECT * FROM laptop_archive where name=[REPLACE THIS] limit 1")#
$res->fetch_assoc()
please try this one :)
The issue appears to be caused by the use of LONGTEXT in the database, after changing the type to VARCHAR the issue has gone away.
I had this error two days ago. From what I have learned about it the problems lies in the size of the column your selecting from. In my case I was selecting a pdf and I had to column as a long blob which is 4294967296 bytes. The server was allocating that much to grab the file regardless of the size of the file. So I had to change the column to a medium blob and it works fine. So in your case it looks like it would be the image. I would change that to a medium blob and it should work fine. Otherwise you would have to configure your server to allow bigger allocations of grabs.

Categories