I have been running a loop where I inserted 50,000 data sets into a MySQL database. Suddenly the server on my hosted space shows this message:
Fatal error: Allowed memory size of 20971520 bytes exhausted (tried to allocate 32 bytes) in /srv/www/htdocs/server/html/email.php on line
14
I know what it means, but how can I reset all the memory so it can be used again?
Note:
I do not want to increase my memory, like with
ini_set('memory_limit', '-1');
but want to completely release it.
When you have a shared system from a Provider in most cases you cannot set the memory limit for yourself. Then when every 100 People on the Server set a higher memory limit the provider has a problem.
So they give a fixed size in your case 20MB and disable the memory_limit function.
20MB is not so much. And when you try to insert a lot of datasets you need some memory. This is the reason why there comes the message 32 bytes from allowed memory 20971520 bytes.
I think with 20MB there is no real good solution to solve this problem. Perhaps you can try to put some logic to the database and make a stored procedures.
I know this is an old thread, but I found it so I think it's worth posting for others.
I came across this error whilst trying to carry out a bulk system update to a single field in every record.
The records were read into an array, processed and updated back to the database.
I believe the error was caused by too much data stored in the array on the server.
To get over the problem, I split the task up to handle 1000 records at a time.
i.e; (Code Tested.
$r_count=1; $start_id=0; $end_id=1000;
while($r_count>0)
{
$strSQL="SELECT * from spAddbook
WHERE id > '$start_id' AND id <= '$end_id'";
$result = mysql_query($strSQL, $link) or die ("Couldn't read because ".mysql_error());
$r_count = mysql_num_rows($result);
echo"<br>$r_count found<br>";
if (mysql_num_rows($result))
{
while ($qry = mysql_fetch_array($result))
{
// Read the Data and carry out processing
}
}
$start_id+=1000; $end_id+=1000;
}
echo"<br><br>All Done";
I Hope this helps.. It worked for me.
Related
I use TYPO3 7.6 and solr 6.1.3 and solrfal 4.1.0. No I get a PHP memory limit error everytime I've tried to run the solrfal scheduler task. He is still on 57 %. I debugged and deleted the last file he tries to index. But the error was also thrown also with the next file.
I got the error:
Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 464255211 bytes) in /var/www/web1/htdocs/leipzig.ihk.de/typo3conf/ext/solr/Classes/ExtractingQuery.php on line 104
on this line file_get_contents() throws the error. The file has only 90KB. Has anybody an idea?
I'd recommend not uploading such a large file.
I would check to reduce the number of items per run, or increase the memory limit.
You need to Increase memory_limit in your php.ini file.
Got the error. In /typo3conf/ext/solrfal/Classes/Queue/ItemRepository.php on line 154, the first "merged_id" was empty. However this happens. I wrapped the line in a if statement and not it works again
if($mergeId['merge_id']) {
$rows[] = $mergeId['merge_id'];
}
Another solution would be to add merge_id > 0 to the where statement.
The previous propose solution does not fix the problem. For me this looks like, that the merge_id is not set in the database and therefore all items will be merged.
By defaut the merge_id has the following format:
'type/fileUid/languageUid/rootPageId'
If there are items in the file index queue without a merge_id your should clear the file index queue and fill it again.
I am trying to display users on a map using google API. Now that when users count increase to 12000 I got an memory exception error. For the time being I increased the memory to 256 from 128. But I am sure when its 25000 users again the same error will come.
The error is,
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 30720 bytes) in /var/www/html/digitalebox_v2_20150904_100/protected/extensions/bootstrap/widgets/TbBaseMenu.phponline 193
Fatal error: Class declarations may not be nestedin/var/www/html/yii-1.1.14.f0fee9/framework/collections/CListIterator.phponline 20`
My code,
$criteria = new CDbCriteria();
$criteria->condition = 't.date BETWEEN :from_date AND :to_date';
$modelUsers = User::model()->findAll($criteria);
foreach ($modelUsers as $modelUser) {
$fullAddress = $modelUser->address1 . ', ' . $modelUser->zip . ' ' . $modelUser->city . ', ' . Country::model()->findByPk($modelUser->countryCode)->countryName;
}
when this $modelUsers has 12000 records this memory problem comes as its 12000 user objects.
what should i do to prevent this kind of issues in future ? what is the minimum required memory size for a php application to run ?
When you call findAll it loads all records in time, so you get end memory error. Speacialy for that situations Yii has CDataProviderIterator. It allows iteration over large data sets without holding the entire set in memory.
$dataProvider = new CActiveDataProvider("User");
$iterator = new CDataProviderIterator($dataProvider);
foreach($iterator as $user) {
$fullAddress = $modelUser->address1 . ', ' . $modelUser->zip . ' ' . $modelUser->city . ', ' . Country::model()->findByPk($modelUser->countryCode)->countryName;
}
I'd solve this problem in a completely different way than suggested. I'd ditch the whole model concept and I'd query MySQL for the addresses. You can query MySQL so it returns already concatenated address, which means you can avoid concatenating it in PHP - that avoids wasting memory.
Next step would be using PDO and issuing an unbuffered query. This means that PHP process will not store entire 12 000 records in its memory - that's how you avoid exhausting the memory.
Final step is outputting the result - as you loop through the unbuffered query, you can use output a single row (or 10 rows) at a time.
What happens this way is that you trade CPU for RAM. Since you don't know how much RAM you will need, the best approach is to use as little as possible.
Using unbuffered queries and flushing PHP's output buffer seems like the only viable way to go for your use case, seeing you can't avoid outputting a lot of data.
Perhaps increase the memory a bit more and see if that corrects the issue. Try setting it to 512M or 1024M. I will tell you from my own experience, if you are trying to load google map markers that number of markers will probably crash the map.
Increase the value of memory_limit in php.ini. Then restart Apache or whatever PHP is running under. Keep in mind that what you add here needs to be taken away from other programs running on the same machine (such MySQL and its innodb_buffer_pool_size).
I am receiving the following error on a script when it reaches the bind section of this code.
Error:
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 4294967296 bytes)
Code:
$result = $dbc->prepare('SELECT make, line, model, name, year, image, processor_family, processor_speeds, shipped_ram, max_ram, ram_type, video_card, video_memory, screen_type, screen_size, hard_drives, internal_drives, external_drives, audio_card, speakers, microphone, networking, external_ports, internal_ports, description, driver_link, manufacturer_link FROM laptop_archive where name=?');
$result->bind_param('s', $name);
$result->execute();
$result->bind_result($make, $line, $model, $name, $year, $image, $processor_family, $processor_speeds, $shipped_ram, $max_ram, $ram_type, $video_card, $video_memory, $screen_type, $screen_size, $hard_drive, $internal_drives, $external_drives, $audio_card, $speakers, $microphone, $networking, $external_ports, $internal_ports, $description, $driver_link, $manufacturer_link);
The portion of the database that it is attempting to access has a number of fields to retrieve although none of them contain a large amount of data with the majority being around 20 characters and the description at around 300 characters.
I have had a search which reveals a few answers although none of them have worked for me, the code is running on a VPS with 512MB RAM and has a memory limit of 256MB as set in php.ini.
The amount of memory to be allocated is 4GB which seems extremely excessive for what is in the database, am I missing something stupid here.
normal times it shouldnt take any huge memory because you didnt fetch the data.
$res = $mysqli->query("SELECT * FROM laptop_archive where name=[REPLACE THIS] limit 1")#
$res->fetch_assoc()
please try this one :)
The issue appears to be caused by the use of LONGTEXT in the database, after changing the type to VARCHAR the issue has gone away.
I had this error two days ago. From what I have learned about it the problems lies in the size of the column your selecting from. In my case I was selecting a pdf and I had to column as a long blob which is 4294967296 bytes. The server was allocating that much to grab the file regardless of the size of the file. So I had to change the column to a medium blob and it works fine. So in your case it looks like it would be the image. I would change that to a medium blob and it should work fine. Otherwise you would have to configure your server to allow bigger allocations of grabs.
Using fgetcsv, can I somehow do a destructive read where rows I've read and processed would be discarded so if I don't make it through the whole file in the first pass, I can come back and pick up where I left off before the script timed out?
Additional Details:
I'm getting a daily product feed from a vendor that comes across as a 200mb .gz file. When I unpack the file, it turns into a 1.5gb .csv with nearly 500,000 rows and 20 - 25 fields. I need to read this information into a MySQL db, ideally with PHP so I can schedule a CRON to run the script at my web hosting provider every day.
I have a hard timeout on the server set to 180 seconds by the hosting provider, and max memory utilization limit of 128mb for any single script. These limits cannot be changed by me.
My idea was to grab the information from the .csv using the fgetcsv function, but I'm expecting to have to take multiple passes at the file because of the 3 minute timeout, I was thinking it would be nice to whittle away at the file as I process it so I wouldn't need to spend cycles skipping over rows that were already processed in a previous pass.
From your problem description it really sounds like you need to switch hosts. Processing a 2 GB file with a hard time limit is not a very constructive environment. Having said that, deleting read lines from the file is even less constructive, since you would have to rewrite the entire 2 GB to disk minus the part you have already read, which is incredibly expensive.
Assuming you save how many rows you have already processed, you can skip rows like this:
$alreadyProcessed = 42; // for example
$i = 0;
while ($row = fgetcsv($fileHandle)) {
if ($i++ < $alreadyProcessed) {
continue;
}
...
}
However, this means you're reading the entire 2 GB file from the beginning each time you go through it, which in itself already takes a while and you'll be able to process fewer and fewer rows each time you start again.
The best solution here is to remember the current position of the file pointer, for which ftell is the function you're looking for:
$lastPosition = file_get_contents('last_position.txt');
$fh = fopen('my.csv', 'r');
fseek($fh, $lastPosition);
while ($row = fgetcsv($fh)) {
...
file_put_contents('last_position.txt', ftell($fh));
}
This allows you to jump right back to the last position you were at and continue reading. You obviously want to add a lot of error handling here, so you're never in an inconsistent state no matter which point your script is interrupted at.
You can avoid timeout and memory error to some extent when reading like a Stream. By Reading line by line and then inserts each line into a database (Or Process accordingly). In that way only single line is hold in memory on each iteration. Please note don't try to load a huge csv-file into an array, that really would consume a lot of memory.
if(($handle = fopen("yourHugeCSV.csv", 'r')) !== false)
{
// Get the first row (Header)
$header = fgetcsv($handle);
// loop through the file line-by-line
while(($data = fgetcsv($handle)) !== false)
{
// Process Your Data
unset($data);
}
fclose($handle);
}
I think a better solution (it will be phenomnally inefficient to continuously rewind and write to open file stream) would be to track the file position of each record read (using ftell) and store it with the data you've read - then if you have to resume, then just fseek to the last position.
You could try loading the file directly using mysql's read file function (which will likely be a lot faster) although I've had problems with this in the past and ended up writing my own php code.
I have a hard timeout on the server set to 180 seconds by the hosting provider, and max memory utilization limit of 128mb for any single script. These limits cannot be changed by me.
What have you tried?
The memory can be limited by other means than the php.ini file, but I can't imagine how anyone could actually prevent you from using a different execution time (even if ini_set is disabled, from the command line you could run php -d max_execution_time=3000 /your/script.php or php -c /path/to/custom/inifile /your/script.php )
Unless you are trying to fit the entire datafile into memory then there should be no issue with a memory limit of 128Mb
I have the following code:
<?php
$FILE="giant-data-barf.txt";
$fp = fopen($FILE,'r');
//read everything into data
$data = fread($fp, filesize($FILE));
fclose($fp);
$data_arr = json_decode($data);
var_dump($data_arr);
?>
The file giant-data-barf.txt is, as its name suggests, a huge file(it's 5.4mb right now, but it could go up to several GB)
When I execute this script, I get the following error:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 71 bytes) in ........./data.php on line 12
I looked at possible solutions, and saw this:
ini_set('memory_limit','16M');
and my question is, is there a limit to how big I should set my memory? Or is there a better way of solving this problem?
THIS IS A VERY BAD IDEA, that said, you'll need to set
ini_set('memory_limit',filesize($FILE) + SOME_OVERHEAD_AMOUNT);
because you're reading the entire thing into memory. You may very well have to set the memory limit to two times the size of the file since you also want to JSON_DECODE
NOTE THAT ON A WEB SERVER THIS WILL CONSUME MASSIVE AMOUNTS OF MEMORY AND YOU SHOULD NOT DO THIS IF THE FILE WILL BE MANY GIGABYTES AS YOU SAID!!!!
Is it really a giant JSON blob? You should look at converting this to a database or other format which you can use random or row access with before parsing with PHP.
I've given all my servers a memory_limit of 100M... didn't run into trouble yet.
I would consider splitting up that file somehow, or get rid of it and use a database