So I'm trying to cache an array in a file and use it somewhere else.
import.php
// Above code is to get each line in CSV and put in it in an array
// (1 line is 1 multidimensional array) - $csv
$export = var_export($csv, true);
$content = "<?php \$data=" . $export . ";?>";
$target_path1 = "/var/www/html/Samples/test";
file_put_contents($target_path1 . "recordset.php", $content);
somewhere.php
ini_set('memory_limit','-1');
include_once("/var/www/html/Samples/test/recordset.php");
print_r($data);
Now, I've included recordset.php in somewhere.php to use the array stored in it. It works fine when the uploaded CSV file has 5000 lines, now if i try to upload csv with 50000 lines for example, i'm getting a fatal error:
Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 79691776 bytes)
How can I fix it or is there a possible way to achieve what i want in a more convenient way? Speaking about the performance... Should i consider the CPU of the server? I've override the memory limit and set it to -1 in somewhere.php
There are 2 ways to fix this:
You need to increase memory(RAM) on the server as memory_limit can only use memory which is available on server. And it seems that you have very low RAM available for PHP.
To Check the total RAM on Linux server:
<?php
$fh = fopen('/proc/meminfo','r');
$mem = 0;
while ($line = fgets($fh)) {
$pieces = array();
if (preg_match('/^MemTotal:\s+(\d+)\skB$/', $line, $pieces)) {
$mem = $pieces[1];
break;
}
}
fclose($fh);
echo "$mem kB RAM found"; ?>
Source: get server ram with php
You should parse your CSV file in chunks & every time release occupied memory using unset function.
Related
In my script I wanted to clear the array elements to free memory from no longer used data.
I found myself in strange situation where using unset() causes:
( ! ) Fatal error: Allowed memory size of 134217728 bytes exhausted
(tried to allocate 16777224 bytes) in
.../models/Persons.php on line 60
This is code part which causes this problem:
$chunks_count = count($this->xml_records_chunk['fnames']) - 1;
for ($num = 0; $num <= $chunks_count; $num++) {
$chunks_count = count($this->xml_records_chunk['fnames']) - 1;
$not_last = ($num < $chunks_count ? ',' : '');
$new_records .= '(' . $this->xml_records_chunk['fnames'][$chunks_count] . ','
. $this->xml_records_chunk['lnames'][$chunks_count] . ' , '
. $this->xml_records_chunk['dobs'][$chunks_count] . ' , '
. $this->xml_records_chunk['phones'][$chunks_count] . ' )' . $not_last;
unset($this->xml_records_chunk['fnames'][$chunks_count]);
unset($this->xml_records_chunk['lnames'][$chunks_count]);
unset($this->xml_records_chunk['dobs'][$chunks_count]);
unset($this->xml_records_chunk['phones'][$chunks_count]);
}
Script works just fine without unset.
Now the questions are:
Why unset causes memory exhaustion?
What is the correct way to unset unsused array elements in this case?
I've already checked this for example:
What's better at freeing memory with PHP: unset() or $var = null
Ok null indeed works a bit other way since with it script dies on line 61 - 3rd unset.
That is a good question why your unset breaks your memory. But you call $this->xml_records_chunk as variable in your code. So i would suggest that you have an array with all existing elements so you have allocated the complete memory already.
I think in that case you don't need to cleanup your array and memory because you have already allocated that memory. The GC is not that bad. So if you script doesn't use the variable anymore it's cleaned.
In your case i would suggest that you change your array structure and put the iterator to the first entry of your value something like this:
$this->xml_records_chunk[$chunks_count]['phones']
Then you have the following structure
$this->xml_records_chunk[$chunks_count] = [
'phones',
'...',
'...
]
Then you can clean with a single unset the complete array with
unset($this->xml_records_chunk[$chunks_count])
that could cause less problems and perhaps you could check the Iterator-Interface to iterate and delete your data.
I am comparing two CSV files in PHP. I am only interested in knowing that the column names are the same and not interested in the data (at least for now).
This is a generic script that handles any CSV file I choose to upload. The file I am uploading is compared against a set sample file (so I can only upload the file if a sample has been provided). This sample file only contains a few lines of data so is not large by any stretch. The file I am uploading can range from 500kb to about 10mb (the one I am uploading is 7,827,180 bytes).
Everything has been fine until today when I started getting this message:
"Fatal error: Out of memory (allocated 524288) (tried to allocate 7835372 bytes) in C:\xampp\htdocs\errcoaching\app\handlers\file_upload_parse.php on line 8" (line 8 refers to line 2 in my sample (the first line inside my function).
function check_csv($f_a, $f_b){
$csv_upload = array_map("str_getcsv", file($f_a,FILE_SKIP_EMPTY_LINES))[0]; // This is line 8
$csv_sample = array_map("str_getcsv", file($f_b,FILE_SKIP_EMPTY_LINES))[0];
$match = 'true';
foreach ($csv_sample as $key => $value) {
if($value != $csv_upload[$key]){
$match = 'false';
break 1;
}
}
return $match;
}
You are reading the whole file into an array, then disgarding all but the 1st line.
Instead you can use http://php.net/manual/en/function.fgetcsv.php to read only the 1st line:
$handle_a = fopen($f_a, "r");
$handle_b = fopen($f_b, "r");
$csv_upload = fgetcsv($handle_a);
$csv_sample = fgetcsv($handle_b);
//the rest of your code
fclose($handle_a);
fclose($handle_b);
You should probably also handle file read errors
I've a memory problem with an xlsx file of about 95.500 rows and 28 columns.
To handle such big file (more than 10 MB xlsx) i wrote below code but when i execute the code and calling the load method i receive a memory exhausted error even with only one row read! (I've assigned only 128Mb to php interpreter)
Please consider that:
Currently i try to read only one single row and the receive the error about memory exhausted (see $chunkFilter->setRows(1,1);)
After solving this problem about reading the first line, i need to read all other lines to load content in a database table
If you think that there is other library or solution, please consider that i prefer PHP as language because is the main language used for this application But i can accept any other solutions with other languages (like go)
Please, don't simply suggest to increment memory of php process. I alredy know that this is possible but this code run on VPS shared server with only 512Mb of RAM maximum and I need to maintain the memory use lowest as possible
there is solution? please find below code that i use:
/** Define a Read Filter class implementing PHPExcel_Reader_IReadFilter to read file in "chunks" */
class chunkReadFilter implements PHPExcel_Reader_IReadFilter {
private $_startRow = 0;
private $_endRow = 0;
/** Set the list of rows that we want to read */
public function setRows($startRow, $chunkSize) {
$this->_startRow = $startRow;
$this->_endRow = $startRow + $chunkSize;
}
public function readCell($column, $row, $worksheetName = '') {
// Only read the heading row, and the rows that are configured in $this->_startRow and $this->_endRow
if (($row == 1) || ($row >= $this->_startRow && $row < $this->_endRow)) {
return true;
}
return false;
}
}
function loadXLSFile($inputFile){
// Initiate cache
$cacheMethod = PHPExcel_CachedObjectStorageFactory:: cache_to_sqlite3;
if (!PHPExcel_Settings::setCacheStorageMethod($cacheMethod)) {
echo date('H:i:s'), " Unable to set Cell Caching using ", $cacheMethod,
" method, reverting to memory", EOL;
}
$inputFileType = PHPExcel_IOFactory::identify($inputFile);
$objReader = PHPExcel_IOFactory::createReader($inputFileType);
$chunkFilter = new chunkReadFilter();
// Tell the Read Filter, the limits on which rows we want to read this iteration
$chunkFilter->setRows(1,1);
// Tell the Reader that we want to use the Read Filter that we've Instantiated
$objReader->setReadFilter($chunkFilter);
$objReader->setReadDataOnly(true);
$objPHPExcel = $objReader->load($inputFile);
}
UPDATE
Below the error returned as requested by pamelus
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 112 bytes) in /vendor/phpoffice/phpexcel/Classes/PHPExcel/Reader/Excel2007.php on line 471
PHP Stack trace:
PHP 1. {main}() dataimport.php:0
PHP 2. loadFileToDb($inputFile = *uninitialized*, $tabletoupdate = *uninitialized*) dataimport.php:373
PHP 3. PHPExcel_Reader_Excel2007->load($pFilename = *uninitialized*) dataimport.php:231
Given the low memory limit you have, I can suggest you an alternative to PHPExcel that would solve your problem once and for all: Spout. It only requires 10MB of memory, so you should be good!
Your loadXLSXFile() function would become:
use Box\Spout\Reader\ReaderFactory;
use Box\Spout\Common\Type;
function loadXLSFile($inputFile) {
$reader = ReaderFactory::create(Type::XLSX);
$reader->open($inputFile);
foreach ($reader->getSheetIterator() as $sheet) {
foreach ($sheet->getRowIterator() as $row) {
// $row is the first row of the sheet. Do something with it
break; // you won't read any other rows
}
break; // if you only want to read the first sheet
}
$reader->close();
}
It's that simple! No need for caching, filters, and other optimizations :)
I have a php script that splits a large file and inserts into PostgreSQL. This import has worked before on PHP 5.3 and PostgreSQL 8.3 and Mac OS X 10.5.8. I now moved everything over to a new Mac Pro. This has plenty of RAM (16MB), Mac OS X 10.9.2, PHP 5.5.8, PostgreSQL 9.3.
The problem is when reading the large import file. It is a tab-separated file over 181 MB. I have tried to increase PHP memory up to 2GB (!) with no more success. So I guess the problem must be in the code reading the text file and splitting it. I get this error:
PHP Fatal error: Allowed memory size of 2097152000 bytes exhausted (tried to allocate 72 bytes) in /Library/FileMaker Server/Data/Scripts/getGBIFdata.php on line 20
Is there a better way to do this? I read the file and split the lines, then again split each line by \t (tab). The error I get on this line:
$arr = explode("\t", $line);
Here is my code:
<?php
## I have tried everything here, memory_limit in php.ini is 256M
ini_set("memory_limit","1000M");
$db= pg_connect('host=127.0.0.1 dbname=My_DB_Name user=Username password=Pass');
### SETT ERROR_STATE:
pg_set_error_verbosity($db, PGSQL_ERRORS_VERBOSE);
### Emtpy DB
$result = pg_query("TRUNCATE TABLE My_DB_Name");
$fcontents = file ('///Library/FileMaker\ Server/Data/Documents/EXPORT/export_file.tab');
for($i=0; $i<sizeof($fcontents); $i++) {
$line = trim($fcontents[$i]);
$arr = explode("\t", $line);
$query = "insert into My_DB_Name(
field1, field2 etc.... )
values (
'{$arr[0]}','{$arr[1]}','{$arr[2]}','{$arr[3]}', etc........
)";
$result = pg_query($query); echo "\n Lines:".$i;
pg_send_query($db, $query);
$res1 = pg_get_result($db);
}
## Update geometry column
$sql = pg_query("
update darwincore2 set punkt_geom=
ST_SetSRID(ST_MakePoint(My_DB_Name.longitude, darwincore2.latitude),4326);
");
?>
I think the problem is that you're using the file() function which reads the whole file in memory at once. Try reading it line by line using fopen and fgets.
$fp = fopen(filename, "r");
while (($line = fgets($fp)) !== false) {
... insert $line into the db....
}
fclose($fp);
You can also import a file directly with the COPY command (http://www.postgresql.org/docs/9.2/static/sql-copy.html)
this case can be occurred from code e.g infinite loop, process large amount data, or even database queries
You should check code, there might have been infinite loop or such type case
For one off my projects I need to import a very huge text file ( ~ 950MB ). I'm using Symfony2 & Doctrine 2 for my project.
My problem is that I get errors like:
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 24 bytes)
The error even occurs if I increase the memory limit to 1GB.
I tried to analyze the problem by using XDebug and KCacheGrind ( as part of PHPEdit ), but I don't really understand the values :(
I'am looking for a tool or a method (Quick & Simple due to the fact that I don't have much time) to find out why memory is allocated and not freed again.
Edit
To clear some things up here is my code:
$handle = fopen($geonameBasePath . 'allCountries.txt','r');
$i = 0;
$batchSize = 100;
if($handle) {
while (($buffer = fgets($handle,16384)) !== false) {
if( $buffer[0] == '#') //skip comments
continue;
//split parts
$parts = explode("\t",$buffer);
if( $parts[6] != 'P')
continue;
if( $i%$batchSize == 0 ) {
echo 'Flush & Clear' . PHP_EOL;
$em->flush();
$em->clear();
}
$entity = $em->getRepository('MyApplicationBundle:City')->findOneByGeonameId( $parts[0] );
if( $entity !== null) {
$i++;
continue;
}
//create city object
$city = new City();
$city->setGeonameId( $parts[0] );
$city->setName( $parts[1] );
$city->setInternationalName( $parts[2] );
$city->setLatitude($parts[4] );
$city->setLongitude( $parts[5] );
$city->setCountry( $em->getRepository('MyApplicationBundle:Country')->findOneByIsoCode( $parts[8] ) );
$em->persist($city);
unset($city);
unset($entity);
unset($parts);
unset($buffer);
echo $i . PHP_EOL;
$i++;
}
}
fclose($handle);
Things I have tried, but nothing helped:
Adding second parameter to fgets
Increasing memory_limit
Unsetting vars
Increasing memory limit is not going to be enough. When importing files like that, you buffer the reading.
$f = fopen('yourfile');
while ($data = fread($f, '4096') != 0) {
// Do your stuff using the read $data
}
fclose($f);
Update :
When working with an ORM, you have to understand that nothing is actually inserted in the database until the flush call. Meaning all those objects are stored by the ORM tagged as "to be inserted". Only when the flush call is made, the ORM will check the collection and start inserting.
Solution 1 : Flush often. And clear.
Solution 2 : Don't use the ORM. Go for plain SQL command. They will take up far less memory than the object + ORM solution.
33554432 are 32MB
change memory limit in php.ini for example 75MB
memory_limit = 75M
and restart server
Instead of simply reading the file, you should read the file line by line. Every time you do read the one line you should process your data. Do NOT try to fit EVERYTHING in memory. You will fail. The reason for that is that while you can put the TEXT file in ram, you will not be able to also have the data as php objects/variables/whathaveyou at the same time, since php by itself needs much larger amounts of memory for each of them.
What I instead suggest is
a) read a new line,
b) parse the data in the line
c) create the new object to store in the database
d) goto step a, by unset(ting) the old object first or reusing it's memory