I have an ICS file which is 2gb in size and i want to parse ics data from that file but php is not able to read such big file and i am getting fatal error of "Out of Memory" even i have set "ini_set('memory_limit', '-1')".
So i want to somehow break or split big ICS file to small file or is there any way to read the data from such big ICS file.
I have some small files and all are working fine and i can extract data from other files but 2gb big ICS file is more important for me to extract / parse.
Thanks in advance
Usual method to handle Out of memory exception is by allocating more ram to php in php.ini file but because you have a file of 2Gb thats not a valid option unless you have a lot of memory on your system. Basically you are trying to read the file wrong. Rather than reading the whole file and saving it to a variable which would cost ram equivalent to the size of file, you ran parse them by line or byte depending on the format of file you are working on. Here is a basic example that you can work on
<?php
$handle = fopen("fileName", "r");
if ($handle) {
while (($line = fgets($handle)) !== false) {
// process line here
}
fclose($handle);
} else {
// handle file read error
}
?>
Hope this helps you.
Related
When using force_download to download a zip file my code works for a zip file that is 268Mb (31 MP3 files) but not for a zip file that is 287Mb (32 MP3 files), the difference being 1 extra MP3 file added to the zip. The download attempts to start and appears as though it keeps starting over and over a couple of times and shows as failed with Chrome indicating that the zip file is incomplete. Windows reports the zip file which is only 61Kb is invalid when trying to open it.
The zip file gets created and MP3 files added to it by another area of code.
I have increased the memory_limit up to 1024M but its no different.
Below is the code I want working:
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = base_url()."uploads/zipped/".$lastbasket;
$fileContent = file_get_contents($zipdlpath);
force_download($lastbasket, $fileContent);
I have also tried using the following code:
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = FCPATH."uploads/zipped/".$lastbasket;
force_download($zipdlpath, NULL);
Providing a direct link to the zip file works fine (so I know the issue isnt with the zip file itself) but the force_download function in the controller appears to have an issue with larger files or is there a setting I am missing somewhere that is forcing a limit somehow?
PHP 7.1.33
CodeIgniter 3.1.9
Try to increase memory limit by adding this code:
ini_set('memory_limit','1024M');
increase memory limit and use fopen, fread
try this
$this->load->helper("download");
$lastbasket = "uniquefilename.zip";
$zipdlpath = FCPATH."uploads/zipped/".$lastbasket;
force_download($zipdlpath, NULL);
if (is_file($zipdlpath))
{
$chunkSize = 1024 * 1024;
$handle = fopen($zipdlpath, 'rb');
while (!feof($handle))
{
$buffer = fread($handle, $chunkSize);
echo $buffer;
ob_flush();
flush();
}
fclose($handle);
exit;
}
I've tried with the following custom download helper, may it will work for you.
Ref Link - https://github.com/bcit-ci/CodeIgniter/wiki/Download-helper-for-large-files
I have a issue with PHP function xml_parse. It's not working with huge files - I have xml file with 10MB size.
Problem is, that I have old XML-RPC library from Zend and there are another functions (element handlers and case folding).
$parser_resource = xml_parser_create('utf-8');
xml_parser_set_option($parser_resource, XML_OPTION_CASE_FOLDING, true);
xml_set_element_handler($parser_resource, 'XML_RPC_se', 'XML_RPC_ee');
xml_set_character_data_handler($parser_resource, 'XML_RPC_cd');
if (!xml_parse($parser_resource, $data, 1)) {
// ends here with 10MB file
}
On another place, I just use siple_load_xml_file with option LIBXML_PARSEHUGE, but in this case I don't know what can I do.
Best way will be, if function xml_parse will have some parameter for huge files too.
Thank you for your advices
Error is:
XML error: No memory at line ...
The chunk length of file to parse could be to huge.
if you use fread
while ($data = fread($fp, 1024*1024)) {...}
use smaller length (at my case it has to be smaller than 10 MB) e.g. 1MB and put the xml_parse function in the while loop.
I have a file plain.cache which is little over 10MB and I made a gzcompressed file gz.cache out of the original plain.cache file. Then, I made two separate files which load each of the mentioned cache files and was kind of surprised that the page load speed of both files was almost the same. So, my question is - am I being right by concluding that gzcompressed file does not in any way benefit the load speed of the page? Now, I would conclude that the gzuncompress that I use in the gz.php file "makes" the same exact string just as when I read it from the plain file. Given all these staments - a general question is how can I (if it all in all is done this way) increase the load speed by compressing the file with gzcompress.
The image of the files is below, and the code of files is as follows:
_makeCache.php, in which I make the gzcompressed version of the plain.cache file:
$str = file_get_contents("plain.cache");
$strCompressed = gzcompress($str, 9);
$file = "gz.cache";
$fp = fopen($file, "w");
fwrite($fp, $strCompressed);
fclose($fp);
plain.php:
echo file_get_contents("plain.cache");
gz.php:
echo gzuncompress(file_get_contents("plain.cache"));
Your http server is compressing the plain.cache automatically on the fly, using gzip as well, and the client decompresses it. So you should see almost no difference.
I have an interesting problem. I need to do a progress bar from an asycronusly php file downloading. I thought the best way to do it is before the download starts the script is making a txt file which is including the file name and the original file size as well.
Now we have an ajax function which calling a php script what is intended to check the local file size. I have 2 main problems.
files are bigger then 2GB so filesize() function is out of business
i tried to find a different way to determine the local file size like this:
.
function getSize($filename) {
$a = fopen($filename, 'r');
fseek($a, 0, SEEK_END);
$filesize = ftell($a);
fclose($a);
return $filesize;
}
Unfortunately the second way giving me a tons of error assuming that i cannot open a file which is currently downloading.
Is there any way i can check a size of a file which is currently downloading and the file size will be bigger then 2 GB?
Any help is greatly appreciated.
I found the solution by using an exec() function:
exec("ls -s -k /path/to/your/file/".$file_name,$out);
Just change your OS and PHP to support 64 bit computing. and you can still use filesize().
From filesize() manual:
Return Values
Returns the size of the file in bytes, or FALSE (and generates an
error of level E_WARNING) in case of an error.
Note: Because PHP's integer type is signed and many platforms use
32bit integers, some filesystem functions may return unexpected
results for files which are larger than 2GB.
I'm trying to let users upload files onto my website, but unfortunately some of them seem to turn corrupt when reading them. I've tried both images and html files, and all the images come through corrupt (the HTML files come through fine).
To upload the files I'm using a standard HTML form and the PHP $_FILES array. I'm then using the following code to read the contents of the file:
$filename = $_FILES['varname']['tmp_name'];
$handle = fopen($filename, "r");`
$contents = fread($handle, filesize($filename));
fclose($handle);
Unfortunately the value of $contents is now slightly different to the file I uploaded (here's a snippet from the top of the file):
Original file:
ˇÿˇ·ExifII*ˇÏDucky<ˇÓAdobed¿ˇ€Ñ
New file:
ˇÿˇ· Exif II* ˇÏ Ducky < ˇÓ Adobe d¿ ˇ€ Ñ
As you can see there's a difference in the spacing. Any ideas what would be causing this? Am I handling the file read incorrectly for binary files? It seems odd that it's fine for any text files I upload..
Thanks!
I usually output files like this:
header("Content-Disposition: attachment; filename=\"$fileName\"");
readfile("$HOME_DIR/uploads/$fileName");
exit();
Anyway, to try to debug your problem, you should first understand which phase is failing. Upload or download? To check, just go to your webserver and download the file via FTP, then open it in a binary editor. If it is already corrupt then you need to investigate your upload phase, otherwise it's the other way around.
how do you print $contents ? Are you sure this is a problem with reading the file ?
I guess that maybe this is a problem with PRINTING the file to the output... Try printing it binary way. Something like:
$data = unpack("C*", $contents);
foreach ($data as $v)
{
echo $v, ' ';
}
and compare that with binary dump of the original file...