I am working with Files and Folders within CakePHP. Now everything works fine and in the way I want it to. However, when Zipping files, I get the following error message :
Error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 240047685 bytes)
Now zipping smaller files, its fine! I have even done files in size of around 10MB without any issues, however zipping that are larger in size seem to have an issue.
Now I have added the following to my .htaccess file and made a php.ini file as I thought that might be the issue.
php_value upload_max_filesize 640000000M
php_value post_max_size 640000000M
php_value max_execution_time 30000000
php_value max_input_time 30000000
Until I found some posts pointing at the fact that PHP as a 4GB file limit. Well even if that is the case, why does my zip file not do this file (which is only about 245mb).
public function ZippingMyData() {
$UserStartPath = '/data-files/tmp/';
$MyFileData = $this->data['ZipData']; //this is the files selected from a form!
foreach($MyFileData as $DataKey => $DataValue) {
$files = array($UserStartPath.$DataValue);
$zipname = 'file.zip';
$zip = new ZipArchive();
$zip_name = time().".zip"; // Zip name
$zip->open($zip_name, ZipArchive::CREATE);
foreach ($files as $file) {
$path = $file;
if(file_exists($path)) {
$zip->addFromString(basename($path), file_get_contents($path));
} else {
echo"file does not exist";
}
} //End of foreach loop for $files
} //End of foreach for $myfiledata
$this->set('ZipName', $zip_name);
$this->set('ZipFiles', $MyFileData);
$zip->close();
copy($zip_name,$UserStartPath.$zip_name);
unlink($zip_name); //After copy, remove temp file.
$this->render('/Pages/download');
} //End of function
Any ideas of where I am going wrong? I will state that this is NOT my code, I found bits of it on others posts and changed it to fit my needs for my project!
All help most welcome...
Thanks
Glenn.
I think that ZipArchive loads your file in memory, so you have to increase the memory_limit parameter in php.ini.
To avoid consuming all the memory of your server and drop performance, if your file is big, a better (but far from be the best) solution should be:
public function ZippingMyData() {
$UserStartPath = '/data-files/tmp/';
$MyFileData = $this->data['ZipData']; //this is the files selected from a form!
foreach($MyFileData as $DataKey => $DataValue) {
$files = array($UserStartPath.$DataValue);
$zip_name = time().".zip"; // Zip name
// Instead of a foreach you can put all the files in a single command:
// /usr/bin/zip $UserStartPath$zip_name $files[0] $files[1] and so on
foreach ($files as $file) {
$path = $file;
if(file_exists($path)) {
exec("/usr/bin/zip $UserStartPath$zip_name basename($path)");
} else {
echo"file does not exist";
}
} //End of foreach loop for $files
} //End of foreach for $myfiledata
$this->render('/Pages/download');
} //End of function
or similar (depends on your server). This solution has only two limits: disk space and zip limitations.
I apologize for the poor quality of my code and for any error.
Related
I am using Slim Framework to upload files. I need to upload 10 files in single request of 500MB size. How can I accomplish this. I am using following code for this.
$uploadedFiles = $request->getUploadedFiles();
foreach ($uploadedFiles['aws_file'] as $uploadedFile) {
if ($uploadedFile->getError() === UPLOAD_ERR_OK) {
$uploadFileName = $uploadedFile->getClientFilename();
$fileDetails = pathinfo($uploadedFile->getClientFileName());
$fileName = explode('_',$fileDetails['filename']);
if(count($fileName)==3) {
$orgIdArray[] = $fileName[1];
}
} else {
$responseObj->status = 'error';
$responseObj->message = 'Error in file or file is empty ';
$responseObj->errorFileList[] = $uploadedFile->getError();
}
}
I am getting memory issue.
Increase your memory_limit PHP setting.
I am currently working on a tool made with PHP (quite newbie with this technology...) which should generate zip files with a set of files inside. This set of files can be:
Basic files (mutliple formats)
Full directories (will be added into the resulting zip as a new zipped file - ZIP inside the final ZIP)
The thing is that when the zip files contains simple files it is downloaded properly but when the file contains the "Full directory zip file" then the resulting ZIP file get corrupted...
Below the code I am currently using (sorry if its a bit messy but is the first time I work with PHP...)
function ZipFiles($fileArr,$id) {
$destination = "{$_SERVER['DOCUMENT_ROOT']}/wmdmngtools/tempFiles/WMDConfigFiles_".$id.".zip";
$valid_files = array();
//if files were passed in...
if(is_array($fileArr)) {
//cycle through each file
foreach($fileArr as $file) {
if(is_dir($file)) {
//If path is a folder we zip it and put it on $valid_files[]
$resultingZipPath = "{$_SERVER['DOCUMENT_ROOT']}/wmdmngtools/tempFiles/".basename($file)."_FOLDER.zip";
ZipFolder($file,$resultingZipPath );
$valid_files[] = $resultingZipPath ;
}
else {
//If path is not a folder then we make sure the file exists
if(file_exists("{$_SERVER['DOCUMENT_ROOT']}/wmdmngtools/tempFiles/".$file)) {
$valid_files[] = $file;
}
}
}
}
//if we have good files...
if(count($valid_files)) {
//create the archive
$zip = new ZipArchive();
if($zip->open($destination,ZIPARCHIVE::CREATE) !== true) {
return false;
}
//add the files
foreach($valid_files as $file) {
$zip->addFile("{$_SERVER['DOCUMENT_ROOT']}/wmdmngtools/tempFiles/".$file,$file);
}
$zip->close();
return $destination;
}
else
{
return "";
}
}
function ZipFolder($source, $destination) {
// Initialize archive object
$folderZip = new ZipArchive();
$folderZip->open($destination, ZipArchive::CREATE | ZipArchive::OVERWRITE);
// Create recursive directory iterator
/** #var SplFileInfo[] $files */
$files = new RecursiveIteratorIterator(
new RecursiveDirectoryIterator($source),
RecursiveIteratorIterator::LEAVES_ONLY
);
foreach ($files as $name => $file)
{
// Skip directories (they would be added automatically)
if (!$file->isDir())
{
// Get real and relative path for current file
$filePath = $file->getRealPath();
$relativePath = substr($filePath, strlen($source) + 1);
// Add current file to archive
$folderZip->addFile($filePath, $relativePath);
}
}
// Zip archive will be created only after closing object
$folderZip->close();
}
On it we can see two functions:
ZipFiles: is the main fucntion that is called by passing a list/array (contains the list of files and folders that will be added into the final ZIP) and an ID parameter which is simply used for generatig different file names... (can be ignored)
ZipFolder: this fucntion is called for each of the folders (not files) on the above mentioned list/array in order to zip that folder and create a zip file to add it on the final file. (based on what I found in How to zip a whole folder using PHP)
I have tried many things like mentioned in above post like closing all files, or avoiding empty zips inside the zip but nothing worked...
Zip inside zip (php)
Maybe I missed something (most probably :) ) but am running out of aces so any help/guideance would be appreciated.
In case more info is needed please let me know and will post it.
Thanks a lot in advance!!
Finally found the issue. Seems that the file was generated properly but when downloading it from PHP there was a problem when size was bigger than a concrete number.
THis was due to wrong definition of the message length on the header definition:
header("Cache-Control: public");
header("Content-Description: File Transfer");
header("Content-Length: ".filesize($zippedFile));
header("Content-Disposition: attachment; filename=".$zippedFile);
header("Content-type: application/zip");
header("Content-Transfer-Encoding: binary");
Even if I guess it may not be a correct practice I removed the Content-Length entry and now I get the correct file despite of its size.
I have to zip search results containing maximum 10000 files, with an approximate dimension of far more than 1Gb.
I create a zip archive and read every file in a for loop whit fread and add the resulting file to the archive.
I never finished adding files because of this error
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 1257723 bytes)
but I don't think adding 1Gb or more to the memory_limit value of php.ini could be a solution, because memory resources are limited.
Because of the zip file will stay in memory until it will be closed (or so I read in another question), I made a way to create a series of zip file of 50Mb to avoid the memory leak. But even if the php script create another file zip, it will stop with the same PHP Fatal error on the same file (the 174th).
Why?
Am I doing something wrong?
Any help will be appreciated.
Here is a code snippet of the file creation
$zip = new ZipArchive();
$nomeZipFile = "../tmp/" . $title . ".zip";
for ($x = 0; $x < count($risultato); $x++) {
$numFiles = 0;
$dir = '../tmp';
if (file_exists($nomeZipFile)) {
if (filesize($nomeZipFile) > 52428800) {
// filename count
if ($handle = opendir($dir)) {
while (($file = readdir($handle)) !== false) {
if (!in_array($file, array('.', '..')) && !is_dir($dir . $file))
$numFiles++;
}
}
$nomeZipFile = '../tmp/' . $title . $numFiles . '.zip';
$res = $zip->open($nomeZipFile, ZIPARCHIVE::CREATE);
} else {
$res = $zip->open($nomeZipFile, ZIPARCHIVE::CREATE);
}
} else {
$res = $zip->open($nomeZipFile, ZIPARCHIVE::CREATE);
}
...
// adding file
$fileDownload = "";
$fDownload = fopen($kt_response->message, "r"); // the file is downloaded though a webservice
while(!feof($fDownload)) { $fileDownload .= fread($fDownload, 1024); flush(); }
$zip->addFromString(filename, $fileDownload);
....
$zip->close();
}
I'm working on extracting a zip archive with PHP. The structure of the archive is seven folders, each of which contains on the order of 10,000 files, each around 1 kB.
My code is pretty simple and uses the ZipArchive class:
$zip = new ZipArchive();
$result = $zip->open($filename);
if ($result === true) {
$zip->extractTo($tmpdir);
$zip->close();
}
The problem I'm having, though, is that the extraction seems to halt. The first folder is fully extracted, but only about half of the second one is. None of the other five are extracted at all.
I also tried using this code, which breaks it into chunks of 10 kB at a time, but got the exact same result:
$archive = zip_open($filename);
while ($entry = zip_read($archive)) {
$size = zip_entry_filesize($entry);
$name = zip_entry_name($entry);
if (substr($name, -1) == '/') {
if (!file_exists($tmpdir . $name)) mkdir($tmpdir . $name);
} else {
$unzipped = fopen($tmpdir . $name, 'wb');
while ($size > 0) {
$chunkSize = ($size > 10240) ? 10240 : $size;
$size -= $chunkSize;
$chunk = zip_entry_read($entry, $chunkSize);
if ($chunk !== false) fwrite($unzipped, $chunk);
}
fclose($unzipped);
}
}
I've also tried increasing the memory limit in PHP from 512 MB to 1024 MB, but again got the same result. Unzipped everything is around 100 MB, so I wouldn't anticipate it being a memory issue anyway.
Probably its your max execution time... disable the limit completely by setting it to 0 or set a good value.
ini_set('max_execution_time', 10000);
... dont set it to 0 in production use ...
If you don't have access to ini_set() because of the disable_function directive you may have to edit its value in your php.ini directly.
here is the code that i use to upload files and unzip them in a directory.
But the problem is that it seems to be very slow on files greater than 5MB.
I think it doesnt have to do with the network because it is in localhost computer.
Do I need to edit any parameter in php.ini file or apache or any other workarround?
$target_path = "../somepath/";
$target_path .= JRequest::getVar('uploadedDirectory')."/";
$target_Main_path= JRequest::getVar('uploadedDirectory')."/";
$folderName = $_FILES['uploadedfile']['name'];
$target_path = $target_path . basename($folderName);
//upload file
if(move_uploaded_file($_FILES['uploadedfile']['tmp_name'], $target_path."/")) {
echo "The file ". basename( $_FILES['uploadedfile']['name']). " has been uploaded";
} else{
echo "There was an error uploading the file, please try again!";
}
$zip = new ZipArchive;
$res = $zip->open($target_path);
$target_path = str_replace(".zip", "", $target_path);
echo $target_path;
if ($res === TRUE) {
$zip->extractTo($target_path."/");
$zip->close();
echo "ok";
} else {
echo "failed";
}
When handling file uploads ther are a lot of factors to consider. These include PHP settings:
max_input_time
max_execution_time
upload_max_filesize
post_max_size
Execution time can affect the upload if for example, you have a slow upload speed resulting in a timeout.
File size can cause problems if the upload is larger than the upload_max_filesize like wise if your file size + the rest of the post data exceeds post_max_size.
Zip uses a lot of memory in the archive/extraction processes and could easily exceed the memory allocated to PHP -so that would be worth checking as well.
I would start with this page and note some of the comments.