I'm using phps ZipArchive class to create a zip file of a directory. The code works fine for almost all directories except two, which happen to be the largest ones (the largest directory currently contains 51 files with a total of 175MB). When I'm running the code for these directories a temporary file ('filename.zip.[RandomLettersAndNumbers]', e.g. 'filename.zip.riaab4') will be created with a size of 67,108,864 bytes and the script throws an internal server error (500).
What I've tried so far (most of it is still visible in the source):
Increase memory_limit
Increase max_execution_time
Check the php error log: error log is empty
Try to find the exact line where the error occurs: the code is executed up to $zip_archive->close();
Source:
//Temporary (debugging) START
ini_set("log_errors", 1);
ini_set("error_log", "php-error.log");
error_log("Hello errors.");
ini_set('memory_limit','512M');
set_time_limit(180);
//phpinfo();
//Temporary (debugging) END
//Get selected directory
// --> Code removed
$result['directory'] = 'directory1';
//Create .zip file
$zip_archive = new ZipArchive();
$ZIP = 'files/'.$result['directory'].'.zip';
if($zip_archive->open($ZIP, ZipArchive::CREATE | ZIPARCHIVE::OVERWRITE) !== TRUE) {
echo 'Error! Error while creating ZIP Archive.';
die;
}
foreach(new DirectoryIterator('files/'.$result['directory']) as $fileInfo) {
if($fileInfo->isDot())
continue;
if(file_exists($fileInfo->getPath()))
$zip_archive->addFile('files/'.$result['directory'].'/'.$fileInfo->getFilename(), $fileInfo->getFilename());
}
//Temporary (debugging) START
echo '<br><br><br>';
var_dump($zip_archive->close());
// ## The following lines are not executed!! ##
echo '<br><br><br>';
var_dump($zip_archive->getStatusString());
echo '<br><br><br>';
//Temporary (debugging) END
if($zip_archive->numFiles > 0 && $zip_archive->close() == true)
$FILE = $ZIP;
else {
}
Am I missing something? If there's anything else I can try please let me know. Thanks.
EDIT: I've got a more specific question: Why do the temporary zip files of the affected directories all have a file size of 67,108,864 bytes? Is this related to a maximum file size set by php/the server or can this be explained with the zip standard/ZipArchive class?
Try closing the ZIP archive and reopening it after every 5 or so files are added to the archive. I assume this solution solves a memory limit issue.
Credit to MatthewS for this solution.
Related
I am having trouble appending to my log file. If my log file is over 50 mb, then make a new log file. If not, then append to the previous file. I am not sure if I am using FILE_APPEND correctly. Would it be best to use fopen and fwrite?
Here is my code below:
//this is the file to test size to determine whether to append to it or start a new log file
if ($latestFile != '') {
$latestFileSize = filesize($latestFile);//file size in bytes (1/1000000 of MB)
}
if ($latestFileSize != 0) {
$fileSizeThreshold = 50 * 1000000;//Threshold for log file size limit in bytes (50 MB)
if ($latestFileSize > $fileSizeThreshold) {
//The latest results file deletion log file is over 50 MB in size, so create a new one
file_put_contents("c:\\sites\\{$logFileName}", $log);
} else {
//2. Troubleshoot why the FILE_APPEND is not working and fix it. What is the normal behavior of the FILE_APPEND?
//The latest results file deletion log file is NOT over 50 MB in size, so append log entry to latest one
file_put_contents("c:\\sites\\{$latestFileName}", $log, FILE_APPEND);
}
} else {
//This is the first file in the log directory. Create it.
file_put_contents("c:\\sites\\{$logFileName}", $log);
}
closedir($handle);
}//end if $handle
Are you sure $latestFileSize is not always == 0?
Nothing wrong with file_put_contents usually, troubleshoot the logic first. Make sure you do get to the file_put_contents with FILE_APPEND, as the way you use it is perfectly fine.
I Have A Problem About extract the zip file with PHP
I try many shared script on the web
but it still doesn't work
the last script i try is this script :
<?php
$zip = new ZipArchive;
$res = $zip->open('data.zip');
if ($res === TRUE) {
$zip->extractTo('/extract/');
$zip->close();
echo 'woot!';
} else {
echo 'doh!';
}
?>
I am always get the else condition when the script is run ,
I've tried replacing the data.zip and the /extract/ path to complete path http://localhost/basedata/data.zip and http://localhost/basedata/extract/ but I still got the else condition , Anyone can help me?
Here Is My whole script and the zip file
http://www.mediafire.com/?c49c3xdxjlm58ey
You should check which error code gives open (http://www.php.net/manual/en/ziparchive.open.php), that will give you some help.
Error codes are this:
ZIPARCHIVE::ER_EXISTS -10
ZIPARCHIVE::ER_INCONS - 21
ZIPARCHIVE::ER_INVAL - 18
ZIPARCHIVE::ER_MEMORY - 14
ZIPARCHIVE::ER_NOENT - 9
ZIPARCHIVE::ER_NOZIP - 19
ZIPARCHIVE::ER_OPEN - 11
ZIPARCHIVE::ER_READ - 5
ZIPARCHIVE::ER_SEEK - 4
ZipArchive uses file paths, not URLs.
For example, if your web server's document root is /srv/www, specify /srv/www/basedata/data.zip (or wherever the file is on the server's local file system), not http://localhost/basedata/data.zip.
If the .ZIP file is on a remote computer, change the script to download it first before extracting it, though this does not seem to be your case.
Furthermore, the user the PHP script runs as needs to have read permission for the Zip file and write permission for the destination for extracted files.
you are declaring your $zip (ziparchive) incorrectly
<?php
$zip = new ZipArchive;
is missing parentheses
<?php
$zip = new ZipArchive();
if ($zip->open('data.zip')) {
I am trying to unzip one file that has two csv files in it. Every variation of my code has the same results. Using the code I have currently, it gets only one file partially out and then hangs. The file unzipped shows that it is 38,480kb and gets stuck toward the end of row 214410. The true file in the archive is 38,487kb and has a total of 214442 rows. Any idea what could be causing this to hang at the last minute? I am doing all my testing with xampp on localhost on a windows 7 machine. This is in php, and is the only code in the file. The required zip file is in the same folder with it.
<?php
ini_set('memory_limit','-1');
$zip = new ZipArchive;
if ($zip->open('ItemExport.zip') === TRUE) {
for($i = 0; $i < $zip->numFiles; $i++) {
$filename = $zip->getNameIndex($i);
$zip->extractTo('.', array($filename));
}
$zip->close();
} else {
echo 'failed';
}
}
?>
Thanks in advance for any help!
EDIT**
The file shows up almost immediately in the correct directory and a few seconds later it gives the file size 38,480kb. At that point it doesn't do anything else. After waiting on it for MORE than long enough 5-10 minutes+ I opened the file. It is "locked for editing by 'another user'" as it is still being held by http. The writing to the csv file just stops mid-word, mid-sentence on column 9 of 11 in row 214,442.
I am trying to build a small demon in PHP that analyzes the logfiles on a linux system. (eg. follow the syslog).
I have managed to open the file via fopen and continuosly read it with stream_get_line. My problem starts when the monitored file is deleted and recreated (eg when rotating logs). The program then does not read anything anymore, even if the file grew larger than previously.
Is there an elegant solution for this? stream_get_meta_data does not help and using tail -f on the command line shows the same problem.
EDIT, added sample code
I tried to boil down the code to a minimum to illustrate what I am looking for
<?php
$break=FALSE;
$handle = fopen('./testlog.txt', 'r');
do {
$line = stream_get_line($handle, 100, "\n");
if(!empty($line)) {
// do something
echo $line;
}
while (feof($handle)) {
sleep (5);
$line = stream_get_line($handle, 100, "\n");
if(!empty($line)) {
// do something
echo $line;
}
// a commented on php.net indicated it is possible
// with tcp streams to distinguish empty and lost
// does NOT work here --> need somefunction($handle)
if($line !== FALSE && $line ='') $break=TRUE;
}
} while (!$break);
fclose($handle);
?>
When log files are rotated, the original file is copied, then deleted, and a new file with the same name is created. It may have the same name as the original file, but it has a different inode. Inodes (dumbed down description follows) are like hidden incremental index numbers for your files. You can change the name of a file, or move it, but it takes the inode with it. Once that original log file is deleted, you can't re-open a file with the same name using the same file handler, because the inode has changed. Your best bet is detect the failure, and attempt to open the new file.
I have a project where the user can add multiple files to a cart and then zip them into a single file and download all of them at once.
It seems though that as soon as the cart gets bigger then 10mb it cannot create the zip file.
I thought this might be because of upload_max_filesize, but that is set to 200m and same for post_max_size.
What would be limiting this?
Here is my code.
It creates a random file name, then checks the cart loop and adds each file.
// create object
$zip = new ZipArchive();
$filename = "loggedin/user_files/ABX-DOWNLOAD-".rand(1, 1000000).".zip";
while (file_exists($filename)):
$filename = "loggedin/user_files/ABX-DOWNLOAD-".rand(1, 1000000).".zip";
endwhile;
// open archive
if ($zip->open('../'.$filename, ZIPARCHIVE::CREATE) !== TRUE) {
die ("Could not open archive");
}
// add files
//echo $all_audio;
foreach($audio_ids as $audio_id){
if($audio_id != ""){
$audio = Audio::find_by_id($audio_id);
$selected_category = Categories::find_by_id($audio->category_id);
$selected_sub_category = Sub_Categories::find_by_id($audio->sub_category_id);
$selected_sub_sub_category = Sub_Sub_Categories::find_by_id($audio->sub_sub_category_id);
$f = "uploads/mp3/".$selected_category->category_name."/".$selected_sub_category->sub_category_name."/".$selected_sub_sub_category->sub_sub_category_name."/".$audio->media_path;
$zip->addFile($f) or die ("ERROR: Could not add file: $f");
}
}
// close and save archive
$zip->close() or die("ERROR: COULD NOT CLOSE");
}
If the archive is being created in memory, you can increase the PHP memory limit through PHP.ini or other means, or alternatively write the file directly to disk while it is being created.
You can also stream the file to the user as it is being created. See php rendering large zip file - memory limit reached
Are you using the addFile() method?
If so, try replacing that call with addFromString() + file_get_contents().
Well, narrowed it down. A pathname was corrupted and was making the zip error out because the file did not exist. Looks like I need some better error checking.