PHP ZipArchive ExtractTo Hanging Issue - php

I am trying to unzip one file that has two csv files in it. Every variation of my code has the same results. Using the code I have currently, it gets only one file partially out and then hangs. The file unzipped shows that it is 38,480kb and gets stuck toward the end of row 214410. The true file in the archive is 38,487kb and has a total of 214442 rows. Any idea what could be causing this to hang at the last minute? I am doing all my testing with xampp on localhost on a windows 7 machine. This is in php, and is the only code in the file. The required zip file is in the same folder with it.
<?php
ini_set('memory_limit','-1');
$zip = new ZipArchive;
if ($zip->open('ItemExport.zip') === TRUE) {
for($i = 0; $i < $zip->numFiles; $i++) {
$filename = $zip->getNameIndex($i);
$zip->extractTo('.', array($filename));
}
$zip->close();
} else {
echo 'failed';
}
}
?>
Thanks in advance for any help!
EDIT**
The file shows up almost immediately in the correct directory and a few seconds later it gives the file size 38,480kb. At that point it doesn't do anything else. After waiting on it for MORE than long enough 5-10 minutes+ I opened the file. It is "locked for editing by 'another user'" as it is still being held by http. The writing to the csv file just stops mid-word, mid-sentence on column 9 of 11 in row 214,442.

Related

Php cannot see many files included in a zip file

I'm reading files that are located inside a .zip archive (in this case an XBRL taxonomy) using PHP 7 and its ZipArchive functions, but a whole lot of files that I am sure are inside are simply skipped, ignored as they do not exist.
This is the result of running zipinfo www.eba.europa.eu.zip on the file
https://www.dropbox.com/s/336njdmfg8uaho8/output-zipinfo.txt?dl=0
This is the result of reading the content of the zip using the following code:
$zip = new \ZipArchive();
$zip->open("www.eba.europa.eu.zip");
for ($i = 0; $i < $zip->numFiles; $i++) {
echo 'Filename: ' . $zip->getNameIndex($i) . PHP_EOL;
}
https://www.dropbox.com/s/7njkp5i92d68fxs/output-ziparchive.txt?dl=0
As you can see all the files with finrep in their name just are not present in the second test.
What could it be? Missing permissions on something? File size/number limit? Sorry for the Dropbox links, but the logs are both quite big considering the number of files.
Thanks in advance for the help!

PHP: Error (500) when creating a zip file with ZipArchive

I'm using phps ZipArchive class to create a zip file of a directory. The code works fine for almost all directories except two, which happen to be the largest ones (the largest directory currently contains 51 files with a total of 175MB). When I'm running the code for these directories a temporary file ('filename.zip.[RandomLettersAndNumbers]', e.g. 'filename.zip.riaab4') will be created with a size of 67,108,864 bytes and the script throws an internal server error (500).
What I've tried so far (most of it is still visible in the source):
Increase memory_limit
Increase max_execution_time
Check the php error log: error log is empty
Try to find the exact line where the error occurs: the code is executed up to $zip_archive->close();
Source:
//Temporary (debugging) START
ini_set("log_errors", 1);
ini_set("error_log", "php-error.log");
error_log("Hello errors.");
ini_set('memory_limit','512M');
set_time_limit(180);
//phpinfo();
//Temporary (debugging) END
//Get selected directory
// --> Code removed
$result['directory'] = 'directory1';
//Create .zip file
$zip_archive = new ZipArchive();
$ZIP = 'files/'.$result['directory'].'.zip';
if($zip_archive->open($ZIP, ZipArchive::CREATE | ZIPARCHIVE::OVERWRITE) !== TRUE) {
echo 'Error! Error while creating ZIP Archive.';
die;
}
foreach(new DirectoryIterator('files/'.$result['directory']) as $fileInfo) {
if($fileInfo->isDot())
continue;
if(file_exists($fileInfo->getPath()))
$zip_archive->addFile('files/'.$result['directory'].'/'.$fileInfo->getFilename(), $fileInfo->getFilename());
}
//Temporary (debugging) START
echo '<br><br><br>';
var_dump($zip_archive->close());
// ## The following lines are not executed!! ##
echo '<br><br><br>';
var_dump($zip_archive->getStatusString());
echo '<br><br><br>';
//Temporary (debugging) END
if($zip_archive->numFiles > 0 && $zip_archive->close() == true)
$FILE = $ZIP;
else {
}
Am I missing something? If there's anything else I can try please let me know. Thanks.
EDIT: I've got a more specific question: Why do the temporary zip files of the affected directories all have a file size of 67,108,864 bytes? Is this related to a maximum file size set by php/the server or can this be explained with the zip standard/ZipArchive class?
Try closing the ZIP archive and reopening it after every 5 or so files are added to the archive. I assume this solution solves a memory limit issue.
Credit to MatthewS for this solution.

Unzip The Zip Archive With PHP

I Have A Problem About extract the zip file with PHP
I try many shared script on the web
but it still doesn't work
the last script i try is this script :
<?php
$zip = new ZipArchive;
$res = $zip->open('data.zip');
if ($res === TRUE) {
$zip->extractTo('/extract/');
$zip->close();
echo 'woot!';
} else {
echo 'doh!';
}
?>
I am always get the else condition when the script is run ,
I've tried replacing the data.zip and the /extract/ path to complete path http://localhost/basedata/data.zip and http://localhost/basedata/extract/ but I still got the else condition , Anyone can help me?
Here Is My whole script and the zip file
http://www.mediafire.com/?c49c3xdxjlm58ey
You should check which error code gives open (http://www.php.net/manual/en/ziparchive.open.php), that will give you some help.
Error codes are this:
ZIPARCHIVE::ER_EXISTS -10
ZIPARCHIVE::ER_INCONS - 21
ZIPARCHIVE::ER_INVAL - 18
ZIPARCHIVE::ER_MEMORY - 14
ZIPARCHIVE::ER_NOENT - 9
ZIPARCHIVE::ER_NOZIP - 19
ZIPARCHIVE::ER_OPEN - 11
ZIPARCHIVE::ER_READ - 5
ZIPARCHIVE::ER_SEEK - 4
ZipArchive uses file paths, not URLs.
For example, if your web server's document root is /srv/www, specify /srv/www/basedata/data.zip (or wherever the file is on the server's local file system), not http://localhost/basedata/data.zip.
If the .ZIP file is on a remote computer, change the script to download it first before extracting it, though this does not seem to be your case.
Furthermore, the user the PHP script runs as needs to have read permission for the Zip file and write permission for the destination for extracted files.
you are declaring your $zip (ziparchive) incorrectly
<?php
$zip = new ZipArchive;
is missing parentheses
<?php
$zip = new ZipArchive();
if ($zip->open('data.zip')) {

PHP ZipArchive fails to extract CSV files properly

A real head scratcher this one - any help would be gratefully received.
I have been using the zipArchive library to extract csv files from a zip.
Oddly, it will only extract 40 files properly. Files with an index 40 or greater appear as empty files, files 0-39 extract perfectly.
This is the case regardless of the combination of files and the size of the files. I have tried removing the 39th file and the 40th file from the zip and the problem just moves. No matter what combination of files I use, it extracts 40 files properly and then just dies.
Thanks to this forum, I have tried using Shell Exec with exactly the same outcome.
I have also tried extracting the files one at a time, using a zip with only the csv files and zips with multiple different file types. Always only 40 are extracted.
This is such a suspiciously round number that it must surely be a setting somewhere that I cannot find or otherwise a bug.
For what it is worth, the unzipping code is below:
$zip = new ZipArchive;
if ($zip->open('Directory/zipname.zip') == TRUE) {
for ($i = 0; $i < $zip->numFiles; $i++) {
$filename = $zip->getNameIndex($i);
if(substr(strrchr($filename,'.'),1,3)=="csv")
{
$zip->extractTo('Directory/',$filename);
}
}
}
I have also tried the following which uses a different method with the same results :-(
$zip2 = new ZipArchive;
if ($zip2->open('Directory/zipname.zip') == TRUE) {
for ($i = 0; $i < $zip2->numFiles; $i++) {
$filename = $zip2->getNameIndex($i);
if(substr(strrchr($filename,'.'),1,3)=="csv")
{
$content=$zip2->getFromIndex($i);
$thefile=fopen("directory/filename","w");
fwrite($thefile,$content);
fclose($thefile);
}
}
}
FINALLY found the answer. Thanks to all who tried to help.
For others suffering in the same way, the problem was solved by increasing the server disk allocation. I was on a rather old plan which had served well until the advent of a new national database that increased the amount of storage 10 fold.
A measly 100MB allowance meant that the server would only do so much before spitting the dummy.
Interestingly, a similar problem occurred with trying other file operations - it seemed to be limited to 40 file operations per script, regardless of the size of each file.

Renaming a 900kb pdf file takes long time

I am trying to rename() a 900 KiB PDF file in PHP. It is taking a long time to rename it for some reason. I thought it should be instant.
This is on a CentOS server. While the file is being renamed I can get properties and it seems like rename() is copying and replacing the old file with new renamed file.
The old name and new name paths are in the same directory.
Has anyone stumbled upon this issue before?
Code:
//If exists change name and then return path
$pieces = explode("#", $filename);
$newName = $pieces[0].' '.$pieces[2];
rename($uidPath.$filename, $uidPath.$newName);
if (preg_match('/pdf/', $pieces[2]))
{
$result['status'] = '1';
$result['path'] = 'path to file';
}
else
{
$result['status'] = '1';
$result['path'] = 'path to file';
}
PHP is for some reason very slow to release file lock on fclose(), so if you are writing to the file prior to moving it you might have to wait for a bit. I've had this very problem with a low priority background job, so I didn't really look into why this happens or what I can do to prevent it - I just added 1 second sleep between fclose() and rename.

Categories