Is there any on server unzip lib for PHP? - php

We are using dUnzip2 in our script to unzip files before download, write license then use zip.lib to zip it and serve. But the dUnzip2 is using:
foreach ($f as $file_row => $file)
which works fine for small files , but for files larger than 10 mb it should use something like
for($n = 1;$n < count($f);$n++){
$file = $f[$n];
}
which is causing memory limit issues on files that are bigger than 10MB. We have to increase memory limit on the server for that lib all the time. The script itself is HUGE and to be honest I would not dare on taking the task to modify it.
So do you know any other unzip library that would do the same job as dUnzip2 or better solution?

Why not use PHP's built in Zip stuff: http://www.php.net/manual/en/zip.examples.php
Obviously assuming it's enabled (usually is) do a phpinfo() to check.

Related

PHP ZipArchive extractTo() stalls with large files

I am working on a script that uses the extractTo method of ZipArchive to unzip some pretty large files (some are over 10G). Everything works fine for small files, but I have been testing it with files that are ~4G and noticed that unzip works up to a point, then it stops actually unzipping. However, the PHP script appears to be running. No errors or exceptions are thrown. From a terminal I sit in the folder and keep typing ls -la to watch the size of the extracted file grow. It does so for a while, then stops and the script continues to load (watching via browser and via top). The script will then run for the specified timeout period (I set to 3600) and throw a time-out error. The box is running Centos 6.6, 16G of RAM, plenty of processing power, and plenty of disc space. Seems to be crashing at 5011800064 bytes unzipped. Here are some select bits of my code:
set_time_limit(1200);
ini_set('memory_limit', '1024M');
$zip = new ZipArchive;
$res = $zip->open($zipPath);
if ($res === TRUE)
{
$extres = $zip->extractTo(
$unzipPath,
$filesToExtract
);
$zip->close();
}
Any help would be greatly appreciated. I am also curious to know if the extractTo() function tries to load the whole zip into memory? I have scoured the PHP documentation and cannot find anything relevant. All of the related posts either do not have answers and were not specific in their explanation of the problem.
Edit: Just to confirm, I have over 20G free and setting different memory limits for the script doesn't change the number of bytes unzipped.
Update: I have scoured httpd.conf, php.ini, and cannot find any settings that are prohibiting the unzip operations from working.
A traditional .zip archive is limited in size to 4GB:
The maximum size for both the archive file and the individual files inside it is 4,294,967,295 bytes (232−1 bytes, or 4 GiB minus 1 byte) for standard .ZIP, and 18,446,744,073,709,551,615 bytes (264−1 bytes, or 16 EiB minus 1 byte) for ZIP64.
To decompress larger archives using ZIP64 in PHP, you'll need to use PHP 5.6 that uses libzip 0.11.2.

Zipping large directory on Windows Server using PHP

I want to zip a large folder of 50K files on Windows Server. I'm currently using this code:
include_once("CreateZipFile.inc.php");
$createZipFile=new CreateZipFile;
$directoryToZip="repository";
$outputDir=".";
$zipName="CreateZipFileWithPHP.zip";
define("ZIP_DIR",1); //
if(ZIP_DIR)
{
//Code toZip a directory and all its files/subdirectories
$createZipFile->zipDirectory($directoryToZip,$outputDir);
}else
{
//?
}
$fd=fopen($zipName, "wb");
$out=fwrite($fd,$createZipFile->getZippedfile());
fclose($fd);
$createZipFile->forceDownload($zipName);
#unlink($zipName);
Everything works fine until around 2K image files. But this is not what I want to get. I'm willing to process to zip like 50K images at least. Meanwhile my script gets this error:
Fatal error: Maximum execution time of 360 seconds exceeded in C:\xampp\htdocs\filemanager\CreateZipFile.inc.php on line 92
$newOffset = strlen(implode("", $this->compressedData));
I'm searching for any solution to proceed such a huge amount of files. I currently use XAMPP on Windows Server 2008 Standard. Is there any possibility to make small parts of the zips, use a system command and maybe external tool to pack them and then send it to header to download?
http://pastebin.com/iHfT6x69 for CreateZipFile.inc.php
try this .. to increase execution time
ini_set('max_execution_time', 500);
500 is number os seconds change it to whatever you lilke
Do you need a smaller file or a fast served file?
for fast serving without compression and without memory leak you could try to use the system command with a zip software like gzip and turning the compression of.
the files would probably get huge but would be served fast as one file.

download many archives for a server in one archive with php

I need to do an aplication to makes the user download a lot of pictures in the server choosing some criterias and only in one archive. i tried with php and pclzip.lib and zipstream libraries found on web, but none of the two methods work. with pclzip i only can compress 3Mb and with zipstream library in the middle of the download it fails with a 25MB size. I don´t know how many archives the user will need to download but now i have around 700 archives and 75MB. Is there another way to make this application work or has anyone had similar problems with this libraries and solved it. Here's some more specifications: remote server iis, php. thanks and sorry for my english
After some time I've found the way using
ini_set('max_execution_time', 1200);
ini_set("memory_limit","500M");
And the library ZipStream. Also, I've changed the PHP memory limit in the PHP server because my file is really big.
I've used pclzip before and this code works:
if ($this->extension == 'zip') {
$this->archive = new PclZip($orgPath);
if ($this->archive->extract(PCLZIP_OPT_PATH, $newPath) == 0) {
die("Error : " . $this->archive->errorInfo(true));
}
}
But I think that's not your problem. It sounds like not enough RAM for PHP or not enough time.
If that's your problem, you can fix it with
set_time_limit(0); // 0 = no time limit
ini_set('memory_limit', '300M'); // Increases memory for PHP to 300M
That works for me on Apache (I'm not shure, but does ini_set Work on your system?)

Read content of 12000 files from another FTP server

What I would like to script: a PHP script to find a certain string in loads of files
Is it possible to read contents of thousands of text files from another ftp server without actually downloading those files (ftp_get) ?
If not, would downloading them ONCE -> if already exists = skip / filesize differs = redownload -> search certain string -> ...
be the easiest option?
If URL fopen wrappers are enabled, then file_get_contents can do the trick and you do not need to save the file on your server.
<?php
$find = 'mytext'; //text to find
$files = array('http://example.com/file1.txt', 'http://example.com/file2.txt'); //source files
foreach($files as $file)
{
$data = file_get_contents($file);
if(strpos($data, $find) !== FALSE)
echo "found in $file".PHP_EOL;
}
?>
[EDIT]: If Files are accessible only by FTP:
In that case, you have to use like this:
$files = array('ftp://user:pass#domain.com/path/to/file', 'ftp://user:pass#domain.com/path/to/file2');
If you are going to store the files after you download them, then you may be better served to just download or update all of the files, then search through them for the string.
The best approach depends on how you will use it.
If you are going to be deleting the files after you have searched them, then you may want to also keep track of which ones you searched, and their file date information, so that later, when you go to search again, you won't waste time searching files that haven't changed since the last time you checked them.
When you are dealing with so many files, try to cache any information that will help your program to be more efficient next time it runs.
PHP's built-in file reading functions, such as fopen()/fread()/fclose() and file_get_contents() do support FTP URLs, like this:
<?php
$data = file_get_contents('ftp://user:password#ftp.example.com/dir/file');
// The file's contents are stored in the $data variable
If you would need to get a list of the files in the directory, you might want to check out opendir(), readdir() and closedir(), which I'm pretty sure supports FTP URLs.
An example:
<?php
$dir = opendir('ftp://user:password#ftp.example.com/dir/');
if(!$dir)
die;
while(($file = readdir($dir)) !== false)
echo htmlspecialchars($file).'<br />';
closedir($dir);
If you can connect via SSH to that server, and if you can install new PECL (and PEAR) modules, then you might consider using PHP SSH2. Here's a good tutorial on how to install and use it. This is a better alternative to FTP. But if it is not possible, your only solution is file_get_content('ftp://domain/path/to/remote/file');.
** UPDATE **
Here is a PHP-only implementation of an SSH client : SSH in PHP.
With FTP you'll always have to download to check.
I do not know what kind of bandwidth you're having and how big the files are, but this might be an interesting use-case to run this from the cloud like Amazon EC2, or google-apps (if you can download the files in the timelimit).
In the EC2 case you then spin up the server for an hour to check for updates in the files and shut it down again afterwards. This will cost a couple of bucks per month and avoid you from potentially upgrading your line or hosting contract.
If this is a regular task then it might be worth using a simple queue system so you can run multiple processes at once (will hugely increase speed) This would involve two steps:
Get a list of all files on the remote server
Put the list into a queue (you can use memcached for a basic message queuing system)
Use a seperate script to get the next item from the queue.
The procesing script would contain simple functionality (in do while loop)
ftp_connect
do
item = next item from queue
$contents = file_get_contents;
preg_match(.., $contents);
while (true);
ftp close
You could then in theory fork off multiple processes through the command line without needing to worry about race conditions.
This method is probabaly best suited to crons/batch processing, however it might work in this situation too.

How to show file download progress in PHP Shell Scripting?

I have a php shell script that downloads a large file, it would be a bit of a luxury to be able to see the progress of the download in shell while it's happening, anyone have any idea how this can be achieved (or at least point me in the right direction!)
Thanks!
You could try to poll the filesize of the downloaded file as it's downloading, and compare it with the filesize of the file you requested.
This seemed to work (unexpectedly!)
echo "wget '$feedURL'\n";
$execute = "wget -O ".$filePath." '$feedURL'\n";
$systemOutput = shell_exec($execute);
$systemOutput = str_replace( "\n", "\n\t", $systemOutput);
echo "\t$systemOutput\n";
Read the header of the file to get the size of the file (if that information is available). Then keep track of how much you have downloaded and that will give you your percentage.
How you might do that depends on what libraries/functions you are using.

Categories