Generate a tar.gz stream on the fly with php - php

I want to send a larged packed file to a browser without temporary store the file on the local disk.
I want something like this:
set_time_limit(0);
ignore_user_abort(true);
header(...);
$streamer = new TarGZStream(PHP_STDOUT);
for($i;$i<100 && connection_status() == CONNECTION_NORMAL;$i++) {
$fileContent = generateFileContent($i);
...
$streamer->appendContent($fileContent,$filename1);
$fileHandle = getAFileHandle($i);
...
$streamer->appendFile($fileHandle ,$filename2);
}
cleanupFoo();
What I need:
Streaming a tar.gz (or ZIP) to a stream handle or to stdout
Adding some Data by a stream handle
Adding some Data by an existing local file
How can I make the above scenario? What classes can be used? Is there a library for that?

Related

(PHP) Unzip and rename CSV files?

I want to download different feeds form some publishers. But the poor thing is, that they are first of all zipped as .gz and as second not in the right format. You can download one of the feeds and check it out. They do not have any filespec... So, I'm forced to add the .csv by myself..
My question now is, how can I unzip those files from the different urls?
How I do rename them, I know. But how do I unzip them?
I already searched for it and found this one:
//This input should be from somewhere else, hard-coded in this example
$file_name = '2013-07-16.dump.gz';
// Raising this value may increase performance
$buffer_size = 4096; // read 4kb at a time
$out_file_name = str_replace('.gz', '', $file_name);
// Open our files (in binary mode)
$file = gzopen($file_name, 'rb');
$out_file = fopen($out_file_name, 'wb');
// Keep repeating until the end of the input file
while (!gzeof($file)) {
// Read buffer-size bytes
// Both fwrite and gzread and binary-safe
fwrite($out_file, gzread($file, $buffer_size));
}
// Files are done, close files
fclose($out_file);
gzclose($file);
But with those feeds it doesn't work...
Here a two example files: file one | file two
Do you have an idea? - Would be very grateful!
Greetings!
windows 10 + php7.1.4 it's work.
The following code has the same effect.
ob_start();
readgzfile($file_name);
file_put_contents($output_filename', ob_get_contents());
ob_clean();
Or you can try to use the gzip command to decompress, and then use the it.
Program execution Functions

Decompressing an LZO stream in PHP

I have a number of LZO-compressed log files on Amazon S3, which I want to read from PHP. The AWS SDK provides a nice StreamWrapper for reading these files efficiently, but since the files are compressed, I need to decompress the content before I can process it.
I have installed the PHP-LZO extension which allows me to do lzo_decompress($data), but since I'm dealing with a stream rather than the full file contents, I assume I'll need to consume the string one LZO compressed block at a time. In other words, I want to do something like:
$s3 = S3Client::factory( $myAwsCredentials );
$s3->registerStreamWrapper();
$stream = fopen("s3://my_bucket/my_logfile", 'r');
$compressed_data = '';
while (!feof($stream)) {
$compressed_data .= fread($stream, 1024);
// TODO: determine if we have a full LZO block yet
if (contains_full_lzo_block($compressed_data)) {
// TODO: extract the LZO block
$lzo_block = get_lzo_block($compressed_data);
$input = lzo_decompress( $lzo_block );
// ...... and do stuff to the decompressed input
}
}
fclose($stream);
The two TODOs are where I'm unsure what to do:
Inspecting the data stream to dtermine whether I have a full LZO block yet
Extracting this block for decompression
Since the compression was done by Amazon (s3distCp) I don't have control over the block size, so I'll probably need to inspect the incoming stream to determine how big the blocks are -- is this a correct assumption?
(ideally, I'd use a custom StreamFilter directly on the stream, but I haven't been able to find anyone who has done that before)
Ok executing a command via PHP can be done in many different ways, something like:
$command = 'gunzip -c /path/src /path/dest';
$escapedCommand = escapeshellcmd($command);
system($escapedCommand);
or also
shell_exec('gunzip -c /path/src /path/dest');
will do the work.
Now it's a matter of what command to execute, under Linux there's a nice command line tool called lzop which extracts orcompresses lzop files.
You can use it via something like:
lzop -dN sources.lzo
So you final code might be something as easy as:
shell_exec('lzop -dN s3://my_bucket/my_logfile');

stream audio/video files from gridFS on the browser

I have been trying to read an audio file from mongoDB which i have stored using GridFS. I could download the file in the system and play from it but I wanted to stream those audio/video files from the DB itself and play it in the browser. Is there anyway to do that without downloading the file to the system? Any help would be good.
The PHP GridFS support has a MongoGridFSFile::getResource() function that allows you to get the stream as a resource - which doesn't load the whole file in memory. Combined with fread/echo or stream_copy_to_stream you can prevent the whole file from being loaded in memory. With stream_copy_to_stream, you can simply copy the GridFSFile stream's resource to the STDOUT stream:
<?php
$m = new MongoClient;
$images = $m->my_db->getGridFS('images');
$image = $images->findOne('mongo.png');
header('Content-type: image/png;');
$stream = $image->getResource();
stream_copy_to_stream( $stream, STDOUT );
?>
Alternatively, you can use fseek() on the returned $stream resource to only send back parts of the stream to the client. Combined with HTTP Range requests, you can do this pretty efficiently.
If the other recipe fails, for example with NginX and php-fpm, because STDOUT is not available in fpm, you can use
fpassthru($stream);
instead of
stream_copy_to_stream( $stream, STDOUT );
So a complete solution looks like:
function img($nr)
{
$mongo = new MongoClient();
$img = $mongo->ai->getGridFS('img')->findOne(array('metadata.nr'=>$nr));
if (!$img)
err("not found");
header('X-Accel-Buffering: no');
header("Content-type: ".$img->file["contentType"]);
header("Content-length: ".$img->getSize());
fpassthru($img->getResource());
exit(0);
}
FYI:
In this example:
File is not acccessed by the filename, instead it is accessed by a number stored in the metadata. Hint: You can set an unique index to ensure, that no number can be used twice.
Content-Type is read from GridFS, too, so you do not need to hardcode this.
NginX caching is switched off to enable streaming.
This way you can even handle other things like video or html pages. If you want to enable NginX caching, perhaps only output X-Accel-Buffering on bigger sizes.

Secure delete with PHP 5.3.x

Does someone knows an good PHP Solution to delete or better wipe an file from an linux system?
Scenario:
File is encrypted and saved, when a download is requested the file is copyed to an temporary folder and decrypted. This is already working.
But how to remove the file from the temporary location after sending in to the user?
In my mind i have the following options:
Open the File via "fopen" and write 0,1 into it (think very slow)
Save file to Memcache instead of harddisk (could be a problem with my hoster)
Use somd 3rd pary tool on commandline or as cronjob (could be a problem to install)
Goal: Delete the file from hard disk, without the possibility to recover (wipe/overwrite)
Call "shred" via exec/system/passthru
Arguably the best is to never save the file in its decrypted state in the first place.
Rather, use stream filters to decrypt it on-the-fly and send it directly to the end-user.
Update
Your option 1 is actually not too bad if you consider this code:
$filename = 'path/to/file';
$size = filesize($filename);
$src = fopen('/dev/zero', 'rb');
$dest = fopen('/path/to/file', 'wb');
stream_copy_to_stream($src, $dest, $size);
fclose($src);
fclose($dest);
You could choose /dev/urandom as well, but that will be slow.

How can I unzip a .gz file with PHP?

I'm using CodeIgniter and I can't figure out how to unzip files!
PHP itself has a number of functions for dealing with gzip files.
If you want to create a new, uncompressed file, it would be something like this.
Note: This doesn't check if the target file exists first, doesn't delete the input file, or do any error checking. You really should fix those before using this in production code.
// This input should be from somewhere else, hard-coded in this example
$file_name = 'file.txt.gz';
// Raising this value may increase performance
$buffer_size = 4096; // read 4kb at a time
$out_file_name = str_replace('.gz', '', $file_name);
// Open our files (in binary mode)
$file = gzopen($file_name, 'rb');
$out_file = fopen($out_file_name, 'wb');
// Keep repeating until the end of the input file
while(!gzeof($file)) {
// Read buffer-size bytes
// Both fwrite and gzread and binary-safe
fwrite($out_file, gzread($file, $buffer_size));
}
// Files are done, close files
fclose($out_file);
gzclose($file);
Note: This deals with gzip only. It doesn't deal with tar.
gzopen is way too much work. This is more intuitive:
$zipped = file_get_contents("foo.gz");
$unzipped = gzdecode($zipped);
works on http pages when the server is spitting out gzipped data also.
If you have access to system():
system("gunzip file.sql.gz");
Use the functions implemented by the Zlib Compression extension.
This snippet shows how to use some of the functions made available from the extension:
// open file for reading
$zp = gzopen($filename, "r");
// read 3 char
echo gzread($zp, 3);
// output until end of the file and close it.
gzpassthru($zp);
gzclose($zp);
Download the Unzip library
and include or autoload the unzip library
$this->load->library('unzip');

Categories