Amazon S3 StreamWrapper fread Issue In PHP - php

I am using amazon s3 API and setting the client to read as stream. It is working fine for me to use file_get_contents("s3://{bucket}/{key}"), which read the full data for the file(I am using video file & testing on my local system). However, I am trying to optimize the memory used by the script and thus trying to read and return data by chunk as below:
$stream = #fopen("s3://{bucket}/{key}", 'r');
$buffer = 1024;
while(!feof($stream)) {
echo #fread($stream, $buffer);
flush();
}
This is not working on my local system. I am just wondering what might be the issue using this technique. by searching, I found that this is also a very widely used technique. So, if anybody can please give any suggestion about what might be wrong here or any other approach, I should try with, it will be very helpful. Thanks.

OK, finally got the solution. Somehow, some other output are being added to buffer. I had to put:
ob_get_clean();
header('Content-Type: video/quicktime');
In this way to clean anything if were added. Now its working fine.
Thanks Mark Baker for your valuable support through the debugging process.

Related

Create a zip file but provide it as a download without saving it on server [duplicate]

I am trying to generate an archive on-the-fly in PHP and send it to the user immediately (without saving it). I figured that there would be no need to create a file on disk as the data I'm sending isn't persistent anyway, however, upon searching the web, I couldn't find out how. I also don't care about the file format.
So, the question is:
Is it possible to create and manipulate a file archive in memory within a php script without creating a tempfile along the way?
I had the same problem but finally found a somewhat obscure solution and decided to share it here.
I came accross the great zip.lib.php/unzip.lib.php scripts which come with phpmyadmin and are located in the "libraries" directory.
Using zip.lib.php worked as a charm for me:
require_once(LIBS_DIR . 'zip.lib.php');
...
//create the zip
$zip = new zipfile();
//add files to the zip, passing file contents, not actual files
$zip->addFile($file_content, $file_name);
...
//prepare the proper content type
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=my_archive.zip");
header("Content-Description: Files of an applicant");
//get the zip content and send it back to the browser
echo $zip->file();
This script allows downloading of a zip, without the need of having the files as real files or saving the zip itself as a file.
It is a shame that this functionality is not part of a more generic PHP library.
Here is a link to the zip.lib.php file from the phpmyadmin source:
https://github.com/phpmyadmin/phpmyadmin/blob/RELEASE_4_5_5_1/libraries/zip.lib.php
UPDATE:
Make sure you remove the following check from the beginning of zip.lib.php as otherwise the script just terminates:
if (! defined('PHPMYADMIN')) {
exit;
}
UPDATE:
This code is available on the CodeIgniter project as well:
https://github.com/patricksavalle/CodeIgniter/blob/439ac3a87a448ae6c2cbae0890c9f672efcae32d/system/helpers/zip_helper.php
what are you using to generate the archive? You might be able to use the stream php://temp or php://memory to read and write to/from the archive.
See http://php.net/manual/en/wrappers.php.php
Regarding your comment that php://temp works for you except when you close it, try keeping it open, flushing the output, then rewind it back to 0 and read it.
Look here for more examples: http://us.php.net/manual/en/function.tmpfile.php
Also research output buffering and capturing: http://us.php.net/manual/en/function.ob-start.php
You need to use ZipArchive::addFromString - if you use addFile() the file is not actually added until you go to close it. (Horrible bug IMHO, what if you are trying to move files into a zip and you delete them before you close the zip...)
The addFromString() method adds it to the archive immediately.
Is there really a performance issue here, or does it just offend your sense of rightness? A lot of processes write temporary files and delete them, and often they never hit the disk due to caching.
A tempfile is automatically deleted when closed. That's it's nature.
There are only two ways I can think of to create a zip file in memory and serve it and both are probably more trouble than they are worth.
use a ram disk.
modify the ziparchive class to add a method that does everything the close() method does, except actually close the file. (Or add a leave-open parameter to close()).
This might not even be possible depending on the underlying C libraries.

Broadcast stream with PHP within localhost

Maybe I'm asking the impossible but I wanted to clone a stream multiple times. A sort of multicast emulation. The idea is to write every 0.002 seconds a 1300 bytes big buffer into a .sock file (instead of IP:port to avoid overheading) and then to read from other scripts the same .sock file multiple times.
Doing it through a regular file is not doable. It works only within the same script that generates the buffer file and then echos it. The other scripts will misread it badly.
This works perfectly with the script that generates the chunks:
$handle = #fopen($url, 'rb');
$buffer = 1300;
while (1) {
$chunck = fread($handle, $buffer);
$handle2 = fopen('/var/tmp/stream_chunck.tmp', 'w');
fwrite($handle2, $chunck);
fclose($handle2);
readfile('/var/tmp/stream_chunck.tmp');
}
BUT the output of another script that reads the chunks:
while (1) {
readfile('/var/tmp/stream_chunck.tmp');
}
is messy. I don't know how to synchronize the reading process of chunks and I thought that sockets could make a miracle.
It works only within the same script that generates the buffer file and then echos it. The other scripts will misread it badly
Using a single file without any sort of flow control shouldn't be a problem - tail -F does just that. The disadvantage is that the data will just accululate indefinitely on the filesystem as long as a single client has an open file handle (even if you truncate the file).
But if you're writing chunks, then write each chunk to a different file (using an atomic write mechanism) then everyone can read it by polling for available files....
do {
while (!file_exists("$dir/$prefix.$current_chunk")) {
clearstatcache();
usleep(1000);
}
process(file_get_contents("$dir/$prefix.$current_chunk"));
$current_chunk++;
} while (!$finished);
Equally, you could this with a database - which should have slightly lower overhead for the polling, and simplifies the garbage collection of old chunks.
But this is all about how to make your solution workable - it doesn't really address the problem you are trying to solve. If we knew what you were trying to achieve then we might be able to advise on a more appropriate solution - e.g. if it's a chat application, video broadcast, something else....
I suspect a more appropriate solution would be to use mutli-processing, single memory model server - and when we're talking about PHP (which doesn't really do threading very well) that means an event based/asynchronous server. There's a bit more involved than simply calling socket_select() but there are some good scripts available which do most of the complicated stuff for you.

Limiting download speed in PHP web frameworks

I'm going to implement a website in PHP using a Web Framework. I never implemented a website in PHP... I don't have problems learning new languages.
The problem is that I would like to know if frameworks like Zend, CakePHP can be used to create a page which lets you download files at a given rate (eg 50 KB/s)?
Thank you.
Your server should deal with this issue, not PHP.
In case you have Apache see here:
Apache Module mod_ratelimit
Webserver Bandwidth Limiting in Apache
For Lighttpd see here:
http://www.debianhelp.co.uk/ssllighttpd.htm
TrafficShaping
As far as I know limiting the download speed is not part of the core but that is very easy to implement, simply extend the MediaView class and add that simple feature to it.
All you can do with PHP is possible in these Frameworks, too. The trick is to do it without violating the rules of the framework (e.g. the MVC pattern).
In CakePHP it is absolutely possible to create a controller action which puts out a binary file with all needed headers. In your controller action you can than limit the download speed with standard php.
PHP is not very good language for limiting download in my opinion. I have never done it, but I would do it in this way
header('Content-type: image/jpeg');
header('Content-Disposition: attachment; filename="image.jpg"');
$f = file('may_image_or_file_to_download.jpg');
foreach($f as $line){
echo $line;
flush();
usleep(10000); //change sleep time to adjusting download speed
}
you better use some Apache mods if you have possibilities
sorry for my english
If the framework doesn't provide that feature use an existing library, e.g. bandwidth-throttle/bandwidth-throttle
use bandwidthThrottle\BandwidthThrottle;
$in = fopen(__DIR__ . "/resources/video.mpg", "r");
$out = fopen("php://output", "w");
$throttle = new BandwidthThrottle();
$throttle->setRate(100, BandwidthThrottle::KIBIBYTES); // Set limit to 100KiB/s
$throttle->throttle($out);
stream_copy_to_stream($in, $out);

is php file_get_contents() enough for downloading movies?

is file_get_contents() enough for downloading remote movie files located on a server ?
i just think that perhaps storing large movie files to string is harmful ? according to the php docs.
OR do i need to use cURL ? I dont know cURL.
UPDATE: these are big movie files. around 200MB each.
file_get_contents() is a problem because it's going to load the entire file into memory in one go. If you have enough memory to support the operation (taking into account that if this is a web server, you may have multiple hits that generate this behavior simultaneously, and therefore each need that much memory), then file_get_contents() should be fine. However, it's not the right way to do it - you should use a library specifically intended for these sort of operations. As mentioned by others, cURL will do the trick, or wget. You might also have good luck using fopen('http://someurl', 'r') and reading blocks from the file and then dumping them straight to a local file that's been opened for write privileges.
As #mopoke suggested it could depend on the size of the file. For a small movie it may suffice. In general I think cURL would be a better fit though. You have much more flexibility with it than with file_get_contents().
For the best performance you may find it makes sense to just use a standard unix util like WGET. You should be able to call it with system("wget ...") or exec()
http://www.php.net/manual/en/function.system.php
you can read a few bytes at a time using fread().
$src="http://somewhere/test.avi";
$dst="test.avi";
$f = fopen($src, 'rb');
$o = fopen($dst, 'wb');
while (!feof($f)) {
if (fwrite($o, fread($f, 2048)) === FALSE) {
return 1;
}
}
fclose($f);
fclose($o);

Manipulate an Archive in memory with PHP (without creating a temporary file on disk)

I am trying to generate an archive on-the-fly in PHP and send it to the user immediately (without saving it). I figured that there would be no need to create a file on disk as the data I'm sending isn't persistent anyway, however, upon searching the web, I couldn't find out how. I also don't care about the file format.
So, the question is:
Is it possible to create and manipulate a file archive in memory within a php script without creating a tempfile along the way?
I had the same problem but finally found a somewhat obscure solution and decided to share it here.
I came accross the great zip.lib.php/unzip.lib.php scripts which come with phpmyadmin and are located in the "libraries" directory.
Using zip.lib.php worked as a charm for me:
require_once(LIBS_DIR . 'zip.lib.php');
...
//create the zip
$zip = new zipfile();
//add files to the zip, passing file contents, not actual files
$zip->addFile($file_content, $file_name);
...
//prepare the proper content type
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=my_archive.zip");
header("Content-Description: Files of an applicant");
//get the zip content and send it back to the browser
echo $zip->file();
This script allows downloading of a zip, without the need of having the files as real files or saving the zip itself as a file.
It is a shame that this functionality is not part of a more generic PHP library.
Here is a link to the zip.lib.php file from the phpmyadmin source:
https://github.com/phpmyadmin/phpmyadmin/blob/RELEASE_4_5_5_1/libraries/zip.lib.php
UPDATE:
Make sure you remove the following check from the beginning of zip.lib.php as otherwise the script just terminates:
if (! defined('PHPMYADMIN')) {
exit;
}
UPDATE:
This code is available on the CodeIgniter project as well:
https://github.com/patricksavalle/CodeIgniter/blob/439ac3a87a448ae6c2cbae0890c9f672efcae32d/system/helpers/zip_helper.php
what are you using to generate the archive? You might be able to use the stream php://temp or php://memory to read and write to/from the archive.
See http://php.net/manual/en/wrappers.php.php
Regarding your comment that php://temp works for you except when you close it, try keeping it open, flushing the output, then rewind it back to 0 and read it.
Look here for more examples: http://us.php.net/manual/en/function.tmpfile.php
Also research output buffering and capturing: http://us.php.net/manual/en/function.ob-start.php
You need to use ZipArchive::addFromString - if you use addFile() the file is not actually added until you go to close it. (Horrible bug IMHO, what if you are trying to move files into a zip and you delete them before you close the zip...)
The addFromString() method adds it to the archive immediately.
Is there really a performance issue here, or does it just offend your sense of rightness? A lot of processes write temporary files and delete them, and often they never hit the disk due to caching.
A tempfile is automatically deleted when closed. That's it's nature.
There are only two ways I can think of to create a zip file in memory and serve it and both are probably more trouble than they are worth.
use a ram disk.
modify the ziparchive class to add a method that does everything the close() method does, except actually close the file. (Or add a leave-open parameter to close()).
This might not even be possible depending on the underlying C libraries.

Categories