how to stream video from a thread in PHP? - php

hey i am trying to write web page that stream live video, from my locally server. i want to use threads. so i tried to use file_get_content and i got gibberish presenting in my web page . this is the thread, does anybody know different method for presenting video within the thread?
class stream extends Thread {
public function run() {
$file = "/var/www/html/movie.mp4"; // The media file's location
$f = fopen($file, 'rb'); // Open the file in binary mode
$chunkSize = 8192; // The size of each chunk to output
// Start outputting the data
while(true){
fpassthru($f);
echo file_get_contents('/var/www/html/movie.mp4');
//echo fread($f, $chunkSize);
//$data = fread($f, $chunkSize);
// echo $data;
flush();
}
}
}

Related

How to check if PHP download was canceled?

I need to log total downloads of an specific file. Download function is working fine, but can't define if user canceled (clicking "cancel" on browser dialog) or if connection was aborted latter.
I understand it's not simple to know when a file download was finished, so I'm trying to get this by two ways. None works:
Get total bytes sent, latter I will compare it with total file size: this way $bytes_sent var always is set with total file size, no matter if user click cancel button of download dialog or if cancel download process latter.
Trigger connection_aborted() function: Have not found the way this function happen and define my session var...
(I'm not shure if the fact Im working with sessions is relevant).
I appreciate your help :)
<?php
if(is_file($filepath)){
$handle = fopen($filepath, "r");
header("Content-Type: $mime_type");
header("Content-Length: ". filesize($filepath).";");
header("Content-disposition: attachment; filename=" . $name);
while(!feof($handle)){
ignore_user_abort(true);
set_time_limit(0);
$data = fread($handle, filesize($filepath));
print $data;
$_SESSION['download'] = 'Successful download';
//Always is set as total file lenght, even when cancel a large file download before it finish:
bytes_sent = ftell($handle);
flush();
ob_flush();
//Can't trigger connection aborted, in any case:
if(connection_aborted()){
$_SESSION['download'] = 'Canceled download';
}
}
}
PHP Version 5.3.29
You need to read the file in small chunks, rather than reading it all in at once.
$chunk_size = 1000;
ignore_user_abort();
$canceled = false;
while ($chunk = fread($handle, $chunk_size)) {
print $chunk;
ob_flush();
$bytes_sent += strlen($chunk);
if (connection_aborted()) {
$canceled = true;
break;
}
}
$_SESSION['download'] = $canceled ? "Download canceled" : "Download successful";

Php send 0 bytes excel file through echo

I am creating a very large table in a single string $output with the data from a MySQL database. In end of the php script, I send two headers:
header("Content-type: application/msexcel");
header("Content-Disposition: attachment; filename=action.xls");
The problem is when the large table has about more than 55000 rows. When this happens, the file sent is 0 bytes and the user opens a empty file. I am sending the file through this after the headers:
echo $output;
When the table has not too many rows, the file sent work. How to send a file in this way that the size of the string $output don't matter?
header("Content-type: application/msexcel");
header("Content-Disposition: attachment; filename=action.xls");
Move these lines to the top the page or Script before html table it will starts working
The solution found in another question worked for this. The file is too large to send, so the solution is send part by part to the user. Instead of use readfile() function, use the readfile_chuncked() customized function:
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk
// Read a file and display its content chunk by chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}

Download a file with php and polymer

I'm having some trouble with this one. I have found some helpful scripts on the web and have been modifying them for my needs. However, I can't seem to download a file. It will respond back with the contents of the file but doesn't download it. I am using Polymer 1.0+ for my client side and PHP for my server side. The client side code to download a file is as follows:
<!--THIS IS THE HTML SIDE-->
<iron-ajax
id="ajaxDownloadItem"
url="../../../dropFilesBackend/index.php/main/DownloadItem"
method="GET"
handle-as="document"
last-response="{{downloadResponse}}"
on-response="ajaxDownloadItemResponse">
</iron-ajax>
//THIS IS THE JAVASCRIPT THAT WILL CALL THE "iron-ajax" ELEMENT
downloadItem:function(e){
this.$.ajaxDownloadItem.params = {"FILENAME":this.selectedItem.FILENAME,
"PATH":this.folder};
this.$.ajaxDownloadItem.generateRequest();
},
The server side code is as follows (the url is different because I do some url modification to get to the correct script):
function actionDownloadItem(){
valRequestMethodGet();
$username = $_SESSION['USERNAME'];
if(validateLoggedIn($username)){
$itemName = arrayGet($_GET,"FILENAME");
$path = arrayGet($_GET,"PATH");
$username = $_SESSION['USERNAME'];
$downloadItem = CoreFilePath();
$downloadItem .= "/".$_SESSION['USERNAME']."".$path."".$itemName;
DownloadFile($downloadItem);
}
else {
echo "Not Logged In.";
}
}
function DownloadFile($filePath) {
//ignore_user_abort(true);
set_time_limit(0); // disable the time limit for this script
//touch($filePath);
//chmod($filePath, 0775);
if ($fd = fopen($filePath, "r")) {
$fsize = filesize($filePath);//this returns 12
$path_parts = pathinfo($filePath);//basename = textfile.txt
$ext = strtolower($path_parts["extension"]);//this returns txt
$header = headerMimeType($ext); //this returns text/plain
header('Content-disposition: attachment; filename="'.$path_parts["basename"].'"'); // use 'attachment' to force a file download
header("Content-type: $header");
header("Content-length: $fsize");
header("Cache-control: private"); //use this to open files directly
while(!feof($fd)) {
$buffer = fread($fd, 2048);
echo $buffer;
}
}
fclose ($fd);
}
Any help on this one would be greatly appreciated.
First you will need the file handle
$pathToSave = '/home/something/something.txt';
$writeHandle = fopen($pathToSave, 'wb');
Then, while you are reading the download, write to the file instead of echoing
fwrite($writeHandle, fread($fd, 2048));
Finally, after writing to the file finished close the handle
fclose($writeHandle);
I neglect the error check, you should implement your own.

How to handle simultaneous reading and writing

I have a server that grabs mp3 audio buffers from a source and writes it into a file via php. It truncates the beginning of the file so the file size never exceeds 2 MB. At the same time a client is streaming the mp3 by seeking to the end and reading if there is any new data. The problem is when the file get's truncated the position the client was reading at changes.
This is the client side that streams the audio:
$handle = fopen('cool.mp3', "r");
$err = fseek($handle, 0, SEEK_END);
while(file_exists($file_lock)){ // cool.mp3.lock means stream is still going
$data = fread($handle, 1024);
echo $data;
ob_flush();
flush();
}
I use this on the server to write data as I get it:
$data = "audio frames....";
clearstatcache();
$file = 'cool.mp3';
if(filesize($file) > 1024*200){ //2 MB
ftruncatestart($file, 1024*25); //Trim Down By Deleting Front
}
file_put_contents($file, $data, FILE_APPEND);

getimagesize() limiting file size for remote URL

I could use getimagesize() to validate an image, but the problem is what if the mischievous user puts a link to a 10GB random file then it would whack my production server's bandwidth. How do I limit the filesize getimagesize() is getting? (eg. 5MB max image size)
PS: I did research before asking.
You can download the file separately, imposing a maximum size you wish to download:
function mygetimagesize($url, $max_size = -1)
{
// create temporary file to store data from $url
if (false === ($tmpfname = tempnam(sys_get_temp_dir(), uniqid('mgis')))) {
return false;
}
// open input and output
if (false === ($in = fopen($url, 'rb')) || false === ($out = fopen($tmpfname, 'wb'))) {
unlink($tmpfname);
return false;
}
// copy at most $max_size bytes
stream_copy_to_stream($in, $out, $max_size);
// close input and output file
fclose($in); fclose($out);
// retrieve image information
$info = getimagesize($tmpfname);
// get rid of temporary file
unlink($tmpfname);
return $info;
}
You don't want to do something like getimagesize('http://example.com') to begin with, since this will download the image once, check the size, then discard the downloaded image data. That's a real waste of bandwidth.
So, separate the download process from the checking of the image size. For example, use fopen to open the image URL, read little by little and write it to a temporary file, keeping count of how much you have read. Once you cross 5MB and are still not finished reading, you stop and reject the image.
You could try to read the HTTP Content-Size header before starting the actual download to weed out obviously large files, but you cannot rely on it, since it can be spoofed or omitted.
Here is an example, you need to make some change to fit your requirement.
function getimagesize_limit($url, $limit)
{
global $phpbb_root_path;
$tmpfilename = tempnam($phpbb_root_path . 'store/', unique_id() . '-');
$fp = fopen($url, 'r');
if (!$fp) return false;
$tmpfile = fopen($tmpfilename, 'w');
$size = 0;
while (!feof($fp) && $size<$limit)
{
$content = fread($fp, 8192);
$size += 8192; fwrite($tmpfile, $content);
}
fclose($fp);
fclose($tmpfile);
$is = getimagesize($tmpfilename);
unlink($tmpfilename);
return $is;
}

Categories