opening mp4 via php results in full download before playback - php

I'm trying to feed an mp4 file to flash player via php and the video is downloaded completely before starting playback.
$src = '/var/www/user/data/www/domain.com/video.mp4';
if(file_exists($src) and is_readable($src)) {
header('Content-Type: video/mp4');
header('Content-Length: '.filesize($src));
readfile($src);
} else die('error');
I've tried curl with similar results. Any ideas what's causing this delay?

Most likely your Flash player is hoping you'll handle HTTP Range requests so it can get started faster on the playback.
The HTML5/Flash audio player jPlayer has a section in their developer guide about this. Scroll to the part about Byte-Range Requests:
Your server must enable Range requests. This is easy to check for by
seeing if your server's response includes the Accept-Ranges in its
header.
Also note that they offer a PHP solution for handling Range requests if you have to use PHP instead of a direct download.
smartReadFile.php
https://groups.google.com/forum/#!msg/jplayer/nSM2UmnSKKA/bC-l3k0pCPMJ

Another option would be to just have apache send the file it self as opposed to reading it in php and dumping it to the output using X-Sendfile.
First make sure apache is compiled with sendfile support then alter your output code to be:
header ('X-Sendfile: ' . $src);
header ('Content-Type: video/mp4');
header ('Content-Disposition: attachment; filename="' . $filename . '"');
exit;
This is normally faster than doing it via PHP.

Related

Readfile is best solution to download external files?

I need to get a remote file and give it to user without saving it to my server disk (for hiding original URL) and found a lot of posts about download external files with various functions like file_get_contents or readfile. Already I'm using this one:
function startDownload($url){
if($this->url_exists($url))
{
//get filename from url
$name=$this->getFileName($url);
//first flush clear almost output
ob_end_flush();
//final clear
ob_clean();
//set headers
header('Content-Type: application/octet-stream');
header("Content-Transfer-Encoding: Binary");
header("Content-disposition: attachment; filename=\"" . $name . "\"");
//send file to client;
readfile($url);
//exit command is important
exit;
}
else JFactory::getApplication()->enqueueMessage(JText::_('URL_NOT_FOUND'), 'error');
}
And that's working but there is a problem! For a file with 200 MB size it takes ~ 10 seconds to start download in client browser. I think it's because readfile first downloads whole file to my server buffer and then give it to user. Is that right?
And is it possible to make it faster? for example download be started before fetch ended or it isn't possible technically?
In fact I don't know that this method is optimised or not. Any technical advice would be appreciated.
Note :
I know that this function should be changed for big files and that's not my concern now.
I consider to buy the external server in the same datacenter to make this download faster.
Target is that [File server] be separate than the file [online shop].
I tested curl method that mentioned by #LawrenceCherone. It worked nicely but when moved it to my project the result was the same as readfile (white screen for a few seconds).
So suspect to readfile() function. Separate my previous code to a single PHP file and result was amazing! Download starts immediately.
So I think my guess wasn't right and problem was not related to readfile function.
After a little search found a minor modification. I added below line :
while (ob_get_level()) ob_end_clean();
before the :
readfile($url);
And now download starts before whole file fetched in my server.

Problems embedding PDF file sent with XSendFile in a webpage

I'd like to embed a PDF file in a webpage. I need to dynamically produce the PDF so I can authenticate the user first, so I'm using XSendFile on Apache. The PHP file I have works fine when I visit a browser with the PDF file being immediately offered for download. Here is the code I'm using (courtesy of http://www.brighterlamp.com/2010/10/send-files-faster-better-with-php-mod_xsendfile/)
// Get a list of loaded Apache modules
$modules = apache_get_modules();
if (in_array('mod_xsendfile', $modules)) {
// Use XSendFile if possible
header ('X-Sendfile: ' . $pathToFile);
header ('Content-Type: ' . $documentMIME);
header ('Content-Disposition: attachment; filename="' . $actualFilename . '"');
exit;
} else {
// Otherwise, use the traditional PHP way..
header ('Content-Type: ' . $documentMIME);
header ('Content-Disposition: attachment; filename="' . $actualFilename . '"');
#ob_end_clean();
#ob_end_flush();
readfile($pathToFile);
exit;
}
So far so good. Now I want to embed this PDF in a webpage using an object tag e.g.:
<object data="dynamicpdf.php" type="application/pdf">
<p>PDF embed failed</a></p>
</object>
But this fails. If I switch the data attribute to a static PDF file then it works fine.
Any ideas what is going wrong?
Is iframing the PDF an option for you?
Like <iframe src="dynamicpdf.php">
The Content-Disposition header forces the download. Remove it.
General Advise:
I would not use functions like apache_get_modules that asume a specific webserver environment.
What if you switch away from mod_php or apache in future? Your code will break.
Instead I would do the delivery in a streamed php response that is more memory efficient than output buffering the whole PDF into RAM and then send it.
By streaming the PDF out with PHP you would also have only one implementation and it would be same speed as x-sendfile is:
Streaming a large file using PHP

PDF Generation Results in ERR_INVALID_RESPONSE in Chrome

When generating a PDF in the browser programmatically (via PHP) the rendered PDF displays fine in both Firefox and Safari, but Chrome returns an ERR_INVALID_RESPONSE. It is a valid PDF - can be opened locally with Adobe Reader/Preview once saved from the working browsers, and will even open in Chrome once the PDF is saved from a different browser.
The PDF file is being read through file_get_contents(), is given a current timestamp and then passed to the browser. A workaround would involve saving the file to a temporary spot and redirecting the user (for Chrome, at least) but this is not ideal.
I've researched it and only been able to find bug reports dating from 2008.
I have an inkling it's a header error. After the PDF is generated, the following headers are sent to the browser (again working fine in FF, Safari and IE):
header('Content-type:application/pdf');
header("HTTP/1.1 200 OK");
I've also tried adding the following headers after searching on Stack Overflow, but to no avail:
header("Content-Transfer-Encoding: binary");
header('Accept-Ranges: bytes');
Are there missing headers that Chrome requires? Does anyone have experience with getting dynamically generated PDFs to display in Chrome?
EDIT: One of my more salient questions is what could be causing this to work fine locally in Chrome, but wouldn't work on a server environment.
In my case I had to add these 2 parameters to headers because wordpress was sending 404 code as it didn't recognize the url of my php function:
header("Content-type: application/pdf",true,200);
as stated in this answer on wordpress.stackexchange.
This forces the headers to replace (2nd param true) the 404 status code generated by wordpress as it does not recognize the custom url, and sets 200 OK (3rd param 200).
So it ended being something like this:
$pdf_name = "test.pdf";
$pdf_file = "/absolute/path/to/my/pdfs/on/my/server/{$pdf_name}";
header('Content-type: application/pdf',true,200);
header("Content-Disposition: attachment; filename={$pdf_name}");
header('Cache-Control: public');
readfile($pdf_file);
exit();
Try this
<?php
$filename = 'Physical Path to PDf file.pdf';
$content = file_get_contents($filename);
header("Content-type:application/pdf");
// It will be called downloaded.pdf
header("Content-Disposition:inline;filename='".basename($filename)."'");
header('Content-Length: '.strlen( $content ));
// The PDF source is in original.pdf
readfile($filename);
?>
<html>
<body>
...
...
...
Make sure that above header code is called before output of PHP script is
sent to browser.
I want to thank everyone for their answers.
It turns out this was not related to the headers. After attempting to change/remove headers in various ways (detecting encoding, trying with and without content-length, etc.) we decided to dig into the deeper httpd logs to see if anything was resolving differently for Chrome.
It turns out that mod_sec on our server was flagging the request (only from Chrome for some reason) as an attempt at a file injection attack and was returning a 403 forbidden response. Chrome displayed this as the ERR_INVALID_RESPONSE rather than a 403.
The hostname of the CDN was present in the request (we had ample checking at the endpoint to ensure that the file was indeed an allowed resource), and instead are building the URL out on the server instead.

How to force download a file in PHP with multi-parallel download? [duplicate]

I'm trying to download multiple files using header() in a while loop, but only one file gets downloaded. Why?
while ($row = mysql_fetch_array($sql)) {
header('Content-Type: text/x-vcard');
header('Content-Disposition: attachment; filename=' . $row['name'] . '.vcf');
}
You can only transfer one file from server side at a time. Typical workarounds are:
tar/zip them up into one file on server side.
use javascript to window.open multiple files for download.
This is not possible. The HTTP protocol does not have support for downloading multiple files. The most common workaround is to put the files in a zip archive for the client to download.
headers ca be set only once before outputting any data.
As you loop you set some headers after that you must output something. In the next loop you will not set any headers.
Please read php.net

How to display an image returned as a bytes in browser without prompting download?

I have written the following PHP function but still get the prompt to download the file:
function navigateToBytes($contentType, $bytes){
header('Content-Type: ' .$contentType);
//header('Content-Transfer-Encoding: binary'); // UPDATE: as pointed out this is not needed, though it does not solve the problem
header('Content-Length: '.strlen($bytes));
ob_clean();
flush();
echo $bytes;
}
An example of calling the function:
navigateToBytes('image/jpeg', $bytes); // UPDATE: turns out this does work, using image/tiff for tiff images is when the browser does not display the image
where $bytes are the bytes as read from the file.
Apologies all - turns out I was having the problem because the images I was testing were TIFF's (with the Content-Type correctly set to image/tiff) when I used a JPEG the browser would display the image!
Ultimately it is up to the browser to decide whether it can display the Content-Type you are sending.
For the record the only headers I needed to change was
Content-Type,
I should set
Content-Length
too unless I set
Transfer-Encoding: chunked
Try the HTTP header "Content-Disposition: Inline", however some browsers may try to save the user from seeing binary data. Here is a random blog article on that HTTP header:
http://dotanything.wordpress.com/2008/05/30/content-disposition-attachment-vs-inline/
That seems like correct behavior to me. The browser is a viewport for humans to view things in. Humans, by and large, don't want to view binary data. What do you think should happen?
Random Advice: If there's a site that's doing what you want to do, use curl to sniff the headers they're sending.
curl -I http://example.com/path/to/binary/file/that/displays/in/browser
and then use the exact same headers in your own script.
As a start, get rid of things that do not exist in HTTP (Content-Transfer-Encoding).
Then get an HTTP tracing tool, such as the Live HTTP headers plugin for Firefox, and compare "your" headers with those received for a working image.
In doubt, post the HTTP trace here.

Categories