How to Increase Speed of Downloading an MP3 - php

I'm using the following code to force download some mp3 files that are stored on my server. This works fine but it takes over 1 minute to download 1 mp3 file, even for a file that is 2.5MB. Something seems wrong for it take that long. Any ideas what I can do to make this download a lot faster?
$fullfilename=$_GET['file'];
$filename=basename($fullfilename);
header('Content-type: audio/mpeg');
header("Content-Disposition: attachment; filename=\"{$filename}\"");
header("Content-Length: " . filesize($filename));
header('Content-Transfer-Encoding: binary');
ob_clean();
flush();
readfile($fullfilename);
exit;

It depends on the internet connection between the server and your browser. PHP cannot do anything about it.

Related

PHP: How to start serving a file that is being written, before it is completely written?

I want to "pipe" a file through PHP.
My PHP script calls a bash script through "shell_exec". The bash script downloads a file and prints its data. The PHP script then continues to send the file as response:
if (!file_exists("./$filename")) shell_exec("./get-file $file_id \"$filename\"");
if (file_exists($filename)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="' . basename($filename) . '"');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($filename));
readfile($filename);
unlink($filename);
exit;
}
This works okay. But is there some way to "pipe" the file while it's being downloaded? Let's say the file is 500MB, instead of waiting for the file to download in the server, and then serve it to the client, is there some way to send the content as it's being downloaded? The reason is obvious; I want to minimize the total time it takes for the whole process to complete. If what I ask is possible, it could bring total time to half. Thank you.
Edit: I found this but the solution is based on curl. In my case, the file is being downloaded by external script. So I want to run a loop in which the PHP script checks if new bytes are written in the file, and proceed to sending them to the client, without closing the connection. Note that the file is not text-based, it's an audio file.
Edit: Now I'm closer: I modified the external script to output the file in stdout, effectively feeding its data to shell_exec. Now, how can I send data before shell_exec has finished? I guess I'll have to use something like ob_flush() but some help would be appreciated. Thanks again
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="' . basename($filename) . '"');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
// This sends the data AFTER the exec finishes. I want to send them as they arrive, real-time
echo shell_exec("get-file $file_id");
die();
OK I found the answer in other posts.
I just use passthru() instead of shell_exec() and it seems to be working great.
Thank everybody! <3

PHP script only partial image download

When a user runs this PHP script it is supposed to just download the specified image file but for some reason, it's only downloading it partially. It starts the download then stalls at 1,083 KB / 1,382 KB. The download stops after a few seconds and when I open the image, the bottom portion is grey/missing. This script worked fine on my old server but the new server it won't. I tried changing PHP settings and version but no luck.
<?php
if (file_exists("test-download-image.jpg")) {
header("Pragma: no-cache");
header('Expires: 0');
header('Content-Type: application/zip');
header('Content-Disposition: attachment; filename="test-download-image.jpg"');
header('Content-Length: ' . filesize("test-download-image.jpg"));
readfile("test-download-image.jpg");
}
?>

One Time Download Links and large files

I'm using this script:
http://www.webvamp.co.uk/blog/coding/creating-one-time-download-links/
to allow users download files (one time). Everything works fine with small files. Now i'm trying to do the same but with larger file 1.2 GB. Instead of forcing user to download file, script show off the relative patch to the file! Is there any way to modify the script or its a fault of the server configuration?
Thanks for help!
Looking for the code i think it fails on large files due to memory limitation. Script reads the whole file in memory via file_get_contents() before sending it. I suspect, >1Gb files will cause the problems with memory.
Try to replace following lines in download.php script:
//get the file content
$strFile = file_get_contents($strDownload);
//set the headers to force a download
header("Content-type: application/force-download");
header("Content-Disposition: attachment;
filename=\"".str_replace(" ", "_", $arrCheck['file'])."\"");
//echo the file to the user
echo $strFile;
to:
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($strDownload));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($strDownload));
ob_clean();
flush();
readfile($strDownload);
This may be help.
Note from manual: readfile() will not present any memory issues, even when sending large files, on its own

Webpages not served during large file transfers (IIS7.5 + PHP)

For some reason, our webserver is not responding while it's serving large files.
We use the windows platform, because we need to remotely call Win32 applications in order to generate the file that is to be served. This file is served through PHP's function: fpassthru, using this code:
if (file_exists($file)) {
$handle = #fopen($file, "rb");
header('Content-Description: File Transfer');
header('Content-Type: video/mp4');
if($stream==0){
header('Content-Disposition: attachment; filename='.basename($filename.".mp4"));
}
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_end_clean();
fpassthru($handle);
exit;
}
These files are often over 1GB in size and takes a while to transfer, but during this time, the webserver will not serve any pages. My firefox indicates it's 'connecting' but nothing else. Note that somebody else is transferring this file, not me, so different IP, different session.
Any clue where to look? Obviously, it's intolerable to have to wait 5 minutes for a website.
Thanks in advance!
This is commonly caused when you do not close the session before you begin sending the file data. This is because the session cache file can only be opened by one PHP process at a time, therefore the download is effectively blocking all other PHP processes at session_start().
The solution is to call session_write_close() to commit the session data to disk and close the file handle before you start outputting the file data.

php, file download

I am using the simple file downloading script:
if (file_exists($file)) {
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header('Content-Length: ' . filesize($file));
ob_clean();
flush();
readfile($file);
exit;
}
It is working on my localserver upto 200mb.
When i try this code in my website it downloads 173KB instead of 200MB file.
I checked everything, wrote some custom code (using ob functions and fread instead of readfile) but can't download big files.
Thank you for your answers.
I am using Apache 2.2, PHP 5.3
All PHP settings to deal with big files are ok. (execution times, memory limits, ...
One issue I have with the following code is you have no control over the output stream, your letting PHP handle it without knowing exactly what is going on within the background:
What you should do is set up an output system that you can control and replicated accros servers.
For example:
if (file_exists($file))
{
if (FALSE!== ($handler = fopen($file, 'r')))
{
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.basename($file));
header('Content-Transfer-Encoding: chunked'); //changed to chunked
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
//header('Content-Length: ' . filesize($file)); //Remove
//Send the content in chunks
while(false !== ($chunk = fread($handler,4096)))
{
echo $chunk;
}
}
exit;
}
echo "<h1>Content error</h1><p>The file does not exist!</p>";
This is only basic but give it a go!
Also read my reply here: file_get_contents => PHP Fatal error: Allowed memory exhausted
It seems readfile can have issues with long files. As #Khez asked, it could be that the script is running for too long. A quick Googling resulted in a couple examples of chunking the file.
http://teddy.fr/blog/how-serve-big-files-through-php
http://www.php.net/manual/en/function.readfile.php#99406
One solution to certain scenarios is that you can use PHP-script to intelligently decide what file from where to download, but instead of sending the file directly from PHP, you could return a redirection to the client which then contains the direct link which is processed by the web server alone.
This could be done at least in two ways: either PHP-script copies the file into a "download zone" which for example might be cleaned from "old" files regularly by some other background/service script or you expose the real permanent location to the clients.
There are of course drawbacks as is the case with each solution. In this one is that depending on the clients (curl, wget, GUI browser) requesting the file they may not support redirection you make and in the other one, the files are very exposed to the outer world and can be at all times read without the (access) control of the PHP script.
Have you made sure your script can run long enough and has enough memory?
Do you really need output buffering ?
The real solution is to avoid using a PHP script for just sending a file to the client, it's overkill and your webserver is better suited for the task.
Presumably you have a reason for sending the files through PHP, perhaps users must authenticate first? If that is the case then you should use X-Accel-Redirect (if you're using nginx) or X-Sendfile (previously X-LIGHTTPD-send-file) on lighttpd.
If you're using Apache I've found a few references to mod_xsendfile but I've never used it personally, and I doubt it's installed for you if you have managed hosting.
If these solutions are untenable I apologise, but I really need more information on the actual problem: Why are you sending these files through PHP in the first place?

Categories