I write an php script that help with limit speed and connections in download files. I used fopen() and fseek() something like this:
$f = fopen($file, 'rb');
if($f){
fseek($f,$start);//$start extracted from $_SERVER['HTTP_RANGE']
while(!feof($f)){
echo fread($f,$speed);//$speed is bytes per second
flush();
ob_flush();
sleep(1);
}
fclose($f);
}
download process may take several hours to complete, is whole file be in memory until end of download? and how I can optimize this?
No, fread uses an internal buffer to stream the data (8KB by default), so only a very small part of the file actually resides in memory.
Related
These two codes both do the same thing in reading files , so what's the main difference ?
1-First code :
$handle = fopen($file, 'r');
$data = fread($handle, filesize($file));
2-Second code :
readfile($file);
There's a significant difference between fread() and readfile().
First, readfile() does a number of things that fread() does not. readfile() opens the file for reading, reads it, and then prints it to the output buffer all in one go. fread() only does one of those things: it reads bytes from a given file handle.
Additionally, readfile() has some benefits that fread() does not. For example, it can take advantage of memory-mapped I/O where available rather than slower disk reads. This significantly increases the performance of reading the file since it delegates the process away from PHP itself and more towards operating system calls.
Errata
I previously noted that readfile() could run without PHP (this is corrected below).
For truly large files (think several gigs like media files or large archive backups), you may want to consider delegating the reading of the file away from PHP entirely with X-Sendfile headers to your webserver instead (so that you don't keep your PHP worker tied up for the length of an upload that could potentially take hours).
So you could do something like this instead of readfile():
<?php
/* process some things in php here */
header("X-Sendfile: /path/to/file");
exit; // don't need to keep PHP busy for this
Reading the docs, readfile reads the whole content and writes it into STDOUT.
$data = fread($handle, filesize($file));
While fread puts the content into the variable $data.
I'm downloading a large file like that:
$fd = fopen($url, "r");
while(!feof($fd))
{
echo fread($fd, 4096);
ob_flush();
flush();
}
But I have one problem - the file is downloading only to 11,6 MB and stop...
Where is a problem? I'm using ob_flush and flush so I think - it should work.
Thanks.
You don't need the fread() loop if you just want to output a remote file. You can use:
readfile($url);
That's it. However, the script you showed should work as well. The reason must be on the remote server.
If the download takes long you should consider to set the execution time to unlimited:
set_time_limit(0);
... on top of your script.
I want to send an external MP4 file in chunks of 1 MB each to a user. With each chunk I update a database entry to keep track of the download progress. I use fread() to read the file in chunks. Here is the stripped down code:
$filehandle = fopen($file, 'r');
while(!feof($filehandle)){
$buffer = fread($filehandle, 1024*1024);
//do some database stuff
echo $buffer;
ob_flush();
flush();
}
However, when I check the chunk size at some iteration inside the while loop, with
$chunk_length = strlen($buffer);
die("$chunk_length");
I do never get the desired chunk size. It fluctates somewhere around 7000 - 8000 bytes. Nowhere near 1024*1024 bytes.
When I decrease the chunk size to a smaller number, for example 1024 bytes, it works as expected.
According to the PHP fread() manual:
"When reading from anything that is not a regular local file, such as
streams returned when reading remote files or from popen() and
fsockopen(), reading will stop after a packet is available."
In this case I opened a remote file. Apparently, this makes fread() stop not at the specified length, but when the first package has arrived.
I wanted to keep track of a download of an external file.
If you to do this (or keep track of an upload), use CURL instead:
curl_setopt($curl_handle, CURLOPT_NOPROGRESS, false);
curl_setopt($curl_handle, CURLOPT_PROGRESSFUNCTION, 'callbackFunction');
function callbackFunction($download_size, $downloaded, $upload_size, $uploaded){
//do stuff with the parameters
}
Is it possible to use PHP readfile function on a remote file whose size is unknown and is increasing in size? Here is the scenario:
I'm developing a script which downloads a video from a third party website and simultaneously trans-codes the video into MP3 format. This MP3 is then transferred to the user via readfile.
The query used for the above process is like this:
wget -q -O- "VideoURLHere" | ffmpeg -i - "Output.mp3" > /dev/null 2>&1 &
So the file is fetched and encoded at the same time.
Now when the above process is in progress I begin sending the output mp3 to the user via readfile. The problem is that the encoding process takes some time and therefore depending on the users download speed readfile reaches an assumed EoF before the whole file is encoded, resulting in the user receiving partial content/incomplete files.
My first attempt to fix this was to apply a speed limit on the users download, but this is not foolproof as the encoding time and speed vary with load and this still led to partial downloads.
So is there a way to implement this system in such a way that I can serve the downloads simultaneously along with the encoding and also guarantee sending the complete file to the end user?
Any help is appreciated.
EDIT:
In response to Peter, I'm actually using fread(read readfile_chunked):
<?php
function readfile_chunked($filename,$retbytes=true) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$totChunk = 0;
$buffer = '';
$cnt =0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
//usleep(120000); //Used to impose an artificial speed limit
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
readfile_chunked($linkToMp3);
?>
This still does not guarantee complete downloads as depending on the users download speed and the encoding speed, the EOF() may be reached prematurely.
Also in response to theJeztah's comment, I'm trying to achieve this without having to make the user wait..so that's not an option.
Since you are dealing with streams, you probably should use stream handling functions :). passthru comes to mind, although this will only work if the download | transcode command is started in your script.
If it is started externally, take a look at stream_get_contents.
Libevent as mentioned by Evert seems like the general solution where you have to use a file as a buffer. However in your case, you could do it all inline in your script without using a file as a buffer:
<?php
header("Content-Type: audio/mpeg");
passthru("wget -q -O- http://localhost/test.avi | ffmpeg -i - -f mp3 -");
?>
I don't think there's any of being notified about there being new data, short of something like inotify.
I suggest that if you hit EOF, you start polling the modification time of the file (using clearstatcache() between calls) every 200 ms or so. When you find the file size has increased, you can reopen the file, seek to the last position and continue.
I can highly recommend using libevent for applications like this.
It works perfect for cases like this.
The PHP documentation is a bit sparse for this, but you should be able to find more solid examples around the web.
This is my final download page of my website where general public is able to download govt documents. From server my code is reading the to-be-downloaded-file and in a loop sending to the client browser.
$fp = fopen($file, "rb");
while (!feof($fp))
{
echo fread($fp, 65536);
flush(); // this is essential for large downloads
}
fclose($fp);
exit;
I want to send the file very slowly -- that is can I use Sleep function (or anything like that) within this loop and by how much maximum without causing the user client browser to timeout?
So that the user gets sufficient time to read the ads displayed on the page while he/she awaits for the file download to finish.
Also I'm not proficient with PHP environment.
(Pl. fogive me for the morality/immorality of this).
Try this approach: http://bytes.com/topic/php/answers/341922-using-php-limit-download-speed
You can use bandwidth sharing if you're willing to do this at the Apache level.