I'm trying to write an API that accepts PUT, PATCH and DELETE request methods and I was able to do so, but I was running into an issue where reading from php://input is extremely slow. Parsing a request with a 970k gif file takes 10-12 seconds and from the logging I added, all that time is spent reading from php://input
$stream = fopen('php://input', 'r');
// Log stream opened
$raw_data = "";
while(!feof($stream)) {
$raw_data .= fread($stream, $headers['content-length']);
}
// Log stream read
fclose($stream);
The time between those two log entries, as I said, is 10-12 seconds. What am I doing wrong? Is there a php setting somewhere I need to change?
Thanks
Related
Please look at the PHP code below. It is from a download script:
while(ob_get_level() > 0){
ob_end_clean();
}
set_time_limit(0);
ignore_user_abort(true);
$file = fopen(MATIN_FILE_PATH,"rb"); // the main file
$chunksize = 2*1024*1024;
while(!feof($file)){
echo #fread($file, $chunksize);
flush();
if (connection_status() == 1){ // if client aborted
#fclose($file);
exit;
}
}
#fclose($file);
exit;
In this code, you see that I send 2 MB per chunk.
Imagine a client with speed of 100kb/s
After many times of debugging, I found out that when client downloads each 2MB, fwrite happens and while goes to next loop. so, what Is PHP doing at this time? is it waiting for the user to download 2MB completely and then send another 2MB? so, isn't it better that I send 10MB or 50MB per chunk?
Thanks for any detailed guide.
Imagine you have 10 simultaneous client requests for downloading some file with this script and you have set 50MBs per chunk size. For each of the requests a new php process will be invoked, each of them demanding 50MBs of your server's memory to process fread($file, 50*1024*1024). So, you will have 500MBs memory consumed.
If, as you suggested, a client speed is 100kb/s, then the probability that you have 100 simultaneous connections is not so low and you could get 100 concurrent requests, which is 5GBs of RAM already. Do you have that much or need that all?
You cannot make the user download the file faster than his actual download speed, so the chunk size does not significantly matter for this. Neither reducing the number of loops iterations will help to speed up. I have not tested this, but I think, the loop is executed faster than I/O operations with remote client. So, the only thing you should really be concerned of, is to make your server work reliably.
I'm trying to get chunked uploads working on a form in my Laravel 4 project. The client side bit works so far, the uploads are chunking in 2MB chunks, and data is being sent from the browser. There's even have a handy progress bar in place to show the upload progress.
The problem is on the PHP side, as I'm unable to write the contents of the upload stream to disk. The system always ends up with a 0 byte file created. The idea is to append the chunks to the already uploaded file as they arrive.
The project is built on Laravel 4, so I'm not sure if Laravel reads the php://input stream and does something with it. Since php://input can only be read once, it possibly means that by the time when my controller actually tries to read it the stream, it would be empty.
The controller looks as follows:
public function upload()
{
$filename = Config::get('tms.upload_path') . Input::file('file')->getClientOriginalName();
file_put_contents($filename, fopen('php://input', 'r'), FILE_APPEND);
}
The file is being created, but it's length always remains at 0 bytes. Any ideas how I can coax the contents of the php://input stream out of the system?
afaik fopen returns a pointer to file, and not an stream, so probably it is not good as a parameter for file_put_contents
can you try with this workaround, instead of your file_put_contents?
$putdata = fopen("php://input", "r");
$fp = fopen($filename, "a");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
The answer to this is simple, I needed to turn off multipart/form-data and use file_get_contents("php://input") to read the contents and pass the result to file_put_contents() like so:
file_put_contents($filename, file_get_contents("php://input"), FILE_APPEND);
This works and fixes my problems.
Is it possible to use PHP readfile function on a remote file whose size is unknown and is increasing in size? Here is the scenario:
I'm developing a script which downloads a video from a third party website and simultaneously trans-codes the video into MP3 format. This MP3 is then transferred to the user via readfile.
The query used for the above process is like this:
wget -q -O- "VideoURLHere" | ffmpeg -i - "Output.mp3" > /dev/null 2>&1 &
So the file is fetched and encoded at the same time.
Now when the above process is in progress I begin sending the output mp3 to the user via readfile. The problem is that the encoding process takes some time and therefore depending on the users download speed readfile reaches an assumed EoF before the whole file is encoded, resulting in the user receiving partial content/incomplete files.
My first attempt to fix this was to apply a speed limit on the users download, but this is not foolproof as the encoding time and speed vary with load and this still led to partial downloads.
So is there a way to implement this system in such a way that I can serve the downloads simultaneously along with the encoding and also guarantee sending the complete file to the end user?
Any help is appreciated.
EDIT:
In response to Peter, I'm actually using fread(read readfile_chunked):
<?php
function readfile_chunked($filename,$retbytes=true) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$totChunk = 0;
$buffer = '';
$cnt =0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
//usleep(120000); //Used to impose an artificial speed limit
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
readfile_chunked($linkToMp3);
?>
This still does not guarantee complete downloads as depending on the users download speed and the encoding speed, the EOF() may be reached prematurely.
Also in response to theJeztah's comment, I'm trying to achieve this without having to make the user wait..so that's not an option.
Since you are dealing with streams, you probably should use stream handling functions :). passthru comes to mind, although this will only work if the download | transcode command is started in your script.
If it is started externally, take a look at stream_get_contents.
Libevent as mentioned by Evert seems like the general solution where you have to use a file as a buffer. However in your case, you could do it all inline in your script without using a file as a buffer:
<?php
header("Content-Type: audio/mpeg");
passthru("wget -q -O- http://localhost/test.avi | ffmpeg -i - -f mp3 -");
?>
I don't think there's any of being notified about there being new data, short of something like inotify.
I suggest that if you hit EOF, you start polling the modification time of the file (using clearstatcache() between calls) every 200 ms or so. When you find the file size has increased, you can reopen the file, seek to the last position and continue.
I can highly recommend using libevent for applications like this.
It works perfect for cases like this.
The PHP documentation is a bit sparse for this, but you should be able to find more solid examples around the web.
I have a PHP socket client which transfer a image (BMP) to the socket server
$host="127.0.0.1" ;
$port=8000;
$timeout=30;
$socket=fsockopen($host,$port,$errnum,$errstr,$timeout) ;
$bmp=file_get_contents("C:/Image.bmp");
$bytesWritten = fwrite($socket, $bmp);
fclose($socket);
The transferred image is always corrupted and halfly streamed and giving the error message
Fatal error: Maximum execution time of 60 seconds exceeded
im transferring from localhost to localhost ;) and i have a ASP.NET app which does the same thing in milliseconds ! so why not PHP? why it takes long time ?
i think there is some thing to do with file_get_contents which creates a large BLOB behalf of that is there a way to use a FileStream in PHP ?
any idea how to transfer the file without corrupting ?
file_get_contents returns a string. I think you want to use fread instead.
Example:
$filename = "c:\\files\\somepic.gif";
$handle = fopen($filename, "rb");
$contents = fread($handle, filesize($filename));
fclose($handle);
This is my final download page of my website where general public is able to download govt documents. From server my code is reading the to-be-downloaded-file and in a loop sending to the client browser.
$fp = fopen($file, "rb");
while (!feof($fp))
{
echo fread($fp, 65536);
flush(); // this is essential for large downloads
}
fclose($fp);
exit;
I want to send the file very slowly -- that is can I use Sleep function (or anything like that) within this loop and by how much maximum without causing the user client browser to timeout?
So that the user gets sufficient time to read the ads displayed on the page while he/she awaits for the file download to finish.
Also I'm not proficient with PHP environment.
(Pl. fogive me for the morality/immorality of this).
Try this approach: http://bytes.com/topic/php/answers/341922-using-php-limit-download-speed
You can use bandwidth sharing if you're willing to do this at the Apache level.