PHP socket programming transferred image is corrupted - php

I have a PHP socket client which transfer a image (BMP) to the socket server
$host="127.0.0.1" ;
$port=8000;
$timeout=30;
$socket=fsockopen($host,$port,$errnum,$errstr,$timeout) ;
$bmp=file_get_contents("C:/Image.bmp");
$bytesWritten = fwrite($socket, $bmp);
fclose($socket);
The transferred image is always corrupted and halfly streamed and giving the error message
Fatal error: Maximum execution time of 60 seconds exceeded
im transferring from localhost to localhost ;) and i have a ASP.NET app which does the same thing in milliseconds ! so why not PHP? why it takes long time ?
i think there is some thing to do with file_get_contents which creates a large BLOB behalf of that is there a way to use a FileStream in PHP ?
any idea how to transfer the file without corrupting ?

file_get_contents returns a string. I think you want to use fread instead.
Example:
$filename = "c:\\files\\somepic.gif";
$handle = fopen($filename, "rb");
$contents = fread($handle, filesize($filename));
fclose($handle);

Related

Memory problems with php://input

I have an API endpoint that can receive a POST JSON/XML body (or even raw binary data) inside the body content as payload that should be written immediately to a file on the filesystem.
For backwards compatibility reasons, it cannot be a multipart/form-data.
It works with no problems for body content up to a certain size (around 2.3GB with a 8GB script memory limit).
I've tried all of the followings:
both with and without setting the buffers' sizes
$filename = '/tmp/test_big_file.bin';
$input = fopen('php://input', 'rb');
$output = fopen($filename, 'wb');
stream_set_read_buffer($input, 4096);
stream_set_write_buffer($output, 4096);
stream_copy_to_stream($input, $output);
fclose($input);
fclose($output);
and
$filename = '/tmp/test_big_file.bin';
file_put_contents($filename, file_get_contents('php://input'));
and
$filename = '/tmp/test_big_file.bin';
$input = fopen('php://input', 'rb');
$output = fopen($filename, 'wb');
while (!feof($input)) {
fwrite($output, fread($input, 8192), 8192);
}
fclose($input);
fclose($output);
Unfortunately, none of them works. At one point, I get always the same error:
PHP Fatal error: Allowed memory size of 8589934592 bytes exhausted (tried to allocate 2475803056 bytes) in Unknown on line 0
Also unsetting enable_post_data_reading makes no difference and all the php.ini post/memory/whatever sizes are set to 8GB.
I'm using php-fpm.
Looking what's happening at the memory with free -mt, I can see that the memory used increases slowly at the beginning, going faster after a while, up to a point that no more free memory is left, so the error.
On the temp directory, the file is not directly stream-copied, but instead it is written on a temporary file named php7NARsX or other random strings which is not deleted after the script crashes, so that at the following free -mt check, the available memory is 2.3GB less.
Now my questions:
Why the stream is not copied directly from php://input to the output instead of loading it into memory? (also using php://temp as output stream leads to the same error)
Why is PHP using so much memory? I'm sending a 3GB payload, so why it needs more than 8GB?
Of course, any working solution will be much appreciated. Thank You!

php://input reading is extreamly slow

I'm trying to write an API that accepts PUT, PATCH and DELETE request methods and I was able to do so, but I was running into an issue where reading from php://input is extremely slow. Parsing a request with a 970k gif file takes 10-12 seconds and from the logging I added, all that time is spent reading from php://input
$stream = fopen('php://input', 'r');
// Log stream opened
$raw_data = "";
while(!feof($stream)) {
$raw_data .= fread($stream, $headers['content-length']);
}
// Log stream read
fclose($stream);
The time between those two log entries, as I said, is 10-12 seconds. What am I doing wrong? Is there a php setting somewhere I need to change?
Thanks

PHP input stream returning 0 data - Laravel

I'm trying to get chunked uploads working on a form in my Laravel 4 project. The client side bit works so far, the uploads are chunking in 2MB chunks, and data is being sent from the browser. There's even have a handy progress bar in place to show the upload progress.
The problem is on the PHP side, as I'm unable to write the contents of the upload stream to disk. The system always ends up with a 0 byte file created. The idea is to append the chunks to the already uploaded file as they arrive.
The project is built on Laravel 4, so I'm not sure if Laravel reads the php://input stream and does something with it. Since php://input can only be read once, it possibly means that by the time when my controller actually tries to read it the stream, it would be empty.
The controller looks as follows:
public function upload()
{
$filename = Config::get('tms.upload_path') . Input::file('file')->getClientOriginalName();
file_put_contents($filename, fopen('php://input', 'r'), FILE_APPEND);
}
The file is being created, but it's length always remains at 0 bytes. Any ideas how I can coax the contents of the php://input stream out of the system?
afaik fopen returns a pointer to file, and not an stream, so probably it is not good as a parameter for file_put_contents
can you try with this workaround, instead of your file_put_contents?
$putdata = fopen("php://input", "r");
$fp = fopen($filename, "a");
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
fclose($fp);
fclose($putdata);
The answer to this is simple, I needed to turn off multipart/form-data and use file_get_contents("php://input") to read the contents and pass the result to file_put_contents() like so:
file_put_contents($filename, file_get_contents("php://input"), FILE_APPEND);
This works and fixes my problems.

Send partial of FTP stream to php://output

I have a PHP-server that serves audio-files by streaming them from an FTP-server not publicly available.
After sending the approriate headers, I just stream the file to the client using ftp_get like this:
ftp_get($conn, 'php://output', $file, FTP_BINARY);
For reasons that has to do with Range headers, I must now offer to only send a part of this stream:
$start = 300; // First byte to stream
$stop = 499; // Last byte to stream (the Content-Length is then $stop-$start+1)
I can do it by downloading the entire content temporarily to a file/memory, then send the desired part to the output. But since the files are large, that solution will cause a delay for the client who has to wait for the file to first be downloaded to the PHP-server before it even starts to download to the client.
Question:
How can I start streaming to php://output from an FTP-server as soon as the first $start bytes have been discarded and stop streaming when I've reached the '$stop' byte?
Instead of using PHP's FTP extension (eg. ftp_get), it is possible to open a stream using PHP's built-in FTP wrapper.
The following code would stream parts of an FTP-file to php://output:
$start = 300; // First byte to stream
$stop = 499; // Last byte to stream
$url = "ftp://username:password#server.com/path/file.mp3";
$ctx = stream_context_create(array('ftp' => array('resume_pos' => $start)));
$fin = fopen($url, 'r', false, $ctx);
$fout = fopen('php://output', 'w');
stream_copy_to_stream($fin, $fout, $stop-$start+1);
fclose($fin);
fclose($fout);
While stream_copy_to_stream has an $offset parameter, using it resulted in an error because the stream was not seekable. Using the context option resume_pos worked fine however.

Cannot resume downloads bigger than 300M

I am working on a program with php to download files.
the script request is like: http://localhost/download.php?file=abc.zip
I use some script mentioned in Resumable downloads when using PHP to send the file?
it definitely works for files under 300M, either multithread or single-thread download, but, when i try to download a file >300M, I get a problem in single-thread downloading, I downloaded only about 250M data, then it seems like the http connection is broken. it doesnot break in the break-point ..Why?
debugging the script, I pinpointed where it broke:
$max_bf_size = 10240;
$pf = fopen("$file_path", "rb");
fseek($pf, $offset);
while(1)
{
$rd_length = $length < $max_bf_size? $length:$max_bf_size;
$data = fread($pf, $rd_length);
print $data;
$length = $length - $rd_length;
if( $length <= 0 )
{
//__break-point__
break;
}
}
this seems like every requested document can only get 250M data buffer to echo or print..But it works when i use a multi-thread to download a file
fread() will read up to the number of bytes you ask for, so you are doing some unnecessary work calculating the number of bytes to read. I don't know what you mean by single-thread and multi-thread downloading. Do you know about readfile() to just dump an entire file? I assume you need to read a portion of the file starting at $offset up to $length bytes, correct?
I'd also check my web server (Apache?) configuration and ISP limits if applicable; your maximum response size or time may be throttled.
Try this:
define(MAX_BUF_SIZE, 10240);
$pf = fopen($file_path, 'rb');
fseek($pf, $offset);
while (!feof($pf)) {
$data = fread($pf, MAX_BUF_SIZE);
if ($data === false)
break;
print $data;
}
fclose($pf);

Categories