PHP fsockopen client doesn't receive sent data - php

I have the following (stripped-down) piece of code:
function curl_request_async($url, $params)
{
foreach ($params as $key => $val) {
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
fwrite($fp, "$type ".$parts['path']." HTTP/1.1\r\n");
fwrite($fp, "Host: ".$parts['host']."\r\n");
fwrite($fp, "Content-Type: application/x-www-form-urlencoded\r\n");
fwrite($fp, "Content-Length: ".strlen($post_string)."\r\n");
fwrite($fp, "Connection: Close\r\n\r\n");
$bytes_written = fwrite($fp, $post_string);
var_dump($bytes_written, strlen($post_string));
// fread($fp, 1);
// fflush($fp);
fclose($fp);
}
The problem with this code is that I found no evidence the request reached the server called. The line var_dump($bytes_written, strlen($post_string)); outputted int(493) int(493), so it should have received all data, yet it didn't.
If I uncomment fread($fp, 1); it works without a problem. That could be working solution, but it doesn't seem to make sense. There has to be a better way!
My question then is two-fold: why does fread($fp, 1); fix my problem and is there a better solution?

your problem is probably that you wrote the server code in PHP, and you dont have ignore_user_abort=true by default (see http://php.net/manual/en/misc.configuration.php#ini.ignore-user-abort ), so when you close the connection, your server stop executing your php code, thus fread(fp,1) fix your problem - connection dont close before php start writing a response
you can use this code to make a server to test if its actually connecting or not -
<?php
error_reporting(E_ALL);
ini_set('display_errors',1);
$sck=socket_create(AF_INET,SOCK_STREAM,SOL_TCP);
if($sck===FALSE){
die('socket_create failed!');
}
if(!socket_set_block($sck)){
die("socket_set_block failed!");
}
if(!socket_bind($sck, '0.0.0.0',1337)){
die("FAILED to bind to port 1337");
}
if(!socket_listen($sck,0)){
die("socket_listen failed!");
}
$fullFile='';
while((print('listening for connections!'.PHP_EOL)) && false!==($conn=socket_accept($sck))){
echo "new connection!".PHP_EOL;
echo "generating crypto iv..";
while(false!==($buffi=socket_recv($conn,$buff,1024,MSG_WAITALL))){
if($buffi===0){
break;//socket_recv's way of
//saying that the connection closed,
//apparently. the docs say it should return
// false, but it doesn't, it just infinitely returns int(0).
// at least on windows 7 x64 sp1.
}
$fullFile.=$buff;
echo "recieved ".strlen($fullFile)." bytes...".PHP_EOL;
$buff='';//do i need to clear it? or wiill recv do it for me?
}
echo "all bytes recieved (i guess, todo, socket_last_error confirm).";
echo PHP_EOL;var_dump($fullFule);
echo "done!".PHP_EOL;
}
die("should never reach this code...");
it will make a netcat-style server on http://127.0.0.1:1337

fread needs two parameters: a resource and a length number of bytes to read.
Right now you are only reading 1 byte. fread($fp, 1);
If you want to read the complete result, loop it until readed completely:
while(!feof($fp)){
echo fread($fp, 128);
}

Related

Read to the end of an XML response when the server isn't specifying the end of file?

I'm writing a script that communicates with a server via XML. I can tell I'm making successful requests to the server's API because I can see in a log on the server it's receiving them, however I'm having a hard time receiving the response (XML). I do not own the server and unfortunately cannot modify any of the programs sending the response.
I don't think the server is specifying the end of the file, so doing a while (!feof($fp)) { ... } hangs. And unfortunately I don't think I have any way (to my knowledge) of determining the size of the response before reading it.
What I am doing and what I have attempted:
function postXMLSocket ($server, $path, $port, $xmlDocument) {
$contentLength = strlen($xmlDocument);
$result = '';
// Handling error case in else statement below
if ($fp = #fsockopen($server, $port, $errno, $errstr, 30)) {
$out = "POST / HTTP/1.0\r\n";
$out .= "Host: ".$server."\r\n";
$out .= "Content-Type: text/xml\r\n";
$out .= "Content-Length: ".$contentLength."\r\n";
$out .= "Connection: close\r\n";
$out .= "\r\n"; // all headers sent
$out .= $xmlDocument;
fwrite($fp, $out);
// ATTEMPT 5: Read until we have a valid XML doc -- hangs
// libxml_use_internal_errors(true);
// do {
// $result .= fgets($fp, 128);
// $xmlTest = simplexml_load_string($result);
// } while ($xmlTest === false);
// ATTEMPT 4: Read X # of lines -- works but I can't know how many lines response will be
// for ($i = 0; $i < 10; $i++) {
// $result .= fgets($fp, 128);
// }
// ATTEMPT 3: Read until the lines being read are empty -- hangs
// do {
// $lineRead = fgets($fp, 500);
// $result .= $lineRead;
// } while (strlen($lineRead) > 0);
// ATTEMPT 2: Read the whole file w/ fread -- only reads part of file
// $result = fread($fp, 8192);
// ATTEMPT 1: Read to the EOF -- hangs
// while (!feof($fp)) {
// $result .= fgets($fp, 128);
// }
fclose($fp);
}
else {
// Could not connect to socket
return false;
}
return $result;
}
Attempt descriptions:
1) First I just tried reading lines until reaching the end of the file. This keeps hanging and resulting in a time out and I think it's because the server isn't marking the end of the XML file it's responding with, so it's getting caught in an infinite loop.
2) Second I tried to read response as one whole file. This worked and I got something back, but it was incomplete (seems the response is quite large). While this works, I don't have any way of knowing how big the response will be before reading it, so I don't think this is an option.
3) Next I tried reading until fgets is returning an empty string. I made the assumption it would do this if it's reading lines after passing the end of the file, but this hangs as well.
4) For this attempt I just tried to read a hardcoded number of lines (10 in this case), but this has similar problems to attempt 2 above where I can't accurately know how many lines the response will have until after reading it.
5) This is where I thought I was getting clever. I know the response will be XML, and will be contained in a <Response> node. Therefore I thought I could get away with reading until the $result variable contained a valid XML string, however this seems to hang as well.
Using a higher level approach to HTTP requests will probably help you. Try this:
$stringWithSomeXml = "your payload xml here";
postXml("www.google.com", "/path/on/server", 80, $stringWithSomeXml);
function postXml($server, $path, $port, $xmlPayload)
{
$ch = curl_init();
$path = ltrim($path, "/");
if ($port == 80) {
$url = "https://{$server}/{$path}";
} else {
$url = "https://{$server}:{$port}/{$path}";
}
echo "\n$url\n";
curl_setopt(
$ch,
CURLOPT_URL,
$url
);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt(
$ch,
CURLOPT_HTTPHEADER,
[
"Content-type: application/xml",
"Content-Length: ".strlen($xmlPayload)
]
);
curl_setopt($ch, CURLOPT_POSTFIELDS, $xmlPayload);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
$result = curl_exec($ch);
echo "length: " . strlen($result) . "\n";
echo "content: " . $result . "\n";
curl_close($ch);
}

How to get the response from URL without cURL?

I am working on twitter login integration with website. I don't have cURL installed in my server and I am not allowed to install that.
Twitter code is working fine for login. But while using request_token curl is used to send callback URL with that URL and getting the token response. In this same case I want to get the response from that URL without using Curl in PHP. Is it possible?
Curl code now used:
$response = curl_exec($ci);
The above response I need without using Curl.
How to get the response from url without Curl
You don't have to necessarily use cURL, there can be many ways. One of those is:
$response=file_get_contents($ci);
Edit:
You can also use fsockopen, here is an example from PHP.net
<?php
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET / HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
?>

Parameters not posted when using fsockopen and fwrite

I am trying to post parameters from php to another server. When creating the link manually and opening it in the browser it works fine. But when trying to do it from my php-script it doesn't work. The file I am accessing is accessed, but the parameter is not posted.
I guess the problem has to do with how I define and post the parameter ($post_data .= "?companyid=banane";). What is my problem and how do I solve it?
<?php
$fp = fsockopen("192.168.1.102", 80, $errno, $errstr, 30);
error_log("write done");
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$post_data = "GET /cgi-bin/new_instance.pl HTTP/1.1\r\n";
$post_data .= "Host: 192.168.1.102\r\n";
$post_data .= "Connection: Close\r\n\r\n";
$post_data .= "?companyid=banane";
error_log("OUT - - - ".$post_data);
fwrite($fp, $post_data);
error_log("write done");
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
?>
Or am I using the wrong approach? I'm thinking, if this is the correct approach then I should be able to find some good examples when googling around.
Try this as the first line instead:
GET /cgi-bin/new_instance.pl?companyid=banane HTTP/1.1\r\n

PHP - Downloading very large files with fsockopen(), fgets() and feof()

I have a simple download function in a class that might be dealing with files of many hundreds of megabytes at a time from an Amazon Web Services bucket. The whole file cannot be loaded into memory at once, so it must be streamed directly to a file pointer. This is my understanding as this is the first time I've dealt with this issue and I'm picking things up as I go along.
I've ended up with this, based on a 4 KB file buffer which simple testing showed was a good size:
$fs = fsockopen($host, 80, $errno, $errstr, 30);
if (!$fs) {
$this->writeDebugInfo("FAILED ", $errstr . '(' . $errno . ')');
} else {
$out = "GET $file HTTP/1.1\r\n";
$out .= "Host: $host\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fs, $out);
$fm = fopen ($temp_file_name, "w");
stream_set_timeout($fs, 30);
while(!feof($fs) && ($debug = fgets($fs)) != "\r\n" ); // ignore headers
while(!feof($fs)) {
$contents = fgets($fs, 4096);
fwrite($fm, $contents);
$info = stream_get_meta_data($fs);
if ($info['timed_out']) {
break;
}
}
fclose($fm);
fclose($fs);
if ($info['timed_out']) {
// Delete temp file if fails
unlink($temp_file_name);
$this->writeDebugInfo("FAILED - Connection timed out: ", $temp_file_name);
} else {
// Move temp file if succeeds
$media_file_name = str_replace('temp/', 'media/', $temp_file_name);
rename($temp_file_name, $media_file_name);
$this->writeDebugInfo("SUCCESS: ", $media_file_name);
}
}
In testing it's fine. However I have got into a conversation with someone who is saying that I am not understanding how fgets() and feof() work together, and he's mentioning chunked encoding as a more efficient method.
Is the code generally OK, or am I missing something vital here? What is the benefit that chunked encoding will give me?
Your solution seems fine to me, however I have a few comments.
1) Don't create a HTTP packet yourself, i.e. don't send the HTTP request. Instead use something like CURL. This is more fool proof and will support a wider range of responses the server might reply with. Additionally CURL can be setup to write directly to a file, saving you doing it yourself.
2) Using fgets may be a problem if you are reading binary data. Fgets reads to the end of a line, and with binary data this may corrupt your download. Instead I suggest fread($fs, 4096); which will handle both text and binary data.
2) Chunked encoding is a way for a webserver to send you the response in multiple chunks. I don't think this is very useful to you, however, a better encoding that the webserver might support is the gzip encoding. This would allow the webserver to compress the response on the fly. If you use a library like CURL, it will tell the server it supports gzip, and then automatically decompress it for you.
I hope this helps
Don't deal with sockets, optimize your code and use the cURL library, PHP cURL. Like this:
$url = 'http://'.$host.'/'.$file;
// create a new cURL resource
$fh = fopen ($temp_file_name, "w");
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_FILE, $fh);
//curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
// grab URL and pass it to the browser
curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);
fclose($fh);
And the final result in case it helps anyone else. I also wrapped the whole thing in a retry loop to decrease the risk of a completely failed download, but it does increase the use of resources:
do {
$fs = fopen('http://' . $host . $file, "rb");
if (!$fs) {
$this->writeDebugInfo("FAILED ", $errstr . '(' . $errno . ')');
} else {
$fm = fopen ($temp_file_name, "w");
stream_set_timeout($fs, 30);
while(!feof($fs)) {
$contents = fread($fs, 4096); // Buffered download
fwrite($fm, $contents);
$info = stream_get_meta_data($fs);
if ($info['timed_out']) {
break;
}
}
fclose($fm);
fclose($fs);
if ($info['timed_out']) {
// Delete temp file if fails
unlink($temp_file_name);
$this->writeDebugInfo("FAILED on attempt " . $download_attempt . " - Connection timed out: ", $temp_file_name);
$download_attempt++;
if ($download_attempt < 5) {
$this->writeDebugInfo("RETRYING: ", $temp_file_name);
}
} else {
// Move temp file if succeeds
$media_file_name = str_replace('temp/', 'media/', $temp_file_name);
rename($temp_file_name, $media_file_name);
$this->newDownload = true;
$this->writeDebugInfo("SUCCESS: ", $media_file_name);
}
}
} while ($download_attempt < 5 && $info['timed_out']);

Prevent timeout in PHP

I am working on a PHP script that makes an API call to a external site. However, if this site is not available or the request times out, I would like my function to return false.
I have found following, but I am not sure on how to implement it on my script, since i use "file_get_contents" to retrieve the content of the external file call.
Limit execution time of an function or command PHP
$fp = fsockopen("www.example.com", 80);
if (!$fp) {
echo "Unable to open\n";
} else {
fwrite($fp, "GET / HTTP/1.0\r\n\r\n");
stream_set_timeout($fp, 2);
$res = fread($fp, 2000);
$info = stream_get_meta_data($fp);
fclose($fp);
if ($info['timed_out']) {
echo 'Connection timed out!';
} else {
echo $res;
}
}
(From: http://php.net/manual/en/function.stream-set-timeout.php)
How would you adress such an issue? Thanks!
I'd recommend using the cURL family of PHP functions. You can then set the timeout using curl_setopt():
curl_setopt($ch,CURLOPT_CONNECTTIMEOUT,2); // two second timeout
This will cause the curl_exec() function to return FALSE after the timeout.
In general, using cURL is better than any of the file reading functions; it's more dependable, has more options and is not regarded as a security threat. Many sysadmins disable remote file reading, so using cURL will make your code more portable and secure.
<?php
$fp = fsockopen("www.example.com", 80);
if (!$fp) {
echo "Unable to open\n";
} else {
stream_set_timeout($fp, 2); // STREAM RESOURCE, NUMBER OF SECONDS TILL TIMEOUT
// GET YOUR FILE CONTENTS
}
?>
From the PHP manual for File_Get_Contents (comments):
<?php
$ctx = stream_context_create(array(
'http' => array(
'timeout' => 1
)
)
);
file_get_contents("http://example.com/", 0, $ctx);
?>
<?php
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 4);
if ($fp) {
stream_set_timeout($fp, 2);
}

Categories