Quick and reliable HTTP call from PHP script - php

I am adding some external HTTP calls (for internal status monitoring) to a large PHP application which is already very complex and prone to errors. The HTTP call should be made quickly and without raising errors/exceptions. It is okay for HTTP calls to fail.
My first thought was to use Curl but it is not installed on the server. This would let me supress errors, set timeouts and prevent unnecessary blocking if the status server is unreachable/slow.
I know of several built-in PHP functions which can make an HTTP request (and these are enabled on the server) - file(), file_get_contents(), http-get() and I can prefix it with # to suppress errors. But if the monitoring server is unreachable it will hang the script for a number of seconds. Is there a way to set a timeout?

You can set a timeout, as the documentation/comments of file_get_content says:
$ctx = stream_context_create(array(
'http' => array(
'timeout' => 1
)
));
file_get_contents("http://example.com/", 0, $ctx);

Related

PHP SOAP awfully slow: Response only after reaching fastcgi_read_timeout

The constructor of the SOAP client in my wsdl web service is awfully slow, I get the response only after the fastcgi_read_timeout value in my Nginx config is reached, it seems as if the remote server is not closing the connection. I also need to set it to a minimum of 15 seconds otherwise I will get no response at all.
I already read similar posts here on SO, especially this one
PHP SoapClient constructor very slow and it's linked threads but I still cannot find the actual cause of the problem.
This is the part which takes 15+ seconds:
$client = new SoapClient("https://apps.correios.com.br/SigepMasterJPA/AtendeClienteService/AtendeCliente?wsdl");
It seems as it is only slow when called from my php script, because the file opens instantly when accessed from one of the following locations:
wget from my server which is running the script
SoapUI or Postman (But I don't know if they cached it before)
opening the URL in a browser
Ports 80 and 443 in the firewall are open. Following the suggestion from another thread, I found two work arounds:
Loading the wsdl from a local file => fast
Enabling the wsdl cache and using the remote URL => fast
But still I'd like to know why it doesn't work with the original URL.
It seems as if the web service does not close the connection, or in other words, I get the response only after reaching the timeout set in my server config. I tried setting keepalive_timeout 15; in my Nginx config, but it does not work.
Is there any SOAP/PHP parameter which forces the server to close the connection?
I was able to reproduce the problem, and found the solution to the issue (works, maybe not the best) in the accepted answer of a question linked in the question you referenced:
PHP: SoapClient constructor is very slow (takes 3 minutes)
As per the answer, you can adjust the HTTP headers using the stream_context option.
$client = new SoapClient("https://apps.correios.com.br/SigepMasterJPA/AtendeClienteService/AtendeCliente?wsdl",array(
'stream_context'=>stream_context_create(
array('http'=>
array(
'protocol_version'=>'1.0',
'header' => 'Connection: Close'
)
)
)
));
More information on the stream_context option is documented at http://php.net/manual/en/soapclient.soapclient.php
I tested this using PHP 5.6.11-1ubuntu3.1 (cli)

set fopen() deadline with Wordpress on google-app-engine

I have Wordpress 3.8 running on google-app-engine. Everything works fine except the paypal return page with the s2Member® plugin. I think its related to an fopen() or URL fetch error.
The Server Scan By: s2Member® (http://www.s2member.com/kb/server-scanner) in my application reports following issue:
[ERROR] cURL Extension / Or fopen() URL One or more HTTP connection
tests failed against localhost. Cannot connect to self over HTTP —
possible DNS resolution issue. Can't connect to:
http://foto-box.appspot.com
In order to run s2Member®, your installation of PHP needs one of the
following...
Either the cURL extension for remote communication via PHP (plus the OpenSSL extension for PHP).
Or, set: allow_url_fopen = on in your php.ini file (and enable the OpenSSL extension for PHP).
The app-engine Log report is:
PHP Warning: file_get_contents(http://foto-box.appspot.com): failed
to open stream: Request deadline exceeded in
/base/data/home/apps/s~foto-box/3.372404596384852247/wordpress/s2-server-scanner.php
on line 1002
I know there is no cURL on app-engine, but fopen should work by default.
How do I exactly modify the deadline time to figure out if that is the problem?
Where do i have to include
deadline=60
or
$options = ["http" => ["timeout" => 60]];
$context = stream_context_create($options);
$data = file_get_contents("http://foo.bar", false, $context);
in my wordpress or app-engine files exactly to increase the timeout? php.ini, index.php,... or wp-config.php?
I had a look at the script - you can change the timeout on line 1000. It is currently 5 seconds, change it to something like 30 seconds.
if(is_resource($_fopen_test_resource = stream_context_create(array('http' => array('timeout' => 5.0, 'ignore_errors' => FALSE)))))
P.S. It might be a good idea not to run arbitrary scripts that you download from the internet - just sayin.

Proper and fast way to TELNET in PHP. Sockets or cURL

Almost all examples of the TELNET implementations in PHP are with sockets (fsockopen). This does not work for me, because it takes an unacceptable amount of time (~ 60 seconds).
I have tried fsockopen for other purposes and found it slow in contrast to cURL.
Question #1: Why are sockets that slow?
Update: I found we need to set stream_set_timeout function, and we can control the socket execution time. I'm curious how to set the proper timeout or how to make it "stop waiting" once the response is received.
I can't get the same thing implemented with cURL. Where should I put the commands which I need to send to telnet? Is CURLOPT_CUSTOMREQUEST the proper option? I'm doing something like this:
class TELNETcURL{
public $errno;
public $errstr;
private $curl_handle;
private $curl_options = array(
CURLOPT_URL => "telnet://XXX.XXX.XXX.XXX:<port>",
CURLOPT_TIMEOUT => 40,
CURLOPT_RETURNTRANSFER => TRUE,
CURLOPT_HEADER => FALSE,
CURLOPT_PROTOCOLS => CURLPROTO_TELNET
);
function __construct(){
$this->curl_handle = curl_init();
curl_setopt_array($this->curl_handle, $this->curl_options);
}
public function exec_cmd($query) {
curl_setopt($this->curl_handle, CURLOPT_CUSTOMREQUEST, $query."\r\n");
$output = curl_exec($this->curl_handle);
return $output;
}
function __destruct(){
curl_close($this->curl_handle);
}
}
And then something similar to this:
$telnet = new TELNETcURL();
print_r($telnet->exec_cmd("<TELNET commands go here>"));
I am getting "Max execution time exceeded 30 seconds" on curl_exec command.
Question #2: What is wrong with the cURL implementation?
what you need to be doing is using NON-Blocking IO and then Poll for the response. what you are doing now is waiting/hanging for a response that never comes -- thus the timeout.
Personally I've written a lot of socket apps in php they work great -- and I detest cURL as buggy, cumbersome, and highly insecure... just read their bug list you should be appalled.
Go read the Excellent PHP manual complete with many examples for how to do Polled IO they even give you an example telnet server & client.
sockets are not slow. sockets are the base for communication. Curl uses sockets to open a connection to the remote server. Everything works on sockets (i think).
I don't think you can use curl to use a telnet service, well, that's not entirely true, i guess you can connect and send a single command. Curl was designed with http protocol in mind which is stateless (you open a connection, send a request, wait a reply and the close the connection).
Sockets are the only option.
I am getting "Max execution time exceeded 30 seconds" on curl_exec command.
My guess is that the remote server is the culprit. check to see if it works using a regular terminal client, or increase the max_execution_time in php ini.
UPDATE
It seems it is possible to use curl for telnet, check this:
http://www.cs.sunysb.edu/documentation/curl/
But i still think you are better off using sockets.
use pfsockopen instead of fopensock ..its much faster and keeps connection alive all the way

Detecting no available server with PHP and Gearman

I'm currently making use of Gearman with PHP using the standard bindings (docs here). All functioning fine, but I have one small issue with not being able to detect when a call to GearmanClient::addServer (docs here) is "successfull", by which I mean...
The issue is that adding the server attempts no socket I/O, meaning that the server may not actually exist or be operational. This means that subsequent code calls (in the scenario where the sever does not infact exist) fail and result in PHP warnings
Is there any way, or what is the best way, to confirm that the Gearman Daemon is operational on the server before or after adding it?
I would like to achieve this so that I can reliably handle scenarios in which Gearman may have died, or the server is un-contactable perhaps..
Many thanks.
We first tried this by manually calling fsockopen on the host and port passed to addServer, but it turns out that this can leave a lot of hanging connections as the Gearman server expects something to happen over that socket.
We use a monitor script to check the status of the daemon and its workers — something similar to this perl script on Google Groups. We modified the script to restart the daemon if it was not running.
If this does not appeal, have a look at the Gearman Protocol (specifically the “Administrative Protocol” section, referenced in the above thread) and use the status command. This will give you information on the status of the jobs and workers, but also means you can perform a socket connection to the daemon and not leave it hanging.
You can use this library: https://github.com/necromant2005/gearman-stats
It has no external dependencies.
$adapter = new \TweeGearmanStat\Queue\Gearman(array(
'h1' => array('host' => '10.0.0.1', 'port' => 4730, 'timeout' => 1),
'h2' => array('host' => '10.0.0.2', 'port' => 4730, 'timeout' => 1),
));
$status = $adapter->status();
var_dump($status);

file_get_contents returns empty string

I am hesitated to ask this question because it looks weird.
But anyway.
Just in case someone had encountered the same problem already...
filesystem functions (fopem, file, file_get_contents) behave very strange for http:// wrapper
it seemingly works. no errors raised. fopen() returns resource.
it returns no data for all certainly working urls (e.g. http://google.com/).
file returns empty array, file_get_contents() returns empty string, fread returns false
for all intentionally wrong urls (e.g. http://goog973jd23le.com/) it behaves exactly the same, save for little [supposedly domain lookup] timeout, after which I get no error (while should!) but empty string.
url_fopen_wrapper is turned on
curl (both command line and php versions) works fine, all other utilities and applications works fine, local files opened fine
This error seems inapplicable because in my case it doesn't work for every url or host.
php-fpm 5.2.11
Linux version 2.6.35.6-48.fc14.i686 (mockbuild#x86-18.phx2.fedoraproject.org)
I fixed this issue on my server (running PHP 5.3.3 on Fedora 14) by removing the --with-curlwrapper from the PHP configuration and rebuilding it.
Sounds like a bug. But just for posterity, here are a few things you might want to debug.
allow_url_fopen: already tested
PHP under Apache might behave differently than PHP-CLI, and would hint at chroot/selinux/fastcgi/etc. security restrictions
local firewall: unlikely since curl works
user-agent blocking: this is quite common actually, websites block crawlers and unknown clients
transparent proxy from your ISP, which either mangles or blocks (PHP user-agent or non-user-agent could be interpreted as malware)
PHP stream wrapper problems
Anyway, first let's proof that PHPs stream handlers are functional:
<?php
if (!file_get_contents("data:,ok")) {
die("Houston, we have a stream wrapper problem.");
}
Then try to see if PHP makes real HTTP requests at all. First open netcat on the console:
nc -l 80000
And debug with just:
<?php
print file_get_contents("http://localhost:8000/hello");
And from here you can try to communicate with PHP, see if anything returns if you variate the response. Enter an invalid response first into netcat. If there's no error thrown, your PHP package is borked.
(You might also try communicating over a "tcp://.." handle then.)
Next up is experimenting with http stream wrapper parameters. Use http://example.com/ literally, which is known to work and never block user-agents.
$context = stream_context_create(array("http"=>array(
"method" => "GET",
"header" => "Accept: xml/*, text/*, */*\r\n",
"ignore_errors" => false,
"timeout" => 50,
));
print file_get_contents("http://www.example.com/", false, $context, 0, 1000);
I think ignore_errors is very relevant here. But check out http://www.php.net/manual/en/context.http.php and specifically try to set protocol_version to 1.1 (will get chunked and misinterpreted response, but at least we'll see if anything returns).
If even this remains unsuccessful, then try to hack the http wrapper.
<?php
ini_set("user_agent" , "Mozilla/3.0\r\nAccept: */*\r\nX-Padding: Foo");
This will not only set the User-Agent, but inject extra headers. If there is a processing issue with construction the request within the http stream wrapper, then this could very eventually catch it.
Otherwise try to disable any Zend extensions, Suhosin, PHP xdebug, APC and other core modules. There could be interferences. Else this is potentiallyan issue specific to the Fedora package. Try a new version, see if it persists on your system.
When you use the http stream wrapper PHP creates an array for you called $http_response_header after file_get_contents() (or any of the other f family of functions) is called. This contains useful info on the state of the response. Could you do a var_dump() of this array and see if it gives you any more info on the response?
It's a really weird error that you're getting. The only thing I can think of is that something else on the server is blocking the http requests from PHP, but then I can't see why cURL would still be ok...
Is http stream registered in your PHP installation? Look for "Registered PHP Streams" in your phpinfo() output. Mine says "https, ftps, compress.zlib, compress.bzip2, php, file, glob, data, http, ftp, phar, zip".
If there is no http, set allow_url_fopen to on in your php.ini.
My problem was solved dealing with the SSL:
$arrContextOptions = array(
"ssl" => array(
"verify_peer" => false,
"verify_peer_name" => false,
),
);
$context = stream_context_create($arrContextOptions);
$jsonContent = file_get_contents("https://www.yoursite.com", false, $context);
What does a test with fsockopen tell you?
Is the test isolated from other code?
I had the same issue in Windows after installing XAMPP 1.7.7. Eventually I managed to solve it by adding the following line to php.ini (while having allow_url_fopen = On):
extension=php_openssl.dll
Use http://pear.php.net/reference/PHP_Compat-latest/__filesource/fsource_PHP_Compat__PHP_Compat-1.6.0a2CompatFunctionfile_get_contents.php.html and rename it and test if the error occurs with this rewritten function.

Categories