Strange timeout behaviour using php soap client - php

I'm trying to make a proxy-like page that forwards an AJAX request to a SOAP server.
The browser sends 2 requests to the same page (i.e. server.php with different query string) every 10 seconds.
The server makes a soap call to the soap server depending on the query string.
All is working fine.
Then I put a sleep (40 secs) in the soap server to simulate a slow response and I also put a timeout on the caller to abort the call after some seconds.
server.php: Pseudo code:
$timeout = 10;
ini_set("default_socket_timeout", $timeout);
$id = $_GET['id'];
$wsdl= 'http://soapserver/wsdl'
$client = new SoapClient($wsdl,array('connection_timeout'=> $timeout));
print($client->getQuote($id));
If the browser sends an ajax request to http://myserver/server.php?id=IBM
the request stops after the timeout I set.
If I try to make a second call before the first stops, the second one doesn't not respect timeout.
i.e.
Request:
GET http://myserver/server.php?id=IBM
and after 1 second
GET http://myserver/server.php?id=AAP
Response:
after 10 seconds:
No data
after 20 seconds:
No data
I also tried to not use PHP SOAP and use curl instead but I got the same results.
I also tried to open 3 tabs on my browser and call:
http://myserver/server.php?id=IBM
http://myserver/server.php?id=AAP
http://myserver/server.php?id=MSX
The first one stops after 10 seconds, the second after 20 seconds and the third after 30 seconds.
Is this a normal behaviour or I miss something ?
Thanks in advance

You are probably starting sessions, and session_start() blocks a second call to it until the other request has 'freed' the session (in other words: has finished and will not write any data to the session anymore). For time consuming requests, don't start a session if you don't need one, and if you DO need one, get all the data that you need and then call session_write_close() BEFORE you doing the time-consuming thing. If you need to write to the session afterwards, just call session_start() again.

Related

PHP NATS Client disconnects after some idle time

I have been using this library repejota/phpnats for developing a NATS Client that can subscribe to a particular channel. But after connecting, receiving few messages and having some 30 secs idle time, it gets disconnect itself without any interruption. However my Node.js client is working good with the same NATS server.
Here is how I am subscribing...
$c->subscribe(
'foo',
function ($message) {
echo $message->getBody();
}
);
$c->wait();
Any suggestions/help???
Thanks!
Was this just the default PHP timeout killing it off?
Maybe something like this:
ini_set('max_execution_time', 180); // gives about 3 minutes for example
By default, PHP scripts can't live forever as PHP shall be rather considered stateless. This is by design and default life span is 30 seconds (hosters usually extend that to 180 secs but that's irrelevant really). You can extend that time yourself by setting max_execution_time to any value (with 0 meaning "forever") but that's not recommended unless you know you want that. If not, then commonly used approach is to make the script invoke itself (ie via GET request) often passing some params to let invoked script resume where caller finished.
$options = new ConnectionOptions();
$options->setHost('127.0.0.1')->setPort(4222);
$client = new Connection($options);
$client->connect(-1);
You need to set connect parameters as -1

High traffic connection between Php and Redis

I have backend on php, that works with Redis.
But when requests increased and they more than 2000 request per sec I receive an error:
99 - Cannot assign requested address
All sockets in TIME_WAIT.
Connecting example:
$this->_socket = #stream_socket_client(
'tcp://' . $this->hostname . ':' . $this->port,
$errorNumber,
$errorDescription,
ini_get('default_socket_timeout'),
STREAM_CLIENT_CONNECT | STREAM_CLIENT_PERSISTENT
);
I find solution: http://redis4you.com/articles.php?id=012&name=redis
But /proc/sys/net/ipv4/tcp_tw_recycle I can't set in 1.
Don't want to loss packets on the network between application and redis.
Php on new request from API create a new socket.
Any ideas?
I don't know your whole design, but here something you could do :
Create a PHP page that always run (with a while(true) loop)
This page would wait for content from your initial page (where the socket code was before)
Using the pipelining technique, you would send all requests using the same socket.
Only thing missing is how to pass data from the initial page to this new page.
For that last part I see multiple solutions (not sure if they all work though) :
Using APC to store data from initial page and still use it to get it from the new one.
Create a SESSION in the new page which would than have two modes : Processing, Submitting. You should then call this page using your local server inside the initial page.
In both solutions, one instance of this new page shall be executed locally so the 'Processing/Waiting' is activated.
Fixed problem.
Use tcp reuce and time waite for socket sets in 10 seconds. Php work with socket in persistent mode
STREAM_CLIENT_CONNECT | STREAM_CLIENT_PERSISTENT
So even in 2 000 request per second it use not more then 61 sockets.

Timeout a PHP function after 5 seconds

I am making an API call to a service and need to timeout the call after 5 seconds and consider it a "fail" then proceed with the code. If it times out I want to save this to a $timeoutResult variable and then pass that all the way back to the javascript (I can do this part).
I'm just not sure how to do a timed function in PHP. I've seen the documentation on set_time_limit(5) but I'm not sure how to do it?
For example:
$response = $api_calls->apiCall($endpoint, $data); If this takes >5 seconds I want it to quit/consider the call a "fail" and then proceed onto my error handling further down the code.
I'm not sure how to stop the execution of THIS function by considering it a fail and proceeding.
Would something like this work?
set_time_limit(5);
$response = $api_calls->apiCall($endpoint, $data);
set_time_limit(0);
This way I set a timeout (which begins when this function inside a function is being called), it tries to execute, and if it finishes it then sets the time out back to infinity?
My cURL settings in apiCall() has a standard timeout of 10 seconds, but for this one particular call I need it to timeout after 5 seconds and then display an error if it times out.
You've not shown the code which actually makes the api call!
While its possible to to set a watchdog timer (SIGALRM) this is only an option on a POSIX system and only when running in the CLI sapi.
You mention that the code uses curl. This has lots of options for controlling timeouts - _CONNECTTIMEOUT[_MS], _LOW_SPED_LIMIT, _LOW_SPEED_TIME and _TIMEOUT[_MS] all documented in the manual.
I added an additional parameter to my apiCall() function which accepts an array.
I then looped through this array using
if(isset($extra_curl_options) && $extra_curl_options.length > 0){
foreach($extra_curl_options AS $k => $v) {
$http_request->setOption(constant($k), $v);
}
}
This will allow me to pass in multiple curl options to the apiCall for the future.

Setting a socket timeout in PHP

I'm writing a script in PHP that uses PEAR's Net_Socket. I want to query servers to see if they have any current information. I send in a command and then use $socket->readLine() to get the response. However, if there is not a response, my script just waits forever. Is there anyway to either tell the socket to close after a specific amount of time or to wrap the whole function in a timeout, that if it hasn't returned by the timeout, it halts its execution?
On the same page you linked is a link to setTimeout(): https://pear.php.net/manual/en/package.networking.net-socket.settimeout.php
Trying calling $socket->setTimeout( $seconds, $milliseconds ); just before calling readLine()

PHP - set time limit effectively

I have a simple script that makes redirection to mobile version of a website if it finds that user is browsing on mobile phone. It uses Tera-WURFL webservice to acomplish that and it will be placed on other hosting than Tera-WURFL itself. I want to protect it, in case of Tera-WURFL hosting downtime. In other words, if my script takes more than a second to run, then stop executing it and just redirect to regular website. How to do it effectively (so that the CPU would not be overly burdened by the script)?
EDIT: It looks that TeraWurflRemoteClient class have a timeout property. Read below. Now I need to find how to include it in my script, so that it would redirect to regular website in case of this timeout.
Here is the script:
// Instantiate a new TeraWurflRemoteClient object
$wurflObj = new TeraWurflRemoteClient('http://my-Tera-WURFL-install.pl/webservicep.php');
// Define which capabilities you want to test for. Full list: http://wurfl.sourceforge.net/help_doc.php#product_info
$capabilities = array("product_info");
// Define the response format (XML or JSON)
$data_format = TeraWurflRemoteClient::$FORMAT_JSON;
// Call the remote service (the first parameter is the User Agent - leave it as null to let TeraWurflRemoteClient find the user agent from the server global variable)
$wurflObj->getCapabilitiesFromAgent(null, $capabilities, $data_format);
// Use the results to serve the appropriate interface
if ($wurflObj->getDeviceCapability("is_tablet") || !$wurflObj->getDeviceCapability("is_wireless_device") || $_GET["ver"]=="desktop") {
header('Location: http://website.pl/'); //default index file
} else {
header('Location: http://m.website.pl/'); //where to go
}
?>
And here is source of TeraWurflRemoteClient.php that is being included. It has optional timeout argument as mentioned in documentation:
// The timeout in seconds to wait for the server to respond before giving up
$timeout = 1;
TeraWurflRemoteClient class have a timeout property. And it is 1 second by default, as I see in documentation.
So, this script won't be executed longer than a second.
Try achieving this by setting a very short timeout on the HTTP request to TeraWurfl inside their class, so that if the response doesn't come back in like 2-3 secs, consider the check to be false and show the full website.
The place to look for setting a shorter timeout might vary depending on the transport you use to make your HTTP request. Like in Curl you can set the timeout for the HTTP request.
After this do reset your HTTP request timeout back to what it was so that you don't affect any other code.
Also I found this while researching on it, you might want to give it a read, though I would say stay away from forking unless you are very well aware of how things work.
And just now Adelf posted that TeraWurflRemoteClient class has a timeout of 1 sec by default, so that solves your problem but I will post my answer anyway.

Categories