Setting a socket timeout in PHP - php

I'm writing a script in PHP that uses PEAR's Net_Socket. I want to query servers to see if they have any current information. I send in a command and then use $socket->readLine() to get the response. However, if there is not a response, my script just waits forever. Is there anyway to either tell the socket to close after a specific amount of time or to wrap the whole function in a timeout, that if it hasn't returned by the timeout, it halts its execution?

On the same page you linked is a link to setTimeout(): https://pear.php.net/manual/en/package.networking.net-socket.settimeout.php
Trying calling $socket->setTimeout( $seconds, $milliseconds ); just before calling readLine()

Related

PHP NATS Client disconnects after some idle time

I have been using this library repejota/phpnats for developing a NATS Client that can subscribe to a particular channel. But after connecting, receiving few messages and having some 30 secs idle time, it gets disconnect itself without any interruption. However my Node.js client is working good with the same NATS server.
Here is how I am subscribing...
$c->subscribe(
'foo',
function ($message) {
echo $message->getBody();
}
);
$c->wait();
Any suggestions/help???
Thanks!
Was this just the default PHP timeout killing it off?
Maybe something like this:
ini_set('max_execution_time', 180); // gives about 3 minutes for example
By default, PHP scripts can't live forever as PHP shall be rather considered stateless. This is by design and default life span is 30 seconds (hosters usually extend that to 180 secs but that's irrelevant really). You can extend that time yourself by setting max_execution_time to any value (with 0 meaning "forever") but that's not recommended unless you know you want that. If not, then commonly used approach is to make the script invoke itself (ie via GET request) often passing some params to let invoked script resume where caller finished.
$options = new ConnectionOptions();
$options->setHost('127.0.0.1')->setPort(4222);
$client = new Connection($options);
$client->connect(-1);
You need to set connect parameters as -1

PHP exec - echo output line by line during progress

I'm trying to find a way in which I can echo out the output of an exec call, and then flush that to the screen while the process is running. I have written a simple PHP script which accepts a file upload and then converts the file if it is not the appropriate file type using FFMPEG. I am doing this on a windows machine. Currently my command looks like so:
$cmd = "ffmpeg.exe -i ..\..\uploads\\".$filename." ..\..\uploads\\".$filename.".m4v 2>&1";
exec( $cmd, $output);
I need something like this:
while( $output ) {
print_r( $output);
ob_flush(); flush();
}
I've read about using ob_flush() and flush() to clear the output buffer, but I only get output once the process has completed. The command works perfectly, It just doesn't update the Page while converting. I'd like to have some output so the person knows what's going on.
I've set the time out
set_time_limit( 10 * 60 ); //5 minute time out
and would be very greatful if someone could put me in the right direction. I've looked at a number of solutions which come close one Stackoverflow, but none seem to have worked.
Since the exec call is a blocking call you have no way of using buffers to get status.
Instead you could redirect the output in the system call to a log file. Let the client query the server for progress update in which case the server could parse the last lines of the log file to get information about current progress and send it back to the client.
exec() is blocking call, and will NOT return control to PHP until the external program has terminated. That means you cannot do anything to dump the output on a line-by-line basis because PHP is suspended while the external app is running.
For what you want, you need to use proc_open, which returns a filehandle you can read from in a loop. e.g.
$fh = proc_open('.....');
while($line = fgets($fh)) {
print($line);
flush();
}
There are two problems with this approach:
The first is that, as #Marc B notes, the fact that exec will block until it's finished. You'll have to devise some way of measuring progress.
The second is that using ob_flush() in this way amounts to holding the connection between server & client open and dribbling the data out a little at a time. This is not something that the HTTP protocol was designed for and while it might work sometimes, it's not going to work consistently - different browsers and different servers will time out differently. The better way to do it is via AJAX calls: using Javascript's setTimeout() function (or setInterval()), make a call to the server periodically and have the server send back a progress report.

Strange timeout behaviour using php soap client

I'm trying to make a proxy-like page that forwards an AJAX request to a SOAP server.
The browser sends 2 requests to the same page (i.e. server.php with different query string) every 10 seconds.
The server makes a soap call to the soap server depending on the query string.
All is working fine.
Then I put a sleep (40 secs) in the soap server to simulate a slow response and I also put a timeout on the caller to abort the call after some seconds.
server.php: Pseudo code:
$timeout = 10;
ini_set("default_socket_timeout", $timeout);
$id = $_GET['id'];
$wsdl= 'http://soapserver/wsdl'
$client = new SoapClient($wsdl,array('connection_timeout'=> $timeout));
print($client->getQuote($id));
If the browser sends an ajax request to http://myserver/server.php?id=IBM
the request stops after the timeout I set.
If I try to make a second call before the first stops, the second one doesn't not respect timeout.
i.e.
Request:
GET http://myserver/server.php?id=IBM
and after 1 second
GET http://myserver/server.php?id=AAP
Response:
after 10 seconds:
No data
after 20 seconds:
No data
I also tried to not use PHP SOAP and use curl instead but I got the same results.
I also tried to open 3 tabs on my browser and call:
http://myserver/server.php?id=IBM
http://myserver/server.php?id=AAP
http://myserver/server.php?id=MSX
The first one stops after 10 seconds, the second after 20 seconds and the third after 30 seconds.
Is this a normal behaviour or I miss something ?
Thanks in advance
You are probably starting sessions, and session_start() blocks a second call to it until the other request has 'freed' the session (in other words: has finished and will not write any data to the session anymore). For time consuming requests, don't start a session if you don't need one, and if you DO need one, get all the data that you need and then call session_write_close() BEFORE you doing the time-consuming thing. If you need to write to the session afterwards, just call session_start() again.

Can a PHP script abort itself or detect abortion?

PHP scripts can continue executing after the HTTP page request, so how do I finally stop it executing when I'm done with it?
Also, is there an event to detect when the OS is going to forcibly abort the script? or how do I keep an internal timer to predict max_execution_time?
exit()/die() will stop a php script.
To know when to stop the script, you'll just have to use microtime as a timer and save as a constant (or fetch from the php.ini file) the maximum execution time.
You can also look at the Connection handling information. Where we have things like connection_aborted() and connection_status()
But what is the problem you're trying to solve?
You can use register_shutdown_function to register a function to be called at the end of script execution. You can use this to detect when the script has been terminated and perform a certain set of actions.
To have a callback at the moment your request is shutting down, use register_shutdown_function( myfunction ).
Much like most POSIX environments, PHP also supports signal handlers. You can register your own handler for the SIGTERM event.
function my_abort_handler( $signo ) {
echo "Aborted";
}
pcntl_signal( SIGTERM, "my_abort_handler" );
You may want to take a look at pcntl-alarm which allows a script to send a signal to itself. Also contains some sample on how to catch the kill signals which can be send by the OS. And die() indeed.
Well, you could start a $start=microtime(true) which will return a timestamp. Then you can just keep checking microtime(true) and subtract that from your start time to get the number of seconds since executing.
But no, you can't "catch" the script as its terminating for the reason of the request being too long. You could try to do some last minute stuff in the shutdown handler, but I'm not sure if PHP will honor that.
It looks like there used to be a function that does exactly what you want, connection_timeout(), but it was deprecated and removed. Don't know if there is any kind of replacement for this, however.

Abandon Long Processes in PHP (But let them complete)

I have an HTML form that submits to a PHP page which initiates a script. The script can take anywhere from 3 seconds to 30 seconds to run - the user doesn't need to be around for this script to complete.
Is it possible to initiate a PHP script, immediately print "Thanks" to the user (or whatever) and let them go on their merry way while your script continues to work?
In my particular case, I am sending form-data to a php script that then posts the data to numerous other locations. Waiting for all of the posts to succeed is not in my interest at the moment. I would just like to let the script run, allow the user to go and do whatever else they like, and that's it.
Place your long term work in another php script, for example
background.php:
sleep(10);
file_put_contents('foo.txt',mktime());
foreground.php
$unused_but_required = array();
proc_close(proc_open ("php background.php &", array(), $unused_but_required));
echo("Done);
You'll see "Done" immediately, and the file will get written 10 seconds later.
I think proc_close works because we've giving proc_open no pipes, and no file descriptors.
In the script you can set:
<?php
ignore_user_abort(true);
That way the script will not terminate when the user leaves the page. However be very carefull when combining this whith
set_time_limit(0);
Since then the script could execute forever.
You can use set_time_limit and ignore_user_abort, but generally speaking, I would recommend that you put the job in a queue and use an asynchronous script to process it. It's a much simpler and durable design.
You could try the flush and related output buffer functions to immediately send the whatever is in the buffer to the browser:
Theres an API wrapper around pcntl_fork() called php_fork.
But also, this question was on the Daily WTF... don't pound a nail with a glass bottle.
I ended up with the following.
<?php
// Ignore User-Requests to Abort
ignore_user_abort(true);
// Maximum Execution Time In Seconds
set_time_limit(30);
header("Content-Length: 0");
flush();
/*
Loooooooong process
*/
?>

Categories