Abandon Long Processes in PHP (But let them complete) - php

I have an HTML form that submits to a PHP page which initiates a script. The script can take anywhere from 3 seconds to 30 seconds to run - the user doesn't need to be around for this script to complete.
Is it possible to initiate a PHP script, immediately print "Thanks" to the user (or whatever) and let them go on their merry way while your script continues to work?
In my particular case, I am sending form-data to a php script that then posts the data to numerous other locations. Waiting for all of the posts to succeed is not in my interest at the moment. I would just like to let the script run, allow the user to go and do whatever else they like, and that's it.

Place your long term work in another php script, for example
background.php:
sleep(10);
file_put_contents('foo.txt',mktime());
foreground.php
$unused_but_required = array();
proc_close(proc_open ("php background.php &", array(), $unused_but_required));
echo("Done);
You'll see "Done" immediately, and the file will get written 10 seconds later.
I think proc_close works because we've giving proc_open no pipes, and no file descriptors.

In the script you can set:
<?php
ignore_user_abort(true);
That way the script will not terminate when the user leaves the page. However be very carefull when combining this whith
set_time_limit(0);
Since then the script could execute forever.

You can use set_time_limit and ignore_user_abort, but generally speaking, I would recommend that you put the job in a queue and use an asynchronous script to process it. It's a much simpler and durable design.

You could try the flush and related output buffer functions to immediately send the whatever is in the buffer to the browser:

Theres an API wrapper around pcntl_fork() called php_fork.
But also, this question was on the Daily WTF... don't pound a nail with a glass bottle.

I ended up with the following.
<?php
// Ignore User-Requests to Abort
ignore_user_abort(true);
// Maximum Execution Time In Seconds
set_time_limit(30);
header("Content-Length: 0");
flush();
/*
Loooooooong process
*/
?>

Related

PHP exec - echo output line by line during progress

I'm trying to find a way in which I can echo out the output of an exec call, and then flush that to the screen while the process is running. I have written a simple PHP script which accepts a file upload and then converts the file if it is not the appropriate file type using FFMPEG. I am doing this on a windows machine. Currently my command looks like so:
$cmd = "ffmpeg.exe -i ..\..\uploads\\".$filename." ..\..\uploads\\".$filename.".m4v 2>&1";
exec( $cmd, $output);
I need something like this:
while( $output ) {
print_r( $output);
ob_flush(); flush();
}
I've read about using ob_flush() and flush() to clear the output buffer, but I only get output once the process has completed. The command works perfectly, It just doesn't update the Page while converting. I'd like to have some output so the person knows what's going on.
I've set the time out
set_time_limit( 10 * 60 ); //5 minute time out
and would be very greatful if someone could put me in the right direction. I've looked at a number of solutions which come close one Stackoverflow, but none seem to have worked.
Since the exec call is a blocking call you have no way of using buffers to get status.
Instead you could redirect the output in the system call to a log file. Let the client query the server for progress update in which case the server could parse the last lines of the log file to get information about current progress and send it back to the client.
exec() is blocking call, and will NOT return control to PHP until the external program has terminated. That means you cannot do anything to dump the output on a line-by-line basis because PHP is suspended while the external app is running.
For what you want, you need to use proc_open, which returns a filehandle you can read from in a loop. e.g.
$fh = proc_open('.....');
while($line = fgets($fh)) {
print($line);
flush();
}
There are two problems with this approach:
The first is that, as #Marc B notes, the fact that exec will block until it's finished. You'll have to devise some way of measuring progress.
The second is that using ob_flush() in this way amounts to holding the connection between server & client open and dribbling the data out a little at a time. This is not something that the HTTP protocol was designed for and while it might work sometimes, it's not going to work consistently - different browsers and different servers will time out differently. The better way to do it is via AJAX calls: using Javascript's setTimeout() function (or setInterval()), make a call to the server periodically and have the server send back a progress report.

PHP - echo before exec()

Good day!
I am having some issues with getting the echo statement to output before the execution of the exec()
<?
if (isset($_POST['ipaddress'])) {
$escaped_command = escapeshellcmd($_POST['ipaddress']);
if(filter_var($escaped_command, FILTER_VALIDATE_IP)) {
echo "Gleaning ARP information, please wait..";
$command = exec('sudo /sbin/getarp.exp');
The echo statement is being outputted after the execution of the $command. The execution time can be anywhere from 15-30 seconds depending on how large the ARP table on the remote router is. Is there an order of operations that I am not aware of? It appears that all the statements within the if statement are executed in parallel and not by line by line as I had assumed.
I would rather not a solution be provided, but some documentational links that would lead me to finding a solution. I have searched what I could, but was not able to find a viable solution.
Any help would be appreciated.
Thanks.
This is happening because the script will run in its entirety before any result/output is sent to the browser.
In PHP there is a concept of "output buffering".
Whenever you output something (e.g. using echo, print, etc.) the text is thrown into a buffer. This buffer is only sent at certain times (at the end of the request, for instance, or when the buffer is full).
In order to empty the buffer (to "flush" it) you need to do it manually. The flush() function will do this. Sometimes you also need to call ob_flush() (this is if you have opened custom output buffers yourself). It is generally a good idea to just call both functions and be done with it:
echo 'Wait a few seconds...';
flush(); ob_flush();
sleep(3);
echo ' aaand we are done!';
See Output Buffering Control for more information on output buffering in PHP.
This is probably an issue with the output buffer. PHP buffers output and writes it to the browser in chunks. Try adding a call to ob_flush() between the echo and the exec(); this will force PHP to write the current contents of the buffer to the browser.
By default, php does not send any of the output until the php script is done running completely. There is a solution. However, I hear it is a little browser dependent. I would test it on different systems and browsers to see if it is working:
ob_implicit_flush (true)
Put that before any of your echo/print commands and that should allow anything printed to show right up on the browser.
A more universal approach would be to integrate your page with asynchronous javascript. A process commonly referred to as "AJAX". It is a little more difficult because it requires the use of many interacting scripts, some client-side and some server-side. However, AJAX is the defacto way to do thing like this on the web.

Catch Command Line Input, Exit If Input = x; Else Continue

I have a CLI script that runs for days. It processes batches, each of which take around 7 minutes. Sometimes I need to stop the script, but I need to stop it only once a batch has been processed, which is a 2 second sleep I have put in. Is there any way I can catch input at any stage of the scripts execution, if that input = x, then stop the script at the end of the next batch; else continue.
I have come across:
$handle = fopen ("php://stdin","r");
$line = fgets($handle);
but this require input.
I don't think you going to get it the way you are thinking. You can catch the StdOut but I don't think it will do you much good in terms of stopping the script. If I was using this on the cli and it ran all the time but I wanted to pause it for a certain amount of time you can do many things but this is probably how I would tackle it for a "quick fix".
Restructure your php code a tiny bit and put the batching process inside a function if it's not already. Then you can create an infinite loop using while. Then I would have it check for the existence of a pause file after each batch process. If the file exists, then don't start the next batch. Basically pausing it. If it doesn't exist proceed on as normal.
So for example.
You php file could look like this little example.
<?php
//path to pause file
$filename = "/root/pause";
while(1){
if(!file_exists($filename)){
batch();
}
}
function batch(){
//batch processing
echo "batching\n";
//fake processing using a usleep pause
usleep(3000000);
}
?>
Then when you want to pause the script. just create the file pause and when the current processing completes it will stop.
So to create the file on Linux, cd to the directory in the script and run the command
touch pause
or you can use the full path like touch /path/to/pause. Just make sure it's in the same directory as in your script. When you are done, just delete the file rm -f pause and it will resume processing the batches.
Note that when it's paused and it's just looping and not processing, it could cause a little jump in cpu usage, however it should be fine.
Long term you can look at this little example to get you going in that direction.
http://www.phpmysqlitutorials.com/2013/05/08/php-standard-input-and-loops-on-the-command-line/

PHP Async Execution

Scenario is as follows:
Call to a specified URL including the Id of a known SearchDefinition should create a new Search record in a db and return the new Search.Id.
Before returning the Id, I need to spawn a new process / start async execution of a PHP file which takes in the new Search.Id and does the searching.
The UI then polls a 3rd PHP script to get status of the search (2nd script keeps updating search record in the Db).
This gives me a problem around spawning the 2nd PHP script in an async manner.
I'm going to be running this on a 3rd party server so have little control over permissions. As such, I'd prefer to avoid a cron job/similar polling for new Search records (and I don't really like polling if I can avoid it). I'm not a great fan of having to use a web server for work which is not web-related but to avoid permissions issues it may be required.
This seems to leave me 2 options:
Calling the 1st script returns the Id and closes the connection but continues executing and actually does the search (ie stick script 2 at the end of script 1 but close response at the append point)
Launch a second PHP script in an asynchronous manner.
I'm not sure how either of the above could be accomplished. The first still feels nasty.
If it's necessary to use CURL or similar to fake a web call, I'll do it but I was hoping for some kind of convenient multi-threading approach where I simply spawn a new thread and point it at the appropriate function and permissions would be inherited from the caller (ie web server user).
I'd rather use option 1. This would also keep related functionality closer to each other.
Here is a hint how to send something to user and then close the connection and continue executing:
(by tom ********* at gmail dot com, source: http://www.php.net/manual/en/features.connection-handling.php#93441)
<?php
ob_end_clean();
header("Connection: close\r\n");
header("Content-Encoding: none\r\n");
ignore_user_abort(true); // optional
ob_start();
echo ('Text user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
ob_end_clean();
//do processing here
sleep(5);
echo('Text user will never see');
//do some processing
?>
swoole: asynchronous & concurrent extension.
https://github.com/matyhtf/swoole
event-driven
full asynchronous non-blocking
multi-thread reactor
multi-process worker
millisecond timer
async MySQL
async task
async read/write file system
async dns lookup

About exec() function and time

I'll show some code, first:
echo exec("compile\\save.exe Untitled.c tmpUntitled.c");
I have a program, named save.exe and i want to know if it already stopped?
If stopped, ok... Do something...
If not, may be an error, or a loop...
Now: I want to build same way to control the time that program use, and put a limit (time limit exceed, something like this...)
Any one has a sugestion ?
Edit[1]:save.exe is a program wrote on C language, and use two parameters: source and destiny.
popen just don't execute save.exe, i say this because it don't gerenate anymore the destiny (with execit happens);
exec() will suspend your script while the exec'd program is running. Control will be returned only when the external program terminates. If you want to see if the program's hung up or waiting for input or somesuch, you use popen(), which returns a filehandle from which you can read the program's output. At that point you can do a polling loop to see if there's been any output:
$app = popen("your shell command here");
while($output = fgets($app)) {
// handle output
sleep(5); // sleep 5 seconds
}
If you want to send something to the app as input, then you have to use proc_open(), which allows bi-directional communication between your script and the external program.

Categories