I have a php file that takes in a couple parameters from the URL and then runs an exec command, in which i want to wait and have the results of the exec displayed. This exec command takes about 20-30 seconds to finish. It never completes because the webpage just gets an nginx 502 bad gateway error (times out).. Instead of extending nginx timeout error, as that's bad practice to have a connection hang for that long, how can i run the php's exec in the back and then have it returned on the page after it's complete?
Or is there a better way to accomplish this without using php?
Have the PHP trigger an exec to a script that is forked so it runs in the background (ends with &). The page should then return some js that periodically polls the server via ajax requests to check the original script's status. The script should output its STDOUT and STDERR to unique file(s) so that the polling script can check the status.
Edit: If you need to know the script's exit code wrap it in another script:
#/bin/bash
uniqId=$1
yourscript $uniqId &
myPid=$!
echo $myPid > $uniqId'.pid'
wait $myPid
echo $? > $uniqId'.returned'
Call like ./wrapper.sh someIdUniqueToClient &. Untested, but you get the gist.
Related
This might be a very stupid question so please bear with me.
I have a a php script that makes API calls to Shopify.
The entire point of this php script is to print out statements for each customer.
Now it has to run through about 200 customers.
This entire process takes about 15 minutes.
Ordinarily this runs on a monthly basis with a cron job.
But I need to be able to run it manually as well. I just want the page to execute and do everything in the background with my browser or internet connection playing NO role as to whether the complete execution completes.
The cron job runs header_php.php?run=monthly
Is there anyway I can run it manually, make sure it gets a 200 response from the page, and then close my browser tab and ensure that apache does the rest?
I would be executing it via an AJAX call as well.
Another thing, once each statement is done being processed, the script outputs it to pdf and emails it to the customer. So there's no feedback required from the page when it runs.
Easily doable with simple HTTP headers.
Start output buffering
Output the response, if any
Send Content-length and Connection: close headers
Flush and end output buffers
The browser receives HTTP response
Continue time consuming processing
This SO anwer nails it (the comments are helpful as well).
You can call your script with other script that runs in background the job.
shell_exec('nohup /usr/bin/php /dir/to/your/script.php > /dev/null 2>/dev/null &');
Then you don't need to wait to finish the job
on linux
<?php
exec(''.$command.' > /dev/null 2>&1 &');
?>
on windows
<?php
$shell = new COM("WScript.Shell");
$shell->run($command, 0, false);
?>
where command would be something like
php -f $path/to/filename
if you put that in a page, you can then call it whenever you want and it will spawn a thread that will call apache, but not require the browser to wait for any response.
Is it possible to skip waiting for a response of a shell execution command?
The command executes aria2c and places a file to the download queue. Therefore I do not have to wait for the response. Otherwise I run into a "Internal Server Error" because of max execution time is reached.
you should use exec() and redirect the output to a file or null it will then run in the back ground
If a program is started with this function, in order for it to
continue running in the background, the output of the program must be
redirected to a file or another output stream. Failing to do so will
cause PHP to hang until the execution of the program ends.
How would I execute a shell script from PHP while giving constant/live feedback to the browser?
I understand from the system function documentation:
The system() call also tries to automatically flush the web server's
output buffer after each line of output if PHP is running as a server
module.
I'm not clear on what they mean by running it as a 'server module'.
Example PHP code:
<?php
system('/var/lib/script_test.sh');
Example shell code:
#!/bin/bash
echo "Start..."
for i in {1..10}
do
echo "$i..."
sleep 1
done
echo "Done."
What this does: It will wait about 10 seconds and then flush to the output buffer.
What I want this to do: Flush to the output buffer after each line of output.
This can be done using popen() which gives you a handle to the stdout of whatever process you open. Chunks of data can be sent to the client using ob_flush(), the data can be displayed using an XHR.
One option is to write to a file in the shell script, on each step to say where it's up to. On your web page, use an ajax call every X seconds/minutes. The ajax call will call a PHP script which reads the status file and returns the status or completed steps.
The advantage to this approach is the page live information will be available to multiple visitors, rather than just the one that actually initiated the shell script. Obviously that may or may not be desirable depending on your needs.
The disadvantage of course is the longer the ajax interval, the more out of date the update will be.
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Asynchronous shell exec in PHP
i need to run a java program in the background.
process.php contains
shell_exec("php php_cli.php")
php_cli.php contains
shell_exec("java -jar BiForce.jar settings.ini > log.txt");
I am calling process.php asynchronously using ajax
When i click the link in the webpage that calls ajax function (for running process.php) the webage shows "loading". when i click other links at the same time it does not responds.
The java program takes about 24 hours to finish executing, so user will not wait until the execution ends.
The problem is that the browser keeps on loading and does not go to other pages when clicked the link.
I also tried with system(), but the same problem ....
Help will greatly be appreciated.
Using shell_exec waits for the command to hang up, so that's what your script is doing.
If your command doesn't have any wait time, then your script will not either.
You can call another PHP script from your original, without waiting for it to hang up:
$processId = shell_exec(
"nohup " . // Runs a command, ignoring hangup signals.
"nice " . // "Adjusted niceness" :) Read nice --help
"/usr/bin/php -c " . // Path to your PHP executable.
"/path/to/php.ini -f " . // Path to your PHP config.
"/var/www/php_cli.php " . // Path to the script you want to execute.
"action=generate > /process.log " . // Log file.
"& echo $!" // Make sure it returns only the process id.
);
It is then possible to detect whether or not the script is finished by using this command:
exec('ps ' . $processId, $processState);
// exec returns the result of the command - but we need to store the process state.
// The third param is a referenced variable.
// First key in $processState is that it's running.
// Second key would be that it has exited.
if (count($processState) < 2) {
// Process has ended.
}
You could call the command in the page displayed, but appending an & at the end:
shell_exec("java -jar BiForce.jar settings.ini > log.txt &");
This way the process is launched on the background.
Also, there is no need (unless defined by your application) to create a process.php wich itself calls php via a shell exec. You could archive the same functionality via an include to the other file.
As in normal shell scripting you can use the ampersand to background the process:
shell_exec("java -jar BiForce.jar settings.ini > log.txt &");
See Asynchronous shell exec in PHP .
First, you might want to redesign this concept. I am not sure exactly what these programs do, but clearly this is can lead to potential problems...
This is what I suggest you do, instead of starting external processes via PHP:
Your ajax call creates (or reuse) a file in some temporary directory (probably using the user session to generate that file)
some data is written unto the file, and the request ends
Your jar is launched separately, and runs indefinitely
At regular intervals, the Java program scans the temporary directory for new files, or if some file has been modified
parse it, and execute the 24 hour long process, or adjust any previous execution if necessary
Along the same idea, you can even use sockets instead to communicate with that Java program, or any other way.
The advantage of having the Java program running all the time instead of starting a new process is to be able to reuse system resources within the lifetime of the application; for example, if your program is using DB connections, or any data, cache, etc.
I'm building a spider which will traverse various sites and data mining them.
Since I need to get each page separately this could take a VERY long time (maybe 100 pages).
I've already set the set_time_limit to be 2 minutes per page but it seems like apache will kill the script after 5 minutes no matter.
This isn't usually a problem since this will run from cron or something similar which does not have this time limit. However I would also like the admins to be able to start a fetch manually via a HTTP-interface.
It is not important that apache is kept alive for the full duration, I'm, going to use AJAX to trigger a fetch and check back once in a while with AJAX.
My problem is how to start the fetch from within a PHP-script without the fetch being terminated when the script calling it dies.
Maybe I could use system('script.php &') but I'm not sure it will do the trick.
Any other ideas?
$cmd = "php myscript.php $params > /dev/null 2>/dev/null &";
# when we call this particular command, the rest of the script
# will keep executing, not waiting for a response
shell_exec($cmd);
What this does is sends all the STDOUT and STDERR to /dev/null, and your script keeps executing. Even if the 'parent' script finishes before myscript.php, myscript.php will finish executing.
if you don't want to use exec you can use a php built in function !
ignore_user_abort(true);
this will tell the script to resume even if the connection between the browser and the server is dropped ;)