Do not wait for a respondse of PHP shell_exec - php

Is it possible to skip waiting for a response of a shell execution command?
The command executes aria2c and places a file to the download queue. Therefore I do not have to wait for the response. Otherwise I run into a "Internal Server Error" because of max execution time is reached.

you should use exec() and redirect the output to a file or null it will then run in the back ground
If a program is started with this function, in order for it to
continue running in the background, the output of the program must be
redirected to a file or another output stream. Failing to do so will
cause PHP to hang until the execution of the program ends.

Related

Apache hangs after 200 seconds not receiving output

I have a PHP script that hangs in the browser when it runs for over 200 seconds. This PHP script calls shell_exec on a bash script and waits for output. Then, echoes that output to the browser.
Everything works fine when:
script runs < 200 seconds
run script from command line as localuser NOT apache
Does not work when:
run from the browser (apache)
run for > 200 seconds
The operations of the bash script execute fine in all cases and i can see the output when i write it to the console ... but it will just hang in the browser (no output) even after the script has "finished".
I understand that set_time_limit will not have an effect here because system commands do not count towards script execution time.
REF: Max Execution Time and System Calls
Below is a screenshot of my Apache timeout settings. To be honest, I am unsure if that would contribute to this issue, if using shell_exec (thus not outputting / echoing anything for some time) would have an effect on that, or what would result if an Apache timeout were to occur.
OTHER NOTES
I think it could be worth noting that after some time of leaving the script "hanging", it runs again without prompting.

Execute PHP and let Apache do the rest in the background

This might be a very stupid question so please bear with me.
I have a a php script that makes API calls to Shopify.
The entire point of this php script is to print out statements for each customer.
Now it has to run through about 200 customers.
This entire process takes about 15 minutes.
Ordinarily this runs on a monthly basis with a cron job.
But I need to be able to run it manually as well. I just want the page to execute and do everything in the background with my browser or internet connection playing NO role as to whether the complete execution completes.
The cron job runs header_php.php?run=monthly
Is there anyway I can run it manually, make sure it gets a 200 response from the page, and then close my browser tab and ensure that apache does the rest?
I would be executing it via an AJAX call as well.
Another thing, once each statement is done being processed, the script outputs it to pdf and emails it to the customer. So there's no feedback required from the page when it runs.
Easily doable with simple HTTP headers.
Start output buffering
Output the response, if any
Send Content-length and Connection: close headers
Flush and end output buffers
The browser receives HTTP response
Continue time consuming processing
This SO anwer nails it (the comments are helpful as well).
You can call your script with other script that runs in background the job.
shell_exec('nohup /usr/bin/php /dir/to/your/script.php > /dev/null 2>/dev/null &');
Then you don't need to wait to finish the job
on linux
<?php
exec(''.$command.' > /dev/null 2>&1 &');
?>
on windows
<?php
$shell = new COM("WScript.Shell");
$shell->run($command, 0, false);
?>
where command would be something like
php -f $path/to/filename
if you put that in a page, you can then call it whenever you want and it will spawn a thread that will call apache, but not require the browser to wait for any response.

PHP wait for exec to finish and output the result

I have a php file that takes in a couple parameters from the URL and then runs an exec command, in which i want to wait and have the results of the exec displayed. This exec command takes about 20-30 seconds to finish. It never completes because the webpage just gets an nginx 502 bad gateway error (times out).. Instead of extending nginx timeout error, as that's bad practice to have a connection hang for that long, how can i run the php's exec in the back and then have it returned on the page after it's complete?
Or is there a better way to accomplish this without using php?
Have the PHP trigger an exec to a script that is forked so it runs in the background (ends with &). The page should then return some js that periodically polls the server via ajax requests to check the original script's status. The script should output its STDOUT and STDERR to unique file(s) so that the polling script can check the status.
Edit: If you need to know the script's exit code wrap it in another script:
#/bin/bash
uniqId=$1
yourscript $uniqId &
myPid=$!
echo $myPid > $uniqId'.pid'
wait $myPid
echo $? > $uniqId'.returned'
Call like ./wrapper.sh someIdUniqueToClient &. Untested, but you get the gist.

Call Shell Command from PHP without considering output

I'm trying to write a cronjob which launches multiple processes that I want to run in parallel.
I'm using a foreach calling each command, but the command line waits for the output. I don't want it to put.
Was wondering if anyone ever used any library for this?
Add an ampersand after the command:
$ php task.php &
It will run that instance of php in the background and continue.
If you read the manual on passthru you'll notice it tells you how to avoid this...
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
So you can rely on UNIX fds to redirect output to something like /dev/null for example if you don't care about the output or to some file if you do want to save the output and this will avoid PHP waiting on the command to finish.
pssthru("somecommand > /some/path/to/file")

Run a PHP-script from a PHP-script without blocking

I'm building a spider which will traverse various sites and data mining them.
Since I need to get each page separately this could take a VERY long time (maybe 100 pages).
I've already set the set_time_limit to be 2 minutes per page but it seems like apache will kill the script after 5 minutes no matter.
This isn't usually a problem since this will run from cron or something similar which does not have this time limit. However I would also like the admins to be able to start a fetch manually via a HTTP-interface.
It is not important that apache is kept alive for the full duration, I'm, going to use AJAX to trigger a fetch and check back once in a while with AJAX.
My problem is how to start the fetch from within a PHP-script without the fetch being terminated when the script calling it dies.
Maybe I could use system('script.php &') but I'm not sure it will do the trick.
Any other ideas?
$cmd = "php myscript.php $params > /dev/null 2>/dev/null &";
# when we call this particular command, the rest of the script
# will keep executing, not waiting for a response
shell_exec($cmd);
What this does is sends all the STDOUT and STDERR to /dev/null, and your script keeps executing. Even if the 'parent' script finishes before myscript.php, myscript.php will finish executing.
if you don't want to use exec you can use a php built in function !
ignore_user_abort(true);
this will tell the script to resume even if the connection between the browser and the server is dropped ;)

Categories