I am trying to manage a queue of files waiting to be processed by ffmpeg. A page is run using CRON that runs through a database of files waiting to be processed. The page then builds the commands and sends them to the command line using exec().
However, when the PHP page is run from the command line or CRON, it runs the exec() OK, but does not return to the PHP page to continue updating the database and other functions.
Example:
<?php
$cmd = "ffmpeg inpupt.mpg output.m4v";
exec($cmd . ' 2>&1', $output, $return);
//Page continues...but not executed
$update = mysql_query("UPDATE.....");
?>
When this page is run from the command line, the command is run using exec() but then the rest of the page is not executed. I think the problem may be that I am running a command using exec() in a page run from the command line.
Is it possible to run a PHP page in full from the command line which includes exec()?
Or is there a better way of doing this?
Thank you.
I wrote an article about Running a Background Process from PHP on Linux some time ago:
<?php system( 'sh test.sh >/dev/null &' ); ?>
Notice the & operator at the end. This starts a process that returns control to the shell immediately AND CONTINUES TO RUN in the background.
More examples:
<!--
saving standard output to a file
very important when your process runs in background
as this is the only way the process can error/success
-->
<?php system( 'sh test.sh >test-out.txt &' ); ?>
<!--
saving standard output and standard error to files
same as above, most programs log errors to standard error hence its better to capture both
-->
<?php system( 'sh test.sh >test-out.txt 2>test-err.txt &' ); ?>
Have you tried using CURL instead?
Unsure but probably thats due to the shell constraints of cron processes if it works as a web page then use it as a web page, setup a cron job that calls wget wherever_your_page_is and it will be called via your web server and should mimic your tests.
Related
When executing multiple scripts within PHP using the exec command; are each script ran one at a time as in one after the other or are they ran simultaneously?
exec('/usr/bin/php -q process-duplicates.php');
exec('/usr/bin/php -q process-images.php');
exec('/usr/bin/php -q process-sitemaps.php');
Just want to make sure they are one after the other before attempting to rewrite my crontabs.
Sure, the only way to run at background is adding & to the command line arguments, which would put that exec()'d process into the background:
exec("php test.php &");
So you are right, they run one after the other.
NOTE: In your case you shouldn't use & as it will force to run all the scripts simultaneously.
exec waits for the script to return, see php.net
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
But as a devops, please please please do not run your cron jobs like this! Create entries in the crontab for each, or put them in a shell script and have cron run the script.
I have a PHP script, which accesses a database table, which contain information about some large files which need to be downloaded. I intend to run this PHP script as a cron job, in an hourly basis, and each time it runs, it should do the following things:
Check if there are files need to be downloaded
If there are, execute a shell script, which emits a wget command, and starts the downloading of the file, in the background, and when ready, runs a second php script, which updates the db tables of the completion of the download, and get back the process id of this shell script, for later use
Check if there are files currently being downloaded, if there are, check their process id is still active, and if not, adjust the table so we know that an error occured somewhere in the download
The shell script works accordingly, if I run it from the console, everything works fine, I am getting back the process id of the shell script also in my php file, my problem is, that when the originating php file exists, the shell script it initiated stops also.
Here's the code I use in php to start the shell script:
function runProcess($command, &$output=array()) {
$command = $command . ' > /dev/null 2>&1 & echo $!';
echo $command . "<BR>";
return exec($command, $output);
}
/** excerpt from the class that does the processing */
$pid=runProcess("sh ".self::DOWNLOAD_FILE_SHELL." ".DEFAULT_DIR_WHOME." 1.xml ".$this->Parent->XMLPath, $output));
echo $pid;
My question is as follows: How can I force the shell script to continue running, even when the parent process (the php script) exits?
If you're running on UNIX like environment, then the simplest way is to add ampersand (&) sign at the end of command line. this will tell the shell to run the command and not wait the completion. try this:
$command = $command . ' > /dev/null 2>&1 & echo $! &';
I'm working on a web application that needs to send a lot of HTTP requests and update the table, this will block the PHP from executing. So I though I might have to write a separate PHP script and run it via my main application. I tried Exec but still the program waits until the script is executed.
exec('php do_job.php');
I even tried redirecting the output to a file as PHP.Net suggests:
Note: If a program is started with this function, in order for it to
continue running in the background, the output of the program must be
redirected to a file or another output stream. Failing to do so will
cause PHP to hang until the execution of the program ends.
$result = exec('php do_job.php > output.txt &',$output);
But still no success ... Further down the same page I came accross this:
$command = 'php do_job.php';
$shell = new COM("WScript.Shell");
$shell->run($command, 0, false);
Still no sucess ... Lastly I tried:
pclose(popen("start /B ". $command, "r"));
What am I doing wrong here?
I'm developing my app on localhost (XAMPP - Windows), later I'll be releasing it on a Linux host. My last resort would be to run the script via CRON jobs. Is this the only way?
By just adding & at the end of command you can run any command in background. Here I'm redirecting o/p to /dev/null to avoid hang(For Linux).
exec('php do_job.php > /dev/null &');
If you want see the o/p of the command you can redirect it to a file.
exec('php do_job.php > full_path_to_file &');
I have this set in my php script to make it supposedly run as long as it needs to to parse and do mysql queries and fetch images for over 100,000 rows.
ignore_user_abort(true);
set_time_limit(0);
#begin logging output
error_reporting(E_ALL);
ini_set('memory_limit', '512M');
I run the command like this in shell:
nohup php myscript.php > output.txt
after running about 8 to 10 hours this script will still be running but execution just stops... no more output.. it's not a zombie process I checked top. It hasn't met the memory limit either and if it did wouldn't it exit?
What is going on? It's a real pain to babysit this script and write custom code to nudge it along. What is going on? I read up on unix maybe cleaning up zombies but it's not a zombie. I know it's not php settings.. and it's not running through a webserver it's from command line only so what gives.
It looks like you haven't detached your process correctly. Currently, if your process's parent die, your process will die too. If you place your process in background (create a real daemon), you'll not meet scuh trouble.
You can execute your PHP this way to really detach it :
php myscript.php > output.txt 2>&1 &
For your information :
> output.txt
will redirect standard output (ie. your echo, print etc) to output.txt file
2>&1
will redirect error output to standard output, writting it in the same output.txt file
&
is the most important thing in your case : it will detach your process to create a real daemon.
Edit : if you're having troubles while disconecting your shell, the most simple is to put your script on a bash script, for example run.sh :
#!/bin/bash
php myscript.php > output.txt 2>&1 &
And you'll run your script this way :
bash run.sh &
In such case, your shell will "think" your program has ended at the end of the shell script, not at the end of the php daemon.
Long-running PHP scripts shouldn't die or hang without reason. I've had scripts that run continuously for 6 months +. There must be something else going on inside of your script body.
I know I should use comment to answer this, but I have not enough reputation to do it...
Maybe your process is consuming 100% of CPU, I had an issue with a while loop without calling a sleep() or usleep() at the end of the loop.
I am using phpseclib to ssh to my server and run a python script. The python script is an infinite loop, so it runs until you stop it. When I execute python script.py via ssh with phpseclib, it works, but the page just loads for ever. It does this because phpseclib does not think it is "done" running the line of code that runs the infinite loop script so it hangs on that line. I have tried using exit and die after that line, but of course, it didnt work because it hangs on the line before, the one that executes the command. Does any one have any ideas on how I can fix this without modifying the python file? Thanks.
Assuming the command will be run by a shell, you could have it execute this to start it:
nohup python myscript.py > /dev/null 2>&1 &
If you put an & on the end of any shell command it will run in the background and return immediately, that's all you really need.
Something else you could have also done:
$ssh->setTimeout(1);