Created a cron which indeed executes in every six hours which does create a shell script with different input to store.php every time it is executed.
* */6 * * * /usr/bin/php /var/www/execute.php > /dev/null &
execute.php will create the test.sh file with following commands and executes test.sh
/usr/bin/php /var/www/store.php 'x' > /dev/null &
/usr/bin/php /var/www/store.php 'y' > /dev/null &
it executes successfully and the logic is done. but when I checked through top command on shell
$top
I see all the php process running continuously, even after the execution is completed.
I want it to run the php code only once but it is still running in background... how should I stop it and how to do it only once... even added exit in php file(store.php at the end)....
If you want to end a script, you can use a return statement at the end of the file. Or you can simply call exit(). However, the script SHOULD end it's self once it get's to the last line. Suggesting that the script is not ever getting to the end of the file ... I would recomend putting in some echo statements at various parts of the file and making sure that it actually get's to the end of the file.
Related
When executing multiple scripts within PHP using the exec command; are each script ran one at a time as in one after the other or are they ran simultaneously?
exec('/usr/bin/php -q process-duplicates.php');
exec('/usr/bin/php -q process-images.php');
exec('/usr/bin/php -q process-sitemaps.php');
Just want to make sure they are one after the other before attempting to rewrite my crontabs.
Sure, the only way to run at background is adding & to the command line arguments, which would put that exec()'d process into the background:
exec("php test.php &");
So you are right, they run one after the other.
NOTE: In your case you shouldn't use & as it will force to run all the scripts simultaneously.
exec waits for the script to return, see php.net
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
But as a devops, please please please do not run your cron jobs like this! Create entries in the crontab for each, or put them in a shell script and have cron run the script.
I have a PHP script, which accesses a database table, which contain information about some large files which need to be downloaded. I intend to run this PHP script as a cron job, in an hourly basis, and each time it runs, it should do the following things:
Check if there are files need to be downloaded
If there are, execute a shell script, which emits a wget command, and starts the downloading of the file, in the background, and when ready, runs a second php script, which updates the db tables of the completion of the download, and get back the process id of this shell script, for later use
Check if there are files currently being downloaded, if there are, check their process id is still active, and if not, adjust the table so we know that an error occured somewhere in the download
The shell script works accordingly, if I run it from the console, everything works fine, I am getting back the process id of the shell script also in my php file, my problem is, that when the originating php file exists, the shell script it initiated stops also.
Here's the code I use in php to start the shell script:
function runProcess($command, &$output=array()) {
$command = $command . ' > /dev/null 2>&1 & echo $!';
echo $command . "<BR>";
return exec($command, $output);
}
/** excerpt from the class that does the processing */
$pid=runProcess("sh ".self::DOWNLOAD_FILE_SHELL." ".DEFAULT_DIR_WHOME." 1.xml ".$this->Parent->XMLPath, $output));
echo $pid;
My question is as follows: How can I force the shell script to continue running, even when the parent process (the php script) exits?
If you're running on UNIX like environment, then the simplest way is to add ampersand (&) sign at the end of command line. this will tell the shell to run the command and not wait the completion. try this:
$command = $command . ' > /dev/null 2>&1 & echo $! &';
I need to make a php script that execute another function (or another script) in background: When i call it, the flow of the script must continue even if the second script is not finished.
I found http://php.net/manual/en/function.sleep.php (sleep function), but is not exactly what i need.
Some help?
Edit:
What i want to acomplish?
I need to make a change in my database. Then run a php script, but i need to make another change in my database 1 second (or whatever lapse) after i ran the php script.
Since you apparently don't care about a response from the script you are calling.
And if your server allow...
You could call your script as if it was from a shell and say that it should discard output with "> /dev/null 2>/dev/null &"...
shell_exec('php yourotherscript.php > /dev/null 2>/dev/null &');
I've got a PHP script that does some heavy lifting that I'm trying to fire off as a background script using the following code:
exec("script.php > /dev/null 2> /dev/null &");
What happens:
When I run the code above as part of my web app, the script quits
after about a minute.
When I run the code as part of my web app without the final
ampersand, the job runs fine - but exec() waits for the script to
complete before loading the next page, defeating the purpose as the user stares at an unresponsive page.
When I run the shell command script.php > /dev/null 2> /dev/null & as myself from the console with the final ampersand, the job runs fine.
When I run the shell command from the console as web, the job stops running after about a minute.
I've tried piping my output to logfiles: script.php > /home/public/tmp/output.txt 2> /home/public/tmp/errors.txt &. Output looks normal, I don't get any errors. The script just stops working.
The basic rule seems to be: If run as a foreground process as web or as me, it'll complete. If run as a background process as web, it stops working after about a minute.
I'm not running this as a cronjob because my host (NearlyFreeSpeech) can only run cronjobs once an hour, which is more than I want to make users wait for when the job only takes a couple minutes- it might as well fire when users initiate it.
The subscript starts with set_time_limit(60 * 60 * 4); so this shouldn't be a matter of PHP timing out.
set_time_limit
does not include shell-execution time.
http://php.net/manual/en/function.set-time-limit.php
Try using of the code examples in the comments on that site.
I have a PHP script that runs perfectly fine on the command line if I simply run it like this php /path/to/script/script.php.
if I now schedule this very command in cron using crontab -e and the add the line:
*/1 * * * * php /path/to/script/script.php 2>&1 >> /var/log/logfile.log
it does get executed every minute as expected and all the output gets put into the log file just like running it on the command line. But the some parts of the script just don't seemt o work. those particular parts are lines that are like:
system('mkdir /mnt/temp', $retVal);
or
exec('mkdir /mnt/temp');
I have tried every possible thing like running it as root, permissions on all scripts and folders that would be affected, using /bin/mkdir instead of mkdir. The return value from the system() is 0 for running it on CLI and 1 for the crontab way.
Any suggestions?
I couldn't solve the CLI vs crontab issue, but the solution that worked for me was to use a bash script inside of cron. And that bash script in turn calls the PHP script. this works like a charm under any of the users that I need to run the script.
So I can't say that it is or isn't a permissions issue.
Thanks for all your comments guys