A script has an execution time of more than a minute. So, therefore I would like to run the script as a background task.
I've read a lot about it on the internet and read that print shell_exec('/usr/bin/php -q page.php &'); isn't the solution since the taks is still a child of the process. I've tested it with sleep(10) and indeed, the page which should call the cron job is waiting for 10 seconds.
So, symcbean have written an article ( http://symcbean.blogspot.nl/2010/02/php-and-long-running-processes.html?m=1 ) and is suggesting the following code:
print `echo /usr/bin/php -q longThing.php | at now`;
But, unfortunately, i script didn't do anything and after adding 2>&1 I get the following response:
sh: at: command not found
I've search a lot for solving this issue, but can't find any solution.
You should provide the fully qualified path to the at command for example /bin/at.
If you're not sure of the path you can usually type which at at the command line to find the path to the command.
Related
I have seen this question on here before so I am sorry for the repetition but I have still not found an answer to my problem.
I have a bash script that takes a while to run. It needs to be passed variables set by a user on a webpage (don't worry there will be plenty of validation for security etc)
I can get the bash file to start but it dies after 20 seconds maybe when run from the webpage.
When run from the terminal.. runs absolutely fine.
Ok so I have the following:
$bashFile = shell_exec('./CoinCreationBashFile.sh "'.$coinName.'" "'.$coinNameAbreviation.'" "'.$blockReward.'" "'.$blockSpacing.'" "'.$targetTimespan.'" "'.$totalCoins.'" "'.$firstBitAddy.'" "'.$seedNode.'" "'.$seedName.'" "'.$headline.'" ');
Now this executes the bash file, but I read up about Shell_exec php with nohup and came up with the following:
$bashFile = shell_exec('nohup ./CoinCreationBashFile.sh "'.$coinName.'" "'.$coinNameAbreviation.'" "'.$blockReward.'" "'.$blockSpacing.'" "'.$targetTimespan.'" "'.$totalCoins.'" "'.$firstBitAddy.'" "'.$seedNode.'" "'.$seedName.'" "'.$headline.'" >/dev/null 2>&1 &');
But this still died after short time :(
So read up about set_time_limit and max_execution_time and set these to like 10000000 in the php.ini file.... Yet still no joy :(
Just want to run a bash script without it timing out and exiting. Don't really want to have to put an intermediate step in there but someone suggested I look at ZeroMQ to "detach worker from process" so I may have to go this route.
many thanks in advance
dont try runging a script via browser if they take more then 60 seconds instead try running this with SSH or as a cronjob.
In a apcahe server i want to run a PHP scripts as cron which starts a php file in background and exits just after starting of the file and doesn't wait for the script to complete as that script will take around 60 minutes to complete.how this can be done?
You should know that there is no threads in PHP.
But you can execute programs and detach them easily if you're running on Unix/linux system.
$command = "/usr/bin/php '/path/to/your/php/to/execute.php'";
exec("{$command} > /dev/null 2>&1 & echo -n \$!");
May do the job. Let's explain a bit :
exec($command);
Executes /usr/bin/php '/path/to/your/php/to/execute.php' : your script is launched but Apache will awaits the end of the execution before executing next code.
> /dev/null
will redirect standard output (ie. your echo, print etc) to a virtual file (all outputs written in it are lost).
2>&1
will redirect error output to standard output, writting in the same virtual and non-existing file. This avoids having logs into your apache2/error.log for example.
&
is the most important thing in your case : it will detach your execution of $command : so exec() will immediatly release your php code execution.
echo -n \$!
will give PID of your detached execution as response : it will be returned by exec() and makes you able to work with it (such as, put this pid into a database and kill it after some time to avoid zombies).
You need to use "&" symbol to run program as background proccess.
$ php -f file.php &
Thats will run this command in background.
You may wright sh script
#!/bin/bash
php -f file.php &
And run this script from crontab.
This may not be the best solution to your specific problem. But for the record, there is Threads in PHP.
https://github.com/krakjoe/pthreads
I'm assuming you know how to use threads, this is very young code that I wrote myself, but if you have experience with threads and mutex and the like you should be able to solve your problem using this extension.
This is clearly a shameless plug of my own project, and if the user doesn't have the access required to install extensions then it won't help him, but many people find stackoverflow and it will solve other problems no doubt ...
I have this in one PHP file:
echo shell_exec('nohup /usr/bin/php -f '.CRON_DIRECTORY.'testjob.php > /dev/null 2>&1 &');
and in testjob.php I have:
file_put_contents('test.txt',time()); exit;
And it all runs just dandy. However if I go to processes it's not terminating testjob.php after it runs.
(Having to post this as an answer instead of comment as stackoverflow still won't let me post comments...)
Works for me. I made testjob.php exactly as described, and another file test.php with just the given line (except I removed CRON_DIRECTORY, because testjob.php was in the same directory for me).
To be sure I was measuring correctly, I added "sleep(5)" at the top of testjob.php, and in another window I have:
watch 'ps a |grep php'
running. This happens:
I run test.php
test.php exits immediately but testjob.php appears in my list
After 5 seconds it disappears.
I wondered if shell might matter, so I switched from bash to sh. Same result.
I also wondered if it might be because your outer script is long-running. So I put "sleep(10)" at the bottom of test.php. Same result (i.e. testjob.php finishes after 5 seconds, test.php finishes 5 seconds after that).
So, unhelpfully, your problem is somewhere other than the code you've posted.
Remove & from the end of your command. This symbol says nohup to continue running in background, thus shell_exec is waiting for task to complete... and waiting... and waiting... till the end of times ;)
I don't even understan why would you perform this command with nohup.
echo shell_exec('/usr/bin/php -f '.CRON_DIRECTORY.'testjob.php > /dev/null 2>&1');
should be enough.
You're executing PHP and make that execution a background task. That means it will run in background until it is finished. shell_exec will not kill that process or something similar.
You might want to set an execution limit, PHP cli has a setting of unlimited by default. See as well set_time_limit PHP Manual;
So if you wonder why the php process does not terminate, you need to debug the script. If that's too complicated and you're unable to find out why the script runs that long, you might just want to terminate the process after some time, e.g. 1 minute.
I'm trying to get a cron job to run every 5 min on my localhost. Using the Cronnix app I entered the following command
0,5 * * * * root curl http://localhost:8888/site/ > /dev/null
The script runs fine when I visit http://localhost:8888/site/ in my browser. I've read some stuff about getting CI to run on Cron, using wget and various other options but none make a lot of sense.
In another SO post I found the following command
wget -O - -q -t 1 http://www.example.com/cron/run
What is the "-O - -q -t 1" syntax exactly?
Are there other options?
-O - Means the output goes to stdout (-O /dev/null) would nullify any output. -q means be quiet (don't print out any progress bars), this would screw up the look of any log files. -t 1 means to only try once. If the connection fails or times out it will not try again.
See http://linux.die.net/man/1/wget for a full manual on the wget command.
Edit: just realised you're piping all this to /dev/null anyway, you may as well either omit the -O parameter or point that to /dev/null and omit the final pipe.
What I always do is use PHP in cli mode. Seems more efficient to me.
first setup a cron entry like :
*/5 * * * * /usr/bin/php /var/www/html/cronnedscript.php
cronnedscript.php should be placed in your root www folder.
then edit cronnedscript.php with:
<?php
$_GET["/mycontroller/index"] = null;
require "index.php";
?>
where mycrontroller is the CI controller you want to fire.
if you want the controller to only be run by crond ,as opposed through public www requests, add the following line to the controller and to the cronnedscript.php :
if (isset($_SERVER['REMOTE_ADDR'])) die('Permission denied');
I realize that this is a reference to Drupal, however they do a very nice job of explaining what each and every parameter is in the wget call.
Drupal Cron Explanation
If you want the more generic explanation, you can find it here.
Try this and save it by making a folder in the C drive with a .bat extension.
Then give the path of this script to task scheduler.
Then run the same.
C:\xampp\php\php-win.exe -fC:\xampp\htdocs\folder name\index.php controllername functionname
I have a php script I want to run every minute to see if there are draft news posts that need to be posted. I was using "wget" for the cron command in cPanel, but i noticed (after a couple days) that this was creating a blank file in the main directory every single time it ran. Is there something I need to stop that from happening?
Thanks.
When wget runs, by default, it generates an output file, from what I need to remember.
You probably need to use some option of wget, to specify to which file it should write its output -- and use /dev/null as destination file (It's a "special file" that will "eat" everything you can write to it)
Judging from man wget, the -O or --output-file option would be a good candidate :
-O file
--output-document=file
The documents will not be written to the appropriate files, but all will be concatenated together and written to file.
so, you might need to use something like this :
wget -O /dev/null http://www.example.com/your-script.php
And, btw, the output of scripts run from the crontab is often redirected to a logfile -- it can always help.
Something like this might help, about that :
wget -O /dev/null http://www.example.com/your-script.php >> /YOUR_PATH_logfile.log
And you might also want to redirect the error output to another file (can be useful, to help with debugging, the day something goes wrong) :
wget -O /dev/null http://www.example.com/your-script.php >>/YOUR_PATH/log-output.log 2>>/YOUR_PATH/log-errors.log