I have an external PHP script, that is processing an XML array to insert, update or delete rows in a database. This script lies in the root of the project in a folder called scripts and I can run and execute it via terminal with no problems whatsoever and it updates the database accordingly:
php index.php
I have also set up a schedule in Laravel (using October CMS syntax)
public function registerSchedule($schedule)
{
$schedule->exec(public_path() . '/script/index.php')->everyMinute();
}
This however is doing nothing. I tried manually running the schedule with artisan in command line by:
php artisan schedule:run
And the output is
Running scheduled command: /Users/x/x/x/x/scripts/index.php > '/dev/null' 2>&1 &
Nothing happens in the database tho.
Did you try to generate a new key?
php artisan generate:key
Related
I have a Laravel App with Short Schedule package installed and a crontab to execute it.
Inside the executed method I have a SHELL_EXEC function with some code inside it.
The problem is that when the code runs automatically using the cron, the SHELL EXEC doesn't work. The output is null.
When I run the method directly using php artisan or by simply running the schedule using php artisan manually, it works.
To have a clear vision of what's going on:
Inside the crontab -e I have the following
* * * * * php /var/www/html/project/artisan short-schedule:run --lifetime=60
The cron executes this method which has the following code:
$shell_command = '/home/paul/elrondsdk/erdpy --verbose tx new --receiver xxxx --send --pem asdfa.pem --gas-limit 300 --nonce 12';
$output = shell_exec($shell_command);
Log::info('output', [$output]);
If it runs automatically using the cron, the $output is NULL.
If I run the command manually, I get a proper output.
Initially I thought I need to specify the exact path in the shell exec because cron doesn't not PATH. Unfortunately did not solve my problem.
Thanks!
I tried to run the command manually or to run php artisan schedule:run and wait and it worked!
It doesn't work ONLY when its runs by the cron.
i am running command php artisan schedule:run and it will work . but i want to update thing everyMinute(); automatically without trigger command every time .if i have to trigger it mannualy then what is the meaning of scheduler
my command file
public function handle()
{
$update = Roi::find(12);
$update->level = 48;
$update->save();
}
2.kernal.php
protected function schedule(Schedule $schedule)
{
$schedule->command('level:update')->everyTwoMinutes();
}
i am checking updates in updated_at timestamp in database
That might solve your issue, it's the way you start the scheduler:
php artisan schedule:run >> /dev/null 2>&1
That way you will run every minute and check whatever is due to execute it.
https://gist.github.com/Splode/94bfa9071625e38f7fd76ae210520d94 windows not supported for command line task schedule
I think running schedule not working on windows locally
the only way worked for me is to create task in windows and related with .bat file
note: bat file content is
#ECHO OFF
php path-of-laravel-project\artisan schedule:run
PAUSE
I'm using Heroku to host a Lumen app. I've setup the logging to send messages to stdout:
// bootstrap/app.php
$app->configureMonologUsing(function ($monolog) {
/** #var Monolog\Logger $monolog */
$monolog->pushHandler(new \Monolog\Handler\StreamHandler('php://stdout', \Monolog\Logger::DEBUG));
});
I've tested this works by adding a log message to the schedule function
protected function schedule(Schedule $schedule)
{
\Log::info('Running cronjobs.');
$schedule
->command('update:daily')
->timezone('America/Los_Angeles')
->everyMinute();
}
The Heroku logs show:
Jan 19 09:42:15 service app/scheduler.4140: [2018-01-19 17:42:14] lumen.INFO: Running cronjobs. [] []
Jan 19 09:42:15 service app/scheduler.4140: Running scheduled command: '/app/.heroku/php/bin/php' artisan update:daily > '/dev/null' 2>&1
However the update:daily command has a number of log messages inside it and none of them are showing up.
I thought this might be because of the > '/dev/null' so I tried adding ->sendOutputTo('php://stdout') and ->sendOutputTo(base_path('storage/logs/cronjobs.log')) to the scheduled task but neither works.
Here's my Procfile:
web: vendor/bin/heroku-php-nginx -C heroku-nginx.conf -l storage/logs/cronjobs.log public/
Ethan is right with his observation regarding > '/dev/null'. As Heroku collects logs from stdout and stderr - scheduled tasks are not being logged correctly.
Luckily there's a simple workaround: log to a file, and output it's content into stdout. Heroku will then be able to log everything, whether using its native logging or using an add-on like LogDNA, Papertrail, etc.
Given that you're logging to a single channel in logging.php, into the file storage/logs/laravel.log, replace your scheduler command php artisan schedule:run with the following:
touch ./storage/logs/laravel.log; tail -f ./storage/logs/laravel.log & php artisan schedule:run
What it does?
Makes sure a log file exists. If it didn't - the following command will fail
Tails the log file, so its content is outputted to stdout
Starting the Laravel scheduler
The advantage of using this method is that you don't need to change your code, rather the configuration and the environment, so if you're running the app on other platforms - it won't change the way they work/log.
Laravel uses the > '/dev/null' when you do not call ->sendOutputTo(). I couldn't figure out an appropriate way to use the ->sendOutputTo() function to send to stdout.
What I did to fix this issue was change the ->command() function to a ->call(function(){}) function and executed the same single line of code the terminal command could have called. When laravel calls a Closure, it does not dump the output to > '/dev/null'.
I try to plan one-time job with 'at' command. There is next code in script:
$cmd = 'echo "/usr/bin/php '.$script_dir.$script_name.' '.$args.'"|/usr/bin/at "'.$time.'" 2>&1';
exec($cmd, $output , $exit_code);
When I run this command from script it adds the job to the schelude. This I see by the line in logs job 103 at Thu Sep 3 15:08:00 2015 (same text contains $output). But then nothing happens in specified time like at ignores the job. And there are no error messages in logs.
When I run same command with same args from command line on server it scheludes the job and than runs it at specified time.
I found out that when I try to plan a job via php script it runs under apache user. I tried to run next in command line on server:
sudo -u apache echo "/usr/bin/php /var/www/pant/data/www/pant.com/scripts/Run.php firstarg secondarg "|/usr/bin/at "16:00 03.09.2015"
It works correct too. I checked sudoers and have added apache user with NOPASSWD privileges. Script Run.php has execute rights.
at.deny is empty. at.allow does not exist.
So question is: why 'at' does not run command given via php script (exec) but runs same command in command line? How to run it?
Thanks to all.
I found by chance answer at stackexchange.com:
The "problem" is typically PHP is intended to run as module in a webserver. You may need to install the commandline version of php before you can run php scripts from the commandline
I'm using Laravel 4.2 in remote server and I want to execute Laravel commands php artisan migrate but I don't know how.
You can ssh to the server and perform the command, you can add your servers public key to the remote server to do this without a password. I made a bash script with the following code which I can then execute manually via command line or from a program, lets say I call it myscript.sh with the following code
ssh root#127.0.0.1 << EOF
cd /var/www/app/;
php artisan migrate --force; // force prevents artisan from asking for a yes/no on production
exit;
EOF
now I can write 'sh myscript.sh' and it will run migrations on the remote server.
For the completeness of this article, you do this with Windows host using PS remote (enable first) then...
$Username = "{domain}\{domain account}"
$PasswordSS = ConvertTo-SecureString '{domain account password}' -AsPlainText -Force
$Cred = New-Object System.management.Automation.PSCredential $Username,$PasswordSS
Invoke-Command -ComputerName {server name} -ScriptBlock { cd d:\wwwroot\{website};php artisan migrate } -Credential $Cred
This will return the result to your local machine.
I use this in VSTS release deployments all the time.
The preferred way to do this:
ssh into your server (for example, username root & server ip 1.1.1.1: ssh root#1.1.1.1).
go to your project folder (for example: cd /var/....)
run the command php artisan migrate.
Original post (not the best way):
You can setup a cronjob like this:
* * * * * /usr/bin/php /var/www/app/artisan schedule:run
This will run every minute and run the migration.
You can change this to everything you want.
If you want to open a url what while be excecuting the command.
Use this code:
Artisan::call('migrate');
Hope this works!
With Mac terminal
Step 1
sh root#your.ip.address e.g 181.6.41.221
Step 2
Enter your password
Step 3
cd /home/admin/web/yourdomain.com/public_html
Step 4 Laravel command:
"php artisan migrate"
Without the "quote"
Ask your host for your IP address and Password if not known by you.
The solution to this problem could be running once a php code (on the server) like this:
<?php
// installation.php file
echo exec('php /var/www/laravel-app/artisan migrate:install');
and than you need to visit installation.php in your browser.
After migration you should remove the installation file, so nobody can execute it again