(Laravel 3.2) Artisan task in Cron Job not running? - php

I am trying to have this Artisan task run as a cron job (host is Bluehost):
php-cli /home3/***/***/artisan task
this works from the command line (SSH), but not with Cron.
I know it doesn't execute because it's supposed to add a DB entry.
What is wrong there?
EDIT::As there are no errors reported anywhere (either to email or to an output.log file),
I would guess the command executes, but fails to do anything.
Can that be because of a database connection issue in the Artisan Task?
There is a simple DB::('table)insert(etc) in there..
But if the task works from the command-line, why not from Cron?

I ended up making a php page in a sub directory (something like : /tasks_to_execute/task.php) with the php script on it with "echo 'task successful!'" at the end, and the cronjob on bluehost just "lynx" to it.
Soo no I didn't figure out why it didn't work as an artisan task. I just figured a way around it, quick fix. The security issue of having the script as a "public" page doesn't really matter as the script executes every half hour anyways and just updates a database from a facebook feed.

I faced similar problem. Try using with the full path to the php this: /usr/local/bin/php /home/mysitename/laravel/artisan schedule:run

Related

How can I run background tasks in Gitlab CICD?

How can I run a service-based command after the build process in gitlab-ci.yml?
For example, i'd like to run:
php artisan queue:listen --timeout=0 &
The issue is the build runs perpetually and does not finish as it waits for the results of this command (even though this command never finishes).
Is there anyway I can run it as a background task? I tried nohup with no luck.
As mentioned here:
Process started with Runner, even if you add nohup and & at the end, is marked with process group ID.
When the job is finished, the Runner is sending a kill signal to the whole process group.
So any process started directly from CI job will be terminated at job end.
Using a systenter code hereemd service (as in this same page) remains an option, if you control the target server.
With VonC's help - this is the approach I took.
I use Alpine Linux so slightly different to the link he provided, but same approach.
I created a file in /etc/init.d and gave it chmod +x permissions.
With the following contents:
#!/sbin/openrc-run
command="php /var/www/artisan queue:listen"
command_args="--timeout=0"
command_background=true
pidfile="/run/${RC_SVCNAME}.pid"
I then ran it with rc-service laravel-queue start within the gitlab-ci configuration file.

How can I run a cronjob for a php file properly inside of c-panel?

I am trying to run a cron-job that automates the cache rebuild function of Woo Search Box plugin.
The plugin has this dynamic cron command, that I set to run once a day at 1AM:
php /home/carit/public_html/index.php 16021417635f7ebe43c604a
I am well experienced with cron jobs inside of C-Panel but I can't figure out what prevents this cron from running. All in all, I have tried what the documentation suggests here, also tried usr/bin/php instead of php but it didn't make any difference.
If I run this cron-job inside my server's terminal, it works like a charm and the cache is being rebuilt. It only seems to not be working when I run it via cron-job.
Does anyone have any suggestions or ideas on why doesn't this cron-job works?
Please correct me in the comments in case I forgot to provide any key information about my problem,
Thanks.
It seems that some newer versions of CentOs or Ubuntu servers use either /usr/local/bin/php or simply /usr/bin/php. If you have ftp access to the server you want the cron to run in, make sure to check for the php file in both paths recommended above before trying to use the ordinary php before a cron job.

Running a laravel cron job (locally, to test) using Task Scheduler in Windows 10

I want to run a laravel cron job in order to run a command on windows 10 using task scheduler, I tried to create a basic task in scheduler but it shows running but data doesnt add in db. When I run "php artisan schedule:run" it works perfectly. I am using Laravel and Homestead.
I have adding these two lines while creating the task in scheduler
C:\xampp\php\php.exe (why do we have to add this when I don't even use xampp anymore, so I think this is the part which is giving issues?????)
C:\projects\project-name\artisan schedule:run
I would really appreciate, if someone could guide me through this, thanks.
Update your task scheduler command to this:
C:\xampp\php\php.exe C:\projects\project-name\artisan schedule:run
C:\xampp\php\php.exe does not mean using xampp, we're just using php here which is coincidentally found inside your xampp folder, because we need the executable php to run the file artisan with the parameter schedule:run which is found in C:\projects\project-name\
You can add an environment variable for your executable php so you could just write the command as php C:\path\to\artisan schedule:run.
Also, try to see the logs of task scheduler, so you can see what it tried to do.
As per your issue. Yes, C:\xampp\php\php.exe causes an issue. Try typing in your cmd that command. What happens? It's just paused there. That's also what's happening in your scheduled task.

php artisan up command not working in laravel?

I have this very strange problem in laravel. I successfully put my website into maintenance mode via artisan by this command:
php artisan down
But now i have to put my website back into live mode.I tried:
php artisan up
However, the site isn't going live even though i get success message? Have you guys ever faced this issue?
Whats the fix?
I'm on :
1. Macbook pro Mamp
2. Laravel 5.1
Thanks
artisan up command simply deletes storage/framework/down file. Please check if the file exits after you execute the up command. If it still exists, it seems like a file access issue. Whenever you run down/up commands, make sure that you run them as the same user that is running your application.
In order to get the site up and running again, remove the storage/framework/down file manually.

Running an artisan command forever with laravel forge?

Can someone possibly advise how I can keep my custom artisan command running forever with the daemon?
I saw the many tutorials with queues, however it doesn't exactly fit. I am trying to accomplish "subscribe" with pubnub's php library and this seems like the best way, unless I missed something?
Thanks in advance!
If you run the artisan command from the command line - it can already run indefinitely/forever. You dont need to do anything.
I have an application that has been running the one single artisan command for 97 days straight at the moment.
You then need to make sure it has not crashed for some reason, with something like Supervisor, or a web monitoring service like Eyewitness.io
This will help you run artisan command forever
nohup php artisan yourcommand:abc > mylog.log 2>&1 & echo $! >> save_pid.txt
when you want to kill this process, get pid from save_pid.txt file
kill pid

Categories