Can someone possibly advise how I can keep my custom artisan command running forever with the daemon?
I saw the many tutorials with queues, however it doesn't exactly fit. I am trying to accomplish "subscribe" with pubnub's php library and this seems like the best way, unless I missed something?
Thanks in advance!
If you run the artisan command from the command line - it can already run indefinitely/forever. You dont need to do anything.
I have an application that has been running the one single artisan command for 97 days straight at the moment.
You then need to make sure it has not crashed for some reason, with something like Supervisor, or a web monitoring service like Eyewitness.io
This will help you run artisan command forever
nohup php artisan yourcommand:abc > mylog.log 2>&1 & echo $! >> save_pid.txt
when you want to kill this process, get pid from save_pid.txt file
kill pid
Related
How can I run a service-based command after the build process in gitlab-ci.yml?
For example, i'd like to run:
php artisan queue:listen --timeout=0 &
The issue is the build runs perpetually and does not finish as it waits for the results of this command (even though this command never finishes).
Is there anyway I can run it as a background task? I tried nohup with no luck.
As mentioned here:
Process started with Runner, even if you add nohup and & at the end, is marked with process group ID.
When the job is finished, the Runner is sending a kill signal to the whole process group.
So any process started directly from CI job will be terminated at job end.
Using a systenter code hereemd service (as in this same page) remains an option, if you control the target server.
With VonC's help - this is the approach I took.
I use Alpine Linux so slightly different to the link he provided, but same approach.
I created a file in /etc/init.d and gave it chmod +x permissions.
With the following contents:
#!/sbin/openrc-run
command="php /var/www/artisan queue:listen"
command_args="--timeout=0"
command_background=true
pidfile="/run/${RC_SVCNAME}.pid"
I then ran it with rc-service laravel-queue start within the gitlab-ci configuration file.
I want to run a laravel cron job in order to run a command on windows 10 using task scheduler, I tried to create a basic task in scheduler but it shows running but data doesnt add in db. When I run "php artisan schedule:run" it works perfectly. I am using Laravel and Homestead.
I have adding these two lines while creating the task in scheduler
C:\xampp\php\php.exe (why do we have to add this when I don't even use xampp anymore, so I think this is the part which is giving issues?????)
C:\projects\project-name\artisan schedule:run
I would really appreciate, if someone could guide me through this, thanks.
Update your task scheduler command to this:
C:\xampp\php\php.exe C:\projects\project-name\artisan schedule:run
C:\xampp\php\php.exe does not mean using xampp, we're just using php here which is coincidentally found inside your xampp folder, because we need the executable php to run the file artisan with the parameter schedule:run which is found in C:\projects\project-name\
You can add an environment variable for your executable php so you could just write the command as php C:\path\to\artisan schedule:run.
Also, try to see the logs of task scheduler, so you can see what it tried to do.
As per your issue. Yes, C:\xampp\php\php.exe causes an issue. Try typing in your cmd that command. What happens? It's just paused there. That's also what's happening in your scheduled task.
I'm trying to use Laravel Queues for sending emails using the database driver, I have already configured it, run the migration for the "jobs" table and when I run this:
Mail::to($user->email)->queue(new CompraRealizadaAdmin(Cart::content(), $monto_descuento, $envio, $user_array, $direccion, $compra));
A record is added on the "jobs" table, but, how do I run the queue on the database table?, I understand that for triggering it at the moment it is added, I will need to run the command php artisan queue:listen, or if I need to run all the ones that are still on queue, I will use php artisan queue:work.
But how do I run the command without the need to open terminal and keep it open until it has finished...?
I had the idea of creating a schedule and run it every minute and just execute the code: Artisan::call('queue:work'); but that does not work.
Any ideas?
Depending on your needs, preferences and your target OS you can use
supervisord (cross platform)
upstart / systemd (linux)
launchd (OS X)
or alike services to manage your queue worker processes.
In fact Laravel documentation explains in great detail how to install and configure supervisord for this.
It depends on which OS you are working on for Ubuntu or linux you can use supervisor and hup.
butt be careful you have to run hup every time you reboot your machine.
thats how you can run this command. hup php artisan queue:work.
Hope this helps
I have a command that looks like the following:
php bin/console rabbitmq:multiple-consumer -w run_task
the command above has an endless while loop, its meant to be that way because its a listener that listens from the queue. Is there a way to put this command to run in the background so that I don't have to have 10 terminal tab always open? If not what is the solution
For me this question is more OS specific than PHP. I would solve this by using system-tools which make it possible to run tasks in background e.g. screen (linux).
If you want a command that does this you are able to write an wrapper command using the symfony process component where you could run the real task in a screen-instance.
I am trying to have this Artisan task run as a cron job (host is Bluehost):
php-cli /home3/***/***/artisan task
this works from the command line (SSH), but not with Cron.
I know it doesn't execute because it's supposed to add a DB entry.
What is wrong there?
EDIT::As there are no errors reported anywhere (either to email or to an output.log file),
I would guess the command executes, but fails to do anything.
Can that be because of a database connection issue in the Artisan Task?
There is a simple DB::('table)insert(etc) in there..
But if the task works from the command-line, why not from Cron?
I ended up making a php page in a sub directory (something like : /tasks_to_execute/task.php) with the php script on it with "echo 'task successful!'" at the end, and the cronjob on bluehost just "lynx" to it.
Soo no I didn't figure out why it didn't work as an artisan task. I just figured a way around it, quick fix. The security issue of having the script as a "public" page doesn't really matter as the script executes every half hour anyways and just updates a database from a facebook feed.
I faced similar problem. Try using with the full path to the php this: /usr/local/bin/php /home/mysitename/laravel/artisan schedule:run