I have a command that looks like the following:
php bin/console rabbitmq:multiple-consumer -w run_task
the command above has an endless while loop, its meant to be that way because its a listener that listens from the queue. Is there a way to put this command to run in the background so that I don't have to have 10 terminal tab always open? If not what is the solution
For me this question is more OS specific than PHP. I would solve this by using system-tools which make it possible to run tasks in background e.g. screen (linux).
If you want a command that does this you are able to write an wrapper command using the symfony process component where you could run the real task in a screen-instance.
Related
How can I run a service-based command after the build process in gitlab-ci.yml?
For example, i'd like to run:
php artisan queue:listen --timeout=0 &
The issue is the build runs perpetually and does not finish as it waits for the results of this command (even though this command never finishes).
Is there anyway I can run it as a background task? I tried nohup with no luck.
As mentioned here:
Process started with Runner, even if you add nohup and & at the end, is marked with process group ID.
When the job is finished, the Runner is sending a kill signal to the whole process group.
So any process started directly from CI job will be terminated at job end.
Using a systenter code hereemd service (as in this same page) remains an option, if you control the target server.
With VonC's help - this is the approach I took.
I use Alpine Linux so slightly different to the link he provided, but same approach.
I created a file in /etc/init.d and gave it chmod +x permissions.
With the following contents:
#!/sbin/openrc-run
command="php /var/www/artisan queue:listen"
command_args="--timeout=0"
command_background=true
pidfile="/run/${RC_SVCNAME}.pid"
I then ran it with rc-service laravel-queue start within the gitlab-ci configuration file.
I'm trying to use Laravel Queues for sending emails using the database driver, I have already configured it, run the migration for the "jobs" table and when I run this:
Mail::to($user->email)->queue(new CompraRealizadaAdmin(Cart::content(), $monto_descuento, $envio, $user_array, $direccion, $compra));
A record is added on the "jobs" table, but, how do I run the queue on the database table?, I understand that for triggering it at the moment it is added, I will need to run the command php artisan queue:listen, or if I need to run all the ones that are still on queue, I will use php artisan queue:work.
But how do I run the command without the need to open terminal and keep it open until it has finished...?
I had the idea of creating a schedule and run it every minute and just execute the code: Artisan::call('queue:work'); but that does not work.
Any ideas?
Depending on your needs, preferences and your target OS you can use
supervisord (cross platform)
upstart / systemd (linux)
launchd (OS X)
or alike services to manage your queue worker processes.
In fact Laravel documentation explains in great detail how to install and configure supervisord for this.
It depends on which OS you are working on for Ubuntu or linux you can use supervisor and hup.
butt be careful you have to run hup every time you reboot your machine.
thats how you can run this command. hup php artisan queue:work.
Hope this helps
Is there a way to enable something like hot reload when working with single file Vue components in laravel? The laravel docs suggest gulp watch, but doesn't that mean I will have to kill the laravel dev server each time I make a change to a component? Is there a way to reload when I add (or make a change to) a component without having to stop the server and run the gulp command?
You should only need to issue the gulp watch command once, then it will continue to run. I don't know what system you're on but if you're using a Bash Terminal you can run it as a background task using the Ampersand. You could also be running your development server as a backgrounded task, use homestead, or open up two terminal windows.
Can someone possibly advise how I can keep my custom artisan command running forever with the daemon?
I saw the many tutorials with queues, however it doesn't exactly fit. I am trying to accomplish "subscribe" with pubnub's php library and this seems like the best way, unless I missed something?
Thanks in advance!
If you run the artisan command from the command line - it can already run indefinitely/forever. You dont need to do anything.
I have an application that has been running the one single artisan command for 97 days straight at the moment.
You then need to make sure it has not crashed for some reason, with something like Supervisor, or a web monitoring service like Eyewitness.io
This will help you run artisan command forever
nohup php artisan yourcommand:abc > mylog.log 2>&1 & echo $! >> save_pid.txt
when you want to kill this process, get pid from save_pid.txt file
kill pid
I have a php file that calls exec() on a c++ exe.When the .exe is finishing running I need to run the php file again,repeat .
I am wondering what is the best way to do this? This .php file has no user interaction and needs to be completely automated.
EDIT:Found another solution that would kill the process if it is already running which will cover this issue well.See posts on this page.
http://php.net/manual/en/function.getmypid.php
Here's a simple Linux line that starts up the script in the background and uses the watch command to restart the script whenever it finishes running:
watch -n 0 php /path/to/script.php &
Once you've started it like this, you can use the ps command to list running processes and find the process ID for watch, and then use kill process_id to stop watch.
It's probably not best practice to run commands like this, but it's a quick and easy solution that doesn't require any special access privileges on the system (whereas using cron might), and doesn't involve editing your code (whereas a codeigniter solution will).
I haven't used codeigniter before but there seems to be a solution as described in the wiki.
Depending on how you can access the system (if you are admin or not) and depending on how you are planning to update the automated commands, IMHO, you could use both solution (Linux crontab or codeigniter cron script).