How to make sure a php script keeps running on Windows? - php

For my Linux servers, I use this script to make sure my queue listeners keep running in the background. I tried to get it working on Windows, but without luck so far. I need to get the PID of the started php process, but I can't find a way to do this in Windows.
How can I get the PID of a process started with exec() on Windows, or, how can I make sure my Laravel queue listeners keep running on Windows?

there are several ways for getting the pid. First oft all you can use the taskmanager or the terminal and the tasklist command.
You can also use the exec() to execute the tasklist command and get the pid.
Last but not least, you could use the phpfunction getmypid().
http://php.net/manual/de/function.getmypid.php

You could also use https://eyewitness.io to monitor your queues and cron jobs. You'll get an alert if any of them stop working. There is a Laravel 4 package available to help make integration easier.

Related

How can I run a daemon on xampp using PHP?

I have an XML database that I want to manage independently from users on my website. Looking into the matter it appears that I should write a daemon script to manage my database. That is all fine and dandy but, I feel like I'm opening a can of worms. I wanted to write my daemon script in PHP, so I looked into PCNTL. But I quickly learned that PCNTL is not suited for web servers. So now I am stumped. How can I get a daemon to run on my server? Do I need to learn another language? I only want to write my own scripts. But I feel lost. I would prefer to write my daemon in PHP as I am familiar with the language.
I have been researching everything from PCNTL, CLI, SO questions, numerous articles on daemon processes... etc
I am running PHP 5.6.32 (cli), windows 7, on Apache. XAMPP 5.6.32. Unix system.
EDIT: I also have windows setup to run PHP from command prompt.
There's nothing wrong in running a PHP daemon, however it's not the fastest thing, especially before the 7.0 version. You can proceed in two ways:
Using Cron Jobs, if you're under Unix systems crontab will be fine, in this way you can specify the interval within the system automatically executes the specified script and then exit.
The true daemon, firstly you need to change the max_execution_time in PHP.ini to 0 (infinite), then in your daemon call for first function set_time_limit(0);, remember to run it only once. However if there is some failure like a thrown error uncatched the script will exit and you need to open it again manually, and don't try...catch in a while loop because it will probably go into an endless loop. Execute the script with php -f daemon.php.

Laravel 5.3 scheduler Runs once on Windows then ends without any further processing

OS: Windows 8.1
Laravel-version: 5.3.15
Hello,
I'm having a bit of trouble getting Laravel's scheduler to properly run on on my machine and STAY RUNNING in a way where I can see some sort of output to indicate that it's actually working. Every time I run php artisan schedule:run it calls any listed commands once and then just dies. It doesn't repeat or anything. Example attempt below:
The Code:
The Result When Trying to Run the Scheduler:
What I've tried:
I've tried multiple suggested fixes from the link below and some are actually referenced throughout stackoverflow as well. Including adding a batch file, setting php's path in my system veriables etc etc. as suggested in the link below by Kryptonit3:
https://laracasts.com/discuss/channels/general-discussion/running-schedulerun-on-windows
However, nothing works. Even in the "Windows Task Scheduler" tool it will just run once, end any further processing in the CLI and then close the window. There is no actual indication that this scheduler is running in the background indefinitely (Sort-of like a background job only without the queues).
Questions:
1 - With no indication (the echoed "I am Scheduled") how do I know if this is actually working?
2 - Why does attempting to run this command die after it successfully completes once.
3 - Does my Laravel Version have anything to do the scheduler not running permenently when I try using php artisan schedule:run in the CLI or windows task scheduler?
4 - Mainly what I'm trying to accomplish here is every minute the system would scan a single DB table's field. How do I accomplish this?
5 - Can someone please give me some clarification on this? Or point me in the right direction so that I can get this thing really running indefinitely without it just ending after the first run?
Note: I do not want to use Laravel Forge or any other external service for something so simple. That would be overkill and unecessary I feel.
after doing some rather serious digging I found that all of what Kryptonit3 had said in his step by step answer that I referenced was correct. However, I did not know I needed to restart my computer after creating a new task with the Windows Task Scheduler. I thought this would happen automatically. Would've saved me 2 days of debugging had I'd of known but regardless, if someone else comes across this similar problem they will know to reboot the computer.
Reference:
https://laracasts.com/discuss/channels/general-discussion/running-schedulerun-on-windows

PHP console command doesn't stop at end of process

I get a strange PHP bug on a PHP 5.6 / Symfony 2.7 project, running on a CentOS6 server through Apache.
I have a Symfony console command running as a service which launches some other console commands every 2 seconds. I use the Symfony Process component to launch the sub-processes and have timeout management.
And everything is done to avoid to launch parallel processes from the main command.
The issue I have is that sometimes php console commands doesn't stop after finishing their process. Which means that if I launch by hand the commands, everything runs correctly on the PHP side but I don't get the hand back on the console after the PHP statements finished, unless I use Ctrl+C.
The issue was happening a lot of times when the PHP version was 5.5, but now with PHP 5.6 it (only) happens randomly. When it happens, I can see a lot of stucked php sub-processes, probably launched by the main command.
I just can't find any explanation since php commands doesn't raise any error. It's just that the console get stuck and wait for something to finish.
Has anybody a possible solution to this issue?

Running Gearman PHP workers infinitely

I've setup Gearman to work with PHP. Im really new to Gearman and task managing and the problem im having is that, when i close the terminal window running the worker, the process stops too. I want the PHP worker script to run forever. I don't know how to achieve this. Am i missing something from the documentation?
Take a look at Gearman Manager. It's designed to work as a service that you can start / stop. It's installed with install.sh.
/etc/init.d/gearman-manager start
/etc/init.d/gearman-manager stop
In case anybody is interested in a simpler way to handle this. Use a shell script to call the worker.php in a loop. You can also pass variables to the php cli (-rmethod)
#!/bin/bash
while true
do
php -q /path/to/worker.php -rmethod
sleep 5
done
Another way was to use Supervisord. Running Gearman Workers in the Background

Executing multiple simultaneous php scripts from CLI

I have 55 php files that I would like to run simultaneously from the command line. Right now, I am running them in multiple CLI windows using the code:
php Script1.php
I would like to be able to call one single php file that would execute all 55 php files simultaneously. I have been reading about how to make the command line not wait for the output, but I can't seem to make it work.
This thread:
How to run multiple PHP scripts from CLI
suggests putting an & at the end of the command to run the command in the background, but using the xampp CLI this doesn't seem to do anything.
Any ideas greatly appreciated.
Brian
By mentioning XAMPP, I assume you are on windows. I think what you need is the start command. You probably need start php Script1.php. For more info, do a
start /?|more
Linux
Apart from adding a &, you also need to redirect output to somewhere - otherwise your php process waits until the other process finished, because there could be more output:
exec('/path/to/program & > /dev/null 2>&1')
You could use the php forking mechanism. Read about it here: http://php.net/manual/en/function.pcntl-fork.php

Categories