I am making php cli application in which I need Key Event listener
let's say this is my code
Task To do
for($i=0;$i<=n;$i++){
$message->send($user[$i]);
}
Once I'm done with sending messages, I will have to keep connection alive with following code to receive delivery receipts.
I use $command = fopen ("php://stdin","r"); to receive user commands.
while(true){
$connection->refresh();
}
Connection is automatically kept alive during any activities but on idle i have to keep above loop running.
How can I run the event on pressing any key which will make this event break and execute some function?
PHP is not developed, to handle this kind of problems. The Magic Word would be Thread
The Cleanest way, I can Imagine, is taking a look into the Extension PThreads. With it, you can do Threads like in other Language:
class AsyncOperation extends Thread {
public function __construct($arg){
$this->arg = $arg;
}
public function run(){
if($this->arg){
printf("Hello %s\n", $this->arg);
}
}
}
$thread = new AsyncOperation("World");
if($thread->start())
$thread->join();
The other way would be to do tasks via a queue in a different script. There are some Queue Server out there, but it can be done simple calling shell_execute your Queue Script via PHP.exe. On Linux you need something like ...script.php > /dev/null 2>/dev/null &, Windows start /B php..., to stop waiting on Script is finished.
Related
I have a problem with terminating processes started from a queue job.
I use the yii2-queue extension to run some long running system commands that have a total execution time limit controlled by the getTtr method of the RetryableInterface. The command may take anywhere from minutes to hours to fully complete, but I need to kill it after it hits the 60-minute mark.
<?php
use Symfony\Component\Process\Process;
use yii\base\BaseObject;
use yii\queue\RetryableJobInterface;
class TailJob extends BaseObject implements RetryableJobInterface
{
public function getTtr()
{
return 10;
}
public function execute($queue)
{
$process = new Process('tail -f /var/log/dpkg.log');
$process->setTimeout(60);
$process->run();
}
public function canRetry($attempt, $error)
{
return false;
}
}
Now, the problem that I face is that even when queue/listen kills the job, the tail command (it's just an example; in production I need to run a different command) keeps running in the background. Is there any way I can force the system to kill the tail command when the job is killed?
Your script needs to keep checking if the timeout was reached; e.g.
while($process->isRunning()) {
$process->checkTimeout();
usleep(200000);
}
Read more about "Process Timeout" here:
https://symfony.com/doc/current/components/process.html
Run the command with a timeout
$process = new Process('timeout 3600 tail -f /var/log/dpkg.log');
Will limit the process to a maximum of 60 minutes.
If your script kills it first that's fine and if it doesn't then the process will die at the timeout time.
https://linux.die.net/man/1/timeout
Below is what's happening when i run php artisan queue:listen and at my job table only have one job
and this is my code :
public function handle(Xero $xero)
{
$this->getAndCreateXeroSnapshotID();
$this->importInvoices($xero);
$this->importBankTransaction($xero);
$this->importBankStatement($xero);
$this->importBalanceSheet($xero);
$this->importProfitAndLoss($xero);
}
In order for a job to leave the queue, it must reach the end of the handle function -- without errors and exceptions.
There must be something breaking inside one or more of your functions.
If an exception is thrown while the job is being processed, the job will automatically be released back onto the queue so it may be attempted again. https://laravel.com/docs/5.8/queues
The same behavior can be achieved with
$this->release()
If you can't figure out what is breaking, you can set your job to run only once. If an error is thrown, the job will be considered failed and will be put in the failed jobs queue.
The maximum number of attempts is defined by the --tries switch used
on the queue:work Artisan command. https://laravel.com/docs/5.8/queues
php artisan queue:work --tries=1
If you are using the database queue, (awesome for debugging) run this command to create the failed queue table
php artisan queue:failed
Finally, to find out what is wrong with your code. You can catch and log the error.
public function handle(Xero $xero)
{
try{
$this->getAndCreateXeroSnapshotID();
$this->importInvoices($xero);
$this->importBankTransaction($xero);
$this->importBankStatement($xero);
$this->importBalanceSheet($xero);
$this->importProfitAndLoss($xero);
}catch(\Exception $e){
Log::error($e->getMessage());
}
}
You could also set your error log channel to be slack, bugsnag or whatever. Just be sure to check it. Please don't be offended, it's normal to screw up when dealing with laravel queues. How do you think I got here?
Laravel try to run the job again and again.
php artisan queue:work --tries=3
Upper command will only try to run the jobs 3 times.
Hope this helps
In my case the problem was the payload, I've created the variable private, but it needs to by protected.
class EventJob implements ShouldQueue
{
use InteractsWithQueue, Queueable, SerializesModels;
// payload
protected $command;
// Maximum tries of this event
public $tries = 5;
public function __construct(CommandInterface $command)
{
$this->command = $command;
}
public function handle()
{
$event = I_Event::create([
'event_type_id' => $command->getEventTypeId(),
'sender_url' => $command->getSenderUrl(),
'sender_ip' => $command->getSenderIp()
]);
return $event;
}
}
The solution that worked for me to delete the job after pushing them into the queue.
Consider the e.g.
class SomeController extends Controller{
public function uploadProductCsv(){
//process file here and push the code inot Queue
Queue::push('SomeController#processFile', $someDataArray);
}
public function processFile($job, $data){
//logic to process the data
$job->delete(); //after complete the process delete the job
}
}
Note: This is implemented for laravel 4.2
I have the following (simple) lock code for a Laravel 5.3 command:
private $hash = null;
public final function handle() {
try {
$this->hash = md5(serialize([ static::class, $this->arguments(), $this->options() ]));
$this->info("Generated signature ".$this->hash,"v");
if (Redis::exists($this->hash)) {
$this->hash = null;
throw new \Exception("Method ".$this->signature." is already running");
}
Redis::set($this->hash, true);
$this->info("Running method","vv");
$this->runMutuallyExclusiveCommand(); //Actual command is not important here
$this->cleanup();
} catch (\Exception $e) {
$this->error($e->getMessage());
}
}
public function cleanup() {
if (is_string($this->hash)) {
Redis::del($this->hash);
}
}
This works fine if the command is allowed to go through its execution cycle normally (including handling when there's a PHP exception). However the problem arises when the command is interrupted via other means (e.g. CTRL-C or when the terminal window is closed). In that case the cleanup code is not ran and the command is considered to be still "executing" so I need to manually remove the entry from the cache in order to restart it. I have tried running the cleanup code in a __destruct function but that does not seem to be called either.
My question is, is there a way to set some code to be ran when a command is terminated regardless how it was terminated?
Short answer is no. When you kill the running process, either by Ctrl-C or just closing the terminal, you terminate it. You would need to have an interrupt in your shell that links to your cleanup code, but that is way out of scope.
There are other options however. Cron jobs can be run at intermittent intervals to perform clean up tasks and other helpful things. You could also create a start up routine that runs prior to your current code. When you execute the start up routine, it could do the cleanup for you, then call your current routine. I believe your best bet is to use a cron job that simply runs at given intervals that then looks for entries in the cache that are no longer appropriate, and then cleans them. Here is a decent site to get you started with cron jobs https://www.linux.com/learn/scheduling-magic-intro-cron-linux
I just discovered that it exists a ConsoleEvents::TEMINATE event in Symfony.
I want to use it to execute some additional process after the command execution (and not delaying the command).
But the fact is that i want to execute some process when a specific command is finish, not for all the commands (because i think that consoleevent.terminate is fired for all the commands.
I really don't know how to do that.
Regards.
You can access instance of the command from ConsoleTerminateEvent
It's almost copy paste from documentation of Console component. with full symfony registering listener looks a little different but you should get the idea.
$dispatcher->addListener(
ConsoleEvents::TERMINATE,
function(ConsoleTerminateEvent $event) {
$command = $event->getCommand();
// if it's not the command you want
if (!$command instanceof YourDesiredCommand) {
return;
}
// put your logic here
}
);
I try to execute some final code after my script got aborted in PHP. So let's say I have this PHP script:
while(true) {
echo 'loop';
sleep(1);
}
if I execute the script with $ php script.php it run's till the given execution time.
Now I like to execute some final code after the script has been aborted. So if I
hit Ctrl+C
the execution time is over
Is there even a possibility to do some clean up in those cases?
I tried it with pcntl_signal but with no luck. Also with register_shutdown_function but this get only called if the script ends successfully.
UPDATE
I found out (thx to rch's link) that I somehow can "catch" the events with:
pcntl_signal(SIGTERM, $restartMyself); // kill
pcntl_signal(SIGHUP, $restartMyself); // kill -s HUP or kill -1
pcntl_signal(SIGINT, $restartMyself); // Ctrl-C
But if I extend my code with
$cleanUp = function() {
echo 'clean up';
exit;
};
pcntl_signal(SIGINT, $cleanUp);
the script keeps executing but does not respect the code in $cleanUp closure if I hit Ctrl+C.
The function pcntl_signal() is the answer for the situation when the script is interrupted using Ctrl-C (and by other signals). You have to pay attention to the documentation. It says:
You must use the declare() statement to specify the locations in your program where callbacks are allowed to occur for the signal handler to function properly.
The declare() statement, amongst other things, installs a callback function that handles the dispatching of the signals received since its last call, by calling the function pcntl_signal_dispatch() which in turn calls the signal handlers you installed.
Alternatively, you can call the function pcntl_signal_dispatch() yourself when you consider it's appropriate for the flow of your code (and don't use declare(ticks=1) at all).
This is an example program that uses declare(ticks=1):
declare(ticks=1);
// Install the signal handlers
pcntl_signal(SIGHUP, 'handleSigHup');
pcntl_signal(SIGINT, 'handleSigInt');
pcntl_signal(SIGTERM, 'handleSigTerm');
while(true) {
echo 'loop';
sleep(1);
}
// Reset the signal handlers
pcntl_signal(SIGHUP, SIG_DFL);
pcntl_signal(SIGINT, SIG_DFL);
pcntl_signal(SIGTERM, SIG_DFL);
/**
* SIGHUP: the controlling pseudo or virtual terminal has been closed
*/
function handleSigHup()
{
echo("Caught SIGHUP, terminating.\n");
exit(1);
}
/**
* SIGINT: the user wishes to interrupt the process; this is typically initiated by pressing Control-C
*
* It should be noted that SIGINT is nearly identical to SIGTERM.
*/
function handleSigInt()
{
echo("Caught SIGINT, terminating.\n");
exit(1);
}
/**
* SIGTERM: request process termination
*
* The SIGTERM signal is a generic signal used to cause program termination.
* It is the normal way to politely ask a program to terminate.
* The shell command kill generates SIGTERM by default.
*/
function handleSigTerm()
{
echo("Caught SIGTERM, terminating.\n");
exit(1);
}
This may have some really useful information, it looks like they're utilizing the same things you've attempted but seemingly with positive results? Perhaps there's something here you haven't attempted or maybe missed.
Automatically Restart PHP Script on Exit
Check this: connection_aborted()
http://php.net/manual/en/function.connection-aborted.php
Here's an example of how to use it to achieve what you want:
http://php.net/manual/en/function.connection-aborted.php#111167