Run a detached endless job in Symfony - php

Symfony: 4.1
PHP: 7.1
I have working websocket server using Ratchet. The websocket itself works fin. I can run it from the terminal using Symfony's commands
php bin/console app:websocket:execute
I'm having trouble getting around some of these issues:
You need to dedicate a terminal to running this command
Most webhosting services don't give you access to the terminal
I want admins to be able to toggle the websocket server on and off
Admins aren't required to know what a terminal is
For issue 1, I try to use this "detaching" cheat, but it doesn't solve issue 2:
php bin/console app:websocket:execute > /dev/null 2>&1 &
In order to tackle all four issues. I have tried using a process. but the issues with this approach is:
$process->run() - running a process with php bin/console always ends in a timeout
$process-start() - starting a process means it runs asynchronously, but it also means the process is terminated once the request ends, terminating my websocket server too.
Here is an example
$process = new Process("php bin/console");
$process->setWorkingDirectory(getcwd() . "/../");
$process->setTimeout(10);
$process->run(); // Stalls for 10 seconds, then throws timeout exception
$process-start(); // Doesn't stall, but terminates at end of request
// $process->run() ==== unreachable code
if (!$process->isSuccessful()) {
throw new ProcessFailedException($process);
}
I have tried creating a console application, and run the command from there. but the same issues as a process apply here.
$application = new Application($this->kernel);
$application->setAutoExit(false);
$input = new ArrayInput(array(
'command' => 'app:websocket:execute'
));
try {
$ob = new BufferedOutput();
$application->run($input, $ob);
$output = $ob->fetch();
} catch (\Exception $e) {
return null;
}
As a final resort, I tried a bundle called DtcQueueBundle, because it mentions the following:
Ease of Use
Kickoff background tasks with a line of code or two
Easily add background worker services
Turn any code into background task with a few lines
So I did what they asked, created a worker and tried to run it as a "background task"
use App\Ratchet\ForumUpdater;
use Ratchet\Http\HttpServer;
use Ratchet\Server\IoServer;
use Ratchet\WebSocket\WsServer;
use Symfony\Component\Console\Command\Command;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Output\OutputInterface;
class SocketWorker extends \Dtc\QueueBundle\Model\Worker
{
public function execute()
{
$server = IoServer::factory(
new HttpServer(
new WsServer(
new ForumUpdater()
)
),
8080
);
$server->run();
return "Websocket started";
}
public function getName()
{
return "websocket-server";
}
}
Their docs are the absolute worst! I even tried to dig into their code to start jobs from inside my controller. But I couldn't get it to run in a detached manner.
Whatever happens, I believe my command isn't running because it highjacks my PHP thread. I would like to know, is it possible to detach this endless process? Is it even possible to run two PHP instances together? I would think so!
Thanks for any help, sorry for the long post

Related

How to launch a Symfony Command asynchronously during a request using the Symfony Process component?

I'm trying to execute a Symfony Command using the Symfony Process component so it executes asynchronously when getting an API request.
When I do so I get the error message that Code: 127(Command not found), but when I run it manually from my console it works like a charm.
This is the call:
public function asyncTriggerExportWitnesses(): bool
{
$process = new Process(['php /var/www/bin/console app:excel:witness']);
$process->setTimeout(600);
$process->setOptions(['create_new_console' => true]);
$process->start();
$this->logInfo('pid witness export: ' . $process->getPid());
if (!$process->isSuccessful()) {
$this->logError('async witness export failed: ' . $process->getErrorOutput());
throw new ProcessFailedException($process);
}
return true;
}
And this is the error I get:
The command \"'php /var/www/bin/console app:excel:witness'\" failed.
Exit Code: 127(Command not found)
Working directory: /var/www/public
Output:================
Error Output:================
sh: exec: line 1: php /var/www/bin/console app:excel:witness: not found
What is wrong with my usage of the Process component?
Calling it like this doesn't work either:
$process = new Process(['/usr/local/bin/php', '/var/www/bin/console', 'app:excel:witness']);
this results in following error:
The command \"'/usr/local/bin/php' '/var/www/bin/console' 'app:excel:witness'\" failed.
Exit Code: ()
Working directory: /var/www/public
Output:
================
Error Output:
================
First, note that the Process component is not meant to run asynchronously after the parent process dies. So triggering async jobs to run during an API request is a not a good use case.
These two comments in the docs about running things asynchronously are very pertinent:
If a Response is sent before a child process had a chance to complete, the server process will be killed (depending on your OS). It means that your task will be stopped right away. Running an asynchronous process is not the same as running a process that survives its parent process.
If you want your process to survive the request/response cycle, you can take advantage of the kernel.terminate event, and run your command synchronously inside this event. Be aware that kernel.terminate is called only if you use PHP-FPM.
Beware also that if you do that, the said PHP-FPM process will not be available to serve any new request until the subprocess is finished. This means you can quickly block your FPM pool if you’re not careful enough. That is why it’s generally way better not to do any fancy things even after the request is sent, but to use a job queue instead.
If you want to run jobs asynchronously, just store the job "somewhere" (e.d a database, redis, a textfile, etc), and have a decoupled consumer go through the "pending jobs" and execute whatever you need without triggering the job within an API request.
This above is very easy to implement, but you could also just use Symfony Messenger that will do it for you. Dispatch messages on your API request, consume messages with your job queue consumer.
All this being said, your use of process is also failing because you are trying mixing sync and async methods.
Your second attempt at calling the command is at least successful in finding the executable, but since you call isSuccessful() before the job is done.
If you use start() (instead of run()), you cannot simply call isSuccessful() directly, because the job is not finished yet.
Here is how you would execute an async job (although again, this would very rarely be useful during an API request):
class ProcessCommand extends Command
{
protected static $defaultName = 'process_bg';
protected function execute(InputInterface $input, OutputInterface $output)
{
$phpBinaryFinder = new PhpExecutableFinder();
$pr = new Process([$phpBinaryFinder->find(), 'bin/console', 'bg']);
$pr->setWorkingDirectory(__DIR__ . '/../..');
$pr->start();
while ($pr->isRunning()) {
$output->write('.');
}
$output->writeln('');
if ( ! $pr->isSuccessful()) {
$output->writeln('Error!!!');
return self::FAILURE;
}
$output->writeln('Job finished');
return self::SUCCESS;
}
}
I like to use exec().
You'd need to add a couple of bits to the end of your command:
Use '2>&1' so the output has somewhere to go. From memory, this is important so that PHP isn't waiting for the output to be returned (or streamed or whatever).
Put '&' on the end to make the command run in the background.
Then it's a good idea to return a 202 (Accepted) rather than 200, because we don't yet know whether it was successful, as the command hasn't completed.
public function runMyCommandIntheBackground(string $projectDir): Response
exec("{ProjectDir}/bin/console your:command:name 2>&1 &");
return new Response('', Response::HTTP_ACCEPTED);
}

Run .sh file using exec Laravel PHP

I am trying to run a .sh file that will import a excel file to my database. Both files are in same directory inside the public folder. For some reason the exec command isn't being executed or neither any error occurs.
.sh file colde:
IFS=,
while read column1
do
echo "SQL COMMAND GOES HERE"
done < file.csv | mysql --user='myusername' --password='mypassword' -D databasename;
echo "finish"
In my php file i have tried following:
$content = file_get_contents('folder_name/file_name.sh');
echo exec($content);
And:
shell_exec('sh /folder_name/file_name.sh');
Note: I can directly execute the sh file from gitbash however I want it to be done using function in Laravel controller. I'm using windows OS.
you can use Process Component of Symfony that is already in Laravel http://symfony.com/doc/current/components/process.html
use Symfony\Component\Process\Process;
use Symfony\Component\Process\Exception\ProcessFailedException;
$process = new Process('sh /folder_name/file_name.sh');
$process->run();
// executes after the command finishes
if (!$process->isSuccessful()) {
throw new ProcessFailedException($process);
}
echo $process->getOutput();
All of these answers are outdated now, instead use (Symfony 4.2 or higher):
$process = Process::fromShellCommandline('/deploy.sh');
Or
$process = new Process(['/deploy.sh']);
https://symfony.com/doc/current/components/process.html
I know this is a little late but I can't add a comment (due to being a new member) but to fix the issue in Windows " 'sh' is not recognized as an internal or external command, operable program or batch file." I had to change the new process from:
$process = new Process('sh /folder_name/file_name.sh');
to use the following syntax:
$process = new Process('/folder_name/file_name.sh');
The only problem with is that when uploading to a Linux server it will need to be changed to call sh.
Hope this helps anyone who hit this issue when following the accepted answer in Windows.
In Symfony 5.2.0 that used by Laravel 8.x (same as current Symfony Version 6.0 used by Laravel 9.x), you need to specify the sh command and your argument, in your case:
use Symfony\Component\Process\Process;
$process = new Process(['/usr/bin/sh', 'folder_name/file_name.sh']);
$process->run();
The system will find folder_name/file_name.sh from your /public folder (if it executed from a url), if you want to use another working directory, specify that in the second Process parameter.
$process = new Process(['/usr/bin/sh', 'folder_name/file_name.sh'], '/var/www/app');
And the /usr/bin/sh sometimes have a different place for each user, but that is a default one. Type whereis sh to find it out.

Start a background symfony/process from symfony/console

I'm attempting to create a cli (symfony/console) that manages long-running child processes (symfony/process) that consume a message queue. I have two commands, Consume and Listen. Consume is a wrapper for Listen so it can be run in the background. Listen is a long running dump of the MQ that shows additional messages as they're added to the message queue.
Problem: When attempting to invoke the cli command for Listen from within Consume, it starts the process and gives me a PID, but then the child process immediately dies. I need to figure out how to get Consume to spin off multiple Listen processes that actually stay running.
In case it's relevant, the OS(es) that this will run on are SLES 12 and Ubuntu 14.04 using PHP 5.5.
Some Code (relevant snippets)
Listen
// Runs as: php mycli.php mq:listen
// Keeps running until you ctrl+C
// On the commandline, I can run this as
// nohup php mycli.php mq:listen 2>&1 > /dev/null &
// and it works fine as a long running process.
protected function execute(InputInterface $input, OutputInterface $output)
{
$mq = new Mq::connect();
while (true) {
// read the queue
$mq->getMessage();
}
}
Consume
// Runs as: php mycli.php mq:consume --listen
// Goal: Run the mq:listen command in the background
protected function execute(InputInterface $input, OutputInterface $output)
{
if ($input->getOption('listen')) {
$process = new Process('php mycli.php mq:listen');
$process->start();
$pid = $process->getPid();
$output->writeln("Worker started with PID: $pid");
}
}
A task like this would normally be delegated to a task scheduler like Supervisor. It is quite dangerous to leave processes as orphans like this. If your message queue client loses connection but the processes are still running, they effectively turn into zombies.
You need to keep the mq:consume command running for the duration for each sub process. This is achievable as indicated in the last three examples at: http://symfony.com/blog/new-in-symfony-2-2-process-component-enhancements

Get the current task being run when Laravel queue:listen times out

We are running Laravel 4 with supervisord / SQS and we have 30+ different tasks being run using 10 worker processes. All has been going very well however it seems that certain tasks have started to timeout. We get an exception like this:
[Symfony\Component\Process\Exception\ProcessTimedOutException]
The process ""/usr/bin/php5" artisan queue:work --queue="https://sqs.us-east- 1.amazonaws.com/xxxx" --delay=0 --memory=128 --sleep=3 --tries=0 --env=development" exceeded the timeout of 180 seconds.
I can catch this exception using this:
App::error(function(Symfony\Component\Process\Exception\ProcessTimedOutException $exception) {
/// caught!
});
However I can't seem to determine WHICH task is being run (when the timeout occurs) and even better if I can access the data which was passed to the task..
I have tried logging the exception object stack trace:
$exception->getTraceAsString()
However, this doesn't get me enough detail about the task that was called.
UPDATE
I have done more research on how the php artisan queue:listen works. Some references:
Illuminate/Queue/Console/Listen
Illuminate/Queue/Listener
Symfony/Component/Process
Basically, when you call php artisan queue:listen, a SUB-PROCESS (using Symfony/Component/Process) is created which essentially runs the command php artisan queue:work. That sub-process fetches the next job from the queue, runs it, reports when complete, and then the Listener spawns another sub-process to handle the next job.
So, if one of the sub-processes is taking longer than the established timeout limit, the PARENT Listener throws an exception however, the parent instance has no data about the sub-process it created. WITH A SLIGHT EXCEPTION! It appears that the parent Listener DOES handle the sub-process' output. It appears to me that the parent does nothing more than render the sub-process' (worker) output to the console. However, perhaps there is a way to capture this output so that when an exception is thrown, we can log the output and therefore have knowledge about which task was running when the timeout occurred!
I have also noticed that when using supervisord, we are able to specify a stdout_logfile which logs all of the worker output. Right now we are using a single log file for all of our 10 supervisord "programs". We could change this to have each "program" use it's own log file and then perhaps when the timeout exception is thrown on the parent Listener, we could have it grab the last 10 lines of that log file. That would also give us info on which tasks are being run during the timeout. However, I am not sure how to "inform" the parent Listener which supervisord program it is running so it knows which log file to look at!
Looking at the exception class (Symfony\Component\Process\Exception\ProcessTimedOutException) I found the method getProcess() which returns an instance of Symfony\Component\Process\Process. In there you got getOutput(). The method does what it's name says.
As you proposed in the comments you can use this by echoing classname and parameters in each task and then using the generated output to determine the problematic task. As you said, it's not very elegant but I can't think of a better way (except maybe tinkering with the Illuminate\Queue\Listener class...)
Here's an example how you could do it (untested though)
I chose this format for the output:
ClassName:ParametersAsJson;ClassName:ParametersAsJson;
So in a BaseTask I'd do this:
abstract class BaseTask {
public function fire(){
echo get_class($this) . ':' . json_encode(func_get_args()) . ';';
}
}
Unfortunately that means in every task you will have to call parent::fire
class Task extends BaseTask {
public function fire($param1, $param2){
parent::fire($param1, $param2);
// do stuff
}
}
And finally, the exception handler:
App::error(function(Symfony\Component\Process\Exception\ProcessTimedOutException $exception) {
$output = $exception->getProcess()->getOutput();
$tasks = explode(';', $output);
array_pop($output); // remove empty task that's here because of the closing ";"
$lastTask = end($tasks);
$parts = explode(':', $lastTask);
$className = $parts[0];
$parameters = json_decode($parts[1]);
// write details to log
});
Since Laravel 4.1 there is a built-in mechanism for handling failed jobs where all job details are persisted in database or available at exception-time (or both). Detailed and clear documentation is available on Laravel's website.
To summarise:
Laravel can move failed jobs into a failed_jobs table for later review
You can register an exception handler via Queue::failing, which will receive detailed job information for immediate handling
However, it is questionable whether a timeout is considered a failure in Laravel, so this requires testing as I do not have hands-on experience with queues.
If you use Laravel 4.0 perhaps it would be worthwhile to evaluate an upgrade at least to 4.1 instead of writing complicated code that will become redundant once you really have to upgrade (you will upgrade at some point, right? :) ). Upgrade path seems quite straightforward.
While this does not directly answer your question for Laravel 4.0 it is something you and any future reader may consider.

Launch Process comand from a controller - Symfony 2.1

I have a doubt launching a command process from my controller. The command process work fine from command line, and from my controller works using
MyProcess->run();
but not works
MyProcess->start()
I don't understand why!
I need launch it in background (asynchronously) and i would need use start()
//Launch process
$process = new Process('php ../app/console preinscripciones:enviar');
$process->start();//not work using start but it does it if I use run()
// executes after the command finishes
if (!$process->isSuccessful()) {
throw new \RuntimeException($process->getErrorOutput());
}
echo $process->getOutput();
Thanks in advance!

Categories