I have a class in my Symfony 2.3 project that is doing some http requests and takes some time.
I would like to run this task as a background process, so that the server returns an answer to the client and the background process continues running.
Do you know how to do that in Symfony?
I found the Process Component: http://symfony.com/doc/current/components/process.html but I am not sure if I can run a class method from there.
A simple way to do this is to separate the heavy lifting from the response by using a queue and a symfony command to process the queue.
http://symfony.com/doc/current/components/console/introduction.html
Create a symfony command that processes the jobs added to a queue, then add the work to be done to the queue from your controller. The queue will probably be implemented as a database table of jobs.
That way you can return a success response to the user and run a cron job on the server regularly to process the work you require.
This is something you could easily do with enqueue library. First, you can choose from a variety of transports, such as AMQP, STOMP, Redis, Amazon SQS, Filesystem and so on.
Secondly, That's super easy to use. Let's start from installation:
You have to install the enqueue/enqueue-bundle library and one of the transports. Assuming you choose the filesystem enqueue/fs library:
composer require enqueue/enqueue-bundle enqueue/fs
Now let's see how you can send messages from your POST script:
<?php
use Enqueue\Client\ProducerInterface;
use Symfony\Component\DependencyInjection\Container;
/** #var Container $container */
/** #var ProducerInterface $producer */ $producer = $container->get('enqueue.client.producer');
$producer->sendCommand('a_background_task', 'task_data');
For the consumption, you have to create a processor service and tag it with enqueue.client.processor tag:
<?php
use Enqueue\Client\CommandSubscriberInterface;
use Enqueue\Psr\PsrContext;
use Enqueue\Psr\PsrMessage;
use Enqueue\Psr\PsrProcessor;
class BackgroundTask implements PsrProcessor, CommandSubscriberInterface
{
public static function getSubscribedCommand()
{
// do job
return self::ACK;
}
public function process(PsrMessage $message, PsrContext $context)
{
return 'a_background_task';
}
}
And run a consumer with a command:
./bin/console enqueue:consume --setup-broker -vvv
On the prod you most likely need more then one consumer and if the process exists it has to be restarted. To address this you need a sort of process manager. There several options:
http://supervisord.org/ - You need extra service. It has to be configured properly.
A pure PHP process manager like this. Based on Symfony process component and pure PHP code. It can handle process reboot, correct exit on sigterm signal and a lot more.
A php\swoole process manager like this. It requires a swoole PHP extension but it is performance is amazing.
Related
I have an API created with Slim 3.
In it I have some curl execution such as sending Push notification to a user.
I want send response to requester then execute curl or any other function.
I read about Threads in PHP and use pthreads-polyfill but it sends response after finishing Thread.
Sample tested code:
$app->get('/test', function (Request $request, Response $response) {
PushController::publish("1111111", "HELLO");
$result = 'OK';
return $response->getBody()->write($result)->withStatus(200);
});
I understand what you are trying to do, and threading is not the answer.
One solution is to call a script from the main one, as you mentioned.
A more elegant one imho, is to call fastcgi_finish_request . It will return the answer to the requester and continue to execute the script. Unfortunately this function is only available with PHP-FPM. Which is the industry standard by now but does not necessarily comes as default when you install a LAMP stack.
For your requirement two solutions may suite
Queue
Cron
Redis can be used as queuing server. For that you need to install redis server on your system. There is php implementation of predis for Redis. For more details about Redis you can read it in Redis official site. Beanstalkd can also used as Queuing server.
To learn, how to create cron jobs you can refer the exisitng stackoverflow question
I'll preface this by admitting slight sleep-deprivation.
The setup is as follows:
API Endpoint (Server A) receives an incoming call, and adds this to a specific queue on the RabbitMQ Server (Server B).
RabbitMQ (Server B) is simply a RabbitMQ Queue Server. Nothing more, nothing less.
Laravel Installation (Server C) is our actual Laravel install, which is meant to look for jobs on specific queues and do things with them.
We have a RabbitMQ package in the Laravel install, which allows the use of the regular Laravel Queue mechanics over a RabbitMQ connection.
The issue I've come across is that we can spawn a worker for a queue - but since we're not generating the jobs passing a $job class (the job content itself is most often a JSON array), the Laravel install has no idea what to do with the job.
So my question revolves mainly around how to approach a scenario like this. I'm thinking that using the Queue-functionality in Laravel won't do what I need it to do. Can you see an approach that I'm missing? Do I really need to spawn a daemon on a non-framework script to handle this?
Your input is much appreciated!
An alternative approach would be a listener on your Laravel application consuming the JSON messages an acting on those.
A queue listener can be created using a package such as https://github.com/bschmitt/laravel-amqp (a generic AMQP bridge for Laravel) or https://github.com/needle-project/laravel-rabbitmq (a bridge more specialised for RabbitMQ).
The queue consumer then reads the JSON payload, saves the paymload as appropriate data, then decides what jobs to dispatch as a result within the Laravel application, as handled by the https://github.com/vyuldashev/laravel-queue-rabbitmq package.
The the two applications still communicate with plain JSON, and not the Laravel-oriented JSON containing the serialised job class.
The solution is indeed to replicate the job code onto the one issuing the job. The code will not need every dependency that the job requires to actually function, as it only serializes the job from the one pushing it.
I have a specific task in a CakePHP Shell and it's executed by a CRON job. But I want the users to be able to execute it from a web interface (like a button or something like this) whenever he wants.
So, my question is, is this possible to execute a shell from a controller ?
Emulate this in a controller:
bin/cake MyShell
I know it was possible in the previous versions of CakePHP, but I didn't find something related to this in the newest version. And use exec("bin/cake MyShell") seems really dirty to me.
create a shell object , the call any of its function to want to excute
$myShell = new \App\Shell\MyShell;
$myShell->anyShellFun();
in order to call shell from your controller function you need to do this in your controller function :
namespace App\Controller;
use App\Controller\AppController;
use Cake\Console\ShellDispatcher; //use this shell dispacher
/**
* Example Controller
*
*/
class ExampleController extends AppController
{
/**
* Index method
*
* #return \Cake\Network\Response|null
*/
public function index()
{
$shell = new ShellDispatcher(); //create object of your Shell Dispatcher
$output = $shell->run(['cake', 'foo']); //here foo is your shell name
if (0 === $output) {
$this->Flash->success('Success from shell command.');
} else {
$this->Flash->error('Failure from shell command.');
}
return $this->redirect('/');
}
}
hope this answer your question, if any problem leave a comment.
thank you.
If you can't mitigate the need to do this as dogmatic suggests then, read on.
So you have a (potentially) long-running job you want to perform and you don't want the user to wait.
As the PHP code your user is executing happens during a request that has been started by Apache, any code that is executed will stall that request until it completion (unless you hit Apache's request timeout).
If the above isn't acceptable for your application then you will need to trigger PHP outwith the Apache request (ie. from the command line).
Usability-wise, at this point it would make sense to notify your user that you are processing data in the background. Anything from a message telling them they can check back later to a spinning progress bar that polls your application over ajax to detect job completion.
The simplest approach is to have a cronjob that executes a PHP script (ie. CakePHP shell) on some interval (at minimum, this is once per minute). Here you can perform such tasks in the background.
Some issues arise with background jobs however. How do you know when they failed? How do you know when you need to retry? What if it doesn't complete within the cron interval.. will a race-condition occur?
The proper, but more complicated setup, would be to use a work/message queue system. They allow you to handle the above issues more gracefully, but generally require you to run a background daemon on a server to catch and handle any incoming jobs.
The way this works is, in your code (when a user registers) you insert a job into the queue. The queue daemon picks up the job instantly (it doesn't run on an interval so it's always waiting) and hands it to a worker process (a CakePHP shell for example). It's instant and - if you tell it - it knows if it worked, it knows if it failed, it can retry if you want and it doesn't accidentally handle the same job twice.
There are a number of these available, such as Beanstalkd, dropr, Gearman, RabbitMQ, etc. There are also a number of CakePHP plugins (of varying age) that can help:
cakephp-queue (MySQL)
CakePHP-Queue-Plugin (MySQL)
CakeResque (Redis)
cakephp-gearman (Gearman)
and others.
I have had experience using CakePHP with both Beanstalkd (+ the PHP Pheanstalk library) and the CakePHP Queue plugin (first one above). I have to credit Beanstalkd (written in C) for being very lightweight, simple and fast. However, with regards to CakePHP development, I found the plugin faster to get up and running because:
The plugin comes with all the PHP code you need to get started. With Beanstalkd, you need to write more code (such as a PHP daemon that polls the queue looking for jobs)
The Beanstalkd server infrastructure becomes more complex. I had to install multiple instances of beanstalkd for dev/test/prod, and install supervisord to look after the processes).
Developing/testing is a bit easier since it's a self-contained CakePHP + MySQL solution. You simply need to type cake queue add user signup and cake queue runworker.
How can some code in a bundle be executed after booting the Symfony2 kernel?
The code must be run before a request is handled or console command is run.
The code must be executed once, even when the kernel handles multiple requests during its lifetime.
The code must be able to access the bundle configuration. It may therefore not be run to early in the proces.
The reason I need this is that I need to register a stream wrapper. I need to be able to use the bundle configuration since the stream wrapper definitions are defined in the config.
I tried the following:
Implementing the constructor of the bundle class. (This does not work, not all bundles are initialised at this point)
Creating event listeners for kernel.request and console.command (This will cause the code to be executed multiple times when the kernel handles multiple requests during its lifetime.)
You can override the boot method of your bundle.
class MyBundle extends Bundle
{
public function boot()
{
}
}
You can register one service as event listener for both kernel.request and console.command. It will be fired from console and from HTTP request.
I must implement a rabbitmq solution for a web-service.
Well I decided to deploy a simple queue, with a producer and one customer.
My questions are how to make costumer.php listening continuously for request from producer.php? May I append it to my crontab? How do I define a worker process that will work on background?
How can I send SOAP call to the costumer?
The
while(count($channel->callbacks)) {
$channel->wait();
}
loop is doing the waiting part. It will run forever, calling the $callback function/class as needed.
In the tutorial, you can replace the $callback (which is a function here) by an array containing the instance of a consumer class and the function to call, ie array($consumer, 'processMessage'). The function will receive the message as a parameter.
In the tutorial, the receive.php will run indefinitely : you can run it as a daemon (with runit for instance) if you wish.
If you wish to use a SOAP call, it has nothing do to with rabbitmq. You can use both if you wish and they can call the same classes to do the tasks, but you will have to create another layer of code.