I followed this guide: https://symfony.com/doc/current/console/lockable_trait.html and implemented the command lock feature for my one of my commands to see how it works. It worked as described and then I was going to implement it for all of my commands. But the issue is that I have about 50 commands and:
I do not want spent time adding the necessary code to each command
I want to have the centralized management of commands locking. I mean, adding extra option to regular commands so that they will be used by my future management center. For now I will need a pretty simple option protected function isLocked() for a regular command which will help me to manage if a command should have lockable feature.
So, I went to the source of \Symfony\Component\Console\Command\LockableTrait and after some time created the following listener to the event console.command:
use Symfony\Component\Console\Event\ConsoleCommandEvent;
use Symfony\Component\Console\Exception\LogicException;
use Symfony\Component\Lock\Lock;
use Symfony\Component\Lock\LockFactory;
use Symfony\Component\Lock\LockInterface;
use Symfony\Component\Lock\Store\FlockStore;
use Symfony\Component\Lock\Store\SemaphoreStore;
class LockCommandsListener
{
/**
* #var array<string, Lock>
*/
private $commandLocks = [];
private static function init()
{
if (!class_exists(SemaphoreStore::class)) {
throw new LogicException('To enable the locking feature you must install the symfony/lock component.');
}
}
public function onConsoleCommand(ConsoleCommandEvent $event)
{
static::init();
$name = $event->getCommand()->getName();
$this->ensureLockNotPlaced($name);
$lock = $this->createLock($name);
$this->commandLocks[$name] = $lock;
if (!$lock->acquire()) {
$this->disableCommand($event, $name);
}
}
private function disableCommand(ConsoleCommandEvent $event, string $name)
{
unset($this->commandLocks[$name]);
$event->getOutput()->writeln('The command ' . $name . ' is already running');
$event->disableCommand();
$event->getCommand()->setCode()
}
private function createLock(string $name): LockInterface
{
if (SemaphoreStore::isSupported()) {
$store = new SemaphoreStore();
} else {
$store = new FlockStore();
}
return (new LockFactory($store))->createLock($name);
}
private function ensureLockNotPlaced(string $name)
{
if (isset($this->commandLocks[$name])) {
throw new LogicException('A lock is already in place.');
}
}
}
I made some tests and it kind of worked. But I am not sure this is the right way of doing things.
Another problem is that I can not find the proper exit code when I disabled a command. Should I just disable it? But it seems that exit code would be a great feature here. Specially when it comes to this listener testing (PHPUnit testing).
And I also have with testing itself. How can I run commands in parallel in my test class. For now I have this:
class LockCommandTest extends CommandTest
{
public function testOneCommandCanBeRun()
{
$commandTester = new ApplicationTester($this->application);
$commandTester->run([
'command' => 'app:dummy-command'
]);
$output = $commandTester->getDisplay();
dd($output);
}
}
It will allow only to run my commands one by one. But I would like to run them both so after running the first one, the second will fail (with some exit code).
As for me the best way to make background task is doing it via supervisor, create config file, like:
[program:your_service]
command=/usr/local/bin/php /srv/www/bin/console <your:app:command>
priority=1
numprocs=1
# Each 5 min.
startsecs=300
autostart=true
autorestart=true
process_name=%(program_name)s_%(process_num)02d
user=root
this is the best way to be sure that your command will be ran only in one process
Related
I have multiple supervisor in horizon, and they work normally, the problem is that I want to interact with them by my own web interface, and by interacting I mean pause them and continue (unpause them).
To do that I want to be able as much as possible, without using system (in artisan horizon:pause-supervisor it sends posix_kill($supervisor->pid, 12)).
I tried to instantiate the supervisor by doing this :
class HorizonManager
{
private SupervisorRepository $supervisors;
private MasterSupervisorRepository $masters;
private WorkloadRepository $workload;
private RedisJobRepository $jobRepository;
private QueueManager $queueManager;
public function __construct(MasterSupervisorRepository $masters, SupervisorRepository $supervisors, WorkloadRepository $workload, RedisJobRepository $jobRepository, QueueManager $manager)
{
$this->masters = $masters;
$this->supervisors = $supervisors;
$this->workload = $workload;
$this->jobRepository = $jobRepository;
$this->queueManager = $manager;
}
public function pauseSupervisor(string $supervisorName){
$supervisor = $this->supervisors->find($supervisorName);
$supervisorOpt = new SupervisorOptions(...$supervisor->options);
$sup = new Supervisor($supervisorOpt);
$sup->pause();
$sup->persist();
return $this->supervisors->find($supervisorName);
}
}
In the return from the function, I have the supervisor paused, but it's not really paused (even If I persist the instantiate supervisor it's still running as a process)
For those interested I failed doing it by instanciating it so instead I send the command using artisan call :
define('SIGUSR2', 12);
Artisan::call('horizon:pause-supervisor', ['name'=>$supervisorName]);
$supervisor = $this->supervisors->find($supervisorName);
$supervisor->status = 'paused';
I have been having a lot of trouble getting the Laravel Excel package to export a large amount of data. I need to export about 80-100k rows so I implemented the queued export as mentioned in the docs. It works fine when I export a smaller amount of rows, but when I try to do 60-80k rows, it fails every time. While the jobs are being processed, I watch the temp file that is created, and I can see that the size of the file is increasing. I also watch the jobs in the database (I'm using the database queue driver), and I can see the jobs completing for a while. It seems that the jobs take incremently more time until the job fails. I don't get why the first several jobs are quick, and then they start taking more and more time to complete.
I'm using supervisor to manage the queue, so here's my config for that:
[program:laravel-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/html/site/artisan queue:work --sleep=3 --tries=3 --timeout=120 --queue=exports,default
autostart=true
autorestart=true
user=www-data
numprocs=8
redirect_stderr=true
stdout_logfile=/var/log/supervisor/worker.log
loglevel=debug
And then my controller to create the export
(new NewExport($client, $year))->queue('public/exports/' . $name)->allOnQueue('exports')->chain([
new NotifyUserOfCompletedExport($request->user(), $name),
]);
I'm using:
Laravel 5.8,
PHP 7.2,
Postgresql 10.10
I should also mention that I have played around with the chunk size a bit, but in the end I've always run into the same problem. I tried chunk sizes of 500, 2000, 10000 but no luck.
In the failed_jobs table, the exception is MaxAttemptsExceededException, although I have also got exceptions for InvalidArgumentException File '/path/to/temp/file' does not exist. I'm not quite sure what else to do. I guess I could make it so it doesn't timeout, but that seems like it will just cause more problems. Any help would be appreciated.
EDIT
Here is the content of my Export Class:
class NewExport implements FromQuery, WithHeadings, WithMapping, WithStrictNullComparison
{
public function __construct($client, $year)
{
$this->year = $year;
$this->client = $client;
}
public function query()
{
$data = $this->getDataQuery();
return $data ;
}
public function headings(): array
{
$columns = [
//....
];
return $columns;
}
public function map($row): array
{
$mapping = [];
foreach($row as $key => $value) {
if(is_bool($value)) {
if($value) {
$mapping[$key] = "Yes";
} else {
$mapping[$key] = "No";
}
}else{
$mapping[$key] = $value;
}
}
return $mapping;
}
private function getDataQuery()
{
$query = \DB::table('my_table')->orderBy('my_field');
return $query;
}
The NotifyUserOfCompletedExport class is just creating a job to email the logged in user that the export is finished with a link to download it.
class NotifyUserOfCompletedExport implements ShouldQueue
{
use Queueable, SerializesModels;
public $user;
public $filename;
public function __construct(User $user, $filename)
{
$this->user = $user;
$this->filename = $filename;
}
public function handle()
{
// This just sends the email
$this->user->notify(new ExportReady($this->filename, $this->user));
}
}
EDIT 2:
So I read this post, and I verified that eventually my server was just running out of memory. That lead to the MaxAttemptsExceededException error. I added more memory to the server, and I am still getting the InvalidArgumentException File '/path/to/temp/file' does not exist after the jobs have completed. It's even more weird though, because I can see that /path/to/temp/file actually does exist. So I have no idea what is going on here, but it's super frustrating.
I'm facing problems with "too many connections" for PHPUnit tests for ZF3 and Doctrine, because I'm executing ~200 tests per PHPUnit execution.
I've already found some questions and answers on stack overflow but non of these work.
My setup:
ZF2/ZF3, Doctrine 2 and PHPUnit.
I have a base test class for all tests and the setUp and tearDown function look like this:
public function setUp()
{
$this->setApplicationConfig(Bootstrap::getConfig());
Bootstrap::loadAllFixtures();
if (!static::$em) {
echo "init em";
static::$em = Bootstrap::getEntityManager();
}
parent::setUp();
....
}
public function tearDown()
{
parent::tearDown();
static::$em->flush();
static::$em->clear();
static::$em->getConnection()->close();
$refl = new \ReflectionObject($this);
foreach ($refl->getProperties() as $prop) {
if (!$prop->isStatic() && 0 !== strpos($prop->getDeclaringClass()->getName(), 'PHPUnit_')) {
$prop->setAccessible(true);
$prop->setValue($this, null);
}
}
gc_collect_cycles();
}
public static function (Bootstrap::)loadAllFixtures()
{
static::$em->getConnection()->executeUpdate("SET foreign_key_checks = 0;");
$loader = new Loader();
foreach (self::$config['data-fixture'] as $fixtureDir) {
$loader->loadFromDirectory($fixtureDir);
}
$purger = new ORMPurger(static::$em);
$executor = new ORMExecutor(static::$em, $purger);
$executor->execute($loader->getFixtures());
$executor = null;
$purger = null;
static::$em->getConnection()->executeUpdate("SET foreign_key_checks = 1;");
static::$em->flush();
static::$em->clear();
}
I'm monitoring my local MySQL server with innotop and the number of connections is increasing.
Do you have any ideas what I'm missing?
Thank you,
Alexander
Update 14.02.2017:
I've changed functions to use static::$em and added Bootstrap::loadAllFixtures method.
If I add static::$em->close() to tearDown method, all following test fail with message like "EntityManager already closed". echo "init em"; is only call once and shown for the first test.
Is there a possibility to check if my Application opens connections without closing them? My test cases are based on AbstractHttpControllerTestCase
I came across this problem too. Following the advice in PHPUnit's documentation I had done the following:
final public function getConnection()
{
if ($this->conn === null) {
if (self::$pdo == null) {
//We get the EM from dependency injection container
$container = $this->getContainer();
self::$pdo = $container->get('Doctrine.EntityManager')->getConnection()->getWrappedConnection();
}
$this->conn = $this->createDefaultDBConnection(self::$pdo, 'spark_api_docker');
}
return $this->conn;
}
While self:$pdo was being shared, the number of 'threads_connected', when I observed show status like '%onn%'; on my database, crept up until it reached the limit.
I found two solutions to this:
1) Close the connection after each test
public function tearDown()
{
parent::tearDown();
//You'll probably need to get hold of our entity manager another way
$this->getContainer()->get('Doctrine.EntityManager')->getConnection()->close();
}
importantly, do not set self::$pdo to null. I had seen this as a recommendation elsewhere, but there's no point setting it as a static property and then resetting it after each test.
This works my closing connections that are no longer needed. When a testcase is finished, unless you have closed the connection it will remain open until the script ends (i.e. PHPUnit finishes running your test). Since you're creating a new connection for each test case, the number of connections goes up.
2) Run each test in a seperate PHP thread
This is the sledgehammer approach. It will likely impact the speed of your tests to one degree or another. In your phpunit.xml`:
<?xml version="1.0" encoding="UTF-8"?>
<phpunit
...
processIsolation = "true"
>
...
</phpunit>
Returning to PHPUnit's advice, storing the connection and PDO helps with not creating new connections for each test but does not help you when you have many test cases. Each test case gets instantianted in the same thread, and each will create a new connection.
Your tearDown method looks like it should do the trick. I just do this and have never experienced this issue
protected function tearDown()
{
parent::tearDown();
$this->em->close();
$this->em = null;
}
What does Bootstrap::loadAllFixtures method do? Is there any db connection in there that might be being overlooked?
I'm trying to run a job queue to create a PDF file using SlmQueueBeanstalkd and DOMPDFModule in ZF".
Here's what I'm doing in my controller:
public function reporteAction()
{
$job = new TareaReporte();
$queueManager = $this->serviceLocator->get('SlmQueue\Queue\QueuePluginManager');
$queue = $queueManager->get('myQueue');
$queue->push($job);
...
}
This is the job:
namespace Application\Job;
use SlmQueue\Job\AbstractJob;
use SlmQueue\Queue\QueueAwareInterface;
use SlmQueue\Queue\QueueInterface;
use DOMPDFModule\View\Model\PdfModel;
class TareaReporte extends AbstractJob implements QueueAwareInterface
{
protected $queue;
public function getQueue()
{
return $this->queue;
}
public function setQueue(QueueInterface $queue)
{
$this->queue = $queue;
}
public function execute()
{
$sm = $this->getQueue()->getJobPluginManager()->getServiceLocator();
$empresaTable = $sm->get('Application\Model\EmpresaTable');
$registros = $empresaTable->listadoCompleto();
$model = new PdfModel(array('registros' => $registros));
$model->setOption('paperSize', 'letter');
$model->setOption('paperOrientation', 'portrait');
$model->setTemplate('empresa/reporte-pdf');
$output = $sm->get('viewPdfrenderer')->render($model);
$filename = "/path/to/pdf/file.pdf";
file_put_contents($filename, $output);
}
}
The first time you run it, the file is created and the work is successful, however, if you run a second time, the task is buried and the file is not created.
It seems that stays in an endless cycle when trying to render the model a second time.
I've had a similar issue and it turned out it was because of the way ZendPdf\PdfDocument reuses it's object factory. Are you using ZendPdf\PdfDocument?
You might need to correctly close factory.
class MyDocument extends PdfDocument
{
public function __destruct()
{
$this->_objFactory->close();
}
}
Try to add this or something similar to the PdfDocument class...
update : it seem you are not using PdfDocument, however I suspect this is the issue is the same. Are you able to regenerate a second PDF in a normal http request? It is your job to make sure the environment is equal on each run.
If you are unable to overcome this problem a short-term quick solution would be to set max_runs configuration for SlmQueue to 1. That way the worker is stopped after each job and this reset to a vanilla state...
Symfony2 enables developers to create their own command-line commands. They can be executed from command line, but also from the controller. According to official Symfony2 documentation, it can be done like that:
protected function execute(InputInterface $input, OutputInterface $output)
{
$command = $this->getApplication()->find('demo:greet');
$arguments = array(
...
);
$input = new ArrayInput($arguments);
$returnCode = $command->run($input, $output);
}
But in this situation we wait for the command to finish it's execution and return the return code.
How can I, from controller, execute command forking it to background without waiting for it to finish execution?
In other words what would be equivalent of
$ nohup php app/console demo:greet &
From the documentation is better use start() instead run() if you want to create a background process. The process_max_time could kill your process if you create it with run()
"Instead of using run() to execute a process, you can start() it: run() is blocking and waits for the process to finish, start() creates a background process."
According to the documentation I don't think there is such an option: http://api.symfony.com/2.1/Symfony/Component/Console/Application.html
But regarding what you are trying to achieve, I think you should use the process component instead:
use Symfony\Component\Process\Process;
$process = new Process('ls -lsa');
$process->run(function ($type, $buffer) {
if ('err' === $type) {
echo 'ERR > '.$buffer;
} else {
echo 'OUT > '.$buffer;
}
});
And as mentioned in the documentation "if you want to be able to get some feedback in real-time, just pass an anonymous function to the run() method".
http://symfony.com/doc/master/components/process.html