How to use Models in a Laravel Queue - php

I'm trying to import a mailing list from CSV to my DATABASE. I have two models in my Laravel which is responsible for doing this: Target and Mailing (one Target has many Mailings)
I'm using Queue system with Beanstalkd. I'm using:
Queue::push('ImportCSV', array(
'file' => $file->getClientOriginalName(),
'target' => $name
));
To push my jobs and then I have the ImportCSV job class:
class ImportCSV
{
public function fire($job, $data)
{
Log::info("Starting to add {$data['target']} to database");
$target = new Target();
$target->name = $data['target'];
$target->save();
$reader = new \EasyCSV\Reader($data['file']);
// There must be a Email field in CSV file
/*if(!in_array('Email', $reader->getHeaders() ))
throw new Exception("Email field not found", 1);*/
while ($row = $reader->getRow())
{
$mailing = new Mailing();
$mailing->target()->associate($target);
$mailing->email = $row['Email'];
$mailing->save();
}
Log::info("Mailing list {$target->name} added to database");
$job->delete();
}
}
All the code seems to be working since I get these messages in my Log file
[2013-09-10 21:03:25] log.INFO: Starting to add TEst to database [] []
[2013-09-10 21:03:25] log.INFO: Mailing list TEst added to database [] []
But no records are added to my database. How should I use models inside a job? I already tested it in a Controller for example and everything works fine

Since you don't see other errors, I'm thinking this is an environment issue.
First - environments
Make sure your call to php artisan queue:listen (or queue:work, if applicable) is using the correct environment so the correct database is getting used:
$ php artisan queue:listen --env=YOUR_ENV
Here's a post on setting up queues in Laravel 4 which might be helpful for more information.
Second - namespaces
As you (apparently?) aren't seeing any PHP errors, this is less likely, but another idea:
If your class is namespaced, you may need to use the \ character to get your models, which are in the global namespace.
// From:
$mailing = new Mailing();
// To:
$mailing = new \Mailing();

Related

How to test semaphores with PHPUnit

I'm using The Symfony Lock package to check if a class method can be executed
if ($this->lock->acquire()) {
$this->execute();
$this->lock->release();
}
Important: I'm not using the Symfony Framework, only the Lock component
I want to make a test that asserts that the execution is locked when running in multiple threads, but I have not found any documentation on how to achieve this.
Is it a good idea to use pthreads? If not, which is the best way to make this test?
Thank you very much.
Referring to the Lock Component documentation :
https://symfony.com/doc/current/components/lock.html#usage
Information on the CommandTester :
https://symfony.com/doc/current/console.html#testing-commands
Solution for PHPUnit test :
use Symfony\Component\Lock\Factory;
use Symfony\Component\Lock\Store\SemaphoreStore;
public function testLockIsSet()
{
// Create a new Semaphore lock with the same ID as the one that would be
// created if you were running the command / class / process etc.
$store = new SemaphoreStore();
$factory = new Factory($store);
$lock = $factory->createLock('lock-name-used-eg-generate-pdf');
if ($lock->acquire()) {
// In my use case I was running multiple commands to see if the lock
// was working properly
$commandTester = new CommandTester($this->command);
// Try and run the command. The lock should already be set.
$commandTester->execute(
[
'command' => $this->command->getName()
]
);
// You could also use expectException() here for LogicException
$this->assertContains(
'The command is already running in locked mode.',
$commandTester->getDisplay()
);
$lock->release();
}
}

Laravel 5.3 changing logfiles for specific console commands

There are two noisy console commands in my Laravel 5.3 app that I want to keep logs for but would prefer to have them write to a different log file from the rest of the system.
Currently my app writes logs to a file configured in bootstrap/app.php using $app->configureMonologUsing(function($monolog) { ...
Second prize is writing all console commands to another log file, but ideally just these two.
I tried following these instructions (https://blog.muya.co.ke/configure-custom-logging-in-laravel-5/ and https://laracasts.com/discuss/channels/general-discussion/advance-logging-with-laravel-and-monolog) to reroute all console logs to another file but it did not work and just caused weird issues in the rest of the code.
If this is still the preferred method in 5.3 then I will keep trying, but was wondering if there was newer method or a method to only change the file for those two console commands.
They are two approaches you could take
First, you could use Log::useFiles or Log::useDailyFiles like suggests here.
Log::useDailyFiles(storage_path().'/logs/name-of-log.log');
Log::info([info to log]);
The downside of this approach is that everything will still be log in your default log file because the default Monolog is executed before your code.
Second, to avoid to have everything in your default log, you could overwrite the default logging class. An exemple of this is given here. You could have a specific log file for let's say Log::info() and all the others logs could be written in your default file. The obvious downside of this approach is that it requires more work and code maintenance.
This is possible but first you need to remove existing handlers.
Monolog already has had some logging handlers set, so you need to get rid of those with $monolog->popHandler();. Then using Wistar's suggestion a simple way of adding a new log is with $log->useFiles('/var/log/nginx/ds.console.log', $level='info');.
public function fire (Writer $log)
{
$monolog = $log->getMonolog();
$monolog->popHandler();
$log->useFiles('/var/log/nginx/ds.console.log', $level='info');
$log->useFiles('/var/log/nginx/ds.console.log', $level='error');
...
For multiple handlers
If you have more than one log handler set (if for example you are using Sentry) you may need to pop more than one before the handlers are clear. If you want to keep a handler, you need to loop through all of them and then readd the ones you wanted to keep.
$monolog->popHandler() will throw an exception if you try to pop a non-existant handler so you have to jump through hoops to get it working.
public function fire (Writer $log)
{
$monolog = $log->getMonolog();
$handlers = $monolog->getHandlers();
$numberOfHandlers = count($handlers);
$saveHandlers = [];
for ($idx=0; $idx<$numberOfHandlers; $idx++)
{
$handler = $monolog->popHandler();
if (get_class($handler) !== 'Monolog\Handler\StreamHandler')
{
$saveHandlers[] = $handler;
}
}
foreach ($saveHandlers as $handler)
{
$monolog->pushHandler($handler);
}
$log->useFiles('/var/log/nginx/ds.console.log', $level='info');
$log->useFiles('/var/log/nginx/ds.console.log', $level='error');
...
For more control over the log file, instead of $log->useFiles() you can use something like this:
$logStreamHandler = new \Monolog\Handler\StreamHandler('/var/log/nginx/ds.console.log');
$pid = getmypid();
$logFormat = "%datetime% $pid [%level_name%]: %message%\n";
$formatter = new \Monolog\Formatter\LineFormatter($logFormat, null, true);
$logStreamHandler->setFormatter($formatter);
$monolog->pushHandler($logStreamHandler);

Calling console command security:check from controller action produces Lock file not found response

(Symfony3)
I'm toying with the idea of setting up some simple cron tasks to generate security reports for our project managers so that they can schedule upgrade time for developers (vs. me forgetting to run them manually).
As a very basic check, I'll simply run...
php bin/console security:check
...to see what composer has to say about vulnerabilities. Ultimately I'd like to roll this output into an email or post it to a slack channel or basecamp job when the cron is run.
Problem
When I run the command from via terminal it works great. Running the command inside a controller always returns the response Lock file does not exist. I'm assuming this in reference to the composer.lock file at the root of the project. I can confirm that this file does in fact exist.
Following is the controller I'm currently using, which is adapted from this:
http://symfony.com/doc/current/console/command_in_controller.html
<?php
namespace Treetop1500\SecurityReportBundle\Controller;
use Symfony\Bundle\FrameworkBundle\Controller\Controller;
use Symfony\Bundle\FrameworkBundle\Console\Application;
use Symfony\Component\Console\Input\ArrayInput;
use Symfony\Component\Console\Output\BufferedOutput;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\HttpKernel\Exception\UnauthorizedHttpException;
class DefaultController extends Controller
{
public function indexAction($key)
{
if ($key != $this->getParameter('easy_cron_key')) {
throw new UnauthorizedHttpException("You are not authorized to access this page.");
}
$kernel = $this->get('kernel');
$application = new Application($kernel);
$application->setAutoExit(false);
$input = new ArrayInput(array(
'command' => 'security:check'
));
// You can use NullOutput() if you don't need the output
$output = new BufferedOutput();
$application->run($input, $output);
// return the output, don't use if you used NullOutput()
$content = $output->fetch();
// return new Response(""), if you used NullOutput()
return new Response($content);
}
}
$content always has the value "Lock file does not exist."
I realize there are probably better tools and ways to do this, however I would really like to understand why this is the generated response from in this controller action. Thank you for taking a look!
Pass absolute path to composer.lock file just like that:
php bin/console security:check /path/to/another/composer.lock
So in your example, that's would be:
$input = new ArrayInput([
'command' => 'security:check',
'lockfile' => '/path/to/another/composer.lock'
]);
Read more: SecurityCheckerCommand from SensioLabs. Optional argument is lockfile, which is checked by SecurityChecker. On line 46, they are looking for composer.lock file (default argument) and throw an exception, when they not found.
P.S. Earlier, I type the wrong parameters to array. I checked in Symfony documentation (How to Call Other Commands) and fixed the answer.
The solution to this is to pass the lockfile argument to the ArrayInput object like this:
$lockfile = $this->get('kernel')->getRootDir()."/../composer.lock";
$input = new ArrayInput(array('command'=>'security:check','lockfile'=>$lockfile));

"Nested" Tasks called within the Action in Symfony 1.4 giving error

I have a bunch of symfony tasks to send different kind of emails. For example, I have a sendMailConfirmationTask, sendMailAlertContactTask, sendMailBlogTask etc...
My goal is to have one "master" class : sendMailBaseTask, that will, based on the correct parameter, execute the right task.
So as an example, when I execute "php symfony sendMail:base --object=confirmation", it creates an instance of sendMailBaseTask, and then, the following line makes the call to the right task :
$this->runTask($prefix . $task_name, array(), $options); // $prefix . $task_name = "sendMail:confirmation" for example
Through the CLI, both methods works fine, I can trigger my confirmation email these ways :
php symfony sendMail:base --object=confirmation --to=some#email.com
php symfony sendMailConfirmation --to=some#email.com
Where it becomes tricky is when I want to run my task within an sfAction. I'd like to run my task whenever someone registers onto my application for example. So here's the piece of code I've tried without luck :
chdir(sfConfig::get('sf_root_dir'));
$task = new sendMailBaseTask($configuration->getEventDispatcher(), new sfFormatter());
$rc = $task->run(array(), array('object' => 'confirmation', 'to' => $email, 'hash' => $hash));
chdir($current_dir);
This gives the following error : "Unable to create a task as no command application is associated with this task yet."
But, if instead of creating a sendMailBaseTask instance I create a sendMailConfirmationTask, it works fine.
I could do it that way, but that's not the way I want it working, so if anyone has a clue... Thanks!
It seems that you need to launch your task in a different way.
If we take a look at how doctrine handle this case with the doctrine:build task. It does that:
if (self::BUILD_MODEL == (self::BUILD_MODEL & $mode))
{
$task = new sfDoctrineBuildModelTask($this->dispatcher, $this->formatter);
$task->setCommandApplication($this->commandApplication);
$task->setConfiguration($this->configuration);
$ret = $task->run();
if ($ret)
{
return $ret;
}
}
This is launched when you type:
php symfony doctrine:build --model
This means, your sendMailBaseTask shouldn't use runTask but something like that:
$task = new sendMailConfirmationTask($this->dispatcher, $this->formatter);
$task->setCommandApplication($this->commandApplication);
$task->setConfiguration($this->configuration);
$ret = $task->run(array(), $options);

autoloader when executing php from linux bash

im currently working on some sort of upload with automatic video conversion. At the moment i am executing a php script via php shell command after the upload is finished so the user doesn't have to wait until the conversion is completed. Like so:
protected function _runConversionScript() {
if (!exec("php -f '" . $this->_conversionScript . "' > /dev/null &"))
return true;
return false;
}
Now in my conversion script file i am using functions from another class "UploadFunctions" to update the status in the database (like started, converted, finished...). The problem there is though that this UploadFunctions class inherits from another class "Controller" where for example the database connection gets established. Currently i am using spl_autoloader to search specific directories for the files needed (for example controller.php), but because the conversion script is out of context with the whole autoloader stuff it doesn't recognize the Controller class and throws an fatal php error.
Here is some code from the conversion script:
require_once('uploadfunctions.php');
$upload_func = new UploadFunctions();
// we want to make sure we only process videos that haven't already
// been or are being processed
$where = array(
'status' => 'queued'
);
$videos = $upload_func->getVideos($where);
foreach ($videos as $video) {
// update database to show that these videos are being processed
$update = array(
'id' => $video['id'],
'status' => 'started'
);
// execute update
$upload_func->updateVideo($update);
.........
Am i doing this completly wrong or is there a better way to accomplish this? If you need more code or information please let me know!
Thanks a lot
Here is my spl_autoload code:
<?php
spl_autoload_register('autoloader');
function autoloader($class_name) {
$class_name = strtolower($class_name);
$pos = strpos($class_name ,'twig');
if($pos !== false){
return false;
}
$possibilities = array(
'..'.DIRECTORY_SEPARATOR.'globals'.DIRECTORY_SEPARATOR.$class_name.'.php',
'controller'.DIRECTORY_SEPARATOR.$class_name.'.php',
'..'.DIRECTORY_SEPARATOR.'libs'.DIRECTORY_SEPARATOR.$class_name.'.php',
'local'.DIRECTORY_SEPARATOR.$class_name.'.php'
);
foreach ($possibilities as $file) {
if(class_exists($class_name) != true) {
if (file_exists($file)) {
include_once($file);
}
}
}
}
?>
I have my project divided into subfolders wich represent the functionality, for example upload, myaccount and gallery.. in every subfolder there are also 2 other folders: controller and local. Controller is the class controlling this part (upload for example) and local is the folder where i am putting the local classes wich are needed. The controller class gets called from the index.php wich is located in the sub-project folder. "libs" and "global" are just projectwide classes, like database, user and so on.
This is an example of my folder structure:
www/index.php // main site
www/upload/index.php // calls the controller for upload and initializes the spl_autoload
www/upload/controller/indexcontroller.php // functionality for the upload
www/upload/local/processVideo.php // this is the conversion script.
I am fairly new to spl_autoload function. In my opinion the spl_autoload is not getting called if my script is calling: "php -f processVideo.php", isn't it?
PHP relative paths are calculated from the path where PHP binary is called.
I suggest you to use __DIR__ constant to avoid that behavior
http://php.net/manual/en/language.constants.predefined.php
I was actually able to resolve the issue. I had to include the spl_autoload_register function inside the conversion script so that it was able to locate the files. This was an issue because the conversion script is not build into my framework an so it isn't able to load the classes from the framework autoloader.

Categories