I'm using Symfony 2.7
Instead of using doctrine, i've build a service which do all raw sql statement and is called RawSQLManager.
Randomly, when i pull the same URL multiple time, i've a critical error in my dev.log file. I don't understand how it can be possible.
At the beginning of my project i've build this function :
public function getFolderInfo($id_folder)
{
$sql = 'SELECT zs.ID, zs.Nom, zs.Principale, zs.Archive, zs.DateHeureModification
FROM Zones_Stockages zs
WHERE zs.ID = :id_folder;';
$params['id_folder'] = $id_folder;
$stmt = $this->conn->prepare($sql);
$stmt->execute($params);
return $stmt->fetch();
}
Now, i don't use this function, she's deleted and no one use this function in my whole project.
But here my log :
[2016-01-05 10:16:09] doctrine.DEBUG: SELECT t0.Principale AS Principale_1, t0.Archive AS Archive_2, t0.Nom AS Nom_3, t0.DateHeureModification AS DateHeureModification_4, t0.ID AS ID_5, t0.Abonnements_ID AS Abonnements_ID_6 FROM Zones_Stockages t0 WHERE t0.ID = ? ["7385"] []
as you can see, that's the exact same statement and nowhere in my project there is a similar statement.
[EDIT] : There is no Abonnements_ID in my initial statement.
I've cleared my cache with the following :
app/console doctrine:cache:clear-metadata
app/console doctrine:cache:clear-query
app/console doctrine:cache:clear-result
even delete my cache folder. This statement still be called.
And this weird call is called at every request i do on the server and sometimes is followed by a critical error onto my Abonnements Entity, but i don't understand what does this have to do with the previous call and how to start debugging what's wrong with this Entity.
[2016-01-05 10:16:09] request.CRITICAL: Uncaught PHP Exception Symfony\Component\Debug\Exception\ContextErrorException: "Warning: rename(C:\wamp\www\app\cache\dev/doctrine/orm/Proxies\__CG__AppBundleEntityAbonnements.php.568b89d923b5a9.13683491,C:\wamp\www\app\cache\dev/doctrine/orm/Proxies\__CG__AppBundleEntityAbonnements.php): " at C:\wamp\www\vendor\doctrine\common\lib\Doctrine\Common\Proxy\ProxyGenerator.php line 306 {"exception":"[object] (Symfony\\Component\\Debug\\Exception\\ContextErrorException(code: 0): Warning: rename(C:\\wamp\\www\\app\\cache\\dev/doctrine/orm/Proxies\\__CG__AppBundleEntityAbonnements.php.568b89d923b5a9.13683491,C:\\wamp\\www\\app\\cache\\dev/doctrine/orm/Proxies\\__CG__AppBundleEntityAbonnements.php): at C:\\wamp\\www\\vendor\\doctrine\\common\\lib\\Doctrine\\Common\\Proxy\\ProxyGenerator.php:306)"} []
So two issues in a row :
A call to a ghost statement.
A critical error that append randomly on the same url, with same parameters.
What did i miss please?
About the "ghost statement", the reason can only be: the code is somehow still present in the application ... or some cache is in place (for example Opcache or something like that in the web server).
About the "critical error", the line of code that triggers it is rename($tmpFileName, $fileName); inside the code that creates Doctrine proxies for entities. Since this is just a file rename, this could be a permission issue in your cache directory.
The problem is that the Doctrine's proxy class generation code doesn't handle concurrent requests very well. It works on Unix-like systems, but not on Windows, where you can't just rename over a file that is open.
See configuration of the doctrine bundle. You'll most like have auto_generate_proxy_classes set to "%kernel.debug%" (this is the default setting in symfony standard edition).
Try changing auto_generate_proxy_classes to false. You'll now have to manually clear the cache if you change your entities, but that error should be gone.
Related
I create a service inside my module with the name of an existing core service (prestashop.adapter.data_provider.product). It successfully replaces it as seen in php ./bin/console debug:container output.
In yaml:
prestashop.adapter.data_provider.product:
class: PrestaShop\Module\MyModule\Adapter\Product\ProductDataProvider
The problem now is that I have 500 errors in some pages in BO. These errors are type errors like:
Type error: Argument 4 passed to
PrestaShopBundle\Model\Product\AdminModelAdapter::__construct() must
be an instance of
PrestaShop\PrestaShop\Adapter\Product\ProductDataProvider, instance of
PrestaShop\Module\MyModule\Adapter\Product\ProductDataProvider given,
called in....
I understand now that, whenever a constructor has an argument of type ProductDataProvider, the app tries to load my service but find a differentuse statement in the class (i.e in PrestaShopBundle\Model\Product\AdminModelAdapter)
These errors can be fixed by replacing the use statement in each file containing the problem, but as you may know, touching core files must be avoided.
Is there a way of overriding an existing service, but also make the override work all across the app "bypassing the use statements of the old service".
Using Laravel 5.5, I have a listener called JobEventSubscriber that is using a database queue. It has a method called uploadFileToPartner that is triggered whenever a JobFilesUploaded event is fired.
Here is the code of my subscribe method:
public function subscribe($events){
$events->listen(JobSaved::class, JobEventSubscriber::class . '#syncToCrm');
$events->listen(JobFilesUploaded::class, JobEventSubscriber::class . '#uploadFileToPartner');
}
Whenever either of these events fire, the database Listener fails with the following error:
ErrorException: call_user_func_array() expects parameter 1 to be a valid callback,
class 'App\Listeners\JobEventSubscriber' does not have a method 'uploadFileToPartner'
in /var/www/html/vendor/laravel/framework/src/Illuminate/Events/CallQueuedListener.php:79
When I change my queue_driver to sync it works. I also went into Tinker and typed:
>>> use App\Listeners\JobEventSubscriber
>>> $eventSubscriber = app(JobEventSubscriber::class);
=> App\Listeners\JobEventSubscriber {#879
+connection: "database",
}
>>> method_exists($eventSubscriber, 'uploadFileToPartner');
=> true
What is wrong here where it cannot find methods that are definitely there.
It may be relevant to mention that I recently updated this app from Larvel 5.4.
Upon reading the docs it says that if you change your code you need to restart your queue process.
Specifically it says:
Remember, queue workers are long-lived processes and store the booted application state in memory. As a result, they will not notice changes in your code base after they have been started. So, during your deployment process, be sure to restart your queue workers.
I changed the name of the methods, and then I didn't restart the queue. So the queue was receiving events with the new names but it was executing old code. As a result the method names weren't recognized.
I'm creating a console command for my bundle with Symfony 2. This command execute several request to database (Mysql). In order to debug my command I need to know how much SQL query has been executed during the command execution. And if it's possible, show these requests (like the Symfony profiler do)
I have the same problem with AJAX requests. When I make an AJAX request, I can't know how much query have been executed during the request.
You can enable the doctrine logging like :
$doctrine = $this->get('doctrine');
$doctrine = $this->getDoctrine();
$em = $doctrine->getConnection();
// $doctrine->getManager() did not work for me
// (resulted in $stack->queries being empty array)
$stack = new \Doctrine\DBAL\Logging\DebugStack();
$em->getConfiguration()->setSQLLogger($stack);
... // do some queries
var_dump($stack->queries);
You can go to see that : http://vvv.tobiassjosten.net/symfony/logging-doctrine-queries-in-symfony2/
To return to Cesar what Cesar own. I find it here : Count queries to database in Doctrine2
You can put all this logic into domain model and treat command only as an invoker. Then you can use the same domain model with controller using www and profiler to diagnose.
Second thing is that you should have integration test for this and you can verify execution time with this test.
I'm writing unit tests for a Symfony 2 / Doctrine 2 application and I'm encountering what looks like a concurrency issue.
The code looks like this:
$obj = new Obj();
$obj->setName('test');
... etc ...
$em->persist($obj);
$em->flush();
...
$qb = $em->getRepository('Obj');
// Select object using DQL
$this->assertTrue($obj !== null);
At this point $obj is often null. I.e. the DQL query has failed to find it. If I add breakpoint and pause execution somewhere before executing the DQL, $obj is always found. If not it's usually not found but occasionally it is found.
I have tried wrapping the insertion in a transaction:
$em->getConnection()->beginTransaction();
$obj = new Obj();
$obj->setName('test');
... etc ...
$em->persist($obj);
$em->flush();
$em->getConnection()->commit();
This doesn't seem to help.
I have tried adding a pause between the insert and the DQL query:
sleep(1);
This results in the expected behaviour consistently. Thus my conclusion that this is a concurrency issue. Or at least something to do with Doctrine not immediately writing through on flush.
Is there any way with Doctrine 2 to force a write to the database to complete? Or an event to listen for? Or am I doing something else wrong here?
Having followed zerkms' advice to enable logging I quickly tracked the problem down to a fault in my own code. There was no concurrency issue.
On my Mac I did the following:
Edit: /usr/local/etc/my.cnf
Add these lines:
general_log_file = /tmp/query.log
general_log = 1
Then:
brew services restart apache2
Next I tailed the file:
tail -f /tmp/query.log
I saw the two queries - an insert and a select - executing in the right order. Running those queries directly in MySQL client revealed the error in my code.
Note: putting the log into /tmp/ ensures the logs are deleted everytime I log out.
On a project i am working we use Symfony2 console commands to run image converting (using LaTeX and some imagick). Due to the nature of project, not all conditions may be met during the console command run so the execution will fail, to be later restarted with a cron job, only if attempts count is not higher that predefined limit.
We already hove logging in our project, we use Monolog logger. What i basically want is to somehow duplicate everything that goes to the main log file in another log file, created specifically for that console command execution and only if attempts limit is reached.
So, if we run command once and it fails - it's ok and nothing should be created.
But if we run command for the 10th time, which is attempt limit, i want to have a separate log file named, say '/logs/failed_commands//fail.log'. That log file should only have messages for the last failed attempt, but not for all the previous ones.
How to do that? Do i need some combination of special logger handler (like FingersCrossed) and proper exceptions handling? Should i rather create additional instance of logger (if so, how can i pass it over to dependent services?)
This is simplified and cleaned piece of command that runs images converting. The attempts limit is checked withing the $this->checkProcessLimit() method
public function execute(InputInterface $input, OutputInterface $output)
{
try {
set_time_limit(0); //loose any time restrictions
$this->checkingDirectories();
$this->checkProcessLimit();
$this->isBuildRunning();
$this->checkingFiles();
try {
$this->startPdfBuilding();
} catch (InternalProjectException $e) {
throw PdfBuildingException::failedStartBuilding($this->pressSheet->getId());
}
} catch (PdfBuildingException $e) {
$this->printError($output, $e);
return;
}
try {
$this->logger->info('Building Image.');
$this->instantiatePdfBuilder();
$buildingResult = $this->pdfBuilder->outputPdf();
$this->generatePreview($buildingResult);
$this->movePDFs($buildingResult);
$this->unlinkLockFile();
$output->writeln('<info>Image successfully built</info>');
} catch (LaTeXException $e) {
$this->unlinkLockFile();
$this->abortPdfBuilding($e->getMessage());
$this->printError($output, $e);
return;
}
}
UPD: It seems that for dumping a bunch of log entries i need to use BufferHandler bundled with Monolog Logger. But i still need to figure out the way to set it up to get dumps only when errors limit (not error level) reached.
UPD2: I've managed to make it work, but i don't like the solution.
Since in Symfony2 you have to define loggers in config.yml and have to rebuild cache for any changes in configuration, i had to resort to dynamically adding a handler to a logger. But the logger itself is considered to be of Psr\Log\LoggerInterface interface, which does not have any means to add handlers. The solution i had to use actually checks if used logger is an instance of Monolog\Logger and then manually adding a BufferHandler to it in Symfony2 Console command's initialize() method.
Then, when it comes to the point where I check for attempts limit, i close buffer handler and delete actual log file (since BufferHandler has no means to removing/closing itself without flushing all it's contents) if limit is not yet reached. If it is, i just let the log file to stay.
This way it works, but it always writes the log, and i have to remove logs if condition (reached attempt limit) is not met.
i think you must create a custom handler.
With Monolog, you can log in a database (see for example https://github.com/Seldaek/monolog/blob/master/doc/04-extending.md)
Thus, it's easy to know how many times an error was raised since x days.
(something like : "select count(*) from monolog where channel='...' and time>...")