Symfony 4 PHPunit, not truncating tables in SQLite - php

I am trying to setup a simple test in PPHunit in my Symfony 4.4 app, but it seems that the test database (SQLite) is not truncating.
So it will work the first time you run it, but not after that as the auto increment column id is not resetting.
It's strange as this was working for me on another project until very recently:
// tests/BaseWebTestCase.php
protected function tearDown(): void
{
parent::tearDown();
$purger = new ORMPurger($this->em);
$purger->setPurgeMode(ORMPurger::PURGE_MODE_TRUNCATE);
$purger->purge();
}
But maybe a recent update has meant it doesn't work anymore. How can I update this so that the tables are truncated every time my test is run?
When I run the test I get the following error:
There was 1 error:
1) App\Tests\Controller\BlogPostControllerTest::testAUserCanAccessBlogPost
Error: Call to a member function getId() on null
/home/vagrant/code/tests/Controller/BlogPostControllerTest.php:26
ERRORS!
Tests: 1, Assertions: 0, Errors: 1.
This is my test:
// tests/Controller/BlogPostControllerTest.php
public function testAUserCanAccessBlogPost()
{
$blogPost = $this->em
->getRepository(BlogPost::class)
->findOneBy([
'id' => rand(1, 20)
]);
$this->client->request('GET', '/' . $blogPost->getId());
$this->assertEquals(200, $this->client->getResponse()->getStatusCode());
}
And the fixtures function:
public function load(ObjectManager $manager)
{
$faker = Factory::create('en_GB');
// Create blog posts
for ($i = 0; $i < 20; $i++) {
$blogPost = new BlogPost();
$blogPost->setTitle($faker->sentence(3));
$blogPost->setBody($faker->paragraph);
$manager->persist($blogPost);
}
$manager->flush();
}
If I add a die statement to the test I am able to login to the SQLite database and I can see that the ids for the posts are incrementing and not resetting to start with 1, which is causing the error.
UPDATE
I am now able to get this working if I clear the cache:
sudo rm -rf var/cache
The test will work once, but fail if I run a second time, hence the original issue remains.

Related

Laravel 9 multiple tag flush not working when flushing individual tags

I've noticed an issue with my Laravel application, and i'm not quite sure if i'm just doing something stupidly wrong, or there is a genuine issue.
So i'm storing and fetching my cached data with multiple tags on my App\Models\User\User model like below
public function getSetting(String|Bool $key = false) : Mixed {
//Try
try {
return cache()->tags(["user_{$this->id}_cache", 'user_settings'])->remember("user_{$this->id}_settings", now()->addDays(1), function(){
return $this->settings()->pluck('value', 'key')->toArray();
});
} catch(\Exception $e){
//Logging errors here
}
}
This function simply grabs all of the users settings and returns an array.
I am using 2 cache tags because I want to cover both scenarios
The ability to be able to remove all cached items for a specific model (User)
The ability to be able to remove a specific type of cache across all models (Users)
The Laravel cache documentation simply states to pass the tag (or tags as an array) that you want to remove.
So my thinking is that if I want to clear user settings cache for all users, I should be able to run the following
cache()->tags('user_settings')->flush();
and if I want to remove all cache for a specific user, I should be able to run
cache()->tags('user_1_cache')->flush();
But for some reason, only the second example (using user_1_cache) works? If I run the first example and try to clear all cache with the tags user_settings, the function returns true but does not clear the cache?
Am I doing something stupidly wrong or just completely misunderstanding how the cache tags work?
Versions
PHP - 8.1
Laravel - 9.3.8
Cache driver - Redis
I reproduced your scenario here. It's working as stated in the docs.
class User extends Model
{
protected $fillable = ['id', 'name'];
public function cacheSettings()
{
return cache()->tags([$this->getUserCacheKey(), 'user_settings'])->remember("{$this->id}_settings", now()->addDay(), function () {
return $this->only('name');
});
}
public function getSettings()
{
return cache()->tags([$this->getUserCacheKey(), 'user_settings'])->get("{$this->id}_settings");
}
public function getUserCacheKey()
{
return "user_{$this->id}_cache";
}
}
These tests run with no problem:
public function test_cache_flush_all_users()
{
Cache::clear();
$alice = new User(['id' => 1, 'name' => 'alice']);
$john = new User(['id' => 2, 'name' => 'john']);
$alice->cacheSettings();
$john->cacheSettings();
Cache::tags('user_settings')->flush();
// both deleted
$this->assertNull($alice->getSettings());
$this->assertNull($john->getSettings());
}
public function test_cache_flush_specific_user()
{
Cache::clear();
$alice = new User(['id' => 1, 'name' => 'alice']);
$john = new User(['id' => 2, 'name' => 'john']);
$alice->cacheSettings();
$john->cacheSettings();
Cache::tags($alice->getUserCacheKey())->flush();
// only alice deleted
$this->assertNull($alice->getSettings());
$this->assertNotNull($john->getSettings());
}
Not having all details of your implementation, perhaps you can figure out what is causing the issue.

Laravel Excel queued export failing

I have been having a lot of trouble getting the Laravel Excel package to export a large amount of data. I need to export about 80-100k rows so I implemented the queued export as mentioned in the docs. It works fine when I export a smaller amount of rows, but when I try to do 60-80k rows, it fails every time. While the jobs are being processed, I watch the temp file that is created, and I can see that the size of the file is increasing. I also watch the jobs in the database (I'm using the database queue driver), and I can see the jobs completing for a while. It seems that the jobs take incremently more time until the job fails. I don't get why the first several jobs are quick, and then they start taking more and more time to complete.
I'm using supervisor to manage the queue, so here's my config for that:
[program:laravel-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/html/site/artisan queue:work --sleep=3 --tries=3 --timeout=120 --queue=exports,default
autostart=true
autorestart=true
user=www-data
numprocs=8
redirect_stderr=true
stdout_logfile=/var/log/supervisor/worker.log
loglevel=debug
And then my controller to create the export
(new NewExport($client, $year))->queue('public/exports/' . $name)->allOnQueue('exports')->chain([
new NotifyUserOfCompletedExport($request->user(), $name),
]);
I'm using:
Laravel 5.8,
PHP 7.2,
Postgresql 10.10
I should also mention that I have played around with the chunk size a bit, but in the end I've always run into the same problem. I tried chunk sizes of 500, 2000, 10000 but no luck.
In the failed_jobs table, the exception is MaxAttemptsExceededException, although I have also got exceptions for InvalidArgumentException File '/path/to/temp/file' does not exist. I'm not quite sure what else to do. I guess I could make it so it doesn't timeout, but that seems like it will just cause more problems. Any help would be appreciated.
EDIT
Here is the content of my Export Class:
class NewExport implements FromQuery, WithHeadings, WithMapping, WithStrictNullComparison
{
public function __construct($client, $year)
{
$this->year = $year;
$this->client = $client;
}
public function query()
{
$data = $this->getDataQuery();
return $data ;
}
public function headings(): array
{
$columns = [
//....
];
return $columns;
}
public function map($row): array
{
$mapping = [];
foreach($row as $key => $value) {
if(is_bool($value)) {
if($value) {
$mapping[$key] = "Yes";
} else {
$mapping[$key] = "No";
}
}else{
$mapping[$key] = $value;
}
}
return $mapping;
}
private function getDataQuery()
{
$query = \DB::table('my_table')->orderBy('my_field');
return $query;
}
The NotifyUserOfCompletedExport class is just creating a job to email the logged in user that the export is finished with a link to download it.
class NotifyUserOfCompletedExport implements ShouldQueue
{
use Queueable, SerializesModels;
public $user;
public $filename;
public function __construct(User $user, $filename)
{
$this->user = $user;
$this->filename = $filename;
}
public function handle()
{
// This just sends the email
$this->user->notify(new ExportReady($this->filename, $this->user));
}
}
EDIT 2:
So I read this post, and I verified that eventually my server was just running out of memory. That lead to the MaxAttemptsExceededException error. I added more memory to the server, and I am still getting the InvalidArgumentException File '/path/to/temp/file' does not exist after the jobs have completed. It's even more weird though, because I can see that /path/to/temp/file actually does exist. So I have no idea what is going on here, but it's super frustrating.

Symfony MongoDb can't retrieve new field values

I have added new entry in Document description
/**
* #MongoDB\Field(type="string")
*/
protected $city;
Then let Doctrine generate entities. Now newly created records have new field "city" with values as expected. However I can see these values only in mongo console. In Doctrine output they are allways set to "null". The entity entries seems correct
public function getFirstName()
{
return $this->firstName;
}
/**
* Get city
*
* #return string $city
*/
public function getCity()
{
return $this->city;
}
I have repository
public function allQuery($cat)
{
$q = $this->createQueryBuilder()
->sort('createdAt', 'DESC');
if ($cat) {
$q->field('category.$id')->equals(new \MongoId($cat));
}
return $q;
}
And service
function addAllPager($perPage = 10, $cat)
{
return $this->_addPager($this->repo()->allQuery($cat), $perPage);
}
In Controller
$helper = $this->get('appbundle.test.helper');
$tests = $helper->addAllPager(10, $cat);
Symfony profiler shows me query db.Test.find().sort({ "createdAt": -1 }).limit(10).skip(0). Dumped Contents of $tests
#firstName: "John"
#city: null
What I am missing?
EDIT
Cache clearing with php bin/console cache:clear solved the problem.
php bin/console doctrine:mongodb:cache:clear-metadata was not enough. Thank you malarzm.
I know this is 8 months after the question has been asked but had the same issue and fought with doctrine for a while. I am using Symfony 3 and I tried php bin/console doctrine:mongodb:cache:clear-metadata with no luck.
I finally ran the command php bin/console cache:clear or just delete the cache with this command sudo rm -rf var/cache and that fixed the issue.

How to test Doctrine Migrations?

I'm working on a project that does NOT have a copy of production DB on development environment.
Sometimes we have an issue with DB migrations - they pass on dev DB but fail in production/testing.
It's often beacuse Dev environent data is loaded from Fixtures that use the latest entities - filling all tables properly.
Is there any easy way to make sure Doctrine Migration(s) will pass in production?
Do you have/know any way to write an automatic tests that will make sure data will be migrated properly without downloading the production/testing DB and running the migration manually?
I would like to avoid downloading a production/testing DB to dev machine so I can check migrations becasue that DB contains private data and it can be quite big.
First, you need to create a sample database dump in state before the migration. For MySQL use mysqldump. For postgres pg_dump, e.g.:
mysqldump -u root -p mydatabase > dump-2018-02-20.sql
pg_dump -Upostgres --inserts --encoding utf8 -f dump-2018-02-20.sql mydatabase
Then create an abstract class for all migrations tests (I assume you have configured a separate database for integration testing in config_test.yml):
abstract class DatabaseMigrationTestCase extends WebTestCase {
/** #var ResettableContainerInterface */
protected $container;
/** #var Application */
private $application;
protected function setUp() {
$this->container = self::createClient()->getContainer();
$kernel = $this->container->get('kernel');
$this->application = new Application($kernel);
$this->application->setAutoExit(false);
$this->application->setCatchExceptions(false);
$em = $this->container->get(EntityManagerInterface::class);
$this->executeCommand('doctrine:schema:drop --force');
$em->getConnection()->exec('DROP TABLE IF EXISTS public.migration_versions');
}
protected function loadDump(string $name) {
$em = $this->container->get(EntityManagerInterface::class);
$em->getConnection()->exec(file_get_contents(__DIR__ . '/dumps/dump-' . $name . '.sql'));
}
protected function executeCommand(string $command): string {
$input = new StringInput("$command --env=test");
$output = new BufferedOutput();
$input->setInteractive(false);
$returnCode = $this->application->run($input, $output);
if ($returnCode != 0) {
throw new \RuntimeException('Failed to execute command. ' . $output->fetch());
}
return $output->fetch();
}
protected function migrate(string $toVersion = '') {
$this->executeCommand('doctrine:migrations:migrate ' . $toVersion);
}
}
Example migration test:
class Version20180222232445_MyMigrationTest extends DatabaseMigrationTestCase {
/** #before */
public function prepare() {
$this->loadDump('2018-02-20');
$this->migrate('20180222232445');
}
public function testMigratedSomeData() {
$em = $this->container->get(EntityManagerInterface::class);
$someRow = $em->getConnection()->executeQuery('SELECT * FROM myTable WHERE id = 1')->fetch();
$this->assertEquals(1, $someRow['id']);
// check other stuff if it has been migrated correctly
}
}
I've figured out simple "smoke tests" for Doctrine Migrations.
I have PHPUnit test perfoming following steps:
Drop test DB
Create test DB
Load migrations (create schema)
Load fixtures (imitate production data)
Migrate to some older version
Migrate back to the latest version
This way I can test for the major issues, we've had recently.
Example of PHPUnit tests can be found on my blog: http://damiansromek.pl/2015/09/29/how-to-test-doctrine-migrations/

facing an obstacle in Fixtures creation in the Simpletest

I am beginner to Simpletest and facing an issue while creating fixtures. As I am using cakephp 1.3.14 version for my application.
Created fixture with filename complaint_fixture.php
class ComplaintFixture extends CakeTestFixture {
var $name = 'Complaint';
var $import = array('table' => 'complaints', 'records' => true);
// do not truncate movie_stars table between tests
public function truncate($db) {
return null;
}
// do not drop movie_stars table between tests
public function drop($db) {
return null;
}
}
Created test case with name complaint.test.php
App::import('Model', 'Complaint');
class ComplaintTestCase extends CakeTestCase {
var $fixtures = array('app.Complaint');
function setUp($method) {
parent::setUp();
$this->Complaint = & ClassRegistry::init('Complaint');
// load data
$this->loadFixtures('Complaint');
}
function testFixture() {
$numberOfResults = $this->Complaint->find('count');
var_dump($numberOfResults);
}
/*
function testupdateComplaintStatus(){
$result = $this->Complaint->updateComplaintStatus(47,'ACT');
$this->assertEqual($result,1,'Status updated successfully!');
} */
}
As you can see in the above code, a fixture is created with name Complaint and then a test case is being used to load that fixture. So, what I have read on it from developer guide
- we do create a fixture with specifying the fields name and a records set
- load that fixture in test model class.
BUT, what I am looking for is to perform CRUD operations on test data which is being inserted into the test database. And, when I try to do the same with above given script, It starts affecting the production database records instead of test database.
If you see in the above code I have even stopped truncate and drop for test data, yet not able to sort out the issue.
Can anyone let me know what I have missed in the above code?

Categories