Use fixtures in integration tests - php

I found that CakePhp 3 use real db connection in integration tests, so line like $this->post('articles/delete/1') delete the real db row even if fixtures provided. Is it possible to use only data provided by fixtures (test connection)? Model tests are working normally with fixtures using test connection.

CakePHP performs the following during the course of a fixture based test case:
1. Creates tables for each of the fixtures needed.
2. Populates tables with data, if data is provided in fixture.
3. Runs test methods.
4. Empties the fixture tables.
5. Removes fixture tables from database.
So as mentioned in cake's documentation cakephp empties the fixture tables and no matter either it delete records or save records or do any other operation.

I found the reason for my problem.
In my bootstrap.php I had event listener binded to app events. In listener's constructor I initialized properties with models objects using TableRegistry::get().
The reason was that TableRegistry will remember connection to use on first call so that then ConnectionManager::alias() will have no effect. As I had TableRegistry initialization in my event's constructor it was called before main app loaded and fixtureManager added connection aliases.
So never call TableRegistry before app loaded or it may cause problems.

Related

Unique error persisting entity updates to db with Doctrine

I'm using Doctrine in my Symfony project to manage the persisting layer in my application.
Recently, I've been having some issues when persisting changes from my entities to my database. The problem I've been having is that when I update an entity and save it to my database, sometimes the EntityManager treats my entity as a new object, so instead of performing an update operation, it performs an insert operation, thus causing a unique exception error in my database.
As the docs say, when updating an object you should only perform these steps:
fetch the object from Doctrine
modify the object
(optional) call persist() on the entity manager
call flush() on the entity manager
Note I added (optional) to the persist() call because, as the docs say, it isn't necessary since Doctrine is already watching the object for changes
Now that things are explained, this is the work I do in my code:
$myEntity = $this->myEntityRepository->byId($id);
// make some changes to the entity
$myEntity->setSomething('something');
$this->myEntityRepository->save($entity);
Where the save() operation in my repository is as follows:
$this->entityManager->persist($entity);
$this->entityManager->flush();
And the byId() operation:
return $this->entityManager->getRepository()->find($id);
As I said, the persist operation should only be called when persisting new entities, but since Doctrine can differentiate between an already managed entity and a new one, it should be no problem. If I didn't call the persist() method, instead of executing and insert operation and cause a unique violation, it would literally do nothing as it wouldn't detect any changes to my operations.
The reason I always use the persist() method is because the save() operation in my repository is used with both new entities and updates to existing entities.
As I've seen in another answer, calling the merge() operation instead of persist() should solve the problem, but I don't see it right because I think it's just a "dirty" solution, plus the method is being deprecated in future versions of Doctrine.
So, what am I missing here? Why sometimes I get a unique error when running the above code? I only have one connection and one entity manager configured in my application.
I'd like to add that the only occurrences of this problem are found in code executed in consumers (async events), not in the API itself, but whenever I receive a new event, a new and fresh connection to the database is created to ensure I don't have overlapping problems with the entity manager used in some previous event.
When talking about consumers I mean that an event is published via RabbitMQ (the event contains the ID of the entity) and then it gets consumed in a separate process from the API, fetching the entity directly with the entity manager repository.
My guess is that between the line where I get my entity from the repository (i.e. I use the find() method) and the line where I save it into the database (i.e. I use the flush() method), the entity manager somehow removes the entity from its UnitOfWork so it treats it as a new entity instead of a managed one.
An issue I recently found in some code was where an entity was being passed - via a RabbitMq message and the entity created from scratch, including the id -- $x = new Entity(); $x->setId($data['id');$x->setName($data['name'); # ... etc).
Having an id set does not make the entity 'managed' by Doctrine however - only reading the original record (from the DB) and then updating it. Persisting $x as above will create a new record - and ignoring the id that was set.
I would ensure (perhaps with some assertions on the record that you are fetching that must pass - or die with an error, at least for now) so that know that you are always dealing with a known entity (I'm also presuming that your ->byId($id) call is just a ->find($id) internally).

Map database views in Doctrine Migrations Bundle

There does not seem to be proper documentation available on how to configure and use database views with the doctrine migrations bundle.
One probably is not able to map SQL statements which will end up creating/updating a database view (from the sql given somewhere) when migrations:diff and migrations:migrate are run.
If an entity is mapped to a database view with the #table(name="view_name") markup, it ends up causing an error / new table being attempted, instead of understanding that its a database view being used.
Is there a solution? Am I missing something?
I'm not sure that doctrine can get out of the box views. For all I know, you'll have to cheat.
Or:
I think you have to write the migration script yourself.You can generate an empty one and then write the create-statements into it.
In the repository you integrate native sql. The result you map to your entity or DTO.
https://www.doctrine-project.org/projects/doctrine-orm/en/2.7/reference/native-sql.html

What's the right way to unit test a read-only model in Laravel?

I have a mysql view in a Laravel project. I've written some reports against the view.
How do I seed the model in my unit tests? I have the model annotated as readonly, so I can't seed the data the normal way.
Here's my model:
namespace App\Models;
use App\Models\Model;
use MichaelAChrisco\ReadOnly\ReadOnlyTrait;
class FancyView extends Model
{
use ReadOnlyTrait;
protected $table = 'really_fancy_view';
}
I have a mysql view that gets created in a migration.
One idea we had was to seed the tables the view uses, then run create the view. But is this the right way to test against a mysql view? My view is created with raw SQL. Will Laravel be able to handle creating a view in a test environment?
I can't find anything online about testing against a view, let alone anything on SO.
What you can do, possibly, is create a folder in your tests with your gold set of data (be it sql insert files or csv's) and then just read in and run those against the database before running the tests (you may be able to leverage a migration as well, but I am personally not familiar with them in laravel).
You can leverage setUpBeforeClass in phpunit to accomplish this. Drop the test table if it exists before you read in the contents of your golden set and insert it into the database. Then use tearDownAfterClass to drop the table again after the tests have occurred.
This pattern can also be used for your proposed second solution, seeding the data into the source tables (only enough data to run the test efficiently) then create the view in the same step.
class DatabaseTest extends TestCase {
public function setUpBeforeClass() {
// make sure you're starting with a fresh state
$this->tearDownAfterClass();
// 1. seed database tables
// 2. run view generation query
}
public function tearDownAfterClass() {
// 1. drop view table
// 2. truncate seeded data
}
}
As for which is better, that really depends on what you're trying to test. Are you testing that the database can actually create the view? If so then it would make sense that you test that the view was able to be created at all. A benefit of this would be knowing that if the view somehow broke your tests would catch it right away.
Performance concerns shouldn't be an issue, because you should only be testing against enough data to verify that your view is being composed accurately.
Usually you do not need to unit test something that is fully tested in the library:
https://github.com/michaelachrisco/ReadOnlyTraitLaravel/blob/master/src/ReadOnlyTrait.php
If you were so inclined, you could make the table and populate it with faked data just as jardis suggested.
If you are looking to test views (or full on integration tests) within Laravel, you can use https://laravel.com/docs/5.6/dusk which looks to be a wrapper around Selenium.

PHPUnit testing database with Zend_Test_PHPUnit_DatabaseTestCase and database adapter issue

I have implemented the abstract class from Database testing with Zend Framework and PHPUnit which implements Zend_Test_PHPUnit_DatabaseTestCase, however I am having issues with the way my database abstraction layer uses the Zend_Db_Adapter and how the unit test uses its own database connection, createZendDbConnection
The code in question is:
public function testSaveInsert()
{
$hostModel = new MyLibrary_Model_Host();
$hostModel->setName('PHPUnit - Host Test 1')
->setDescription('Host Test Description 1');
$databaseAPI = new MyLibrary_DatabaseAPI();
$databaseAPI->addHost($hostModel);
$hosts = $databaseAPI->getHosts();
Zend_Debug::dump($hosts);
// get data from the testing database
$dataSet = new Zend_Test_PHPUnit_Db_DataSet_QueryDataSet($this->getConnection());
$dataSet->addTable('HOST', "SELECT host.NAME
, host.DESCRIPTION
FROM HOST_TABLE host WHERE name like 'PHPUnit -%'");
// verify expected vs actual
$this->assertDataSetsMatchXML('hostInsertIntoAssertion.xml', $dataSet);
}
The MyLibrary_DatabaseAPI() class has a constructor
$hostDb = Zend_Registry::get('config')->dbConnectionIds->hostDb;
$this->appDb = Zend_Registry::get($hostDb);
The addHost method takes the host model attributes and converts them to parameters to be passed into the package function, using the database connection set in the registry. I have a flag in the method call to commit (business requirements determine this), so I can chose when to do this.
At the time of Zend_Debug::dumping the results the host has been added to the database table, however when I run the assertDataSetsMatchXML test it is not picking up a host was added and is failing.
I suspect is the issue is when I instantiate Zend_Test_PHPUnit...($this->getConnection) it uses createZendDbConnection which creates a new database session which isn't aware of the previous one and because I haven't 'committed' the addHost it isn't aware of it?
If I run the method call and commit the record it is added to the host table and the test passes, but I cannot rollback the results using the createZendDbConnection so the record remains in the table, which i dont want. Essentially I want a silent test, come in, test and leave no footprint.
Any ideas on how I can resolve this issue? The basis behind testing this way is the database has an api to tables so i don't directly CRUD in the database. The PHP database API class reflects the calls I can make to the database API so I want to test each of these methods.
You should allow your classes to inject the dependencies they have instead of fetching them from the registry. This is a pattern called "dependency injection" and comes in about two different types: Inject via constructor pattern, or via a setter.
That way, your DatabaseAPI would accept a database connection when instantiated, instead of fetching one from "somewhere", and that database connection can be a mock object instead of the real thing.
The mock can be configured to wait for certain method calls, can check if the parameters are correct, and might even return a defined result. All those mock calls are part of the test.
The biggest benefit: Mocks are only taking place in memory, they do not affect any permanent storage like a database. That means they are way faster than the real database access, and they leave no trace behind after their variable is unset or forgotten.
The only places where software really needs to use the underlying hardware is in those classes that must do the actual work. Fortunately for you, you are using the classes of the Zend framework, and you can consider them tested need not do it yourself again (unless you suspect an error).

Get a list of tables/models in Doctrine 1.2 - Symfony 1.4 and describe?

How could I fetch all the information about all the models or tables I have in the database using Doctrine 1.2 in Symfony 1.4.
I need to make a demo about capifony/git/migrations. So I want for an user to perform the following:
clone the repository
make a change on the template (any text)
change schema.yml and generate the migrations-diff
deploy
So I need to list all the models or tables and each with its columns. In order to demonstrate that the process works
Doctrine_Connection has a function called getTables(), I assume you can get a list of the tables on that connection calling it. According to this it returns an array of Doctrine_Table instances.
That class contains an array of its column definitions, which you can retrieve by calling getColumns().
I hope that's enough to get you started.

Categories