What are the current best practices for testing database interaction with Symfony2? I have a simple CRUD setup and i want to make sure my testing is OK. Right now, i have 4 tests, each one making sure that create, update, delete and list actions are occurring ok.
I have two magic methods, __construct and __destruct, on my test case. Inside them, i call exec() with 'php app/console ...' in order to create the database, create the schema and later on drop the database. However, this is SLOW as hell and it happens all the time when i have more than one test case.
How should i proceed when it comes to database testing and isolating such tests?
I think it's best to always start clean to be sure that tests are fully isolated. To do that I'm simply building the database structure before each test and than I'm filling it with fixtures needed for a given test.
Note that I'm only building needed database tables and I'm only inserting needed fixtures. It's a bit faster than loading a big database dump. It's also more clean as tests don't share fixtures (which makes them easier to maintain).
I have a base test case class called KernelAwareTest which helps me in building the schema. You can find it on gist: https://gist.github.com/1319290
setUp() boots the Symfony kernel and stores a reference to it in a class property (together with references to the DIC and the entity manager). Also a call to generateSchema() is made to generate the database schema (it uses Schema Tool from Doctrine).
By default it generates the database structure for all entities known to the entity manager. You can change this behaviour in your test class by overriding the getMetadatas() method.
P.S.: I tried using in memory database (sqlite) but it wasn't perfect. Anyway it's probably better to run tests against the database you use on production.
Database testing is always slow as you need to create/drop your schema before/after each test. To avoid unnecessary operations, you could:
create schema in the 'setUpBeforeClass' method;
ensure your 4 tests are launched in one thread with the '#depend' annotation;
drop schema in the 'tearDownAfterClass' method.
The schema will be created/droped only once for your tests case.
You can also setup doctrine to use an inmemory sqlite database (which is very fast):
doctrine:
dbal:
driver: pdo_sqlite
path: :memory:
memory: true
Anyway, '_construct' and '_destruct' should never be used in phpunit test cases, instead you should use 'setUp' and 'tearDown'.
The question is pretty old but still valid today so here is my experience and how I handle it today on my Symfony projects.
I started of using an SQLite in-memory database for my tests and I rebuild the db schema + inserted fixtures before each single test case. This had two major drawbacks:
It was still way too slow :(
On dev & prod I used Mysql which soon became a problem because SQLite simply does not have all the features needed and sometimes behaves differently
Using MSQL for the tests and rebuilding the schema + inserting fixtures before each test was simply too slow. So I was looking for alternatives...
I stumbled across this blog post: http://alexandre-salome.fr/blog/Symfony2-Isolation-Of-Tests
The guy here suggests to run tests inside active database transactions and simply roll back any changes after every single test.
I took this idea and created a bundle for it: https://github.com/dmaicher/doctrine-test-bundle
The setup of the bundle is really easy and does not require changing any existing php test classes. Internally it changes the doctrine config to use custom database connections + driver.
With this bundle you can simply create your database schema + insert fixtures ONCE BEFORE running the whole testsuite (I prefer doing this in a custom phpunit bootstrap file). Using the phpunit listener all tests will run inside database transactions.
I've been using this bundle since a while already and for me it works out perfectly using SQLite, MySQL or PostgreSQL.
Since a while its also used on the symfony-demo project.
testing on local machine is pain in the ... ,so i'm started to using ci system buddy.works (there is free stand-alone version) , and for this i neeeded to resolve this issue on my own.
result is :
all tests works
tests are runing on production sql data
tests are running in separation (not in dev or production) - so i can do
anything that i want with database
all pushes to git are tested and i have notification if something is broken
all pushes/pull request to deploy branch are automatic uploaded to production
This is my solution :
second parameters.yml in config with configuration for test
on production i'm making daily sqldump
on starting test on ci this sql backup is copied via scp to test machine
to prepare all this i'm using robo.li ( my config is below)
/**
* This is project's console commands configuration for Robo task runner.
*
* #see http://robo.li/
*/
class RoboFile extends \Robo\Tasks
{
function loadDb(){
$this->taskExecStack()
->stopOnFail()
->exec(" mysql -h mariadb -u root -pqwerty -e 'create database test' ")
->exec(" mysql -h mariadb -u root -pqwerty test < test.sql ")
->run();
}
function prepareDb(){
$this->taskExecStack()
->stopOnFail()
->exec("cp app/config/parameters-test.yml app/config/parameters.yml")
->run();
$this->taskReplaceInFile('app/config/parameters.yml')
->from('database_host: 127.0.0.1')
->to("database_host: 'mariadb'")
->run();
$this->taskReplaceInFile('app/config/parameters.yml')
->from('database_user: dbuser')
->to("database_user: 'root'")
->run();
$this->taskReplaceInFile('app/config/parameters.yml')
->from('database_password: 123')
->to("database_password: 'qwerty'")
->run();
}
}
i hope it help you to create idea how to organize all this . Using stand alone ci is difficult to setup , but it's really good idea
Related
Scenario
We want to autofill a table with new static content with doctrine, ideally with something like a fixture class or similar.
We follow the simple development life-cycle with development to staging to production. And we are using Doctrine v2.6 with Symfony v3.4. Every release step is executed by an Jenkins job.
For development and staging we use the very useful and simple doctrine-fixtures-bundle to auto-fill our database with test-datasets. The database-schema is auto-generated by doctrine:schema:update on basis of our entities.
I've tried to use the fixtures also for production but even with doctrine:fixtures:load --fixtures=src/MyBundle/DataFixture/ORM/MyFixture.php it is purging the whole database. Then I read something about the --append command to prevent doctrine from purging the database. But then it will append the datasets in every release process (?). Nevertheless, it also feels like a very bad practice.
What I'm wondering
Is it possible to truncate the table, load table records with static data loaded from a class that can be executed via a command line? Or is there a completely different (and clean) way for such a case? Is the doctrine:migration bundle the real way to go?
Thanks for your help!
You should create a Command for populate tables.
https://symfony.com/doc/current/console.html
I'm using database migrations with PHP symfony framework and I noticed that there are a lot of files, and when I build my project and every time many files (migrations) are executed.
What is best practice to manage migrations? Could I delete them and create only one database dump file, for database initiation?
I'm assuming you're using Doctrine Migrations to create these files. On dev you can always regenerate the database using the doctrine:schema:update command.
When setting up a new instance you can also use the doctrine:schema:create command which will also create the latest table definitions for you.
That brings us to your question; do you need to keep all the migrations? As long as you know that the migrations have been executed on outdated instances you can safely archive (my preferred option) or delete them as they will never be called again.
I would prefer to use migrations on production instead of doctrine:schema:update, because you can have some problems while updating process. It is always a better option IMO to keep an eye on what happens with the DB.
On development env you can easily use doctrine:schema:update command but you should use migrations on production. In some cases the command may brake the migrations (it happened to me a few times).
I am building a complex application but i want to know that is it safe to use doctrine migrations in production.
For eg. the site has been used for 1 year and company wants to add extra attribute to user table.
So do i straight way chnage by going in database or through doctrine migrations
This is one of the intended uses (and benefits) of migrations - to automate the changes to your database quickly and accurately. Yes, they can and in most cases should be used to update your database in production.
Edit: The Symfony2 documentation also explains clearly this is one of the purposes of migrations.
Of course, the end goal of writing migrations is to be able to use them to reliably update your database structure when you deploy your application. By running the migrations locally (or on a beta server), you can ensure that the migrations work as you expect.
...
Yes it would be safe.
I would just add an extra attribute in the User entity. Then run the doctrine:generate:entities command. That should generate the get/set methods. Then just update your database using doctrine:schema:update --force. That should add it into your database table.
I'm trying to unit test the classes I've created but the majority of the classes deal with the database. I've gotten non-database related classes to be test just fine locally, but I'm stumped when it comes to work with a database, especially remotely. The guide shows using PDO to access a local database that seems to be dumped to an XML file, so it's of little use to me since my database is in the Amazon cloud and using pg_* functions to connect to a Postgres database.
Are there any good examples of a similar situation or can anyone offer any help? I don't know if I should have a local version of my DB in a file or connect to the remote server. If I have to connect, what do I have to do to make it work?
Conclusion:
The project architect and I did an investigation and we determined it would be best to implement an ORM, since there was no abstraction to the database. Until then, database testing will be on hold. Once in place, I'm sure the PHPUnit manual will make much more sense.
The short answer is to Read The Fine Manual entry on database testing at the PHPUnit manual.
And now the long answer ...
The first thing to remember about unit testing is that it needs to be performed in isolation from all other components. Often, this goal is simplified using inversion of control (IoC) techniques like dependency injectionwikipedia. When your classes explicitly ask for their dependencies in the constructor methods it's a simple operation to mockphpunit those dependencies so that you can test the remaining code in isolation.
Testing code that interacts with models is a bit different, though. Usually it's not practical or advisable to inject your models into the class in which you need to access them. Your models are generally "dumb" data structures that expose limited or no capabilities. As a result, it's generally acceptable (in terms of testability) to instantiate your models on the fly inside your otherwise-injected classes. Unfortunately, this makes testing database code difficult because, as the PHPUnit documentation notes:
[T]he database is essentially a global input variable to your code
So how do you isolate and test code that interacts with the database if the models aren't directly injected? The simplest way to do this is to utilize test fixturesphpunit.
Since you're definitely already using PDO or an ORM library that builds on PDO (right?),
setting up the fixtures is as simple as seeding a basic SQLite database or XML file with data to accommodate your test cases and using that special database connection when you test the code that interacts with the database. You could specify this connection in your PHPUnit bootstrap file, but it's probably more semantically appropriate to setup a PHPUnit Database TestCasephpunit.
The generally accepted best practice steps for testing DB code (these are also echoed in the PHPUnit documentation on DB testing):
Set up fixture
Exercise System Under Test
Verify outcome
Teardown
So, to summarize, all you need to do is create a "dummy" database fixture and have your code interact with that known data instead of an actual database you would use in production. This method allows you to successfully isolate the code under test because it deals with known data, and this means you can make specific/testable assertions about the results of your database operations.
UPDATE
Just because it's an extraordinarily useful guide for what not to do in your code if you want to promote testability, I'm adding a link to Misko Hevery's How to Write 3v1L, Untestable Code. It's not involved with database testing in particular, but it's helpful nevertheless. Happy testing!
UPDATE 2
I wanted to respond to the comment about putting off model testing because the existing code base doesn't implement PDO for database access:
Your models don't have to use PDO to implement PHPUnit's DbUnit extension.
It will make your life a bit easier if you use PDO, but you aren't required to do so. Say, for example, you've built your application with PHP's built-in pg_* PostgreSQL functions. PHPUnit still allows you to specify fixtures and it can still rebuild them for each test -- you would simply need to point your connection when performing tests to the same resource the DbUnit extension uses for its fixture.
(Make this CW if needed)
We are two developers working on a web application based (PHP5, ZF, Doctrine, MySQL5). We are working each with a local webserver and a local database. The database schema is defined in a YAML file.
What's the best way to keep our database schemas in sync?
Here's how we do it: Whenever developer "A" makes a change, he generates a migration class. Then he commits the migration file developer "B" executes the migration class.
But creating a migration class on every db change is a rather tedious process.
Do you have a better solution?
I don't know how you do in the Zend Framework with Doctrine. Here's how I would do it in Symfony with Propel. Though the exact procedure may vary, but the underlying concept is the same.
I have unit tests on my DAL.
Whenever the schema changes, we check in the yml and the generated ORM code ( You do have a source control, don't you). I set the check-in to auto-mode, meaning I will get all the check-in instantly.
If the schema changes don't affect my thing, then I would just ignore the changes. But if the schema changes break my thing, then I will rebuild my form, ORM classes and whatnot by using symfony propel build command. Rebuilding those infrastructures is just a single command line thing, so there is no problem for me.
Finally, after rebuilding, I will run my unit tests, to make sure everything is OK. If not, I better get them fixed!
I see that this question is already answered but Doctrine can migrate your databases for you without having to blow away the whole thing. We use this for every schema change where one developer changes his/her local yaml file, generates new models locally, creates a migration using Doctrine, runs that migration locally to change the db, then checks in both the new yaml file and the migration. Then other developers check out the changed yaml file and migration, then they generate new models and run the migration to sync their db. Deploying the code to our QA and production environments is pretty much the same process.
More information on Doctrine migrations can be found on the Doctrine site.