I'm trying to unit test the classes I've created but the majority of the classes deal with the database. I've gotten non-database related classes to be test just fine locally, but I'm stumped when it comes to work with a database, especially remotely. The guide shows using PDO to access a local database that seems to be dumped to an XML file, so it's of little use to me since my database is in the Amazon cloud and using pg_* functions to connect to a Postgres database.
Are there any good examples of a similar situation or can anyone offer any help? I don't know if I should have a local version of my DB in a file or connect to the remote server. If I have to connect, what do I have to do to make it work?
Conclusion:
The project architect and I did an investigation and we determined it would be best to implement an ORM, since there was no abstraction to the database. Until then, database testing will be on hold. Once in place, I'm sure the PHPUnit manual will make much more sense.
The short answer is to Read The Fine Manual entry on database testing at the PHPUnit manual.
And now the long answer ...
The first thing to remember about unit testing is that it needs to be performed in isolation from all other components. Often, this goal is simplified using inversion of control (IoC) techniques like dependency injectionwikipedia. When your classes explicitly ask for their dependencies in the constructor methods it's a simple operation to mockphpunit those dependencies so that you can test the remaining code in isolation.
Testing code that interacts with models is a bit different, though. Usually it's not practical or advisable to inject your models into the class in which you need to access them. Your models are generally "dumb" data structures that expose limited or no capabilities. As a result, it's generally acceptable (in terms of testability) to instantiate your models on the fly inside your otherwise-injected classes. Unfortunately, this makes testing database code difficult because, as the PHPUnit documentation notes:
[T]he database is essentially a global input variable to your code
So how do you isolate and test code that interacts with the database if the models aren't directly injected? The simplest way to do this is to utilize test fixturesphpunit.
Since you're definitely already using PDO or an ORM library that builds on PDO (right?),
setting up the fixtures is as simple as seeding a basic SQLite database or XML file with data to accommodate your test cases and using that special database connection when you test the code that interacts with the database. You could specify this connection in your PHPUnit bootstrap file, but it's probably more semantically appropriate to setup a PHPUnit Database TestCasephpunit.
The generally accepted best practice steps for testing DB code (these are also echoed in the PHPUnit documentation on DB testing):
Set up fixture
Exercise System Under Test
Verify outcome
Teardown
So, to summarize, all you need to do is create a "dummy" database fixture and have your code interact with that known data instead of an actual database you would use in production. This method allows you to successfully isolate the code under test because it deals with known data, and this means you can make specific/testable assertions about the results of your database operations.
UPDATE
Just because it's an extraordinarily useful guide for what not to do in your code if you want to promote testability, I'm adding a link to Misko Hevery's How to Write 3v1L, Untestable Code. It's not involved with database testing in particular, but it's helpful nevertheless. Happy testing!
UPDATE 2
I wanted to respond to the comment about putting off model testing because the existing code base doesn't implement PDO for database access:
Your models don't have to use PDO to implement PHPUnit's DbUnit extension.
It will make your life a bit easier if you use PDO, but you aren't required to do so. Say, for example, you've built your application with PHP's built-in pg_* PostgreSQL functions. PHPUnit still allows you to specify fixtures and it can still rebuild them for each test -- you would simply need to point your connection when performing tests to the same resource the DbUnit extension uses for its fixture.
Related
I have a really nasty problem with a CodeIgniter application which I want to load balance using also MySQL replication.
In other PHP frameworks this would be a 5minute job because you may switch the datasource just before saving the data (CakePHP allows this, for example), but it seems in CodeIgniter that's a different story.
My problem is the application is legacy and I have to load balance it so I have all the hardware setup already done, the filesystem is in sync, the database has a slave and updates it as it should, but I have to make the second CodeIgniter application to read from the slave and write to the master.
I already found a solution but since the application is legacy I wouldn't want to go into its internals because I could destroy something that already works (it's also pretty sensible because people are paying within the app and it's also live which makes things even worse).
The solution I've found is this one which makes you change all the models in the application.
As further information CodeIgniter database driver (CI_DB_mysql_driver in this case) doesn't have any method for switching the connection or something.
Do you have another suggestion for tackling this problem? (besides changing all the models which I find a bit too intrusive)
What are the current best practices for testing database interaction with Symfony2? I have a simple CRUD setup and i want to make sure my testing is OK. Right now, i have 4 tests, each one making sure that create, update, delete and list actions are occurring ok.
I have two magic methods, __construct and __destruct, on my test case. Inside them, i call exec() with 'php app/console ...' in order to create the database, create the schema and later on drop the database. However, this is SLOW as hell and it happens all the time when i have more than one test case.
How should i proceed when it comes to database testing and isolating such tests?
I think it's best to always start clean to be sure that tests are fully isolated. To do that I'm simply building the database structure before each test and than I'm filling it with fixtures needed for a given test.
Note that I'm only building needed database tables and I'm only inserting needed fixtures. It's a bit faster than loading a big database dump. It's also more clean as tests don't share fixtures (which makes them easier to maintain).
I have a base test case class called KernelAwareTest which helps me in building the schema. You can find it on gist: https://gist.github.com/1319290
setUp() boots the Symfony kernel and stores a reference to it in a class property (together with references to the DIC and the entity manager). Also a call to generateSchema() is made to generate the database schema (it uses Schema Tool from Doctrine).
By default it generates the database structure for all entities known to the entity manager. You can change this behaviour in your test class by overriding the getMetadatas() method.
P.S.: I tried using in memory database (sqlite) but it wasn't perfect. Anyway it's probably better to run tests against the database you use on production.
Database testing is always slow as you need to create/drop your schema before/after each test. To avoid unnecessary operations, you could:
create schema in the 'setUpBeforeClass' method;
ensure your 4 tests are launched in one thread with the '#depend' annotation;
drop schema in the 'tearDownAfterClass' method.
The schema will be created/droped only once for your tests case.
You can also setup doctrine to use an inmemory sqlite database (which is very fast):
doctrine:
dbal:
driver: pdo_sqlite
path: :memory:
memory: true
Anyway, '_construct' and '_destruct' should never be used in phpunit test cases, instead you should use 'setUp' and 'tearDown'.
The question is pretty old but still valid today so here is my experience and how I handle it today on my Symfony projects.
I started of using an SQLite in-memory database for my tests and I rebuild the db schema + inserted fixtures before each single test case. This had two major drawbacks:
It was still way too slow :(
On dev & prod I used Mysql which soon became a problem because SQLite simply does not have all the features needed and sometimes behaves differently
Using MSQL for the tests and rebuilding the schema + inserting fixtures before each test was simply too slow. So I was looking for alternatives...
I stumbled across this blog post: http://alexandre-salome.fr/blog/Symfony2-Isolation-Of-Tests
The guy here suggests to run tests inside active database transactions and simply roll back any changes after every single test.
I took this idea and created a bundle for it: https://github.com/dmaicher/doctrine-test-bundle
The setup of the bundle is really easy and does not require changing any existing php test classes. Internally it changes the doctrine config to use custom database connections + driver.
With this bundle you can simply create your database schema + insert fixtures ONCE BEFORE running the whole testsuite (I prefer doing this in a custom phpunit bootstrap file). Using the phpunit listener all tests will run inside database transactions.
I've been using this bundle since a while already and for me it works out perfectly using SQLite, MySQL or PostgreSQL.
Since a while its also used on the symfony-demo project.
testing on local machine is pain in the ... ,so i'm started to using ci system buddy.works (there is free stand-alone version) , and for this i neeeded to resolve this issue on my own.
result is :
all tests works
tests are runing on production sql data
tests are running in separation (not in dev or production) - so i can do
anything that i want with database
all pushes to git are tested and i have notification if something is broken
all pushes/pull request to deploy branch are automatic uploaded to production
This is my solution :
second parameters.yml in config with configuration for test
on production i'm making daily sqldump
on starting test on ci this sql backup is copied via scp to test machine
to prepare all this i'm using robo.li ( my config is below)
/**
* This is project's console commands configuration for Robo task runner.
*
* #see http://robo.li/
*/
class RoboFile extends \Robo\Tasks
{
function loadDb(){
$this->taskExecStack()
->stopOnFail()
->exec(" mysql -h mariadb -u root -pqwerty -e 'create database test' ")
->exec(" mysql -h mariadb -u root -pqwerty test < test.sql ")
->run();
}
function prepareDb(){
$this->taskExecStack()
->stopOnFail()
->exec("cp app/config/parameters-test.yml app/config/parameters.yml")
->run();
$this->taskReplaceInFile('app/config/parameters.yml')
->from('database_host: 127.0.0.1')
->to("database_host: 'mariadb'")
->run();
$this->taskReplaceInFile('app/config/parameters.yml')
->from('database_user: dbuser')
->to("database_user: 'root'")
->run();
$this->taskReplaceInFile('app/config/parameters.yml')
->from('database_password: 123')
->to("database_password: 'qwerty'")
->run();
}
}
i hope it help you to create idea how to organize all this . Using stand alone ci is difficult to setup , but it's really good idea
I am looking for a (small) library that helps me cleanly implement a front controller for my pet project and dispatches requests to single controller classes. The front controller/dispatcher and controller classes need to be fully unittestable without sending HTTP requests.
Requirements
PSR-0 compatible
installable via its own PEAR channel
support for unit testing:
checking if the correct HTTP headers are sent
catches output to allow inspection in unit tests
perferably PHPUnit helper methods to help inspecting the output (for different output types, i.e. HTML, XML, JSON)
allows setting of incoming HTTP headers, GET and POST parameters and cookies without actually doing HTTP requests
needs to be usable standalone - without the db abstraction, templating and so that the fat frameworks all provide
Background
SemanticScuttle, the application that is bound to get proper "C" support, is an existing, working application. The library needs to blend in it and needs to work with the existing structure and classes. I won't rewrite it to match a framework's specific required directory layout.
The application already has unittests, but based on HTTP requests which make them slow. Also, the current old way of having several dozens of .php files in the www directory isn't the most managable solution, which is why proper controller classes need to be introduced. All in all, there will be about 20-30 controllers.
Previous experience
In general, I was pretty happy with Zend Framework for some previous projects but it has several drawbacks:
not pear-installable, so I cannot use it as dependency in my pear-installble applications
only available as one fat download, so I manually need to extract the required bits from it - for each single ZF update.
while unit test support exists for ZF controllers, it's lacking some advanced utility functionality like assertions for json, HTTP status code and content type checks.
While these points seem to be nit-picking, they are important for me. If I have to implement them myself, I do not need to use an external libary but write my own.
What I don't want
StackOverflow has a million "what's the best PHP framework" questions (1, 2, 3, 4, 5), but I'm not looking for those but for a specific library that helps with controllers. If it's part of a modular framework, fine.
I also know the PHP framework comparison website, but it doesn't help answer my question since my requirements are not listed there.
And I know that I can build this all on my own and invent another microframework. But why? There are so many of them already, and one just has to have all that I need.
Related questions
What's your 'no framework' PHP framework?
How do you convert a page-based PHP application to MVC?
Knowing Symfony2 well, I can assure you it's definitely possible to use it just for the "C" in MVC. The models and templates are completely free and are typically executed from the Controllers anyway, so if you don't call Doctrine or Twig specifically, you can do what you want.
As for functional testing, which is really what you're talking about in your article, what you want to look at is the WebTestCase class, which is well complemented by the LiipFunctionalTestBundle bundle for more advanced cases.
That allows for some things like this example of testing a contact form that sends an email, where the entire HTTP request is done in process, since the framework is written to allow multiple requests per process and has no global state, this works very well and does not require a http server to be running or anything. As you can see I do assertions on the HTTP status code of the response too, and was able to capture the email without sending it since in test configuration sending of emails is disabled in the standard distro of Symfony2.
That being said, you could also just use the Request and Response classes from Symfony2's HttpFoundation component. It should allow you to test your code, but IMO you wouldn't get as many nice features as you could if you'd use the entire framework. Of course that's just my biased opinion ;)
I would recommend downloading the Symfony 2 framework Routing component: https://github.com/symfony/Routing
Documentation is found here: http://symfony.com/doc/current/book/routing.html
Perhaps it does not satisfy all you requirements, but it's the closest.
If you are familiar with symfony (which I think you are) you should check out silex From their website this is what they say about it:
A microframework provides the guts for building simple single-file apps. Silex aims to be:
Concise: Silex exposes an intuitive
and concise API that is fun to use.
Extensible: Silex has an extension
system based around the Pimple micro
service-container that makes it even
easier to tie in third party
libraries.
Testable: Silex uses
Symfony2's HttpKernel which abstracts
request and response. This makes it
very easy to test apps and the
framework itself. It also respects
the HTTP specification and encourages
its proper use.
I'd add Net_URL_Mapper, it doesn't have the assertions though. Is that why you ruled it out?
Another pretty interesting thing is silex. It also comes with controller tests. I'd use that over Symfony2. But that's my personal preference.
Quite a understandable wishlist. I think we all hate it in testing when we run into dependencies that make testing to havoc. Tests should be simple and short, having many things to solve before and after running each test can be a burden.
From the description of your question it looks like that you pretty specifically know what you're looking for.
My first reaction would be that you use PHPUnit for this. It does not qualify all your requirements, but it's a base you can build on. It's highly expendable and flexible, however it does not support PSR-0 but has an autoloader of it's own so probably that does not weight that hard.
From the information you give in your question I'm not sure if the design of your testsuite(s) or the design of your application are hindering in writing and performing the tests you would love to.
I smell sort of probably both. If your application code is not easily testable, then there is not much a testing framework like PHPUnit can do about. So for example, if your controllers do not use a request object with an interface, it's not so easy to inject some request that was not triggered by the HTTP request, but by your tests. As HTTP is most often the entry-point into a webapplication, it pays to abstract here for tests. There exist some suggestions apart from specific frameworks: Fig/Http. However this is just a pointer.
Similar is this with the database scenario you give: If your application code is depending on the database, then your tests will be as well. If you don't want to test against your database all the time, you need to have your controllers being able to work w/o the concrete database. This is comparable with the HTTP requests.
There exists numerous approaches to cope with these circumstances, but as I read you question you don't look uneducated, but it's more you're looking for a better solution than exisiting ones.
As with every own code, it's pretty hard to find something that matches the own design. The best suggestion I can give is to extend PHPUnit to add those suites and constraints you need to for your application while you use the support of automated tests to refactor your application to fit the needs of how you would like to test.
So you can start with the tests and then develop the controller like you need it. This will keep your controller light I assume and help you to find the solutions you need.
If you find something that is missing with PHPUnit, you can first extend it on your own and additionally the author is very helpful in adding missing features.
Keep in mind that if there does not exist what you need, you need to code it your own. However if you're able to share (part) of the work with others, you most often get a benefit than by doing everything alone. That's a point for an existing framework, be it for testing or the application.
So if as of yet there is no such controller / MVC that does support easy unit-testing out of the box that fits your needs, chime in and develop one TDD-wise. If done right it can exactly match your requirements. However I think you're not alone with this problem. So not a very concrete answer, but I can only say that I made very good experiences with PHPUnit and it's extendability. That includes output tests you're mentioning in your question.
And probably a little differentiation at the end: It's one thing to test code-units and another to test if they all work in concert in the application with it's various requests. The last most often requires larger test setups by nature. However, if you can separate units from each other and clearly define with which other units they interact, then you normally only need to test the interaction between those which can reduce the setup. This does not save you from infrastructure problems, but those are normally not tested with unit-tests anyway (albeit you can extend PHPUnit to perform other type of checks).
A popular framework - even with a bad design - has the big plus that components tend to be better tested by use. That normally helps to go over the first years of your application until design issues in a framework make you need to rewrite your whole code base (probably).
As controllers often are sort in the middle of everything, this can lead to the scenario that you tend to test the whole application while you only want to test the controller(s). So you should think about the design and role of the controllers and their place within the overall application, what you really want to test with your controllers, so you can really make them testable according to your needs. If you don't need to test the database, you don't need to test the models. So you could mock a model returning random data to take it to the extreme. But if you want to test if HTTP handling is right, then probably a unit that abstracts HTTP handling is needed at first. Each controller relying on this would not be needed to test (theoretically) as the HTTP processing has been tested already. It's a question of the level of abstraction as well. There is no overall solution, it's only that frameworks can offer something but you're then bound to those paradigms the framework expects. AFAIK testing in php is getting more and more popular but that doesn't mean that the existing frameworks have good support for it. I know from the zend framework that they are working on this to improve the situation since longer. So it's probably worth to look into the more recent developments in the more popular frameworks to what this leads to as well.
And for the very specifics, you need to test on your own always.
Opting to PHPUnit and own testcases however does look as a practically way to me. Code your controllers as you need them for your project in TDD and you should get what you need.
Probably the more component based approach of Symfony 2 is better fitting your needs than what you experienced with Zend Framework. However, I can not suggest you anything specific as needs highly differ within application design. What's a quick and solid solution for one application is a burden for the other. See Page Controller.
You could take a look at the http://ezcomponents.org/ witch is becoming apache zeta
There are three ways how to make eZ components available for your PHP environment, please read the whole of this article before continuing with the practical part:
Use PEAR Installer for convenient installation via command line
Download eZ components packaged in an archive
Get the latest sources from SVN
I haven't got my hands into it yet but looks like a good solution...
Seldaek: WebTestCase isn't quite the right thing - it's for testing a view directly, and a controller or model only indirectly.
A unit test case for a controller would invoke the controller, likely giving it a mock object for the templating engine (e.g. a mock Smarty object), then check the values that were assigned to that object for display: for example, if you invoked the controller for /countries/south-sudan, you could check that the template variable $continent was set to "Africa". This kind of unit testing wouldn't actually involve any template rendering in most cases.
I'm kind of new to unit testing, but I've recently seen how it can be quite useful. I've seen that most unit tests are self running. In fact most unit testing frameworks provide a way of running several tests at once (such as unit testing a whole system).
I wonder though; How do you deal with external resources in self running unit tests? I like the idea of testing a whole system and seeing which classes failed, but a class may, for example, create thumbnails from an uploaded image. How would that test be self running when it relies on an upload image? Would I keep a directory of images, and "pretend" to upload one of them in the test?
Any thoughts on the matter would be greatly appreciated.
If you are planning on testing external resources then it would be integration testing. In pure unit testing -> to test external resources, you would have to mock the external resource. So in this case you create a IDirectory interface and then use say a FakeDirectory class and then use FakeDirectory to "Upload" the image. And when you are actually using the application you would pass an actual directory.
In integration testing, you could have a setup class which would do all the work to set up and then you would test.
I have come across this same situation while unit testing my PHP classes. There are functions which can be tested without using any other resources (unit testing), but many functions perform file read/write operations or require database access (integration testing). In order to test these functions, I've combined unit testing with integration testing. In my setUp and tearDown testing classes, it may load a database schema or fetch test data from a local test_data/ directory required by the class functions.
If you need to test what happens with user input, you indeed need some sample data at hand. A directory with images, text files, PDFs or whatever else is needed, should be there along your unit tests. Or you can generate random data programmatically in your tests.
Yes, ideally a class that creates a thumbnail can use a placeholder image that you provide as a resource in your unit test directory. You should be able to test the class in isolation, with as little dependency on the rest of your application as possible. That's kind of what people mean when they recommend to design your code to be "testable."
Mock external dependencies. I have no real experience of mocking in php but I have seen enough resources online just googling for mock and php that it's being done
(Make this CW if needed)
We are two developers working on a web application based (PHP5, ZF, Doctrine, MySQL5). We are working each with a local webserver and a local database. The database schema is defined in a YAML file.
What's the best way to keep our database schemas in sync?
Here's how we do it: Whenever developer "A" makes a change, he generates a migration class. Then he commits the migration file developer "B" executes the migration class.
But creating a migration class on every db change is a rather tedious process.
Do you have a better solution?
I don't know how you do in the Zend Framework with Doctrine. Here's how I would do it in Symfony with Propel. Though the exact procedure may vary, but the underlying concept is the same.
I have unit tests on my DAL.
Whenever the schema changes, we check in the yml and the generated ORM code ( You do have a source control, don't you). I set the check-in to auto-mode, meaning I will get all the check-in instantly.
If the schema changes don't affect my thing, then I would just ignore the changes. But if the schema changes break my thing, then I will rebuild my form, ORM classes and whatnot by using symfony propel build command. Rebuilding those infrastructures is just a single command line thing, so there is no problem for me.
Finally, after rebuilding, I will run my unit tests, to make sure everything is OK. If not, I better get them fixed!
I see that this question is already answered but Doctrine can migrate your databases for you without having to blow away the whole thing. We use this for every schema change where one developer changes his/her local yaml file, generates new models locally, creates a migration using Doctrine, runs that migration locally to change the db, then checks in both the new yaml file and the migration. Then other developers check out the changed yaml file and migration, then they generate new models and run the migration to sync their db. Deploying the code to our QA and production environments is pretty much the same process.
More information on Doctrine migrations can be found on the Doctrine site.