I'm kind of new to unit testing, but I've recently seen how it can be quite useful. I've seen that most unit tests are self running. In fact most unit testing frameworks provide a way of running several tests at once (such as unit testing a whole system).
I wonder though; How do you deal with external resources in self running unit tests? I like the idea of testing a whole system and seeing which classes failed, but a class may, for example, create thumbnails from an uploaded image. How would that test be self running when it relies on an upload image? Would I keep a directory of images, and "pretend" to upload one of them in the test?
Any thoughts on the matter would be greatly appreciated.
If you are planning on testing external resources then it would be integration testing. In pure unit testing -> to test external resources, you would have to mock the external resource. So in this case you create a IDirectory interface and then use say a FakeDirectory class and then use FakeDirectory to "Upload" the image. And when you are actually using the application you would pass an actual directory.
In integration testing, you could have a setup class which would do all the work to set up and then you would test.
I have come across this same situation while unit testing my PHP classes. There are functions which can be tested without using any other resources (unit testing), but many functions perform file read/write operations or require database access (integration testing). In order to test these functions, I've combined unit testing with integration testing. In my setUp and tearDown testing classes, it may load a database schema or fetch test data from a local test_data/ directory required by the class functions.
If you need to test what happens with user input, you indeed need some sample data at hand. A directory with images, text files, PDFs or whatever else is needed, should be there along your unit tests. Or you can generate random data programmatically in your tests.
Yes, ideally a class that creates a thumbnail can use a placeholder image that you provide as a resource in your unit test directory. You should be able to test the class in isolation, with as little dependency on the rest of your application as possible. That's kind of what people mean when they recommend to design your code to be "testable."
Mock external dependencies. I have no real experience of mocking in php but I have seen enough resources online just googling for mock and php that it's being done
Related
I'm writing some unit tests (PHPUnit) for a project that will be used by another project. The project that I am testing has a class that will include files that will be present only in the other project.
Now how do I write a unit test for this class without having the files that it needs to include available? Would you recommend setting up a complete testing project with stub files (the file in question is a file that contains some settings) and running all the unit tests there? Or should I create directories and files using for example the setUp() method?
Edit:
To be more specific, I have a base project A, which is a website. I have a project B, which contains a class that generates a form. The form class will be installed in project A using Composer. The form class will in project A check for the existence of a dir with a settings file. If it exists, it will include it and load the settings in it. To test the form class, do you think I should create a project C (just for testing) which installs project B and in which I set up the directory with the settings file for testing? Or do you think it's a better way to go to create the directory with the settings file in project B itself? The latter seems a bit odd to me, as I don't want to have all this unit testing material available in project A when I composer-install project B in it.
Yes!! To everything:
Now how do I write a unit test for this class without having the files that it needs to include available?
You can create test fixtures. What is the specification your programming your code to? As you develop your code are you using a test file? reading documentation? Given a specification by the client? You could create an input file that fulfills the spec and provide it to your function.
Would you recommend setting up a complete testing project with stub files (the file in question is a file that contains some settings) and running all the unit tests there?
Yes, but only the absolute smallest amount of tests necessary to programmatically ensure that the functionality that you are saying you're providing your client is being delivered! If the function is provided a file path and parses it and then loads the settings I feel like there needs to be at least a couple test cases that ensure that a file can be read from the operating system. Having a fixture file that your test loads to verify that the file opening logic is correct should be a pretty reliable test. I think the tricky part is minimizing the number of these tests.
For example, if you need to test your settings parsing logic, it may seem easy to create a settings file and have your test load and parse that settings file. For a couple tests this will be plenty fast and reliable. But as your test suite grows it becomes orders of magnituded slower than in memory tests. Instead of going through the file system to test the settings parsing logic, you could excercise a settings parsing function directly by providing it with the string contents of the file. This way you could build a settings string in memory, in your test function and pass it into the parsing function, avoiding any file system reads. If the file is too large and expects a file like object so that it can incrementally read data from the file system you could create an in memory stub object which you could use.
I'm not sure of the php api for that but if there is like a readline method you could create a fake file object provide it with the PHP file api and create your fake settings file in memory during the test, also avoiding the file system.
Or should I create directories and files using for example the setUp() method?
What is the benefit of this over having a static file? In my experiences minimizing test complexity and test logic is huge for test suite maintenance and performance.
You could create/delete the files and directories during setup and tear down, but be aware that this can make your tests flakey as sometimes files can not be deleted, e.g. due to lock issues or when a test fails and tear down is not called.
A safer way to interact with "your local filesystem" is vfsStream which will never actually write to disk. The documentation contains a nice basic example both for using the filesystem directly or using vfsStream to mock the filesystem access: https://github.com/mikey179/vfsStream/wiki/Example
Moving to Codeception from Behat and still getting used to it's concepts & where things go.
In the hypothetical that my tests are 100% driven from .feature files, does this mean that all of the test code could be in Contexts? That there wouldn't be anything in any functional tests that extend PHPUnit_Framework_TestCase? (Assuming that all my functional tests would extend that)
Codeception is not driven by Gherkin as Behat is. If you are moving away from Behat you will write functions in classes in Codeception directly and you are not going to start from a Gherkin script to then derive the executable specs (in your contexts files, page objects).
In brief the two flows
Behat
Write the BDD scripts/Gherkin - features. These are totally abstract and should usually be logic descriptions of the use cases your system implements. A product owner can start writing this for instance when a user story is created. Requires no programming logic
For each line in the feature implement an executable specification (a function in a Context class) that handles that action
In Behat usually you also use Page objects (unsure if this can also be done in Codeception but I dont see why not if you can import the Page Object library)
Codeception
You write executable specifications as a first step for instance in a Cept class. A developer is needed here as this is actual PHP Code/Classes
When you run codeception then it prints out a list of all the statements it has run, just like a report.
The above is a very simplified description as your question is also very generic. I hope it answers your question
I am currently going to start from scratch with the phpunit tests for a project. So I was looking into some projects (like Zend) to see how they are doing things and how they organizing their tests.
Most things are pretty clear, only thing I have some problems with is how to organize the test suites properly.
Zend has an AllTests.php from which loads others test suites.
Tough looking at the class it is useing PHPUnit_Framework_TestSuite to create a suite object and then add the other suites to it, but if I look in the PHPUnit docs for organizing tests in PHPUnit versions after 3.4 there is only a description for XML or FileHierarchy. The one using classes to organize the tests was removed.
I haven't found anything that this method is deprecated and projects like Zend are still using it.
But if it is deprecated, how would I be able to organize tests in the same structure with the xml configuration? Executing all tests is no problem, but how would I organize the tests (in the xml) if I only wanted to execute a few tests. Maybe creating several xmls where I only specify a few tests/test suites to be run?
So if I would want to only test module1 and module2 of the application, would I have an extra xml for each and defining test suites only for those modules (classes used by the module) in it. And also one that defines a test suite for all tests?
Or would it be better to use the #group annotation on the specific tests to mark them to be for module1 or module2?
Thanks in advance for pointing me to some best practices.
I'll start of by linking to the manual and then going into what I've seen and heard in the field.
Organizing phpunit test suites
Module / Test folder organization in the file system
My recommended approach is combining the file system with an xml config.
tests/
\ unit/
| - module1
| - module2
- integration/
- functional/
with a phpunit.xml with a simple:
<testsuites>
<testsuite name="My whole project">
<directory>tests</directory>
</testsuite>
</testsuites>
you can split the testsuites if you want to but thats a project to project choice.
Running phpunit will then execute ALL tests and running phpunit tests/unit/module1 will run all tests of module1.
Organization of the "unit" folder
The most common approach here is to mirror your source/ directory structure in your tests/unit/ folder structure.
You have one TestClass per ProductionClass anyways so it's a good approach in my book.
In file organization
One class per file.
It's not going to work anyways if you have more than one test class in one file so avoid that pitfall.
Don't have a test namespace
It just makes writing the test more verbose as you need an additional use statement so I'd say the testClass should go in the same namespace as the production class but that is nothing PHPUnit forces you to do. I've just found it to be easier with no drawbacks.
Executing only a few tests
For example phpunit --filter Factory executes all FactoryTests while phpunit tests/unit/logger/ executes everything logging related.
You can use #group tags for something like issue numbers, stories or something but for "modules" I'd use the folder layout.
Multiple xml files
It can be useful to create multiple xml files if you want to have:
one without code coverage
one just for the unit tests (but not for the functional or integration or long running tests)
other common "filter" cases
PHPBB3 for example does that for their phpunit.xmls
Code coverage for your tests
As it is related to starting a new project with tests:
My suggestion is to use #covers tags like described in my blog (Only for unit tests, always cover all non public functions, always use covers tags.
Don't generate coverage for your integration tests. It gives you a false sense of security.
Always use whitelisting to include all of your production code so the numbers don't lie to you!
Autoloading and bootstrapping your tests
You don't need any sort of auto loading for your tests. PHPUnit will take care of that.
Use the <phpunit bootstrap="file"> attribute to specify your test bootstrap. tests/bootstrap.php is a nice place to put it. There you can set up your applications autoloader and so on (or call your applications bootstrap for that matter).
Summary
Use the xml configuration for pretty much everything
Seperate unit and integration tests
Your unit test folders should mirror your applications folder structure
To only execute specif tests use phpunit --filter or phpunit tests/unit/module1
Use the strict mode from the get go and never turn it off.
Sample projects to look at
Sebastian Bergmanns "Bank Account" example project
phpBB3 Even so they have to fight some with their legacy ;)
Symfony2
Doctrine2
Basic Directory Structure:
I have been experimenting with keeping the test code right next to the code being tested, literally in the same directory with a slightly different file name from the file with the code it is testing. So far I am liking this approach. The idea is you don't have to spend time and energy keeping the directory structure in sync between your code and your test code. So if you change the name of the directory the code is in, you don't then also need to go and find and change the directory name for the test code. This also causes you to spend less time looking for the test code that goes with some code as it is right there next to it. This even makes it less of a hassle to create the file with the test code to begin with because you don't have to first find the directory with the tests, possibly create a new directory to match the one you are creating tests for, and then create the test file. You just create the test file right there.
One huge advantage of this is it means the other employees (not you because you would never do this) will be less likely to avoid writing test code to begin with because it is just too much work. Even as they add methods to existing classes they will be less likely to not feel like adding tests to the existing test code, because of the low friction of finding the test code.
One disadvantage is this makes it harder to release your production code without the tests accompanying it. Although if you use strict naming conventions it still might be possible. For example, I have been using ClassName.php, ClassNameUnitTest.php, and ClassNameIntegrationTest.php. When I want to run all the unit tests, there is a suite that looks for files ending in UnitTest.php. The integration test suite works similarly. If I wanted to, I could use a similar technique to prevent the tests from getting released to production.
Another disadvantage of this approach is when you are just looking for actual code, not test code, it takes a little more effort to differentiate between the two. But I feel this is actually a good thing as it forces us to feel the pain of the reality that test code is code too, it adds its' own maintenance costs, and is just as vitally a part of the code as anything else, not just something off to the side somewhere.
One test class per class:
This is far from experimental for most programmers, but it is for me. I am experimenting with only having one test class per class being tested. In the past I had an entire directory for each class being tested and then I had several classes inside that directory. Each test class setup the class being tested in a certain way, and then had a bunch of methods each one with a different assertion made. But then I started noticing certain conditions I would get these objects into had stuff in common with other conditions it got into from other test classes. The duplication become too much to handle, so I started creating abstractions to remove it. The test code became very difficult to understand and maintain. I realized this, but I couldn't see an alternative that made sense to me. Just having one test class per class seemed like it would not be able to test nearly enough situations without becoming overwhelming to have all that test code inside one test class. Now I have a different perspective on it. Even if I was right, this is a huge dampener on other programmers, and myself, wanting to write and maintain the tests. Now I am experimenting with forcing myself to have one test class per class being tested. If I run into too many things to test in that one test class, I am experimenting with seeing this as an indication that the class being tested is doing too much, and should be broken up into multiple classes. For removing duplication I am trying to stick to simpler abstractions as much as possible that allows everything to exist in one readable test class.
UPDATE
I am still using and liking this approach, but I have found a very good technique for reducing the amount of test code and the amount of duplication. It is important to write reusable assertion methods inside the test class itself that gets heavily used by the test methods in that class. It helps me to come up with the right types of assertion methods if I think of them as internal DSLs (something Uncle Bob promotes, well actually he promotes actually making internal DSLs). Sometimes you can take this DSL concept even further (actually make a DSL) by accepting a string parameter that has a simple value that refers to what kind of test you are trying to perform. For example, one time I made a reusable asssertion method that accepted a $left, $comparesAs, and a $right parameter. This made the tests very short and readable as the code read something like $this->assertCmp('a', '<', 'b').
Honestly, I can't emphasize that point enough, it is the entire foundation of making writing tests something that is sustainable (that you and the other programmers want to keep doing). It makes it possible for the value that tests add to outweigh what they take away. The point is not that you need to use that exact technique, the point is you need to use some kind of reusable abstractions that allow you to write short and readable tests. It might seem like I'm getting off topic from the question, but I'm really not. If you don't do this, you will eventually fall into the trap of needing to create multiple test classes per class being tested, and things really break down from there.
Is there a way to run in-memory database such as hsqldb?
I need it for unit testing. In Java there's no problem with it. But, unfortunately, there is a problem with it in PHP. So, is there a way?
You should be able to use this Perl module to access HSQLDB via PHP
https://metacpan.org/pod/DBD::JDBC
Ideally for unit testing you should be mocking your data access in order to test your components in isolation - so you would use mocks and stubs to remove the dependency on the database in your tests.
I'm curious to see how other developers go about testing their web sites. PHP specifically in my case, but this probably spans multiple languages. I've been working on a site for over a year now, and I'd really like to automate a lot of the regression testing I do between versions.
This specific site is in CodeIgniter, so I have some tests for my models. I'd like to move beyond just testing those though. However, this is an issue even non-MVC developers have had to tackle I'm sure.
Edit: I think the functionality that would satisfy a lot of my test desires is the ability to assert that paramters have a specific value at the end of the script processing. In my case a lot of logic is in the controller, and that's the main area I'd like to test.
For actual unit testing without testing the UI, you should just test the functions in the model. Most of your functionality should be in there anyways.
You might want to have a look at Selenium for testing the UI of your site. It can record your actions and play them back, or you can edit the scripting directly.
(source: seleniumhq.org)
Have you tried Fitnesse ?
It helps on creating Acceptance tests. They are specially useful for websites, which doing this kind of tests are a pain.
There are a couple of videos from unclebob inside the webpage too. The good thing is that Fitnesse is not restricted for website testing, so your knowledge about using it can be used with other apps too.
The project I'm working on is a Desktop APP written in c++ that uses Fitnesse tests.
But if you meant unit testing the models (which I think you didn't), they can be create using the phpunit lib. I think the ZEND framework has a similar lib for that.
You might want to check out PHPUnit
http://www.phpunit.de/manual/current/en/
I have started using it on my PHP projects and it's very easy to work with and very powerful. In particular, learn and use mocks:
http://www.phpunit.de/manual/3.0/en/mock-objects.html
Mocking is especially important when unit testing applications that do database operations.
Take a look at TOAST. It's build specially for CodeIgniter. It uses CI infrastructure, so you can run all test tests via a browser and results are displayed back as a web page (HTML). It's very simple to use.
I suggest you test your Controllers as well. Testing model is ok, but model is just the DB storage. Controllers contain all the "business logic" and are the place where most things go wrong.
One of the best ideas I've heard of, as far as testing web apps go, was to create a script that would go over all the pages in the site and check them for differences from the previous scan, letting you accept changes and fix regressions.
Generally speaking, automatic testing of GUI applications (websites are GUI apps) is difficult and usually unnecessary. Unit tests work best with simple libraries.
I use Canoo WebTest. It is the best free web site unit test framework out there. It is entirely scriptable with XML and requires no browser so it can run from a build server.
We modified Waiter (Ruby). It plays back "scripts" of URLs and Form Filling to IE and we have added a script "command" to take a Screen Capture; the screen capture image is compared against a Known-Good-Image (i.e. a Master Image) and if that image is different it is logged (basically a Web page of such results is prepared) and "a human" does a review of the Master / Test image. Obviously there are two outcomes at that point - "The difference is intentional" or "There is an incorrect change". In the first instance the Master image is replaced with the New Image; in the second we go fix the bug, and the change will be included in the next test run