Zend Unit Test with database - php

I want to make some unit tests in my project (i am new to testing), but tutorials online seem to show examples testing only simpliest stuff.
What I want to test is case when after sending POST to addAction in my SurveyController will result in adding corresponding rows to my survey and question tables (one-to-many).
What are the best practices to test database related stuff? Do I create separate db for my test environment and run tests on it? That is the only and right option?

It depends on your circumstances.
Here is my take on this:
The idea is to test but also be DRY (Don't repeat yourself). Your tests must cover all or as many different cases as possible to ensure that your component is thoroughly tested and ready to be released.
If you use an already developed framework to access your database like Doctrine, Zend Framework, PhalconPHP etc. then you can assume that the framework has been tested and not test the actual CRUD operations. You can concentrate on what your own code does.
Some people might want to test even that but in my view it is an overkill and waste of resources. One can actually include the tests of the particular framework with their own if they want to just have more tests :)
If however you were responsible for the database layer classes and its interaction with your application, yeah tests are a must. You might not run them all the time you need to prove a database operation works or not through some piece of code, but you need to have them.
Finally you can use Mock tests as Mark Baker suggested in your code and assume that the database will respond as you expect it to (since it has already been tested). You can then see how your application will react with different responses or results.
Mocking database operations will actually make your tests run faster (along with the other benefits that come with this strategy) since there won't be any database interactions in the tests themselves. This can become really handy in a project with hundreds if not thousands of tests and continuous integration.
HTH

Related

Refactoring Code To Psr standard and making the code testable in Laravel 4

When i started making a mobile app (that uses laravel on the server) i decided to not dig into testing or coding standards as i thought it was better to actually get a working app first.
A few months later i have a working app but my code doesn't follow any standards and nor have i written any tests.
Current directory structure i'm using is:
app/controllers : Contains all the controllers the app uses. The controllers aren't exactly thin, they contain most of the logic and some of them even have multiple conditional statements (if/else). All the database interactions occur in the controllers.
app/models : Define relationships with other models and contain certain functions relative to the particular model eg. A validate function.
app/libraries : contain custom helper functions.
app/database : contains the migrations and seeds.
My app is currently working and the reason for the slack is probably because i work alone on the app.
My concerns:
Should i go ahead and release the app and then see if its even worth
making the effort to refactor or should i first refactor.
I do wish to refactor the code but i'm unsure on as to what approach
i should take. Should i first get the standards right and then make
my code testable? Or should i not worry about standards( and
continue using classmap to autoload) and just try and make my code
testable?
How should i structure my files?
Where should i place interfaces,abstract classes etc?
Note: I am digging into testing and coding standards from whatever resources i can find, but if you guys could point me to some resource i would appreciate it.
Oh dear. Classic definition of legacy code, is code with no unit tests. So now you are in that unfortunate position where, if you are ever going to have to amend/enhance/resuse this code base, you are going to have to absorb a significant cost. The further away from your current implementation you get, the higher that cost is going to get. That's called technical debt. Having testable code doesn't get rid of it though, it simply reduces the interest rate.
Should you do it? Well if you release now, and it achieves some popularity and therefore need to do more to it...
If it's going to be received with total indifference or screams of dismay, then there's no point in releasing it at all, in fact that would be counter-productive.
Should you choose to carry on with the code base, I can't see any efficient way to re-factor for a coding standard without a unit test to prove you haven't broken the app in doing so. If you find a coding standard where testable and tested aren't a key part of it, ignore it, it's drivel.
Of course you still have the problem of how do you change code to make it testable without breaking it. I suspect you can iterate towards that with say delegation in your fat controllers without too much of a difficulty.
Two key points.
Have at least some test first, even if it's a wholly manual one, say a printout of the web page before you do anything. A set of integration tests, perhaps using an automation suite would be very useful, especially if you already have a coupling problem.
The other is, don't make too many changes at once and by too many I mean how many operations in the app will be affected by this code.
Robert C Martin's (no affiliation whatsoever) Working With Legacy Code would be a good purchase if you want to learn more.
Hopefully this lesson has already been learnt. Don't do this again.
Last but not least doing this sort of exercise is a great learning experience, rest assured you will run into situations (lots of them) where this perennial mistake has been made.

Behat/Mink strategy for testing with contexts

I have a question on how to implement behat/mink functional tests.
In my web app, I have users that can access some data sheets if they have the required credentials (i.e. no access/ read only / write).
I want to be able to test all the possible contexts via behat/mink.
The question is what is the best practice for such testing ?
Some dev told me that I have to create a scenario for each type of user I would like to use. Then, I will have to use the user I created in others tests.
But I am not very confortable with this idea : I believe that it introduces coupling between my tests. If the test that creates the user fails, then the test that checks the access over my datasheet for that specific user will also fail.
So, I believed that I could use some fixtures : before testing my app, I run a script that will insert me all the profiles I need. I will have some tests dedicated for creating users, and I will use the fixtures to check if a specific user is allowed to access a specific datasheet.
The counterpart with this solution is that I will have to maintain the fixtures set.
Do you have any suggestion / idea ?
Hi user3333860 (what a username XD),
I'm not an expert in testing and these days, I'm more on ruby/rspec but I personally think that both solutions are good and are used.
Use a feature to create your User:
If your test for creating a user fails, It may means that your User creation refactoring is also messed up.
So the fact that other tests fails doesn't seems to be a drawback to me. But I do understand the fact that you don't want to have coupling between your tests.
The main point is, are your test ran in a static order or are they run randomly (ie: rspec don't always launch tests in the same order) and are you ready to have the same test run multiple time in different features so that your other tests can successfully complete.
Use a fixtures:
Well it is also a good and popular solution (in fact, the one I use) and you already pinpoint the fact that you will have to maintain.
In the end, I'd take the fixtures path ONLY with a tools helping like FactoryGirl that helps you maintain your objects templates (here is the PHP version of it)
https://github.com/breerly/factory-girl-php
Hope I helped with your dilemma.

Sharing unit testing fixtures

I'm writing unit tests for a project (written in PHP, using PHPUnit) that have its entire environment (loaded components, events, configuration, cache, per-environment singletons, etc) held in an object which all the components use to interact with each other (using a mediator pattern).
In order to make the unit tests run faster, I'm sharing the environment object and some other objects (for example, in my test case for view object [as in the V of MVC], the view manager object [which acts as a factory for view objects and responsible for the actual rendering]) among tests in the same test case (using PHPUnit's setUpBeforeClass() and static properties).
Even though, to the best of my knowledge, the objects I share shouldn't effect the integrity of the tests (in the views case, for example, the environment and view manager object are shared, but a separate view object is created for every test - which is the object that's actually being tested by the test case), it just feels increasingly wrong to me.
I would prefer it if each test used a completely isolated environment and couldn't effect other tests in the same test case in any way. However, that would make the tests run much slower and it feels like a big price for something that I can't really pinpoint the downside of and mainly just "feels wrong".
What do you think? Can you pinpoint any downsides so I can convince myself its worth the longer execution time? Or am I just over reacting and its completely fine?
I share your feelings so maybe i just state my goals and my solution when i faced that issue:
Devs should have a test suite that runs very very fast
At least single test cases should execute in less than a second
I really want to be sure i don't have interdependencies in my test cases
I'm going to assume you have a Continuous Integration Server running. If not a cronjob might do but consider setting up jenkins, it's really really easy.
For normal usage:
Just share as much fixtures as you need to get the speed you need. It might not be pretty and there might be better solutions along the way but if you have something that is expensive to create just do it once.
I'd suggest helper methods getFoo() { if(!self::$foo) .... create ... return $foo;} over setUpBeforeClass because it can make sharing easier but mainly because of the following point.
Once a night:
Run your test suite with --process-isolation and in that bootstrap recreate your complete database and everything.
It might run 6 hours (disable code coverage for that!) but how cares. Your fixtures will be recreated for every single test case since it's a new php process and the static vars don't exist.
Using this way you can be sure that you don't have created dependent once a day. Thats good enough to remember what you did (and you can run with --filter and --process-isolation if you need to fix something).
Like writing "normal" code, when you write test cases, it's fine to rely on knowledge of how fixture objects work.
If a given factory method is documented as generating new instances every time, then I see only downside in creating the factory method anew each time, especially if the factory creation is itself expensive.
It helps to keep in mind a key goal around writing unit tests. You want to know within 5-10 minutes whether you broke the build. That way you can go out to lunch, go to a meeting, go home, etc. after you get the "all clear". If you know that some part of the fixture is reusable without creating interactions, then you should use that knowledge to make your tests even more comprehensive within that 5-10 minute window. I understand the purist impulse here, but it buys you nothing in terms of test independence, and it unnecessarily limits what your test suite will accomplish for you.
Perhaps this is blasphemous, but if you can still manage a significant threshold of code coverage and you can also guarantee that no state pollution can exist (and that is your real issue -- are you making sure that each test will not be effected by the data left over from the test before), I see no problem with leaving things the way they are.
Let the tests run quickly, and when a bug is found (which is inevitable in integration testing) then you have reason to invest the time in localizing one particular test or set of tests. If you have a toolkit which works for the general case, however, I would leave it as is.

How to unit-test an enterprise symfony project?

I´m working on a huge project at my work. We have about 200 database tables, accordingly a huge number of Models, Actions and so on.
How should I begin to write tests for this?
My biggest problem is, with symfony you can test with the lime framework, which is great. But most of the code writes, deletes or do other things with the database. So how can I test a Model without interacting with the database?
I want to do unit-tests because I´ve had problems with bugs in my code, through the refactoring of functions but I don´t even know how to start. The examples from the documentation cover only a very small function. How does it look, when the action file is over 700 lines of code?
Great question.. I've personally run into this all over the place.
Here's what I've found so far:
1) Get a dev database.. DO NOT test on a prod database!
2) It may sound trite, but start small and simple.
3) I don't know what your field is (e-commerce database, contacts database, etc...) but say it is an e-commerce DB. Start by testing creating some order models and saving them. Maybe recreate a real order in a test harness so it saves to the DB. Now you can quickly create 1000 orders to run tests on.. WAY faster than manually doing web checkouts for things. For maximum benefit, create a model of some thing you are currently working on so you can use it during your testing.
4) Now start testing the various methods that your model provides. Again, stick to the ones that are relevant to what you are currently trying to fix/work with. Don't worry about testing everything, just test some stuff, and make sure you can repeat your tests.
5) Need to test controllers? Cool, now you have a model to work with that you don't care about messing up, becuase it isn't real... Need some variations? Create more test suites that build differnt models to fit each of your needs. Models can be complex, but you should be able to write some test functions that create variations of your various models. Then run your controllers against those...
6) Keep plucking away at code coverage.
WARNING: Be careful about being the only one running the unit tests.. You will quickly become the most effective problem solver, but people will then try to get YOU to fix everything...
700 lines in a single controller action? Testing has a way of exposing issues in your own code in other ways than the obvious assertions. If something is difficult to test for whatever reason, there is something wrong with the code.
When starting to test a project, big or small, the focus should be code coverage. Don't worry so much about edge cases at first (unless the situation calls for it). When starting to test a project I like to start with the models as they are the most straightforward. From there move on to controller testing and see how it goes.

How do you set up database testing using the PHP SimpleTest framework

I am using SimpleTest, a PHP-based unit testing framework. I am testing new code that will handle storing and retrieving website comments from a database. I am at a loss for how to structure the project to test the database access code.
I am looking for any suggestions as to best practices for testing db code in a PHP application. Examples are really great. Sites for further reading are great.
Thank you kindly. :)
This is an old question but I thought I'd add some specific experience we've had with this.
Other posters are technically correct that this is a form of integration test but from where I sit there is often too much logic in MySQL to be stubbed out in unit testing. If you are like us and have large, complex services that are heavily dependent on MySQL (and often several tables per service), having a robust testing framework that includes testing query logic is really handy. We mock out a good number of our dependencies in our unit tests but not MySQL.
We have a set of classes that wrap simpletest to provide this functionality. It works something like this:
Instructions to create each database table are stored in a file at tests/etc/schemas/table.sql. It contains the schema data as well as inserts for all the canned data the test will expect to find.
Each test that requires the database extends a Test_DbCase class which provides functionality to build the tables.
A bootstrap class takes care of creating and dropping the database on construct and destruct.
At runtime, the test calls loadTables('foo', 'bar') in the setUp method to execute the sql commands in foo.sql and bar.sql.
Tests are run against the canned data..the rest is obvious.
One other tool we have is a bash script that makes it easier to create the table.sql files. This is really handy because otherwise we'd be writing the SQL by hand - you can take an existing set of tables, set up all your data in MySQL, and then export it to create the test files basically.
This works really well for us, though we ended up having to roll a lot of it ourselves.
I had a local database dedicated to unit testing with a known name and database username/password. The unit tests were hard-coded to that location but different developers could override those variables if they wanted.
Then before each test you TRUNCATE each table. This is much faster than dropping/creating tables or the database itself.
Note: Do not truncate after the tests! That way if a test fails you have the current state of the database which often helps diagnose the problem.
You might want to allow PHP to create and supply data to a temporary table/database and test on that table/database. Then you don't have to reset your database manually. Most frameworks have database manipulation libraries to make it easier. It might take time in the front end but will let you test much faster later when you make changes later.
Testing against a database usually indicates bad tests, probably due to lack of encapsulation in the code under test. You should try to isolate the code that interacts with the database from the rest of your code as much as possible, keeping this interaction layer so simple that you can get away with a few, very basic tests.
In other words; The code that deals with comments, shouldn't be the same code that deals with database interaction. You could - for example - write a generic table module, that your comment model uses to access the database. You would still have to test the table module, but that should be done in isolation from the comment code.
When testing database code, it's good to always have the same database as the starting point. Especially if you do unit-testing (which I assume is the case here). One of the ways is to truncate all tables as Jason suggested, but I prefer to have some starting data in it. You know, you always need to have some 'default' data that is present in each database.
Also, some tests only make sense with full database. So, create a special instance of database for those tests. I have about 3 or 4 different databases that I plug-in (just copy the files in) before running some tests. Having the same starting point each time ensures repeatability.
So, just prepare a few database states that are good 'starting points' and back them up. Before running each set of tests, restore appropriate database, and then run it.
I'd encourage you to not try to test the database access code using SimpleTest.
Instead, create a functional test for your app using, for example, Selenium: record a test case when you start from a known state of a database; then add a comment and check (using Selenium's asserts) that the content indeed is there.
This way it is:
- easier to set up and maintain
- you verify not just the DB layer, but the presentation layer, too
That said, if you have stored procedures in your DB, do use SimpleTest - I've done it myself successfully. Basically, create SimpleTests that start from a known DB state, then perform a few INSERTS/UPDATES, then run the stored proc and make sure the state of the DB is what you'd expect.
If you really want to test against a database, I would recommend to import data/create tables before each test. That way, your database starts from a known state on each test. Since this is rather performance-expensive, you can start a transaction (provided that your rdms supports it) in setUp and rollback in tearDown. MySql (Which is likely the RDBMS you're using), doesn't support nested transactions, so if the code under test uses transactions, you can run into trouble. You can get around this, using savepoints. Set up a savepoint before testing, and rollback to savepoint after test.
I'll still maintain that if you need much of this, you should consider the possibility that your tests are trying to tell you something ..
I think you should use an ORM, and write a few integration tests for that. If the integration tests show you that it works perfectly under the actual environment, then you have to test it again only when you change your environment (database, php version, platform, etc...). After that you can mock up the ORM object, and you won't need to connect to the database.
So I think this is the best way, but if you don't want to use an ORM, then you can create a test database and mock up the database connection (PDO) object. In that case you can create and drop test tables in the setUp and tearDown sections of your testCases. It's important that these are integration tests, not unit tests, so you don't need to run them always, only when something changed beetween the PHP and the SQL server. After you tested your data access objects with your integration tests, you have to mock them up in your unit tests.

Categories