Code coverage with phpunit; can't get to one place - php

In the xdebug code coverage, it shows the line "return false;" (below "!$r") as not covered by my tests. But, the $sql is basically hard-coded. How do I get coverage on that? Do I overwrite "$table" somehow? Or kill the database server for this part of the test?
I guess this is probably telling me I'm not writing my model very well, right? Because I can't test it well. How can I write this better?
Since this one line is not covered, the whole method is not covered and the reports are off.
I'm fairly new to phpunit. Thanks.
public function retrieve_all()
{
$table = $this->tablename();
$sql = "SELECT t.* FROM `{$table}` as t";
$r = dbq ( $sql, 'read' );
if(!$r)
{
return false;
}
$ret = array ();
while ( $rs = mysql_fetch_array ( $r, MYSQL_ASSOC ) )
{
$ret[] = $rs;
}
return $ret;
}

In theory, you should be better separating the model and all the database related code.
In exemple, in Zend Framework, the quickstart guide advise you to have :
your model classes
your data mappers, whose role is to "translate" the model into the database model.
your DAOs (or table data gateway) that do the direct access to the tables
This is a really interesting model, you should have a look at it, it enables you to really separate the model from the data, and thus to perform tests only onto the model part (and don't care about any database problem/question)
But in your code, I suggest you to perform a test where you have the dbq() function to return false (maybe having the db connexion to be impossible "on purpose"), in order to have full code coverage.
I often have these kind of situations, where testing all the "error cases" takes you too much time, so I give up having 100% code coverage.

I guess the function dbq() performs a database query. Just unplug your database connection and re-run the test.
The reason you have trouble testing is that you are using a global resource: your database connection. You would normally avoid this problem by providing your connection object to the test method (or class).

Related

PHP OOP Efficient DB Read Reduction

Six years ago I started a new PHP OOP project without having any experience so I just made it up as I went along. Anyway, I noticed that my rather powerful mySQL server sometimes gets bogged down far too easily and was wondering what the best way to limit some db activity, when I came up with this, as an example...
private $q_enghours;
public function gEngHours() {
if ( isset($this->q_enghours) ) {
} else {
$q = "SELECT q_eh FROM " . quQUOTES . " WHERE id = " . $this->id;
if ($r = $this->_dblink->query($q)) {
$row = $r->fetch_row();
$r->free();
$this->q_enghours = $row[0];
}
else {
$this->q_enghours = 0;
}
}
return $this->q_enghours;
}
This seems like it should be effective in greatly reducing the necessary reads to the db. If the object property is populated, no need to access the db. Note that there are almost two dozen classes all with the same db access routines for the "getter". I've only implemented this change in one place and was wondering if there is a "best practice" for this that I may have missed before I re-write all the classes.
I'd say that this question is rather based on wrong premises.
If you want to deal with "easily bogged down" database, then you have to dig up the particular reason, instead of just making guesses. These trifle reads you are so concerned with, in reality won't make any difference. You have to profile your whole application and find the real cause.
If you want to reduce number of reads, then make your object to map certain database record, by reading that record and populating all the properties once, at object creation. Constructors are made for it.
As a side note, you really need a good database wrapper, just to reduce the amount of code you have to write for each database call, so, this code can be written as
public function gEngHours() {
if ( !isset($this->q_enghours) ) {
$this->q_enghours = $this->db->getOne("SELECT q_eh FROM ?n WHERE id = ?", quQUOTES, $this->id);
}
return $this->q_enghours;
}
where getOne() method is doing all the job of running the query, fetching row, getting first result from it and many other thinks like proper error handling and making query safe.

Cakephp caching assotiation result

I'm querying big chunks of data with cachephp's find. I use recursive 2. (I really need that much recursion sadly.) I want to cache the result from associations, but I don't know where to return them. For example I have a Card table and card belongs to Artist. When I query something from Card, the find method runs in the Card table, but not in the Artist table, but I get the Artist value for the Card's artist_id field and I see a query in the query log like this:
`Artist`.`id`, `Artist`.`name` FROM `swords`.`artists` AS `Artist` WHERE `Artist`.`id` = 93
My question is how can I cache this type of queries?
Thanks!
1. Where does Cake "do" this?
CakePHP does this really cool but - as you have discovered yourself - sometimes expensive operation in its different DataSource::read() Method implementations. For example in the Dbo Datasources its here. As you can see, you have no direct 'hooks' (= callbacks) at the point where Cake determines the value of the $recursive option and may decides to query your associations. BUT we have before and after callbacks.
2. Where to Cache the associated Data?
Such an operation is in my opinion best suited in the beforeFind and afterFind callback method of your Model classes OR equivalent with Model.beforeFind and Model.afterFind event listeners attached to the models event manager.
The general idea is to check your Cache in the beforeFind method. If you have some data cached, change the $recursive option to a lower value (e.g. -1, 0 or 1) and do the normal query. In the afterFind method, you merge your cached data with the newly fetched data from your database.
Note that beforeFind is only called on the Model from which you are actually fetching the data, whereas afterFind is also called on every associated Model, thus the $primary parameter.
3. An Example?
// We are in a Model.
protected $cacheKey;
public function beforeFind($query) {
if (isset($query["recursive"]) && $query["recursive"] == 2) {
$this->cacheKey = genereate_my_unique_query_cache_key($query); // Todo
if (Cache::read($this->cacheKey) !== false) {
$query["recursive"] = 0; // 1, -1, ...
return $query;
}
}
return parent::beforeFind($query);
}
public function afterFind($results, $primary = false) {
if ($primary && $this->cacheKey) {
if (($cachedData = Cache::read($this->cacheKey)) !== false) {
$results = array_merge($results, $cachedData);
// Maybe use Hash::merge() instead of array_merge
// or something completely different.
} else {
$data = ...;
// Extract your data from $results here,
// Hash::extract() is your friend!
// But use debug($results) if you have no clue :)
Cache::write($this->cacheKey, $data);
}
$this->cacheKey = null;
}
return parent::afterFind($results, $primary);
}
4. What else?
If you are having trouble with deep / high values of $recursion, have a look into Cake's Containable Behavior. This allows you to filter even the deepest recursions.
As another tip: sometimes such deep recursions can be a sign of a general bad or suboptimal design (Database Schema, general Software Architecture, Process and Functional flow of the Appliaction, and so on). Maybe there is an easier way to achieve your desired result?
The easiest way to do this is to install the CakePHP Autocache Plugin.
I've been using this (with several custom modifications) for the last 6 months, and it works extremely well. It will not only cache the recursive data as you want, but also any other model query. It can bring the number of queries per request to zero, and still be able to invalidate its cache when the data changes. It's the holy grail of caching... ad-hoc solutions aren't anywhere near as good as this plugin.
Write query result like following
Cache::write("cache_name",$result);
When you want to retrieve data from cache then write like
$results = Cache::read("cache_name");

Escape SQL queries in PHP PostgreSQL

I have a website with lots of PHP files (really a lot...), which use the pg_query and pg_exec functions which do not
escape the apostrophe in Postgre SQL queries.
However, for security reasons and the ability to store names with
apostrophe in my database I want to add an escaping mechanism for my database input. A possible solution is to go
through every PHP file and change the pg_query and pg_exec to use pg_query_params but it is both time consuming
and error prone. A good idea would be to somehow override the pg_query and pg_exec to wrapper functions that would
do the escaping without having to change any PHP file but in this case I guess I will have to change PHP function
definitions and recompile it which is not very ideal.
So, the question is open and any ideas that would
allow to do what I want with minimum time consumption are very welcome.
You post no code but I guess you have this:
$name = "O'Brian";
$result = pg_query($conn, "SELECT id FROM customer WHERE name='{$name}'");
... and you'd need to have this:
$name = "O'Brian";
$result = pg_query_params($conn, 'SELECT id FROM customer WHERE name=$1', array($name));
... but you think the task will consume an unreasonable amount of time.
While it's certainly complex, what alternatives do you have? You cannot override pg_query() but it'd be extremely simple to search and replace it for my_pg_query(). And now what? Your custom function will just see strings:
SELECT id FROM customer WHERE name='O'Brian'
SELECT id FROM customer WHERE name='foo' OR '1'='1'
Even if you manage to implement a bug-free SQL parser:
It won't work reliably with invalid SQL.
It won't be able to determine whether the query is the product of intentional SQL injection.
Just take it easy and fix queries one by one. It'll take time but possibly not as much as you think. Your app will be increasingly better as you progress.
This is a perfect example of when a database layer and associated API will save you loads of time. A good solution would be to make a DB class as a singleton, which you can instantiate from anywhere in your app. A simple set of wrapper functions will allow you to make all queries to the DB go through one point, so you can then alter the way they work very easily. You can also change from one DB to another, or from one DB vendor to another without touching the rest of the app.
The problem you are having with escaping is properly solved by using the PDO interface, instead of functions like pg_query(), which makes escaping unnecessary. Seeing as you'll have to alter everywhere in your app that uses the DB, you may as well refactor to use this pattern at the same time as it'll be the same amount of work.
class db_wrapper {
// Singleton stuff
private $instance;
private function __construct() {
// Connect to DB and store connection somewhere
}
public static function get_db() {
if (isset($instance)) {
return $instance;
}
return $instance = new db_wrapper();
}
// Public API
public function query($sql, array $vars) {
// Use PDO to connect to database and execute query
}
}
// Other parts of your app look like this:
function do_something() {
$db = db_wrapper::get_db();
$sql = "SELECT * FROM table1 WHERE column = :name";
$params = array('name' => 'valuename');
$result = $db->query($sql, $params);
// Use $result for something.
}

Yii sql query bindings, with params, do not work properly

I'm new to Yii and everything seems good, but the problem is, when I`m using the binding params, like (DAO stile):
$command = $this->conn->createCommand($sql);
$command->bindColumn("title", "test_title");
$result = $command->query();
or (Active Record):
$row = Movies::model()->find("m_id=:m_id", array(":m_id"=>27));
or
$row = Movies::model()->findByPk(24);
I've tried everything:
1) added a config param to mysql config. in main.php - 'enableParamLogging' => true
2) changed strings from ' to "
3) added another param just in case of mysql versions - 'emulatePrepare'=>true
Nothing works for me.
I thought that the problem is in params, but bindColumn method doesn't use it, so my presumption is that some module of Yii hasn't been include in config file or something like that.
My model looks like this (created in /models dir):
class Movies extends CActiveRecord {
public static function model($className = __CLASS__) {
parent::model($className);
}
}
Just for everybody to avoid unnecessary questions: the database is configured properly in main.php conf file, there is a table movies, there is a PKs 24 and 27 also.
All native SQL works fine, except using in DAO special methods to bind some params and if in AR using findByPk or find. I hope that this is clear, guys don't bother me with obvious simple technical possibilities, that I can (as U presume) did wrong.
PS Another helpful info - when calling
$command->bindColumn("title", "test_title");
FW's throwing an Exception - CDbCommand and its behaviors do not have a method or closure named "bindColumn". So, as mentioned above, I think that Yii don't see those special methods and this is certain. How can I repair it?
Ok, why this code don't work either? There is no Exceptions, just blank page.
$sql = "SELECT title, year_made FROM movies WHERE year_made=':ym'";
$command = $this->conn->createCommand($sql);
$command->bindParam(":ym", "2012", PDO::PARAM_STR);
$result = $command->query();
The DAO binding isn't working because there is no bindColumn. You only have bindParam, that binds a variable to a column, or bindValue, that binds a value.
I don't know what's wrong with the AR query though.
Edit: Your DAO code should look like this:
$sql = "SELECT * FROM sometable WHERE title = :title";
$command = $this->conn->createCommand($sql);
$command->bindParam(":title", $tile_var);
$result = $command->query();
Have you generated movie model by Gii generator or you wrote it by yourself?
Edit:
As I know you have to add tableName
public function tableName() {
return 'movies';
}
You may have two separate issues. At least your DAO code (if that is what you actually have in your code) is binding wrong.
Binding needs to occur before the query is done (and it's done on the command object), not on the result object. See more info here: http://www.yiiframework.com/doc/guide/1.1/en/database.dao#binding-parameters
As far as your AR queries go, those could also be problematic. You can use findByAttributes instead of this line (although your line should work):
$row = Movies::model()->find("m_id=:m_id", array(":m_id"=>27));
The below should work if you have a standard id key. If you don't (and are using m_id, have you set your model's primaryKey() function up?
$row = Movies::model()->findByPk(24);
Also, you'll be getting a model object back, not a row if that changes how you are handling the results ...

PHP MySQL Function Unit Testing

I need to test a number of functions that I have created using PHP 5 which carry out the required database CRUD type actions (SELECT, UPDATE, INSERT, DELETE) which are required by my web application.
I have been looking at PHP unit testing suites such as Simple Test and PHP Unit which seem to offer what I need however I am unsure how I am meant to achieve this, as equivalence partitioning and boundary analysis isn't all that clear. Do I just need to input different variables and vary this? This seems rather pointless as a string that is different may not necessarily make any difference.
Any guidance on this would be helpful as I have not encountered this before.
If you are testing the interaction between your PHP code and your MySQL database, you are performing integration testing, rather than unit testing.
Here are some examples of integration testing with Enhance PHP framework, it tests a repository class that saves and retrieves a Tenant object.
Instead of running against a pre-populated database, it runs on an entirely empty database and creates and destroys the tables as it goes using a simple table helper. This removes the dependency on particular data being in the right state in a test database, which is hard to keep in step.
<?php
class TenantRepositoryTestFixture extends EnhanceTestFixture {
private $Target;
public function SetUp() {
$tables = new TableHelper();
$tables->CreateTenantTable();
$this->Target = Enhance::GetCodeCoverageWrapper('TenantRepository');
}
public function TearDown() {
$tables = new TableHelper();
$tables->DropTenantTable();
}
public function SaveWithNewTenantExpectSavedTest() {
$tenant = new Tenant();
$tenant->Name = 'test';
$saved = $this->Target->Save($tenant);
$result = $this->Target->GetById($saved->Id);
Assert::AreNotIdentical(0, $result->Id);
Assert::AreIdentical($tenant->Name, $result->Name);
}
public function SaveWithExistingTenantExpectSavedTest() {
$tenant = new Tenant();
$tenant->Name = 'test';
$saved = $this->Target->Save($tenant);
$saved->Name = 'changed';
$saved = $this->Target->Save($saved);
$result = $this->Target->GetById($saved->Id);
Assert::AreIdentical($saved->Id, $result->Id);
Assert::AreIdentical($saved->Name, $result->Name);
}
}
?>
Generally the idea with unit testing is to ensure that, if you make a change, you can simply run a simple series of tests to ensure no existing functionality will break. So that said, you'll want to cover the typical data you're expecting, as well as edge/boundary cases, which might include strings with quotes (to verify that they're being escaped properly), SQL injection attacks (same), empty strings, strings of different encoding, NULL, a boolean true, etc. Each test should verify that, given the data you input, you're getting the expected result, in this case (respectively): escaped string inserted, escaped string inserted, empty string inserted, different encoding converted or thrown out then inserted, an error thrown on a NULL value, the string 'true' inserted, etc.
I haven't used either test framework in a few years, but I remember having good results with PHPUnit.

Categories