I am trying to write unit tests for a file import export module.
One of my method checks that a filename passed exists.
How would this be mocked and a test written to check if the file exists or not?
Unit tests are supposed to prove that a unit of code functions correctly in complete isolation. If your test depends on the file system to function correctly in order to pass, your test is suboptimal and could be lying to you on any given test run.
Like any experiment, when you have more than one variable in play, you can't be sure of your results. For PHP code that interacts with the file system, it's best to mock the file system using a custom stream wrapper (usually vfsStream, but you can easily write your own stream wrapper if you really want to).
$noTest < $testWithFileSystemDependency < $testThatMocksFileSystem
Usually, this is accomplished by passing file paths into the methods that use them directly:
<?php
function myFunction($someFilePath) {
// do stuff
}
In this way, you can mock the file system and pass in a testable dummy path that will behave how you have mocked it to behave.
You write two tests. One creates the file and expects your method to succeed, the other ensures the file doesn't exist and expects your method to fail.
Personally I just create _files directory in the tests directory and create files there. Or in /tmp.
There is a way to mock the fs: http://www.phpunit.de/manual/current/en/test-doubles.html#test-doubles.mocking-the-filesystem but from my experience - I prefer real FS operations (in this case).
Related
I have a function, which someone else wrote, that creates a cURL wrapper object inside the function. Simplified version below
public function getCodes()
{
//do some stufff
$communicator = new Communicator();
$result = $communicator->call($this->API_KEY);
//do some stuff with $result
}
I was tasked with learning PHPUnit and writing tests for this type of code. In doing so I found that it's really hard to test a function like this when the object is created inside of the function and also that tests shouldn't require any outside communication to work.
We wanted to push our tests to git as many projects do but we didn't want to accidentally or intentionally push our API credentials to git.
So my solution was to keep getCodes() public but make it a wrapper for a private function that accepts a Communicator object as a parameter. Then I could test the private method with a mock Communicator object.
But this would mean that getCodes is never tested (my boss wants 100% code coverage) and I also read that you shouldn't be writing tests for private functions in most circumstances.
So my question is basically, how do I write a test for a function like this with an API call.
I would really suggest rewriting the code to inject the Communicator object via constructor.
If you already see there is a big issue with writing tests for something it is a very strong signal to re-thing the current implementation.
Another thing is that you shouldn't test your privates. Sebastian Bergmann wrote a post on his blog about this some time ago and the conclusion is - it is possible just not good (https://sebastian-bergmann.de/archives/881-Testing-Your-Privates.html).
Completely different thing is that I think your tests shouldn't go outside of the boundaries of your system. That is - mock everything that connects to outside systems. Such tests may fail for various of reasons non of witch are valid from the sole perspective of running tests.
You also mentioned coverage. Unfortunately this is something where, I hope, everyone will agree - you cannot have it the moment you start using native PHP resources (with some small exception like FS). You have to understand that things like curl, ssh, ftp and so on cannot be unit tested.
I have a class which reads a config file (array).
Here's a basic example:
class ConfigReader
{
function __construct($configFileName)
{
$this->config = $this->load($configFileName);
}
private function load()
{
//load config file
}
public function hasValue($value)
{
//parse config for value return true
}
public function getValue($key)
{
//etc....
}
}
How should I should I unit test this? Should I mock the config file? I assume not because if I change something in the real config file the test would not pick it up...If I should mock it how should I test the config file?
Dealing with the filesystem is tricky. There are a couple of ways to test this.
One have a test config file that you would use in your test that contains example values. I am not sure what exactly you are going to do with the values. That may require mocking other system values. The down side to this is that you are maintaining a file for testing and have to make sure that paths/permissions of the file are such that the test can run. You can end up with the test failing because the file wasn't readable rather than an actual problem.
The preferred way would be to mock the filesystem. PHPUnit's documentation recommends using vfsStream. This would allow you to create a config file in your test on the fly and try to load it. The advantage this has is that you don't have to maintain the separate file system/paths for loading the file.
I am not able to provide more detail without having a better idea of what your class methods are trying to accomplish. But this should get you on the right start.
I'm calling phpunit with the argument being a directory (bonus questions: why can't it accept a list of files?) and it's now complaining about a class being declared more than once because of a file included in the previous test!
If i run phpunit firstTest.php; phpunit secondTest.php everything works
But phpunit ./ fails with PHP Fatal error: Cannot redeclare class X
my tests are basically:
include 'class_to_be_tested.php'
class class1Test extends...
and nothing else. And i'm using the option --process-isolation. I could add require_once on my classes, but that's not what i want to be able to test them individually.
shouldn't phpunit follow best testing practices and run one test, clear whatever garbage it have, run another test on a clean state? or am i doing something wrong?
Since you have include rather than include_once and there is no other code shown in your question, the cannot redeclare error could also be that you are including the file again somewhere in the code under test.
Assuming that is not the case, there some behind the scenes things that happen with --process-isolation that can keep the global class declarations. This blog post gives more detail: http://matthewturland.com/2010/08/19/process-isolation-in-phpunit/
Basically, you will want to create your own base TestCase and override the run() method to set the preserveGlobalState to false. This should properly allow all your tests to run together.
The base class would look similar to this (taken from the blog post I referred to):
class MyTestCase extends PHPUnit_Framework_TestCase
{
public function run(PHPUnit_Framework_TestResult $result = NULL)
{
$this->setPreserveGlobalState(false);
return parent::run($result);
}
}
The way phpUnit works means that all your tests are run in the context of a single php program. phpUnit aims to isolate the tests from each other, but they are all run within the same program execution.
PHP's include statement will include the requested file regardless of whether it has been included before. This means that if you include a given class twice, you will get an error the second time. This is happening in your tests because each test is including the same file, but without any consideration to whether it's already been included by one of the other tests.
Solutions:
Wrap your include calls with a if(class_exists('classname')) so that you don't include the file if the class has already been defined.
Include the files in a phpUnit bootstrap file instead of in the tests.
Use include_once (or even require_once) instead of include.
Stop including files arbitrarily, and start using an autoloader.
Change:
include 'class_to_be_tested.php';
class class1Test extends...
to be:
include_once 'class_to_be_tested.php';
class class1Test extends...
In PHP you need to have a Really Good Reason to use the former.
Regarding why can't it accept a list of files?, I think the design decision is that you generally don't need to. However you can do it by creating a test suite in the phpunit.xml.dist file, see
http://www.phpunit.de/manual/current/en/organizing-tests.html#organizing-tests.xml-configuration
I am trying to use php-resque to queue and execute ffmpeg conversions on my server. I understand broadly how it should work, but I am having some trouble with the details and can not find any tutorials. Specifically, I don't understand where I should place my job classes, and how to give the classes to my workers and start my workers. The read me only says "Getting your application underway also includes telling the worker your job classes, by means of either an autoloader or including them."
Hopefully someone can outline the overall structure of using php-resque.
You can put your job classes where you want. It'll depend on your application structure.
How to create a job class
For example, let's suppose the class VideoConversion, used for the ffmpeg conversion.
class VideoConversion {
public function perform() {
// The code for video conversion here
}
}
In your main application, before using php-resque, let's say you have something like that
public function uploadVideo() {
// Upload and move the video to a temp folder
// Convert the video
}
And you want to enqueue the 'convert video' part. Let's just queue it to the convert queue:
public function uploadVideo() {
// Upload and move the video to a temp folder
// Let's suppose you need to convert a 'source video' to a 'destination video'
Resque::enqueue('convert', 'VideoConversion', array('origine-video.avi', 'destination-video.avi'));
}
When queuing the job, we passed the path to the source and destination video to the VideoConversion class. You can pass other argument, it'll depend on how your VideoConversion class is written.
A worker will then poll the convert queue, and execute the VideoConversion job. What the worker will do is to instantiate the VideoConversion class, and execute the perform() method.
The job arguments (array('origine-video.avi', 'destination-video.avi')), third argument when queueing the job with Resque::enqueue, will be available inside the perform() method via $this->args.
# VideoConversion.php
class VideoConversion
{
public function perform() {
// $this->args == array('origine-video.avi', 'destination-video.avi');
// Convert the video
}
Find your job classes
The VideoConversion class can be put anywhere, but you have to tell your workers where to find it.
There's multiple ways to do that
Put you jobs classes in the include_path
In your .htaccess or the apache config, add the directory containing all your job classes to the include path. Your workers will automatically find them.
Main issue with this method is that all your jobs classes must be in the same folder, and that all your job classes are available everywhere.
Tell each worker where to find your job classes
When starting the worker, use the APP_INCLUDE argument to point to the job classes 'autoloader'.
APP_INCLUDE=/path/to/autoloader.php QUEUE=convert php resque.php
The above command will start a new worker, polling the queue named convert.
We're also passing the file /path/to/autoloader.php to the worker. (see here to learn to start a worker)
Technically, the worker will include that file with include '/path/to/autoloader.php';.
You can then tell the workers how to find your job classes:
Use basic include
In the '/path/to/autoloader.php':
include /path/to/VideoConversion.php
include /path/to/anotherClass.php
...
Use an autoloader
Use php autoloader to load your job classes.
Use set_include_path()
set_include_path('path/to/job');
That way, your jobs are in the include_path just for this worker.
Closing thought
APP_INCLUDE is binded to the worker you're starting. If you're starting another worker, use APP_INCLUDE again. You can use a different file for each worker.
You can also design your job classes to execute more than one job. There's a tutorial explaining how to do that. It covers from the basic of a queue system to how to use and implement it.
If it's still not enough, take a look at resque documentation. php-resque API is exactly the same. Only difference is that Resque job classes are written in Ruby, whereas php-resque's one are in php.
Hi Please check out following tutorial on how to use resque with phalcon.
http://www.mehuldoshi.in/background-jobs-phalcon-resque/
I have asked a similar question to this one already but I think it was badly worded and confusing so hopefully I can make it a bit clearer.
I am programming in a native Linux file system.
I have a class of HelpTopic:
class HelpTopic extends Help{}
And a class of Help:
class Help{}
Now I go to include HelpTopic:
include('HelpTopic.php');
And even though I do not instantiate HelpTopic with new HelpTopic() PHP (in a Linux file system) still reads the class signature and tries to load Help with HelpTopic.
I do not get this behaviour from a cifs file system shared from a Windows System.
My best guess is that there is some oddity with Linux that causes PHP to react this way but not sure what.
Does anyone have any ideas or solutions to this problem?
EDIT:
I have added my loading function to show what I am doing:
public static function import($cName, $cPath = null){
if(substr($cName, -2) == "/*"){
$d_name = ROOT.'/'.substr($cName, 0, -2);
$d_files = getDirectoryFileList($d_name, array("\.php")); // Currently only accepts .php
foreach($d_files as $file){
glue::import(substr($file, 0, strrpos($file, '.')), substr($cName, 0, -2).'/'.$file);
}
}else{
if(!$cPath) $cPath = self::$_classMapper[$cName];
if(!isset(self::$_classLoaded[$cName])){
self::$_classLoaded[$cName] = true;
if($cPath[0] == "/" || preg_match("/^application/i", $cPath) > 0 || preg_match("/^glue/i", $cPath) > 0){
return include ROOT.'/'.$cPath;
}else{
return include $cPath;
}
}
return true;
}
}
I call this by doing glue::inmport('application/models/*'); and it goes through including all the models in my app. Thing is PHP on a linux based file system (not on cifs) is trying to load the parents of my classes without instantiation.
This is a pretty base function that exists in most frameworks (in fact most of this code is based off of yiis version) so I am confused why others have not run into this problem.
And even though I do not instantiate HelpTopic with new HelpTopic() PHP still reads the class signature and tries to load Help with HelpTopic.
Correct.
In order to know how to properly define a class, PHP needs to resolve any parent classes (all the way up) and any interfaces. This is done when the class is defined, not when the class is used.
You should probably review the PHP documentation on inheritance, which includes a note explaining this behavior:
Unless autoloading is used, then classes must be defined before they are used. If a class extends another, then the parent class must be declared before the child class structure. This rule applies to class that inherit other classes and interfaces.
There are two ways to resolve this problem.
First, add a require_once at the top of the file that defines the child class that includes the file defining the parent class. This is the most simple and straight-forward way, unless you have an autoloader.
The second way is to defione an autoloader. This is also covered in the documentation.
The ... thing ... you're using there is not an autoloader. In fact, it's a horrible abomination that you should purge from your codebase. It's a performance sap and you should not be using it. It also happens to be the thing at fault.
We don't have the definition of getDirectoryFileList() here, so I'll assume it uses either glob() or a DirectoryIterator. This is the source of your problem. You're getting the file list in an undefined order. Or, rather, in whatever order the underlying filesystem wants to give to you. On one machine, the filesystem is probably giving you Help.php before HelpTopic.php, while on the other machine, HelpTopic.php is seen first.
At first glance, you might think this is fixable with a simple sort, but it's not. What happens if you create a Zebra class, and then later need to create an AlbinoZebra that inherits from it? No amount of directory sorting is going to satisfy both the "load ASCIIbetical" and the "I need the Zebra to be first" requirements.
Let's also touch on the performance aspect of the problem. On every single request, you're opening a directory and reading the list of files. That's one hell of a lot of stat calls. This is slow. Very slow. Then, one by one, regardless of whether or not you'll need them, you're including the files. This means that PHP has to compile and interpret every single one of them. If you aren't using a bytecode cache, this is going to utterly destroy performance if the number of files there ever grows to a non-trivial number.
A properly constructed autoloader will entirely mitigate this problem. Autoloaders run on demand, meaning that they'll never attempt to include a file before it's actually needed. Good-performing autoloaders will know where the class file lives based on the name alone. In modern PHP, it's accepted practice to name your classes such that they'll be found easily by an autoloader, using either namespaces or underscores -- or both -- to map directory separators. (Meaning namespace \Models; class Help or class Models_Help would live in Models/Help.php)
Unfortunately most examples won't be useful here, as I don't know what kind of weird things your custom framework does. Take a peek at the Zend Framework autoloader, which uses prefix registration to point class prefixes (Model_) at directories.