I change my environment to testing on basespec with sqlite as driver and store in the memory.
function it_should_validate()
{
// Create the user
$data = array('email' => 'test#gmail.com', 'password' => 'password', 'remember' => false);
$this->validate($data)->shouldBe(true);
}
How do i insert information to the testing database whenever i run the test? Right now the test fails because it always return false;
What you're trying to do is integration testing.
PhpSpec is a tool for unit testing. It puts a big emphasis on testing in isolation. That means, that the only object that's created is the object being tested. All its dependencies should be stubbed or mocked.
In other words there should be no database involved in specs you write.
You didn't show the full example of a class you're trying to spec, so it's hard to advice how your spec should look like.
Adviced reading:
Test Driven Development by Example
Growing Object-Oriented Software Guided by Tests
Related
I have run into a major problem while writing tests. I am using Laravel 5.6.0 as framework and PHPUnit 7.0 for testing. As a testing DB I have used sqlite with storage in memory. This is from my database.php:
'sqlite_testing' => [
'driver' => 'sqlite',
'database' => ':memory:',
'prefix' => '',
],
But my problem is that I have several places where I use whereRaw, for example $query->whereRaw('STR_TO_DATE(CONCAT(date, " ",from), "%Y-%m-%d %k") < ?', [$before]);. The problem here is that sqlite does not have the STR_TO_DATE or CONCAT functions that MySQL has. So PHPUnut throws a bunch of errors because of that.
My solution was instead using a MySQL DB as testing DB. But this doesn't seem to work since I get several different errors, mostly I have several tests where foreign key constraint fails.
One exaple for this is that I have the following in my Base TestCase setUp method:
if (Schema::hasTable('gym_schedule') && !empty(GymSchedule::createGymSchedules())) {
$this->artisan('db:seed', ['--class' => 'GymScheduleTableSeeder']);
}
This fails every time except the first because it says that a schedùle with id 1 already exists (id is my primary key). I did try to truncate all tables between each test class using tearDown, but that did not help at all, and also the testing became reeeeally slow, like 3 seconds for each test.
So basically that approach does not work either.
I am at a loss. I have tried googling this and searching through StackOverflow. I am open to any suggestion that is not too complicated (either remove all MySQL functions somehow, solve usage of MySQL or anything else really).
Does anyone have a good idea?
Regarding the first part of the question, unit testing a Laravel app when using whereRaw(), this is how I wrote the methods:
$raw_condition = "CONCAT(`country_code`, `number`) = ?";
if (env('DB_CONNECTION') === 'sqlite') {
$raw_condition = "(`country_code` || `number`) = ?";
}
return Model::whereRaw($raw_condition, [$number])->firstOrFail();
I am using the Braintree PHP Client which relies heavily on static methods.
All my endpoints in this project are covered with integration tests
Something like:
Storage::shouldReceive('put')->once()->andReturn(true);
$this->post('/api/payment');
As you can see I'm also using Mockery in order to create mocks. However since the Braintree library is heavily relying on static methods, I'm not able to create methods, thus not able to test these endpoints.
Here's an example of a code written using the Braintree PHP Client:
$result = Braintree\Transaction::sale([
'amount' => '1000.00',
'paymentMethodNonce' => 'nonceFromTheClient',
'options' => [ 'submitForSettlement' => true ]
]);
What options do I have here?
this answer will only work if you got mockery 1.* installed.. earlier versions won't do static method mocking. The below code works:
$brainTreeMock = Mockery::mock('alias:Braintree_Transaction');
$transaction = (object)[ 'id' => str_random(5) ];
$brainTreeMock->shouldReceive('sale')->andReturn((object)[
'success' => true,
'transaction' => $transaction
]
);
Mocking one of the components during integration tests should be done with great care, as it defeats the purpose.
I believe Braintree provides a sanbox for integration testing, so there is no need to mock it.
You can use an alias mock to mock public static method calls. You would use it like this:
$classMock = Mockery::mock('alias:NamespaceToClass\ClassName');
$classMock->shouldReceive('someMethod')->once()->andReturn('Something');
Usually I would advocate mocking the responses from an API instead of making calls to it when running integration tests.
However, I contacted Braintree and they said (at time of writing) that it's ok to run the tests vs the sandbox as long as you don't exceed 25 concurrent requests per second.
I am trying to test some of my controllers through Unit Testing. But there is something strange happening. With the following code in my testcase:
public function test_username_registration_too_short()
{
$result = $this->action('POST', 'App\\Controllers\\API\\UserController#store', null, [
'username' => 'foo'
]);
$this->assertEquals('not_saved', $result->getContent());
// $result = $this->action('POST', 'App\\Controllers\\API\\UserController#store', null, [
// 'username' => 'foo'
// ]);
// $this->assertEquals('not_saved', $result->getContent());
}
public function test_username_registration_too_short_run_2()
{
$result = $this->action('POST', 'App\\Controllers\\API\\UserController#store', null, [
'username' => 'foo'
]);
$this->assertEquals('not_saved', $result->getContent());
}
When I run this, the initial too_short test passes, but the exact same code on run 2 does not pass (it even manages to save the user). But if I put that same code twice in the same method (what is commented out now) it works perfectly? I have nothing in my setUp or tearDown methods. And I am a bit lost here.
The code in the controller is the following:
$user = new User(Input::all());
if($user->save())
{
return 'saved';
}
return 'not_saved';
I'm not going to stop repeating myself over this question. There's a similar answer to a (somewhat) similar question. TL;DR: don't use unit testing framework for functional / integration testing.
This is area of functional testing and there is a fabulous framework
called Behat. You should do your own research, but essentially, while
PHPUnit is great at testing more or less independent blocks of
functionality it sucks at testing bigger things like full request
execution. Later you will start experiencing issues with session
errors, misconfigured environment, etc., all because each request is
supposed to be executed in it's own separate space and you force it
into doing the opposite. Behat on the other hand works in a very
different way, where for each scenario (post robot, view non-existing
page), it sends a fresh request to the server and checks the result.
It is mostly used for final testing of everything working together by
making assertions on the final result (response object / html / json).
If you want to test your code the proper way consider using the right tools for that. Once you know your way around with Behat you'll fall in love with it + you can use PHPUnit from within the Behat, to make individual assertions.
Imagine that you have interfaces which describe the data access layer of your application. You haven't decided yet what kind of storing mechanism you want to use, you just want to make sure, that whatever you choose, it will handle concurrent requests well. For that you have to write concurrency tests against those interfaces.
I think a schematic concurrency test should be something like this:
public function testMoneyIsNotLostByConcurrentTransfers(){
$accountRepository = DataAccessLayer::getBankAccountRepository();
$accountOfTom = $accountRepository->create(array(
'owner' => 'Tom',
'balance' => new Money(10000)
));
$accountOfBob = $accountRepository->create(array(
'owner' => 'Bob',
'balance' => new Money(10000)
));
$accountOfSusanne = $accountRepository->create(array(
'owner' => 'Susanne',
'balance' => new Money(10000)
));
$this->concurrentExecution(
function () use ($accountOfTom, $accountOfBob){
$accountOfTom->transfer($accountOfBob, new Money(5000));
},
function() use ($accountOfTom, $accountOfSusanne){
$accountOfSusanne->transfer($accountOfTom, new Money(5000));
}
);
$this->assertEquals($accountOfTom->getBalanceAmount(), 10000);
$this->assertEquals($accountOfBob->getBalanceAmount(), 15000);
$this->assertEquals($accountOfSusanne->getBalanceAmount(), 5000);
}
Is it possible to write such tests, test runner in PHP? Or is there any existing tool which can help by concurrency testing in PHP?
I could not find any test runner for such concurrency tests. I found only paratest, which can run independent tests, like unit tests parallel.
According to PHP - parallel task runner the best option I think is using pthreads with debug_backtrace. I think it will be hard even with that. I am looking forward the installing problems, thread safety, resource sharing difficulties, backtrace bugs, etc... I will have a great time I am sure...:S
I found async calls in the pthreads examples.
If I ever manage to solve this, I will share it on github and add a link here. Until then...
update
I just realized that I don't need multi thread or multi process applications to test concurrency. For example I can start two transactions with 2 database connections from the same php file. What I need is add event triggering for the statements the db driver does, so I can add breakpoints and wait the other task wherever I want. File locking is just the same... So coroutines or some hand made multi tasking and statement logging is just enough...
Concurrency should be built into your saving mechanism, not the execution layer.
For example, if you are using SQL, instead of setting the variable use += and -=.
I've been working on converting an application of mine from CodeIgniter to Phalcon. I've noticed that [query heavy] requests that only took a maximum of 3 or 4 seconds using CI are taking up to 30 seconds to complete using Phalcon!
I've spent days trying to find a solution. I've tried using all the different means of access offered by the framework including submitting raw query strings directly to Phalcon's MySql PDO adapter.
I'm adding my database connection to the service container exactly like it is shown in Phalcon's INVO tutorial:
$di->set('db', function() use ($config) {
return new \Phalcon\Db\Adapter\Pdo\Mysql(array(
"host" => $config->database->host,
"username" => $config->database->username,
"password" => $config->database->password,
"dbname" => $config->database->name
));
});
Using webgrind output I've been able to narrow the bottleneck down to the constructor in Phalcon's PDO adapter class (cost is in milliseconds):
I've already profiled and manually tested the relevant SQL to make sure the bottleneck isn't in the database (or my poorly constructed SQL!)
I've discovered the problem, which to me wasn't immediately apparent, so hopefully others will find this useful as well.
Every time a new query was started, the application was getting a new instance of the database adapter. The request which produced the webgrind output above had a total of 20 queries.
While re-reading Phalcon's documentation section on dependency injection I saw that services can optionally be added to the service container as a "shared" service, which effectively forces the object to act as a singleton, meaning that once one instance of the class is created, the application will simply pass that instance to any request instead of creating a new instance.
There are several methods to force a service to be added as a shared service, details of which can be found here in Phalcon's Documentation:
http://docs.phalconphp.com/en/latest/reference/di.html#shared-services
Changing the code posted above to be added as a shared service looks like this:
$di->setShared('db', function() use ($config) {
return new \Phalcon\Db\Adapter\Pdo\Mysql(array(
"host" => $config->database->host,
"username" => $config->database->username,
"password" => $config->database->password,
"dbname" => $config->database->name
));
});
Here's what the webgrind output looks like for the same query referenced above, but after setting the database service to be added as a shared service (cost in milliseconds):
Notice that the invocation count is now 1 instead of 20, and the invocation cost dropped from 20 seconds down to 1 second!
I hope someone else finds this useful!
In most examples services are shared as de facto, not in the most apparent way though, but via:
$di->set('service', …, true);
The last bool argument passed to the set makes it shared and in 99.9% you'd want your DI services to be that way, otherwise similar things would happen as described by #the-notable, but because they are likely to be not as "impactful", they would be hard to trace down.