So I load all my budgets like so:
$budgets = auth()->user()->budgets()->get(); // With some eager loading and ordering, not needed for this example.
The problem lies in the way I perform some checks in my view. Take for example this snippet:
#foreach($budgets as $budget)
if($budget->getRemainingAmount() < 1) // getRemainingAmount() performs 6 queries
class="danger"
#endif
#endforeach
Now I have one main issue with the above approach. I don't mind the 6 queries, that's just how it works behind the scenes which is totally fine. The thing is, each time I call the method within the view, the 6 queries are run again, which means they are duplicated over and over.
What I want to do is, for example, is include that method in my query and just assign it to a variable.
$budgets = auth()->user()->budgets()->doSomethingToAssignThatMethodToSomeVariable()->get();
Let's just say for now the method does the following simple thing:
public function getRemainingAmount()
{
return 100;
}
Now, how can I assign the method getRemainingAmount() to a variable called $remaining while performing my query? Or, is there a better way to approach this perhaps? In my view, I just want to be able to change this:
if($budget->getRemainingAmount() < 1)
To this (for example):
if($remaining < 1)
So that I can perform the check multiple times without having to run 6 queries over and over, each time I call the method.
Any thoughts on how to achieve this in a simple manner? I actually have multiple methods that result in the same issue (right now my debugbar says: 66 statements were executed, 41 of which were duplicated, 25 unique). Obviously I want to remove the duplication.
You would probably be interested in using the cache for storing repetitive queries results.
You may even pass a Closure as the default value. The result of the
Closure will be returned if the specified item does not exist in the
cache. Passing a Closure allows you to defer the retrieval of default
values from a database or other external service:
$value = Cache::get('key', function () {
return DB::table(...)->get();
});
You have control over how long results are cached and it can drastically improve performance and reduce db load.
So for you, accessing some-budget-result
$result = Cache::get('some-budget-result', function () {
return auth()->user()->budgets()->doSomethingToAssignThatMethodToSomeVariable()->get();
});
would run the query once, and each subsequent time you access that cache item, same results, no additional queries.
https://laravel.com/docs/5.5/cache#retrieving-items-from-the-cache
Related
The code I am using to cause this is too big to post here, and the problem is so odd that I can't recreate a smaller example that has the same issue. I have some code that looks roughly like this:
<?php
class A {
//Lots of methods
public function fastMethod($id) {
return 0;
}
public function differentFastMethod($id) {
$this->someOtherObject->someOtherFastMethod();
return $this->fastMethod($id);
}
public function slowMethod() {
$c = 0;
$start = microtime(true);
// hugeList has about 100,000 ids.
foreach ($this->hugeList as $id) {
// bigLookupTable has an array for almost all ids, each array contains one or two items.
if (isset($this->bigLookupTable[$id]) {
foreach ($this->bigLookupTable[$id] as $differentId) {
fastMethod($diferentId);
$c++;
}
}
}
$end = microtime(true);
print $end - $start;
}
}
?>
running slowMethod takes about 1 second, but if I delete the call to fastMethod it takes about 0.1 second. I thought it must just be big function call overhead, but there's more to it than that.
If I change the call to fastMethod with one to differentFastMethod for instance, it takes about the same time, even though that not only adds more function calls to this object, but includes some to other objects. Furthermore, if I add more calls to fastMethod in the inner loop, each one only adds about another 0.1 seconds to the time. It seems like this loop takes an extra second to run just based on whether or not it calls instance methods on itself, whatever those methods do and however many there are.
Another interesting thing about the performance is the inner loop in slowMethod. The arrays passed to it are mostly 1 element long or occasionally 2 elements, so taking out the loop and just calling fastMethod on the first item of the array should be about as fast, but this is actually much faster, speeding it up by about the same amount that removing the call to fastMethod does.
I realise its hard to diagnose without a working example, but I am stumped as to what could posibly cause behaviour like this, if anyone had any ideas I could test them on the real code.
I am using fatfree framework with the cortex ORM plugin. I am trying to get the number of records matching a specific criteria. My code:
$group_qry = new \models\system\UserGroup;
$group_qry->load(array('type=?',$name));
echo $group_qry->count(); //always returns 3, i.e total number of records in table
Initially I was thinking that this maybe because the filtering wasn't working and it always fetched everything, but that is not the case, cause I verified it with
while(!$group_qry->dry()){
echo '<br/>'.$group_qry->type;
$group_qry->next();
}
So how do I get the number of records actually loaded after filtering?
Yeah this part is confusing: the count() method actually executes a SELECT COUNT(*) statement. It takes the same arguments as the load() method, so in your case :
$group_qry->count(array('type=?',$name));
It is not exactly what you need, since it will execute a second SELECT, which will reduce performance.
What you need is to count the number of rows in the result array. Since this array is a protected variable, you'll need to create a dedicated function for that in the UserGroup class:
class UserGroup extends \DB\SQL\Mapper {
function countResults() {
return count($this->query);
}
}
If you feel that it's a bit of overkill for such a simple need, you can file a request to ask for the framework to handle it. Sounds like a reasonable demand.
UPDATE:
It's now part of the framework. So calling $group_qry->loaded() will return the number of loaded records.
xfra35 is right, there was currently no method for just returning the loaded items, when using the mappers as active record. alternatively you can just count the collection:
$result = $group_qry->find(array('type=?',$name));
echo count($result); // gives you the number of all found results
in addition, i've just added the countResults method, like suggested. thx xfra35.
https://github.com/ikkez/F3-Sugar/commit/92fa18130892ab7d2303edc31d2b1f4bba70e881
I'm querying big chunks of data with cachephp's find. I use recursive 2. (I really need that much recursion sadly.) I want to cache the result from associations, but I don't know where to return them. For example I have a Card table and card belongs to Artist. When I query something from Card, the find method runs in the Card table, but not in the Artist table, but I get the Artist value for the Card's artist_id field and I see a query in the query log like this:
`Artist`.`id`, `Artist`.`name` FROM `swords`.`artists` AS `Artist` WHERE `Artist`.`id` = 93
My question is how can I cache this type of queries?
Thanks!
1. Where does Cake "do" this?
CakePHP does this really cool but - as you have discovered yourself - sometimes expensive operation in its different DataSource::read() Method implementations. For example in the Dbo Datasources its here. As you can see, you have no direct 'hooks' (= callbacks) at the point where Cake determines the value of the $recursive option and may decides to query your associations. BUT we have before and after callbacks.
2. Where to Cache the associated Data?
Such an operation is in my opinion best suited in the beforeFind and afterFind callback method of your Model classes OR equivalent with Model.beforeFind and Model.afterFind event listeners attached to the models event manager.
The general idea is to check your Cache in the beforeFind method. If you have some data cached, change the $recursive option to a lower value (e.g. -1, 0 or 1) and do the normal query. In the afterFind method, you merge your cached data with the newly fetched data from your database.
Note that beforeFind is only called on the Model from which you are actually fetching the data, whereas afterFind is also called on every associated Model, thus the $primary parameter.
3. An Example?
// We are in a Model.
protected $cacheKey;
public function beforeFind($query) {
if (isset($query["recursive"]) && $query["recursive"] == 2) {
$this->cacheKey = genereate_my_unique_query_cache_key($query); // Todo
if (Cache::read($this->cacheKey) !== false) {
$query["recursive"] = 0; // 1, -1, ...
return $query;
}
}
return parent::beforeFind($query);
}
public function afterFind($results, $primary = false) {
if ($primary && $this->cacheKey) {
if (($cachedData = Cache::read($this->cacheKey)) !== false) {
$results = array_merge($results, $cachedData);
// Maybe use Hash::merge() instead of array_merge
// or something completely different.
} else {
$data = ...;
// Extract your data from $results here,
// Hash::extract() is your friend!
// But use debug($results) if you have no clue :)
Cache::write($this->cacheKey, $data);
}
$this->cacheKey = null;
}
return parent::afterFind($results, $primary);
}
4. What else?
If you are having trouble with deep / high values of $recursion, have a look into Cake's Containable Behavior. This allows you to filter even the deepest recursions.
As another tip: sometimes such deep recursions can be a sign of a general bad or suboptimal design (Database Schema, general Software Architecture, Process and Functional flow of the Appliaction, and so on). Maybe there is an easier way to achieve your desired result?
The easiest way to do this is to install the CakePHP Autocache Plugin.
I've been using this (with several custom modifications) for the last 6 months, and it works extremely well. It will not only cache the recursive data as you want, but also any other model query. It can bring the number of queries per request to zero, and still be able to invalidate its cache when the data changes. It's the holy grail of caching... ad-hoc solutions aren't anywhere near as good as this plugin.
Write query result like following
Cache::write("cache_name",$result);
When you want to retrieve data from cache then write like
$results = Cache::read("cache_name");
I have a Propel 1.6 generated class Group that has Inits related to it, and Inits have Resps related to them. Pretty straightforward.
I don't understand the difference between these two pieces of Propel code. Here in the first one, I re-create the $notDeleted criteria on every loop. This code does what I want -- it gets all the Resps into the $data array.
foreach ($group->getInits() as $init) {
$notDeleted = RespQuery::create()->filterByIsDeleted(false);
foreach ($init->getResps($notDeleted) as $resp) {
$data[] = $resp;
}
}
Here in the second code, I had the $notDeleted criteria pulled out of the loop, for (what I thought were) obvious efficiency reasons. This code does not work the way I want -- it only gets the Resps from one of the Inits.
$notDeleted = RespQuery::create()->filterByIsDeleted(false);
foreach ($group->getInits() as $init) {
foreach ($init->getResps($notDeleted) as $resp) {
$data[] = $resp;
}
}
I thought it must be something to do with how the getResps() method caches the results, but that's not how the docs or the code reads in that method. The docs and the code say that if the criteria passed in to getResps() is not null, it will always get the results from the database. Maybe some other Propel cache?
(First off, I'm guessing you meant to use $init versus $initiative in your loops. That or there's some other code we're not seeing here.)
Here's my guess: In your second example you pull out the $notDeleted Criteria object, but each time through the inner foreach the call to getResps($notDeleted) is going to make Propel do a filterByInit() on the Criteria instance with the current Init instance. This will add a new WHERE condition to the SQL, but obviously a Resp can only have one Init.Id value, hence the lone result.
I don't think there is a good reason to pull that out though, under the covers Propel is just creating a new Criteria object, cloning the one you pass in - thus no real memory saved.
I'm looking for a way to prevent repeated calls to the database if the item in question has already been loaded previously. The reason is that we have a lot of different areas that show popular items, latest releases, top rated etc. and sometimes it happens that one item appears in multiple lists on the same page.
I wonder if it's possible to save the object instance in a static array associated with the class and then check if the data is actually in there yet, but then how do I point the new instance to the existing one?
Here's a draft of my idea:
$baseball = new Item($idOfTheBaseballItem);
$baseballAgain = new Item($idOfTheBaseballItem);
class Item
{
static $arrItems = array();
function __construct($id) {
if(in_array($id, self::arrItems)){
// Point this instance to the object in self::arrItems[$id]
// But how?
}
else {
// Call the database
self::arrItems[id] = $this;
}
}
}
If you have any other ideas or you just think I'm totally nuts, let me know.
You should know that static variables only exist in the page they were created, meaning 2 users that load the same page and get served the same script still exist as 2 different memory spaces.
You should consider caching results, take a look at code igniter database caching
What you are trying to achieve is similar to a singleton factory
$baseball = getItem($idOfTheBaseballItem);
$baseballAgain =getItem($idOfTheBaseballItem);
function getItem($id){
static $items=array();
if(!isset($items[$id])$items[$id]=new Item($id);
return $items[$id];
}
class Item{
// this stays the same
}
P.S. Also take a look at memcache. A very simple way to remove database load is to create a /cache/ directory and save database results there for a few minutes or until you deem the data old (this can be done in a number of ways, but most approaches are time based)
You can't directly replace "this" in constructor. Instead, prepare a static function like "getById($id)" that returns object from list.
And as stated above: this will work only per page load.