Laravel 4: remember() results from ::all() - php

Is it possible to apply the remember(60) function to something like Service::all()?
This is a data set that will rarely change. I've attempted several variations with no success:
Service::all()->remember(60);
Service::all()->remember(60)->get();
(Service::all())->remember(60);
Of course, I am aware of other caching methods available, but I prefer the cleanliness of this method if available.

Yes, you should be able to simply swap the two such as
Change
Service::get()->remember(60);
to
Service::remember(60)->get();
An odd quirk I agree, but after I ran into this a few weeks back and realized all I had to do was put remember($time_to_remember) in front of the rest of the query builder it works like a charm.
For your perusing pleasure: See the Laravel 4 Query Builder Docs Here
/**
* Indicate that the query results should be cached.
*
* #param int $minutes
* #param string $key
* #return \Illuminate\Database\Query\Builder
*/
public function remember($minutes, $key = null)
{
list($this->cacheMinutes, $this->cacheKey) = array($minutes, $key);
return $this;
}
L4 Docs - Queries

Related

loading of related documents in doctrine ODM leads to too many queries

I'm stuck trying to reduce the number of database queries on a web api.
My database has 3 collections : playground, widget, token
One playground has many widgets, one widget has one token. Each relationship uses referencesOne/referenceMany.
So here are my simplified models
/**
* #MongoDB\Document()
*/
class Widget
{
/**
* #MongoDB\ReferenceOne(targetDocument="Token", inversedBy="widgets")
*/
protected $token;
/**
* #MongoDB\ReferenceOne(targetDocument="Playground", inversedBy="widgets")
*/
protected $playground;
}
/**
* #MongoDB\Document()
*/
class Playground
{
/**
* #MongoDB\ReferenceMany(targetDocument="Widget", mappedBy="playground")
*/
protected $widgets;
}
/**
* #MongoDB\Document()
*/
class Token
{
/**
* #MongoDB\ReferenceMany(targetDocument="Widget", mappedBy="token")
*/
protected $widgets;
}
I need to use the full playground with all its widgets and tokens but by default, Doctrine does too many queries : one to get the playground (ok), one to get all widgets of the mapping (ok) and for each widget, one query to get the token (not ok). Is there a way to query all tokens at once instead of getting them one by one ?
I've looked at prime but it does not seem to solve my problem...
Is there a way other than using the query builder and manually hydrate all objects to reduce the query count ?
Edit :
As I added in my comment, what I'm looking for is get the playground and all its dependencies as a big object, json encode it and return it into the response.
What I do for now is query the playground and encode it but Doctrine populates the dependencies in a non efficient way : first there is the query to get the playgroung, then, there is one more query to get the related widgets and there is one query for each widget to get its token.
As one playground can have hundreds of widgets, this leads to hundreds of database queries.
What I'm looking for is a way to tell Doctrine to get all this data using only 3 queries (one to get the playgroung, one to get the widgets and one to get the tokens).
update: since the ArrayCollection in the $playground should contain all the widgets at least as proxy objects (or should get loaded when accessed), the following should work to fetch all required tokens...
Since the document manager keeps all managed objects, it should prevent additional queries from occuring. (Notice the omitted assignment from the execute).
$qb = $dm->createQueryBuilder('Token')->findBy(['widget' => $playground->getWidgets()]);
$qb->getQuery()->execute();
inspired by this page on how to avoid doctrine orm traps - point 5
old/original answer
I'm not quite familiar with mongodb, tbh, but according to doctrine's priming references, you should be able to somewhat comfortably hydrate your playground by:
$qb = $dm->createQueryBuilder('Widget')->findBy(['playground' => $playground]);
$qb->field('token')->prime(true);
$widgets = $qb->getQuery()->execute();
however, I might be so wrong.
What about that:
class Foo
{
/** #var \Doctrine\ORM\EntityManagerInterface */
private $entityManager;
public function __construct(\Doctrine\ORM\EntityManagerInterface $entityManager)
{
$this->entityManager = $entityManager;
}
public function getAll(): array {
$qb = $this->entityManager->createQueryBuilder();
return $qb
->select('p,t,w')
->from(Playground::class, 'p')
->join(Widget::class, 'w')
->join(Token::class, 't')
->getQuery()
->getResult();
}
}
At least with mysql backend, this solves the "n+1 problem". Not sure about MongoDB, though.

Fastest method to save pageviews in database with doctrine

Using Symfony 4 with doctrine, I want to save pageviews for one entity Program in the database. I want to save this to the database because I want to give some users the rights to view these numbers. What I have done is add a property to the entity like this:
Program.php
/**
* #ORM\Column(type="integer", nullable=true)
*/
private $pageViews;
/**
* #return mixed
*/
public function getPageViews()
{
return $this->pageViews;
}
/**
* #param mixed $pageViews
*/
public function setPageViews($pageViews)
{
$this->pageViews = $pageViews;
}
And in my ProgramController.php in the function showProgram
//...
$program->setPageViews($program->getPageViews()+1);
$em->persist($program);
$em->flush();
This works and adds 1 to the existing number every time the page is refreshed. My question is, is this an acceptable method or are there faster/better alternatives? And does this slow down performance or is that negligible?
Since you don't really need the entity you could directly do this with SQL, using Doctrine's connection:
$connection = $this->getDoctrine()->getConnection();
$connection->executeUpdate('UPDATE page_view_counter SET page_view = page_view+1;');
or using a preprared statement:
$connection = $this->getDoctrine()->getConnection();
$statement = $connection->prepare(
'UPDATE programs SET page_views = page_views + 1 WHERE programs.id = :id'
);
$statement->bindValue('id', $id);
$statement->execute();
This would speed up things a bit by not using some of the more complex features of the ORM that you don't need for your case.
Another alternative for speeding things up could be to switch technologies, like storing the data in a cache like redis. Whether this will actually improve performance (especially under a heavier load) would need to be verified using some measuring tool like JMeter.

How to handle multiple concurrent updates in Laravel Eloquent?

Laravel 5.5
I'm wondering how to properly handle the possible case of multiple updates to the same records by separate users or from different pages by the same user.
For example, if an instance of Model_1 is read from the database responding to a request from Page_1, and a copy of the same object is loaded responding to a request from Page_2, how best to implement a mechanism to prevent a second update from clobbering the first update? (Of course, the updates could occur in any order...).
I don't know if it is possible to lock records through Eloquent (I don't want to use DB:: for locking as you'd have to refer to the underlying tables and row ids), but even if it were possible, Locking when loading the page and unlocking when submitting wouldn't be proper either (I'm going to omit details).
I think detecting that a previous update has been made and failing the subsequent updates gracefully would be the best approach, but do I have to do it manually, for example by testing a timestamp (updated_at) field?
(I'm supposing Eloquent doesn't automatically compare all fields before updating, as this would be somewhat inefficient, if using large fields such as text/binary)
You should take a look at pessimistic locking, is a feature that prevents any update until the existing one its done.
The query builder also includes a few functions to help you do "pessimistic locking" on your select statements. To run the statement with a "shared lock", you may use the sharedLock method on a query. A shared lock prevents the selected rows from being modified until your transaction commits:
DB::table('users')->where('votes', '>', 100)->sharedLock()->get();
Alternatively, you may use the lockForUpdate method. A "for update" lock prevents the rows from being modified or from being selected with another shared lock:
DB::table('users')->where('votes', '>', 100)->lockForUpdate()->get();
Reference: Laravel Documentation
What I came up with was this:
<?php
namespace App\Traits;
use Illuminate\Support\Facades\DB;
trait UpdatableModelsTrait
{
/**
* Lock record for update, validate updated_at timestamp,
* and return true if valid and updatable, throws otherwise.
* Throws on error.
*
* #return bool
*/
public function update_begin()
{
$result = false;
$updated_at = DB::table($this->getTable())
->where($this->primaryKey, $this->getKey())
->sharedLock()
->value('updated_at');
$updated_at = \Illuminate\Support\Carbon::createFromFormat('Y-m-d H:i:s', $updated_at);
if($this->updated_at->eq($updated_at))
$result = true;
else
abort(456, 'Concurrency Error: The original record has been altered');
return $result;
}
/**
* Save object, and return true if successful, false otherwise.
* Throws on error.
*
* #return bool
*/
public function update_end()
{
return parent::save();
}
/**
* Save object after validating updated_at timestamp,
* and return true if successful, false otherwise.
* Throws on error.
*
* #return bool
*/
public function save(array $options = [])
{
return $this->update_begin() && parent::save($options);
}
}
Usage example:
try {
DB::beginTransaction()
$test1 = Test::where('label', 'Test 1')->first();
$test2 = Test::where('label', 'Test 1')->first();
$test1->label = 'Test 1a';
$test1->save();
$test2->label = 'Test 1b';
$test2->save();
DB::commit();
} catch(\Exception $x) {
DB::rollback();
throw $x;
}
This will cause abort as the timestamp does not match.
Notes:
This will only work properly if the storage engine supports row-locks. InnoDB does.
There is a begin and an end because you may need to update multiple (possibly related) models, and wish to see if locks can be acquired on all before trying to save. An alternative is to simply try to save and rollback on failure.
If you prefer, you could use a closure for the transaction
I'm aware that the custom http response (456) may be considered a bad practice, but you can change that to a return false or a throw, or a 500...
If you don't like traits, put the implementation in a base model
Had to alter from the original code to make it self contained: If you find any errors, please comment.

Symfony/Doctrine remove query in a loop

This is a bit of a followup on my previous question:
Symfony2 / Doctrine queries in loops. I thought it would be better to post this as a separate question though, as the first part has been solved.
I've updated some old query code in our product, because it was causing timeouts. I'm not very used to the whole Symfony/Doctrine/Repository/Entity concept, it's pretty confusing and convoluted in my (limited) experience), but I'm trying to fix it as best I can.
At first, the getRepository was inside a nested foreach loop, so I made a getUsersFromArray function in the repository so it doesn't have to loop over that (causing literally a million queries). I accomplished that by using a where IN(:ids) statement. That helps a lot, at least it doesn't time out any more.
I still have a foreach loop left over, which removes groups from users, so with a 1000 users that's still a 1000 queries... should be doable in a single query right?
$usersWhoShouldNotHaveGroup = $em->getRepository('AdminBundle:User')
->getUsersFromArray($session['usersNotHaveGroup'], $user->getCompany());
foreach ($usersWhoShouldHaveGroup as $u) {
$u->removeGroup($group);
$em->persist($u);
}
$em->flush();
With the removeGroup function being defined in the default FOSUserBundle:
public function removeGroup(GroupInterface $group)
{
if ($this->getGroups()->contains($group)) {
$this->getGroups()->removeElement($group);
}
return $this;
}
And groups being defined like this:
/**
* #ORM\ManyToMany(targetEntity="Something\AdminBundle\Entity\Group", inversedBy="users", cascade={"persist"})
* #ORM\JoinTable(name="user_group",
* joinColumns={#ORM\JoinColumn(name="user_id", referencedColumnName="id")},
* inverseJoinColumns={#ORM\JoinColumn(name="group_id", referencedColumnName="id")}
* )
*/
protected $groups;
(user_group table just has 2 columns, user ids coupled to group ids, user can only be in a single group at a time)
My current code is listed here (point 3) as an antipattern, although I wouldn't know how to implement his solution in my case.
What I do not understand, according to the symfony docs, the actual queries should only be called once it does the $em->flush()?
Then why does the profiler still show a 1000 queries during the foreach loop?
How could I remedy this?

Repositories Method For Dropdowns

I've previously seen some methods for repositories where developers have a method for retrieving fields for preparing dropdowns inside of a form.
This is something I'd like to take advantage of with my application.
This is the logic that I would use in multiple areas of my app for multiple entities.
It's something I want, but I can't get it.
Does anyone know of somewhere I can find this logic?
I've done some research, but I have not found it yet.
But I've seen it, somewhere.
I finally came across something that helped me. I've also included a link for anyone that might be looking for something like this.
http://blog.dannyweeks.com/web-dev/repositories-in-laravel-sharing-my-base-repository
/**
* Items for select options
* #param string $data column to display in the option
* #param string $key column to be used as the value in option
* #param string $orderBy column to sort by
* #param string $sort sort direction
* #return array array with key value pairs
*/
public function getForSelect($data, $key = 'id', $orderBy = 'created_at', $sort = 'DECS')
{
return $this->model
->with($this->relationships)
->orderBy($orderBy, $sort)
->lists($data, $key);
}

Categories