I am calculating basketball stats and have the models Stat, User (which the basketball players are held within), Team, Stat_Meta, Game, Season, Substitution.
I have a view called statTable that can be added to any other view on the app. statTable basically just iterates through each player on the team and retrieves the calculation for each stat type (found in Stat_Meta model). Within those calculation, there are queries run for the Stat, Game, Season, etc. tables. By the time it iterates through every player and all their stats, we are looking at like 500 queries PER game (often we are going through like ~30 queries/view, so you do the math, it's bad).
My question: With the Laravel debug bar installed, I can see that in my test environment, I've got 3,116 queries running when loading the front page, and 2,432 of them are duplicates. It takes forever to load as well. So, how can I re-work this system of queries to reduce the number of them?
Full disclosure, I'm not a CS person, so efficiency isn't something I'm trained in. Right now, I'm super happy this even works, but not it is going to cost me an arm and a leg to do all these queries at scale (not to mention horrible UX).
You could do some optimization of your queries by using Laravel's eager loading. Definition of eager loading from official documentation:
When accessing Eloquent relationships as properties, the relationship
data is "lazy loaded". This means the relationship data is not
actually loaded until you first access the property. However, Eloquent
can "eager load" relationships at the time you query the parent model.
Eager loading alleviates the N + 1 query problem.
You can read some great examples from the link I provided. I believe this will optimize your queries a lot.
Beside eager loading, you should always aim to accomplish as much as you can with your queries instead of processing data with PHP, Laravel collections, etc.
Related
So I'm just about at my whit's end here. I've been tasked with optimizing a system because pages are loading too slowly. The system is built on Laravel 5.5 (yeah I know not the latest version, but that's a problem for another day). I've installed Clockwork to get a better feel for what could be causing things to be so slow and literally fell off my seat. The load of a single page generated 10412 queries in the database (you read that right, 10000 queries to load a single page)!!!!!
Anyway, I tweaked relationships so that things could be properly lazy/eager loaded when necessary and that divided things by three... but that's still a lot considering how little information is currently displayed.
So I went on to try and tackle another problem they have: They defined custom attributes inside models that fetch informations from three or four layers of relationships. Here's a "simple" example:
ride-->(departureStop, destinationStop)-->(state, event, city)
This path is to resolve the "getNameAttribute()" defined inside the Ride model. So in essence, Laravel generates 5 queries just to calculate the name of a ride (said name being something like "Ottawa, On to Toronto, On"). Of course, being an attribute, this means that we get those 5 queries FOR EVERY INSTANCE OF THE RIDE MODEL. And that's just one attribute, there are dozens of them just like this.
Oh and just to add to the fun, we're also using the compoship package because some tables have composite primary keys.
I'm not sure what to include in terms of code here since everything is such tangled mess but every suggestion you can throw at me will be more than welcome.
I'm somewhat new to laravel and working on my first full-stack laravel project. I've been trying to follow the best practices of laravel to help with better optimization and performance. In the dashboard area of my app, I noticed that the models on some of my pages have 253 counts and the view in my dashboard layout has 17 counts when I check the Laravel debugger stats [see image below].
In the image above, I have a country model that has 249 rows of all countries in the world (which are used as dropdown select for users to pick their countries), as well as a notification (this count varies per user).
From my knowledge so far, I know data like countries that rarely change can be cached but I'd like to know how many is too many when it comes to Models and Views count. So I can always watch out for those values, in case I find myself in situations where I can't use cache or before data is cached.
Thanks
That is fine, if you need to display 249 records from the table then you will have to pull 249 records (or models).
What is an issue is when you involve relationships and you come across issues such as the n+1 query prblem; which is described here: What is the "N+1 selects problem" in ORM (Object-Relational Mapping)?
Laravel's eloquent models are set to lazy load by default. The problem is that it makes a lot of query to the database and especially during high traffic, the laravel application crashes whereas a similar application build on Yii 1 has no issues.
After installing the Laravel's debug bar, the problem is too many queries being made on every page load. The next step is to query optimization. I have been using eager loading as directed in the Laravel's documentation but still too many queries.
I was wondering if there is a way to set Eloquent to only "Eager Load" in dev environment. That way when the page fails to load, identifying issue would be easier.
You could set defaults relations to "eager load" directly on the models :
Class MyModel extends Model {
protected $with = ['relation'];
}
The solution for high database load is Cache.
Caching properly could give you incredible performance during high traffic, because it reduces common database queries to zero, and redirect them to RAM ones, which are faster.
Enabling Route Caching will increase perfomance too:
php artisan route:cache
EDIT:
As Fx32 points, you should make sure that you need Eloquent and wouldn't be better to make the same query directly to the DB, joining the tables you need and making a single query instead of a lot:
Cache is not a good solution as a fix for bad database querying.
Eloquent is great, but often it's better to write proper queries with
some joins. Don't just bash away at your DB and then throw all the
results in a cache, as it will introduce new problems. If your use
case is not a flat CRUD API, ActiveRecord patterns might not be the
best solution anyway. If you carefully select and join results from
the DB, and want to speed up retrieval of such items, caching can
help.
I moved from an old sever running centOS on a managed hoster to a new one running Ubuntu in AWS.
Post the move I've noticed that the page that loads a list of items is now taking ~10-12 secs to render (sometimes even up to 74secs). This was never noticed on the old server. I used newrelic to look at what was taking so long and found that the sfPHPView->render() was taking 99% of the time. From nerelic there is approximately ~500 calls to the DB to render the page.
The page is a list of ideas, with each idea a row. I use the $idea->getAccounts()->getSlug() ability of Doctrine 1.2. Where accounts is another table linked to the idea as a foreign relation. This is called several times for each idea row. A partial is not currently used to hold the code for each row element.
Is there a performance advantage to using a partial for the row element? (ignoring for now the benefit of code maintability)
What is best practice for referencing data connected via a foreign relation? I'm surprised that a call is made to the DB everytime $idea->getAccounts()->getSlug() is called.
Is there anything obvious in ubuntu that would otherwise be making sfPHPView->render() run slower than centOS?
I'll give you my thought
When using a partial for a row element, it's more easy to put it in cache, because you can affine the caching by partial.
Because you don't explicit define the relation when making the query, Doctrine won't hydrate all elements with the relation. For this point, you can manually define relations you want to hydrate. Then, your call $idea->getAccounts()->getSlug() won't perform a new query every time.
$q = $this->createQuery();
$q->leftJoin('Idea.Account');
No idea for the point 3.
PS: for the point 2, it's very common to have lots of queries in admin gen when you want to display an information from a relation (in the list view). The solution is to define the method to retrieve data:
In your generator.yml:
list:
table_method: retrieveForBackendList
In the IdeaTable:
public function retrieveForBackendList(Doctrine_Query $q)
{
$rootAlias = $q->getRootAlias();
$q->leftJoin($rootAlias . '.Account');
return $q;
}
Though I would add what else I did to improve the speed of page load in addition to jOk's recommendations.
In addition to the explicit joins I did the following:
Switched to returning a DQL Query object which was then passed to Doctrine's paginator
Changed from using include_partial() to PHP's include() which reduced the object creation time of include_partial()
Hydrate the data from the DB as an array instead of an object
Removed some foreach loops by doing more leftJoins in the DB
Used result & query caching to reduce the number of DB calls
Used view caching to reduce PHP template generation time
Interestingly, by doing 1 to 4 it made 5 and 6 more effective and easier to implement. I think there is something to be said for improving your code before jumping in with caching.
Hey,
I've got a bit of a problem right now trying to figure out how to resolve a specific many to many model in Mongo.
I have an event scheduling system for the CRM I am building that allows events to be assigned to both users and teams. These events are particular to each lead.
So for example, I have a call at 5:00pm Thursday with Jimmy Dolittle. My sales team also has a call Thursday at 7:00am with Bob Jones.
If this were SQL, I would just create a leads table, events table, users table, and teams table. I was thinking about putting the events in the users collection and in the teams collection but then the problem arises when I have a list of leads and want to display the callback date next to each lead. Referencing like that in Mongo is going to be sloooow with a list of 500 leads.
I was also thinking about storing the events in the leads collection, but that would mean I would have to do the same sort of search for leads with events assigned to a particular user or team (there might be 500,000 leads in the database but only 500 have events for a particular user.
This kind of relationship is just going to be a problem in Mongo. In that situation, I would probably write a function for connecting those objects at the application level. Whenever a connection is made, save the relationship in both objects. Then you can search either direction with ease. You'll have redundant data and that causes a risk of getting them out of sync, but that's the price you have to pay with a non-relational structure. Your updates won't be as fast since you'll have to update two docs, but your selects should be speedy.
As Tim Suggests a good idea would be to solve this at the app level.
What i'd do here is create a new collection 'Events' then store an array of _id's of related events inside the user and team objects from here it will be super fast to do a look up. It may mean a lot more queries but queries on the _id field alone are highly optimised and not very resource intensive (unless you have millions of events per user) so if a team has the app up they can see their events and if a user has theirs up they can see their events.
Also i recommend storing back links to the user and team _id's in the event object. Yes this is redundant data but its only a reference and if managed properly at the app level should keep the schema nice and tidy.
Best of Luck.