i have a restful project when make stress test with maximum 500 users the route return status:error
i tried to optimize laravel query and decrease the amount of data which received at every request
but the result of performance not shown
so the question how to optimize my proeject and what are helpfull tools i can use to make best performance as possible
Replace whereHas and whereDoesntHave with whereIn & whereNotIn
remove appends from model
return used data only and remove extra data from responses
There are a few critical parts that you should deeply check and do
User SQL profiler to check which part of SQL script is slow, etc.
Check and remove duplicate SQL queries.
Use caching with Redis everywhere where it is necessary
Cache the config, routes, etc. you can call the php artisan optimize command
Remove unused services from the config
Optimize the composer composer install --prefer-dist --no-dev -o
Cache the results of non-often uneatable queries, e.g. if you have a categories you can cache them, and update the cache when categories are updated.
Use CDN for static files, like images, JS, CSS files, etc.
Big jobs move to the queue and use a queueing system where it is possible
Use laravel Octane for increase performance
I can't suggest more without looking at the code, so I wrote only general things
barryvdh/laravel-debugbar is a great package for tracking execution time of each single query, memory usage and other information.
Related
Recently I developed a full automated DB cache solution - mysql+redis. Based on sql queries it builds cache, reset and updates itself all without manual cache managing. It's also apply on multiple joins, inserts, deletes and updates. It works fantastic on production in my previous project writen on PHP from the scratch without framework.
I would like to make this plugin compatible with Laravel.
The problem is to find the best solution of injection, so it will be possible to intercept sql queries and return cache instead of sql execution.
The solution must be one universal. It must intercept, handle and return cached result for all the SQL not mater ORM or DB.
Will be glad to share final solution in open source once done.
Any ideas?
So I switched over from using Laravel's eager loading to checking the cache first before hitting the db. I've got it to the point where it's only doing 8 queries to the database per request vs the 100+ queries it was using before.
However, the response time is now slower than it was before. Per this stackoverflow question, Laravel Caching with Redis is very slow, it seems that using the cache facade is a lot slower than calling redis/memcached directly.
Why is the cache facade slower than calling the cache directly? Should I switch over to using redis/memcached directly? What can I do to improve the performance of my application while still using the cache facade, if possible?
Leveraging caching as much as possible and minimizing the number of queries to the DB is a requirement for this project I'm working on.
Another thing to note is that I'm using both memcached and redis for the cache drivers, as there are some things that are stored in memcached that are shared between the services I'm writing and the old services, and there are items that are exclusive to the services I'm writing which I can use Redis for instead.
I'm working on a Laravel 5.3.30 app that is performing unacceptably slow, and it seems like there is some issue beyond what any level of optimization can solve. I'm developing the app on the following environment:
Wampserver 3.0.6 64 bit including PHP 7, MySQL 5.7.14, and Apache 2.4.23
Windows 10 Pro 64 bit
ORM: Doctrine 2
Hardware: ZBook 17 G2, i7-4710MQ Processor, 8gb Memory, Quadro K2200M, SSD
I've read about Laravel performance and the advice seems to be to not pay too much attention to speed as the ample performance of production servers will handle the bloat of the framework as I plan to put it on a VPS server running on SSD drives. But my hardware itself isn't all that slow and is also running on SSD, so I'm not so sure. I feel that there is something fundamentally wrong with the setup and have attached the below results from debugbar for loading a very basic page which calls a single SQL query (for selecting the user to verify the session), along with the full SQL log with "general-log" enabled for the request.
I've run the usual optimizations with minimal effect:
php artisan optimize --force
php artisan config:cache
php artisan route:cache
php artisan clear-compiled
caching
I've seen many posts with people stating that achieving <100ms for basic requests on Laravel 5 like the one I benchmarked for this post is a non-issue, so I assume there is something else that's going on here, and I'm hoping someone can point me to what is causing this slow down.
Thanks!
Update
It just occurred to me as soon as I posted this, that I'm using a symlink inside the webroot to connect to the public folder in the Laravel app (which is outside the webroot), and I wonder if this symlink is what is causing the slowdown? I will update if I get the chance to benchmark this.
Update 2
As per the suggestion to test the speed for an Ajax request, the results from the debugbar below show it's just as slow, if not slower.
So having used PHP's get_included_files(), it turns out that Laravel was including some 700 files, and I thought this was the culprit. But having asked a couple of folks on slack, they had similar include numbers, so I realized that I was caching wrong. First off, I should have been calling the optimize query after calling the config cache query, as per:
php artisan config:cache
php artisan optimize --force
And sure enough, this reduced the number of files being included to under 500 files. And as per advice from slack, I took the remaining files and cached as many of them as possible using obcache, which helped as well. So now it's running at acceptable performance, and I imagine using a more fine-tuned strategy with obcache would speed things up even more.
As the time needed for run complete PHPUnit suite raises, our team starts wondering if there is a possibility to run Unit tests in parallel. Recently I read an article about Paraunit, also Sebastian Bergman wrote, he'll add parallelism into PHPUnit 3.7.
But there remains the problem with integration tests, or, more generally, tests that interact with DB. For the sake of consistency, the testDB has to be resetted and fixtures loaded after each test. But in parallel tests there is a problem with race conditions, because all processes use the same DB.
So to be able to run integration tests in parallel, we have to assign own database to each process. I would like to ask, if someone has some thoughts about how this problem can be solved. Maybe there are already implemented solutions to this problem in another xUnit implementation.
In my team we are using MongoDB, so one solution would be to programmatically create a config file for each PHPUnit process, with generated DB name(for this process), and in setUp() method we could clone the main TestDb into this temporary one. But before we start to implement this approach I would like to ask for your ideas about the topic.
This is a good question: preparing for parallel unit tests is going to require learning some new Best Practices, and I suspect some of them are going to slow our tests down.
At the highest level, the advice is: avoid testing with a database wherever possible. Abstract all interactions with your database, and then mock that class. But you've already noted your question is about integration tests, where this is not possible.
When using PDO, I generally use sqlite::memory: Each test gets its own database. It is anonymous and automatically cleaned up when the test ends. (But I noted some problems with this when you real application is not using sqlite: Suggestions to avoid DB deps when using an in-memory sqlite DB to speed up unit tests )
When using a database that does not have an in-memory choice, create the database with a random name. If the parallelization is at the PHPUnit process level, quite coarse, you could use the process pid. But that has no real advantages over a random name. (I know PHP is single-threaded, but perhaps in future we would have a custom phpUnit module, that uses threads to run tests in parallel; we might as well be ready for that.)
If you have the xUnit Test Patterns book, chapter 13 is about testing databases (relatively short). Chapters 8 and 9 on transient vs. persistent fixtures are useful too. And, of course, most of the book is on abstraction layers to make mocking easier :-)
There is also this awesome library (fastest) for executing tests in parallel. It is optimized for functional/integration tests, giving an easy way to work with N databases in parallel.
Our old codebase run in 30 minutes, now in 7 minutes with 4 Processors.
Features
Functional tests could use a database per processor using the
environment variable.
Tests are randomized by default.
Is not coupled with PhpUnit you could run any command.
Is developed in PHP with no dependencies.
As input you could use a phpunit.xml.dist file or use pipe.
Includes a Behat extension to easily pipe scenarios into fastest.
Increase Verbosity with -v option.
Usage
find tests/ -name "*Test.php" | ./bin/fastest "bin/phpunit -c app {};"
But there remains the problem with integration tests, or, more
generally, tests that interact with DB. For the sake of consistency,
the testDB has to be resetted and fixtures loaded after each test. But
in parallel tests there is a problem with race conditions, because all
processes use the same DB.
So to be able to run integration tests in parallel, we have to assign
own database to each process. I would like to ask, if someone has some
thoughts about how this problem can be solved. Maybe there are already
implemented solutions to this problem in another xUnit implementation.
You can avoid integration test conflicts 2 ways:
running only those tests parallel, which uses very different tables of your database, so they don't conflict
create a new database for conflicting tests
Ofc. you can combine these 2 solutions. I don't know about any phpunit test runner which supports any of these approaches, so I think you have to write your own test runner to speed up the process... Btw you can still group your integration tests, and run only a few of them at once, if you are using them by development...
Be aware, that the same conflicts can cause concurrency issues under heavy loading in PHP. For example if you lock 2 files in reverse order under 2 separate controller action, then your application can end up in a deadlock... I am seeking a way to test concurrency issues in PHP, but no luck so far. I don't have time currently to write my own solution, and I am not sure I can manage it, it's pretty hard stuff... :S
In case that your application is coupled with a specific vendor eg. postgresql you can create separate stacks with docker and docker-compose. Then group together tests by purpoce eg. model tests, controller tests etc etc.
For each group deploy in your pipeline a specific stack using docker-compos and run the tests via docker. The idea is to have seperate environment with seperate databases hence you avoid the confict.
I've just started using YII and managed to finish my first app. unfortunately, launch day is close and I want this app to be super fast. So far, the only way of speeding it up I've come across, is standard caching. What other ways are there to speed up my app?
First of all, read Performance Tuning in the official guide. Additionally:
Check HTTP caching.
Update your PHP. Each major version gives you a good boost.
Use redis (or at least database) for sessions (default PHP sessions are using files and are blocking).
Consider using nginx instead (or with) apache. It serves content much better.
Consider using CDN.
Tweak your database.
These are all general things that are relatively easy to do. If it's not acceptable afterwards, do not assume. Profile.
1. Following best practices
In this recipe, we will see how to configure Yii for best performances and will see some additional principles of building responsive applications. These principles are both general and Yii-related. Therefore, we will be able to apply some of these even without using Yii.
Getting ready
Install APC (http://www.php.net/manual/en/apc.installation.php)
Generate a fresh Yii application using yiic webapp
2.Speeding up sessions handling
Native session handling in PHP is fine in most cases. There are at least two possible reasons why you will want to change the way sessions are handled:
When using multiple servers, you need to have a common session storage for both servers
Default PHP sessions use files, so the maximum performance possible is limited by disk I/O
3.Using cache dependencies and chains
Yii supports many cache backends, but what really makes Yii cache flexible is the dependency and dependency chaining support. There are situations when you cannot just simply cache data for an hour because the information cached can be changed at any time.
In this recipe, we will see how to cache a whole page and still always get fresh data when it is updated. The page will be dashboard-type and will show five latest articles added and a total calculated for an account. Note that an operation cannot be edited as it was added, but an article can.
4.Profiling an application with Yii
If all of the best practices for deploying a Yii application are applied and you still do not have the performance you want, then most probably, there are some bottlenecks with the application itself. The main principle while dealing with these bottlenecks is that you should never assume anything and always test and profile the code before trying to optimize it.
If most of your app is cacheable you should try a proxy like varnish.
Go for general PHP Mysql Performance turning.
1)Memcache
Memcahced open source distributed memory object caching system it helps you to speeding up the dynamic web applications by reducing database server load.
2)MySQL Performance Tuning
3)Webserver Performance turning for PHP