I've started using Laravel 5 for a medium-complexity app, and although I was expecting a minor performance impact the framework is proving to be worryingly slow.
I've been developing in a local environment, running Wamp on a Windows 8 environment, i7 with 8GB Ram.
I've run php artisan optimize and enabled apc and I'm still not getting reasonable results.
Should I expect much higher hardware requirements compared to a app developed without any framework (No MVC, Routing, Templating)?
For testing I've setup the simplest route:
public function test(){
return 1;
}
and even this way the request takes ~250-300ms (Chrome)
This is particularly important for my API functionalities.
I'm wondering how much my app could be optimized in a production server environment to minimize response time.
Is it possible to remove unused Laravel components?
Does laravel provide other optimization mechanisms?
Are there better caching alternatives to apc?
Would an SSD be of any help?
Any help on this topic is greatly appreciated. Thanks for your time!
The results you are showing are not okayish indeed.
Do you have any Linux environment to measure performance of the same "test route application" there?
I believe results might be much better.
As simpler option to check: you can try to run Homestead and measure performance within virtual Linux-machine.
Related
I'm working on a Laravel 5.3.30 app that is performing unacceptably slow, and it seems like there is some issue beyond what any level of optimization can solve. I'm developing the app on the following environment:
Wampserver 3.0.6 64 bit including PHP 7, MySQL 5.7.14, and Apache 2.4.23
Windows 10 Pro 64 bit
ORM: Doctrine 2
Hardware: ZBook 17 G2, i7-4710MQ Processor, 8gb Memory, Quadro K2200M, SSD
I've read about Laravel performance and the advice seems to be to not pay too much attention to speed as the ample performance of production servers will handle the bloat of the framework as I plan to put it on a VPS server running on SSD drives. But my hardware itself isn't all that slow and is also running on SSD, so I'm not so sure. I feel that there is something fundamentally wrong with the setup and have attached the below results from debugbar for loading a very basic page which calls a single SQL query (for selecting the user to verify the session), along with the full SQL log with "general-log" enabled for the request.
I've run the usual optimizations with minimal effect:
php artisan optimize --force
php artisan config:cache
php artisan route:cache
php artisan clear-compiled
caching
I've seen many posts with people stating that achieving <100ms for basic requests on Laravel 5 like the one I benchmarked for this post is a non-issue, so I assume there is something else that's going on here, and I'm hoping someone can point me to what is causing this slow down.
Thanks!
Update
It just occurred to me as soon as I posted this, that I'm using a symlink inside the webroot to connect to the public folder in the Laravel app (which is outside the webroot), and I wonder if this symlink is what is causing the slowdown? I will update if I get the chance to benchmark this.
Update 2
As per the suggestion to test the speed for an Ajax request, the results from the debugbar below show it's just as slow, if not slower.
So having used PHP's get_included_files(), it turns out that Laravel was including some 700 files, and I thought this was the culprit. But having asked a couple of folks on slack, they had similar include numbers, so I realized that I was caching wrong. First off, I should have been calling the optimize query after calling the config cache query, as per:
php artisan config:cache
php artisan optimize --force
And sure enough, this reduced the number of files being included to under 500 files. And as per advice from slack, I took the remaining files and cached as many of them as possible using obcache, which helped as well. So now it's running at acceptable performance, and I imagine using a more fine-tuned strategy with obcache would speed things up even more.
We're looking for a PHP framework to work with in future and are currently testing out things with Symfony 2. For this, we've redesigned our API and implemented it as a bundle in Symfony. It turned out that Symfony seems to be very slow - actually far slower than our old (not even well-designed) system.
We tried to optimise the performance by caching the byte code (using APC for this). While we've noticed a huge improvement in performance (before: about 3 seconds to load the API; after: 0.6 seconds in average (still 0.5 seconds slower than our old system without APC)), we're kind of excited - but still not really pleased with the high loading time of such an easy task like getting one result out of an almost empty database.
I don't know, but I could imagine this is due to Symfony autoloading all classes, even when not needed for a specific bundle.
Now, before we deep-six Symfony, we'd like to look out for further optimisations, possibly a way to exclude unneeded components in a specific bundle, as I personally think this would make a big difference.
I'd be thankful for any ideas on how to further improve the performance, experience reports with using Symfony or anything else that could be helpful for us on the lookout for a framework.
Edit:
Some information about the testing environment:
Operating system: Ubuntu 12.04.4 LTS (GNU/Linux 3.8.0-38-generic x86_64)
Apache version: Apache/2.2.22 (Ubuntu)
PHP version: 5.3.10-1ubuntu3.13
Considerable PHP extensions: apc
Also, all tests are done on a local copy of our system, so possible network issues can be excluded.
These points can optimise your application performance:
Upgrade PHP. The performance gain between PHP 5.3 and PHP 5.4 is very high. PHP 5.5 would be even better, but it's not supported by all distributions, like Debian 7.
NGINX is faster than Apache and the configuration is easier.
Using PHP-FPM with NGINX is a good combination. You can also run your PHP with HHVM which is in average 2x faster than PHP-FPM, provided you replace Symfony/Assetic with Grunt. Caution: HHVM requires more precaution and testing before deploying safely. You can follow these two articles (in French): JoliCode and Big Brains Company
PHP APC extension is deprecated. I think that XCache, Memcached or Redis are better, and they're also most supported at the moment. For PHP >= 5.5, APCu can be used as a replacement for APC.
Additionally, you can read a few articles which talk about Symfony2 optimisation and provide Twig benchmarks.
PHP articles:
Script high-performance in French
Comparison of PHP 5.3 and PHP 5.6 performance in French
Check this article to apprehend the PHP optimisation in French
Google's recommendations for optimising PHP applications in English
10 best pratices to optimise PHP in English
Symfony2 and Twig articles:
Symfony documentation gives a few tips how to build a performant application in English
Template optimisation in French
Twig include optimisation in French
Use #Cache annotation requests in English
Symfony2: Good practices in French
Limit the usage of unnecessary bundles
Symfony2 Twig performance optimisation in English
If you don't want to use the Twig Engine, you can disable it in French
Caching in Symfony from it's Cookbook - really impressive!
Others optimisations:
Maybe you can use an updated version of Ubuntu.
Personally, I prefer using Debian which is also popular for servers because it's very stable.
Using a cache proxy like Varnish can be a finishing touch.
Varnish requires a developer's implication and maybe a formation. Using NGINX FastCGI Cache to limit FastCGI requests to HHVM, FPM or PHP-NG can resolve speed response.
Did you look this blog post ? http://symfony.com/blog/push-it-to-the-limits-symfony2-for-high-performance-needs
According to your information, i can advise you to try php 5.5 or 5.6 and NGINX with PHP-FPM, it can 40% faster or more.
You can try approach suggested in this article: http://stfalcon.com/en/blog/post/performance-symfony2-doctrine2-orm
The author suggests to:
Download all necessary connections
Update multiple entities by request
Get data as an associative array
Use reference proxies
Don't forget to use Symfony profiler toolbar
The code described is availiable on github: https://github.com/lensky84/performance
I have installed the Skeleton application of ZF2 (from official GitHub repo), and the first page I see takes 400-700ms to load (default "Welcome to Zend Framework 2" page, with no database connections and without handling anything).
In raw PHP (without frameworks) It'll take a few ms. to load.
Could You explain, what caused such a big delays?
I'm new to ZF, and now deciding, to use or not to use it.
Zend Framework is a heavy php framework which uses a lot of php files. Because php is evaluted on the spot it takes a lot of time every request to evaluate these files. You should use a Opcode cache like apc and many others or if you're using php 5.5 Opcache that is built -in. An Opcode cache makes a copy of these evaluated files and stores these in memory. This gives a huge speed bonus.
Another thing that might be the problem is that if you run this ZF2 application with apache on windows it is much slower then apache on linux
It's hard to know whether this is good or bad without any info on the server you're running this test on. You also implied that this was a stock ZF2 skeleton app, but your screenshot shows what I think is the ZF2 developer toolbar, an add-on module.
I just tested a fresh checkout of the skeleton app on my (admittedly decent spec) dev machine and it loads in 30 ms (PHP 5.5). I would expect to be able to improve that with some simple production-type optimizations (classmap, config caching, superluminal etc.).
Edit: I thought I'd see what I could get this down to with some quick tweaks. I also installed the developer toolbar just to be sure this wasn't slowing things down too much. Result:
I'm new to Symfony2 and I must admit I'm enjoying it. I started playing with the SonataAdmin but soon a major doubt popped out:
is it normal it takes almost 3 seconds to load a listing page (using an empty database)?? I know in production I should go for APC or memcache to speed up things but it looks strange to me it takes so much time.
I'm developing using a virtualmachine with Turnkey lamp (1GB ram).
My PC is fairly new: Intel i3 8Gb ram.
Please tell me what you think/experience.
Thanks.
In development environment, it is hard to measure performance because the framework and the bundles sometimes need to parse a lot of configuration files, introspect objects and perform time consuming task and cache the output.
In production, a lot of stuff is done upfront, i.e. when you are deploying to your web server. The work upfront is done to avoid parsing files, do time consuming task, etc. This is the reason why in production, you almost can't change anything without running php app/console clear:cache again after the modification have been done. Even changing a single Twig template requires a cache clear to update the output presented to the end user.
I did not test this bundle personally, but an admin generator bundle needs to check a lot of properties and objects to perform his task properly. This is indeed time consuming but this is required only in development mode. When in production, this introspection process is not necessary and the information is probably cached somewhere. This should give way better performance in production environment than in development environment.
Bottom line, I don't think this bundle suffer from a performance problem but this depends on your needs and objectives. The only thing I can be sure of: test it in production mode to see the speed it will give you in the end. Clear you cache for production mode and use app.php instead of app_dev.php. Also, check the documentation on performance that can be found on symfony.com.
php app/console cache:clear --env=prod --no-debug
Hope this helps.
Regards,
Matt
I'm thinking about using Zend for my new project.
But I'm worried about using too much system resources.
I'm on the $20 Linode VPS
Will it be worth it?
What resources are you worried about?
Size-on-disk you can predict easily.
For memory and cpu it is harder. You could say that there is always a balance: If you make a small app, you can probably do a better job to code everything yourself. Any framework will give you overhead. But the bigger it gets, the harder it gets to write good code yourself. A framework will help you, and in the end you will be better off using a framework just because of the better your code will be (assuming you use it correctly).
So it is a really hard question to answer, without specifics. Gut feeling does tell me you would not go too wrong by starting with Zend. I've seen a couple of smaller, not-to-high-end environments using Zend with success.
The big question is how much traffic are you seeing?
100 visitors a day?
1,000 visitors a day?
10,000 visitors a day?
I was running a Zend Framework site on a Rackspace Cloud server with similar specs to your $20 Linode VPS. It operated just fine, but I only had maybe 50 visitors a day.
Zend Framework uses an Autoloader so it's pretty lazy in the files that it loads into memory. I've found ZF to be pretty quick for a framework of its size.
As the others have said, it depends on the traffic that you're both getting and expect to receive over time. I have a basic vps for one project and it works fine, but that was after standard tuning of Apache and MySQL. This includes disabling unrequired services, adjusting logging and KeepAlive timeout, amongst other variables (Apache) and tuning various caches (MySQL)
There's a lot that you can do to tune Zend Framework - which I'm quite keen on. Check out the talk by Rob Allen from the PHP London 2011 Conference for some great information.