I'm using Laravel for developing my API, but I was curious if there is a way to boost the performance even more by disable the things I don't need.
Like the CommandBus and the services which Laravel uses to render the views itself, because I'm only returning JSON and not using the features like CommandBus, view rendering etc.
Not as far as I know, but the easiest thing to do to speed up your application would be to use route caching in Laravel 5.
Other than that, I guess general tips would be:
Use nginx instead of Apache
Use Myisam intead of innodb (if you're using MySQL and only doing reads)
Read/write master slave configuration to do reads from the slave.
What response time are you getting and what is the response time that you would like?
Related
I'm very new to Redis and fairly new to Laravel so could use some pointers on this.
We have a legacy PHP application that stores sessions to Redis and I'm able to see it in Redis with the default naming convention, ie.: PHPREDIS_SESSION:1bd9ca87f5b606a35891c807857c2fde
We're moving towards a Laravel API framework and as a short-term hybrid solution we want the API layer to be able to recognize and work with the session that's already been created through the legacy application.
I've been making slow progress into understanding Redis and I can see the legacy system's entries in Redis from Laravel (if I connect with a blank prefix), but is there a way to cut through Laravel's specialized handling and have it load the same PHPREDIS_SESSION space?
I've orbited this so many times I'm wondering if I've missed something simple.
Ultimately I got this working simply by updating my .env file with:
REDIS_PREFIX=PHPREDIS_SESSION:
CACHE_PREFIX=
I was hoping to avoid that since it's kind of bleh, but it functions and I suppose a config file is better than forcing Laravel to act against the grain.
Laravel is now recognizing the session being stored by my legacy application and trying to load it. Now I just need to get de/serialization to sync...
What I am doing ?
Below is the code to get and set the data in cache
\Cache::put('Categories', $Categories, 60);
\Cache::forget('Categories');
Question
What's the recommended location to get and set the Cache code ? So far I did this in the Controller file.
Take a look at Laravel 5.1 Cache especially Cache Usage part, where you set or get cache depending to you and workflow of your app.
I recommended to use them inside controllers.
Like with most of the questions of Where Do I Put X the answer is it depends. There is absolutely nothing wrong with doing it in your controller if your doing a small application and maybe only caching a few things.
If your writting a really big application or something quite complexted then you could consider doing your caching via a repository see Using Repository Pattern in Laravel 5 for some information about the Repository Pattern. If you wanted you could make use of Laravel 5 Repositories this provides not only a clean and well documented way to implement repositories but also it has specific way of doing caching see Cache Usage.
I am working on a Symfony2 project. My project uses database to store the data and Doctrine2 to retrieve that data.
As the data within the database has grown the queries became very slow and the whole web app takes about 2 mins to load or does not load at all.
The only way I can see my self fixing this is to cache some queries results but how can I do that. Unless there is different way of dealing with such issue.
You need to have your cache driver installed and configured in doctrine configuration (result_cache_driver is important in your case). Once you have this done you can make Doctrine to use result cache by setting useResultCache(true)
$cachedResult = $doctrine->getManager()
->createQueryBuilder()
->(...)
->useResultCache(true)
->(...)
Check this blog post
NOTE: by default, in dev environment, result cache won't be used
EDIT: as you're using DBAL and not using ORM - SymfonyDoctrineBundle doesn't support this kind of cache out of the box, but you can add this support by yourself by following this detailed guide
In developer mode Symfony2 creates a new cache on every query. So suppose you have many queries then it will cache all queries one by one.
This takes more time than in production mode, because in production mode the cache will only be stored once.
I'm trying to implement the whole page cache in my website. (Just like stackoverflow). I have already implemented the Output Cache, but my friend told me that stackoverflow uses redis as their cache layer and I'm confused about the redis part.
Is redis the same as outputcache? Can I implement outputcache by using redis? (For yii developers, I'm using Yii's outputcache).
Thanks!
Yii's output cache will store the cached content using the active cache component, which can be CDummyCache/CDbCache/CApcCache/CFileCache/CMemCache, etc(what you set in the config file under the components area).
As it stands right now, there is no official CRedisCache component, but there is this extension: http://www.yiiframework.com/extension/rediscache/ which might help you.
Also, since Redis is key/value store and a bit more(though you won't use that bit more at all i guess) you can give CMemCache a try(having in mind you have memcache php extension and memcached daemon installed on your server).
L.E: i also found this for you: https://github.com/phpnode/YiiRedis which seems very neat.
I'm thinking in create a webapplication with cakephp but consuming python's appengine webservice. But, to install cakephp etc, I need to configure the database. Appengine uses another kind of datastorage, with is different from mysql, etc.
I was thinking in store the data in appengine, and using the python webservices, and with the cakephp application comunicating with the webservice, for insert and retrieve data.
Is there any good resource for this, or is it unpossible.
Obs: also opened for a possibility for developing the webapplicaiton completely in python running in appengine. If anyone has a good resource.
Thanks.
I think that you should try different solution: http://aws.amazon.com/simpledb/
It appears that CakePHP is a MVC framework that's very similar to django, which is included in app engine for python. I'm not sure why you would want to store your data in google app engine, unless you're dealing with an extremely large amount of data, in which case you're likely comfortable enough working in python to just make the app work entirely on GAE.
See the official docs for more info:
http://code.google.com/appengine/docs/python/overview.html
http://code.google.com/appengine/articles/django.html
What you can do is run your CakePHP app on a standard LAMP web host and access the GAE Data Store through a REST or RPC web service. This isn't such a bad idea if you already have a CakePHP front-end that deals with RPCs in the backend, but if your Cake app stores all it's Models in MySQL it could take considerable effort to adapt it.CakePHP Models abstract their storage method using the DataSource class. You might be able to find a DataSource class that uses REST or RPC. However, if you don't have a very considerable investment in CakePHP Controllers and Templates I would suggest simply building your app entirely in GAE
You can not run PHP on GAE. If you run PHP somewhere, it is a bad architecture to go over the internet for your data. It will be slooooow and a nightmare to develop in.
You should store your data where you run your php, unless you must have a distributed, globally scaling architecture, which afaiu not the case.
There's a detailed tutorial on getting CakePHP up using the PHP runtime that Google recently announced. http://aymanrb.blogspot.com/2013/05/cakephp-deployment-on-google-app-engine.html