Caching most of Symfony components and services - php

Having a Java background, one point I have missed during many years is that PHP recreates everything at every request.
So using the Symfony framework, we recreate every components of this - yet great - framework. Every service, all the router, are built and rebuilt.
We can cache data in $_SESSION, but we save 50 times data if we have 50 users. I thought I could use static or $_SERVER, but it doesn't work the Java way.
The PHP way is to use Memcached and there are plenty of exemples caching the doctrine requests, but I haven't seen any of them caching the Router or the Services. Do you know exemples, or is it just a bad idea ?

That's one of the characteristics of PHP, nothing lives once the request is finished.
There're a couple of projects where they wrap the bootstrapping process of symfony so it's only run once. Here's the project: php-pm and an article describing it's usage

Related

Is the whole Zend Framework stack rebuilt for each request?

I am working on a web application built with php and zend framework 2. I come from a Java EE background. It seems to me that for each http request the whole zend application stack is being rebuilt, reading a load of config files from disk, constructing all my services and whatnot. Is this correct? If so this seems rather strange and inefficient to me compared to the Java EE approach of having a load of application services that are initialized on web server start up and then have a lifetime across many requests. Seeing as the browser is making lots of little Ajax requests isn't this php/zend approach horribly slow? Do I need a paradigm shift in the way I approach web application design?
ZF2 can use caching to improve performance , the configuration also
can be cached as well .
any service or class in ZF2 shouldn't be constructed on every request , for example a db connection (ZF2 won't try to connect to db unless you are really doing something to the db server)
another example :
suppose you had an REST API , you need to use on one controller only , no need to construct this api on every request , you alternatively ask you module's service manager to construct that object for you on that controller only .
take a look at this blog , it might help you : http://www.masterzendframework.com/articles-2/zend-framework-2-core-concepts-understanding-dependency-injection
or http://akrabat.com/ or the ZF2 Team leader : Matthew Weier O'Phinney http://mwop.net/
(if I understand your question, sorry if not)
it's correct that php is not running in an environment, like java does.
This means that your php application is beeing initialized on each request.
It seems to me that for each http request the whole zend application stack is being rebuilt
But this is not correct. The initialization of your php- / zf2- application is not that heavy as you may think, when you're coming from java.
Don't think of something like a tomcat server being rebooted on each request. The whole Zend Framework is also not beeing loaded and your application classes aren't as well.
Only the classes that are needed for your specific request are being loaded.
PHP Frameworks make use of Autoloaders for this, so if you call new MyClass() in your app, that class will be loaded. And even the file containing "MyClass" is not being loaded from disc in many cases, but from ram-memory or bytecode cache.
If your server is set up correctly, the zf2 skeleton application for example will be loaded in a few milliseconds.
What about your services that need to be ready for your application?
They're nothing more than a "key" in an associative array, inside your configuration. When your application needs a service for your specific request, it will find the entry in your configuration array extremely fast. The Service will be initialized from a factory method or a factory class only for this request, but only the specific one which is needed.
Facebook is using php as well. It doesn't need a minute for each request right? If php worked the way you're thinking of it at the moment, it would take minutes or hours I guess - or just end up in a timeout :)

Laravel 4 how to lazy load controllers?

Recently I've discovered that Laravel 4 is including all the controllers on each request regardless if it needs it or not, even though Composer has the whole controller architecture mapped in its autoload. I'm trying to build a high traffic website and I'm trying to minimize the overhead as much as possible before starting to make any hardware scaling. Do you have any idea how can I "force" laravel 4 into lazy loading the classes, as in ... whenever they are requested by the code for the given request.
The way I see it now, when it parses the routes it includes all controllers, makes a reflection class of all of them to parse the methods so that it has them mapped in the request memory. IMO that's quite the overkill. A simple Hello World costs me 5Mb ram and 15ms page generation time. That is too much for me. It's like the most hyped framework at the moment and its hyped as lightweight but 5mb/15ms is not lightweight in any case :/
There is a great argument/explanation here: http://forums.laravel.io/viewtopic.php?id=8175
Laravel 4 is meant to use more framework agnostic packages than Laravel 3.
Laravel 3 is very lightweight, perhaps you could look into using v3 instead. It's still actively being used and supported throughout the community.

ZF2 module loading performance

As far as I understand, every enabled module in a ZF2 application is loaded for every request (unless one uses optimization methods such as that offered by the zf2-lazy-loading-module module). I've been keeping an eye on modules that get published on modules.zendframework.org and I've come across modules which offer extremely limited functionality, such as the AkrabatFormatUkTelephone module which purpose is to format phone numbers to UK format.
Whilst I understand development should focus on creating single purpose modules that are good at doing one thing (instead of modules which do many things but not in a very good way), I'm thinking if we start using modules which offer such limited functionality as the one mentioned, we will need to combine hundreds of modules in order to build a rich application which could be disastrous for performance. Instead I would expect this sort of functionality to be put in a class (e.g. Zend\I18n?) and loaded on demand which would be more optimized. But knowing Akrabat's reputation I'm thinking I must be missing something, hence my question:
Is the loading of modules such as the one I mentioned significantly worse for performance than loading the same functionality via PHP classes (or is it similar due to the way ZF2 has been designed)? Does anybody have any figures (i.e. is it 5%, 10%, 15% slower) about module vs class loading performance?
Don't take this comment as a final answer, as hopefully someone of the ZF2 devs will shed some more insight to it, but generally only Module.php and usually module.config.php will be actively loaded. Everything else will simply be registered and be called on demand. So as long as your Module.php and module.config.php are not TOO big in filesize, the performance shouldn't be THAT big of an issue
In the case of Akrabats example, all that's happening is, the registry of a new ViewHelper. Nothing else. The same with all other view helpers inside of Zend. Performance won't really matter a lot in these cases.
Personally the Skeleton loaded with 80ms on my Webspace and with BjyAuthorize, ZfcBase, ZfcUser and my own module, the loading time ramped up to 100ms. And this is without any sort of memory caching enabled!
Loading a module is not much more than loading any class, like Sam pointed out.
As long as you don't use anything from your module and do things right, it's just beeing registered.
Now what does "do things right" mean?
Just try to put a big nonsense loop inside your module classes bootstrap() method. You will see that this slows down every request on your application, because the bootstrap method of your module is called on every request and it should be used very carefully, only for light weight tasks. The purposes you usually use the bootstrap() method for, won't even slow down your app for a millisecond, but writing a file to the disk in this method could slow down your app for many seconds in each request.
If your app becomes really heavy, you should use the classmap_autoloader and some caching wherever you can. If you did "things right", you won't have any performance problems, just because you have many modules or many classes in your app. One could say, it's just all about algorithms.
Keep going on using best practices, like the one you mentioned. Usually these aren't the bottlenecks of your application, but your own algorithms and failures are.
edit:
When you're using modules from the community, you should always check them for performance issues. Even a module that seems to be very light could be a bottleneck for your application if it has bad algorithms. But the case that you're loading an additional module is not the point of it.
Good question. I would like to contribute a little bit to the reaction of Sam.
Module performance is not solely the loading of the module (which is, as pointed out quite fast), but also the communication of the modules in-between. So this question might boil down to: how slow/fast is the ServiceLocator and Event-driven system in comparison to traditional non-modulair systems?
I recall that ZF2 was build with performance in mind. For instance, the ServiceLocator registers factories, so that objects can be instantiated on-the-fly. So this requires only a few extra in-memory objects and instantiations, I guess this does not impact the total performance for your application much. The EventManager works in much the same way and I have not seen it being overloaded with registered events, even in large applications.
What might slow down, on the other hand, is the loading of the modules configuration. I figure that using a cache might solve this problem. I'm not sure but maybe Zend Optimizer might do this already.
So, in short, applications should scale pretty wel, provided that modules behave well, and do not over-register events or misuse the ServiceLocator.
From the MVC component's perspective there are no modules at all! There's one big configuration file - a result of merge of every module's configuration. Unless your modules don't have a onBootstrap method or don't do much, module loading is as fast as invoking new Module on every one of them, which is painless and memory inexpensive.
The configuration merge procedure, which I mentioned above, happens only in DEV mode which is enabled by default.
There are number of tricks also to speed up your ZF2 application, like:
Enable merged config cache
Use EdpSuperluminal module
Return the ViewModel objects from actions, not arrays
Explicitly set the template name on the ViewModel
Use template maps instead of template path stack alone
Route order in the config matters! Its a LIFO queue (last in-first out).
Make sure you don't load Console modules in HTTP context.
Let the Composer do the autoloading, not ZF2
... and more. There's a quite good talk by Gary Hockin on the ZF2 app performance.
Authorization modules will surely slow down your app. There are number of things going down under the hood: the identity of the user needs to be fetched (from the database?), user needs to be authenticated agains your rules. Surely you can speed things up by using memcached or such, but this requires to have some knowledge about the lifecycle of the ZF2 application, about the modules you use, etc.
Also there is Zend Framework 3 going to be released soon, some things will go faster, but don't expect much. A lot of overhead is a result of your lack of knowledge about ZF2 - no offense!

Using Symfony doctrine models in a REST framework

I have a web application which has been developed with symfony 1.4. I have a pretty large code base (and growing). Circa 80,000 lines of code (actions, forms, models, templates etc.)
I'm using the default doctrine version which ships with symfony 1.4.
I've just started developing a mobile version using Sencha touch. I don't wish to use symfony for the REST web services because:
REST services in Symfony 1.4 is not great. For example, If i want a PUT request I have to pass a 'sf_method' parameter specifiying that the request method is PUT. This isn't true REST and it's not ideal for Sencha touch.
I don't need all of the unnecessary symfony functionality(for example the plugins that are autoloaded in the ProjectConfiguration file, the form framework etc.) that you'd use to develop a standard web app. All I need is to define my REST routes and return the specified JSON (as everything that needs to be returned for Sencha touch will be JSON)
I want to keep my mobile app as bloated-free, efficient and quick as possible. And unfortunately for this task, Symfony 1.4 would not be the best choice for using as the backend architecture for my mobile app. If I had chosen symfony2 (it was in it's beta phase, alas) it would be a different story as symfony2 supports true REST functionality. What I do need, however, is the ability to use my current doctrine models (I have circa 90 models) in a chosen REST framework.
Basically, in a nutshell what I need is as simple as this:
Call a rest route->Query my doctrine models->return the JSON without using symfony.
So my question, what would be your advice? I don't want this to be a question of which is the best PHP rest framework, however, what I would like to know is what would be a good REST framework which i can develop efficiently and quickly REST service, make use of my doctrine models and is easily extendable.
Here at my employer, I've created a rather big application with a ExtJS frontend, and Symfony 1.4 backend. And two be honest, I don't feel limited by Symfony 1.4 in any way?
First of: I created my own base controller class (which extends sfActions). This controller can handle (render) different types of data. It has generic handling for Doctrine_Query, Doctrine_Collection, Doctrine_Model and array types.
Also the plugins make me help organize the code, and in some cases plugins are shared between differend projects, so that's also a big plus.
And the extra functionality like forms: it's only prepared for you in the autoloader, you don't have to use it. And I don't think it causes any real performance issues (at least not for me). But I like to use the extra sfValidator framework, to make sure data are correct.
The only real "problem" is indeed the HTTP REST-ful commands, especially PUT and DELETE. I just worked around this problem by generating a controller for each 'manageable' model, and implement specific get, list, create, update and delete actions. So when I would like to manage an Object, I call the objects controller, which has executeCreate, executeUpdate and executeDelete actions.
The reason I read, was that Symfony didn't and couldn't implement this feature because PHP has really bad support this. I don't know if this is true, but if this is your only 'real' issue, you could try to fix this in the Symfony core.
So my advice:
If the raw performance is your problem: try profiling your code, install a opcode (APC) cache, and profile your code (yes, that's double).
If the HTTP PUT command is your problem: I would either work around this (that's the way I solved it), or try to fix it in the core.

PHP front controller library with support for unit testing

I am looking for a (small) library that helps me cleanly implement a front controller for my pet project and dispatches requests to single controller classes. The front controller/dispatcher and controller classes need to be fully unittestable without sending HTTP requests.
Requirements
PSR-0 compatible
installable via its own PEAR channel
support for unit testing:
checking if the correct HTTP headers are sent
catches output to allow inspection in unit tests
perferably PHPUnit helper methods to help inspecting the output (for different output types, i.e. HTML, XML, JSON)
allows setting of incoming HTTP headers, GET and POST parameters and cookies without actually doing HTTP requests
needs to be usable standalone - without the db abstraction, templating and so that the fat frameworks all provide
Background
SemanticScuttle, the application that is bound to get proper "C" support, is an existing, working application. The library needs to blend in it and needs to work with the existing structure and classes. I won't rewrite it to match a framework's specific required directory layout.
The application already has unittests, but based on HTTP requests which make them slow. Also, the current old way of having several dozens of .php files in the www directory isn't the most managable solution, which is why proper controller classes need to be introduced. All in all, there will be about 20-30 controllers.
Previous experience
In general, I was pretty happy with Zend Framework for some previous projects but it has several drawbacks:
not pear-installable, so I cannot use it as dependency in my pear-installble applications
only available as one fat download, so I manually need to extract the required bits from it - for each single ZF update.
while unit test support exists for ZF controllers, it's lacking some advanced utility functionality like assertions for json, HTTP status code and content type checks.
While these points seem to be nit-picking, they are important for me. If I have to implement them myself, I do not need to use an external libary but write my own.
What I don't want
StackOverflow has a million "what's the best PHP framework" questions (1, 2, 3, 4, 5), but I'm not looking for those but for a specific library that helps with controllers. If it's part of a modular framework, fine.
I also know the PHP framework comparison website, but it doesn't help answer my question since my requirements are not listed there.
And I know that I can build this all on my own and invent another microframework. But why? There are so many of them already, and one just has to have all that I need.
Related questions
What's your 'no framework' PHP framework?
How do you convert a page-based PHP application to MVC?
Knowing Symfony2 well, I can assure you it's definitely possible to use it just for the "C" in MVC. The models and templates are completely free and are typically executed from the Controllers anyway, so if you don't call Doctrine or Twig specifically, you can do what you want.
As for functional testing, which is really what you're talking about in your article, what you want to look at is the WebTestCase class, which is well complemented by the LiipFunctionalTestBundle bundle for more advanced cases.
That allows for some things like this example of testing a contact form that sends an email, where the entire HTTP request is done in process, since the framework is written to allow multiple requests per process and has no global state, this works very well and does not require a http server to be running or anything. As you can see I do assertions on the HTTP status code of the response too, and was able to capture the email without sending it since in test configuration sending of emails is disabled in the standard distro of Symfony2.
That being said, you could also just use the Request and Response classes from Symfony2's HttpFoundation component. It should allow you to test your code, but IMO you wouldn't get as many nice features as you could if you'd use the entire framework. Of course that's just my biased opinion ;)
I would recommend downloading the Symfony 2 framework Routing component: https://github.com/symfony/Routing
Documentation is found here: http://symfony.com/doc/current/book/routing.html
Perhaps it does not satisfy all you requirements, but it's the closest.
If you are familiar with symfony (which I think you are) you should check out silex From their website this is what they say about it:
A microframework provides the guts for building simple single-file apps. Silex aims to be:
Concise: Silex exposes an intuitive
and concise API that is fun to use.
Extensible: Silex has an extension
system based around the Pimple micro
service-container that makes it even
easier to tie in third party
libraries.
Testable: Silex uses
Symfony2's HttpKernel which abstracts
request and response. This makes it
very easy to test apps and the
framework itself. It also respects
the HTTP specification and encourages
its proper use.
I'd add Net_URL_Mapper, it doesn't have the assertions though. Is that why you ruled it out?
Another pretty interesting thing is silex. It also comes with controller tests. I'd use that over Symfony2. But that's my personal preference.
Quite a understandable wishlist. I think we all hate it in testing when we run into dependencies that make testing to havoc. Tests should be simple and short, having many things to solve before and after running each test can be a burden.
From the description of your question it looks like that you pretty specifically know what you're looking for.
My first reaction would be that you use PHPUnit for this. It does not qualify all your requirements, but it's a base you can build on. It's highly expendable and flexible, however it does not support PSR-0 but has an autoloader of it's own so probably that does not weight that hard.
From the information you give in your question I'm not sure if the design of your testsuite(s) or the design of your application are hindering in writing and performing the tests you would love to.
I smell sort of probably both. If your application code is not easily testable, then there is not much a testing framework like PHPUnit can do about. So for example, if your controllers do not use a request object with an interface, it's not so easy to inject some request that was not triggered by the HTTP request, but by your tests. As HTTP is most often the entry-point into a webapplication, it pays to abstract here for tests. There exist some suggestions apart from specific frameworks: Fig/Http. However this is just a pointer.
Similar is this with the database scenario you give: If your application code is depending on the database, then your tests will be as well. If you don't want to test against your database all the time, you need to have your controllers being able to work w/o the concrete database. This is comparable with the HTTP requests.
There exists numerous approaches to cope with these circumstances, but as I read you question you don't look uneducated, but it's more you're looking for a better solution than exisiting ones.
As with every own code, it's pretty hard to find something that matches the own design. The best suggestion I can give is to extend PHPUnit to add those suites and constraints you need to for your application while you use the support of automated tests to refactor your application to fit the needs of how you would like to test.
So you can start with the tests and then develop the controller like you need it. This will keep your controller light I assume and help you to find the solutions you need.
If you find something that is missing with PHPUnit, you can first extend it on your own and additionally the author is very helpful in adding missing features.
Keep in mind that if there does not exist what you need, you need to code it your own. However if you're able to share (part) of the work with others, you most often get a benefit than by doing everything alone. That's a point for an existing framework, be it for testing or the application.
So if as of yet there is no such controller / MVC that does support easy unit-testing out of the box that fits your needs, chime in and develop one TDD-wise. If done right it can exactly match your requirements. However I think you're not alone with this problem. So not a very concrete answer, but I can only say that I made very good experiences with PHPUnit and it's extendability. That includes output tests you're mentioning in your question.
And probably a little differentiation at the end: It's one thing to test code-units and another to test if they all work in concert in the application with it's various requests. The last most often requires larger test setups by nature. However, if you can separate units from each other and clearly define with which other units they interact, then you normally only need to test the interaction between those which can reduce the setup. This does not save you from infrastructure problems, but those are normally not tested with unit-tests anyway (albeit you can extend PHPUnit to perform other type of checks).
A popular framework - even with a bad design - has the big plus that components tend to be better tested by use. That normally helps to go over the first years of your application until design issues in a framework make you need to rewrite your whole code base (probably).
As controllers often are sort in the middle of everything, this can lead to the scenario that you tend to test the whole application while you only want to test the controller(s). So you should think about the design and role of the controllers and their place within the overall application, what you really want to test with your controllers, so you can really make them testable according to your needs. If you don't need to test the database, you don't need to test the models. So you could mock a model returning random data to take it to the extreme. But if you want to test if HTTP handling is right, then probably a unit that abstracts HTTP handling is needed at first. Each controller relying on this would not be needed to test (theoretically) as the HTTP processing has been tested already. It's a question of the level of abstraction as well. There is no overall solution, it's only that frameworks can offer something but you're then bound to those paradigms the framework expects. AFAIK testing in php is getting more and more popular but that doesn't mean that the existing frameworks have good support for it. I know from the zend framework that they are working on this to improve the situation since longer. So it's probably worth to look into the more recent developments in the more popular frameworks to what this leads to as well.
And for the very specifics, you need to test on your own always.
Opting to PHPUnit and own testcases however does look as a practically way to me. Code your controllers as you need them for your project in TDD and you should get what you need.
Probably the more component based approach of Symfony 2 is better fitting your needs than what you experienced with Zend Framework. However, I can not suggest you anything specific as needs highly differ within application design. What's a quick and solid solution for one application is a burden for the other. See Page Controller.
You could take a look at the http://ezcomponents.org/ witch is becoming apache zeta
There are three ways how to make eZ components available for your PHP environment, please read the whole of this article before continuing with the practical part:
Use PEAR Installer for convenient installation via command line
Download eZ components packaged in an archive
Get the latest sources from SVN
I haven't got my hands into it yet but looks like a good solution...
Seldaek: WebTestCase isn't quite the right thing - it's for testing a view directly, and a controller or model only indirectly.
A unit test case for a controller would invoke the controller, likely giving it a mock object for the templating engine (e.g. a mock Smarty object), then check the values that were assigned to that object for display: for example, if you invoked the controller for /countries/south-sudan, you could check that the template variable $continent was set to "Africa". This kind of unit testing wouldn't actually involve any template rendering in most cases.

Categories