So I have the following dilemma. I want to have a common library that all my ZF2 applications will use. This library will contain all the business logic for my website. Each application will consume different parts of the library to properly display/perform whatever actions are necessary. Now so far I've managed to create a library. Lets call it Foo. Foo has a Module.php which does the basic autoloading required to load the entire library.
Now here is where I start to have problems. I want to take advantage of dependency injection, the service manager, etc from ZF2 inside Foo. The problem is I only have the one Module.php that loads Foo. This means as my library grows so will Module.php since as far as I can tell I can't have sub modules. Is there any way around this issue?
Essentially I want every app to just include Foo and Foo to have several Module.php so that at least the dependency stuff can be handled on a module by module basis.
You're probably swimming against the current to try and do sub-modules -- and you probably don't need to.
If you've written your module nicely, loading it won't be a very expensive operation. Remember, the whole point of the service manager is that all those services are lazily created. So if the calling code never asks for a particular service in a particular request, that service's classfile is never autoloaded, the object is never instantiated, etc. So you may be fine staying with a big, monolithic, module.
The one place that things might get a little tricky is if you're leaning heavily on the EventManager, and your module is attaching a bunch listeners. But you can probably get around that by setting up some module configuration directives, and then just conditionally attach listeners.
Having said that, it probably makes sense to try to split your module up. So you could have FooBar and FooBaz modules.
If you really, really, want sub-modules, you can dig into the ModuleManager and try to figure it out. I went a little ways down that road once -- and then got distracted. In my case, I was dealing with shipping physical items. I wanted a "Fulfillment" module that could be configured to load a bunch of similar shipping modules (Fulfillment\Courier\USPSModule, Fulfillment\Courier\FedExModule, etc), so that my main Fulfillment module could iterate over all loaded submodules, without specific knowledge about any of them. If I recall correctly, the best way to do it was to essentially mirror what ZF2 does, but inside my Fulfillment\Module class. However, I can't think of many situations where you'd want to do that, unless you want a set of similar submodules that all implement the same interface, and want them to be consumed by a super-module that has no specific knowledge of them. I also looked at this because was thinking about runtime enabling/disabling of those submodules by end-users (sort of like a plugin system).
If you're not doing that, I'd say stick to FooBarModule, FooBazModule, etc, so far as it makes sense. And remember even if your module contains a ton of code, the ServiceManager will only autoload, parse, and instantiate classes that are needed to satisfy the dependencies of any given request.
Related
I'm just new to Kohana and its cascading file system.
From what I understand, using the cascading file system allows extending of core classes and making your module use the subclass in place of the original core class (kind of like monkey patching). What I don't quite understand is why we need to create blank sub classes and put all the logic on Kohana classes. It just seems like a hack and the duplicate classes makes it very hard to trace the calls.
Based from this doc on cascading file system, it will always check for application path first before modules, so is it possible to just completely overwrite the core classes with new versions on the application path? I'm not sure where the blank classes fit in here. An actual concrete example would help, thanks.
I've never really understood the need for the empty classes extending the core Kohana ones either, so you're not alone.
I have often created classes with the same names as the empty ones in order to overwrite them completely. This would be done in either the modules or the application folders.
Kohana compiles the files in this order: system -> modules -> application...so if you were to create a class with the same name within the application directory, it would overwrite any class with the same name in system or modules.
I often create re-usable classes within my own modules and then overwrite certain methods within other modules if I need them to behave slightly differently. You can specify the order that the modules load in by changing your bootstrap.php file in the application directory.
Pretty much the only reason I'm still using Kohana is because of the Hierarchical MVC (HMVC) capabilities, for which I can't seem to find equivalent functionality in any of the other frameworks. It is massively powerful and flexible, especially for large projects.
However, if you are only just getting in to Kohana you may want to reconsider, as it does seem to be a dying framework - the devs seem to have lost interest, which is a real shame because it has so much potential. It is a stable enough framework as it stands though.
Hope this helps you.
As far as I understand, every enabled module in a ZF2 application is loaded for every request (unless one uses optimization methods such as that offered by the zf2-lazy-loading-module module). I've been keeping an eye on modules that get published on modules.zendframework.org and I've come across modules which offer extremely limited functionality, such as the AkrabatFormatUkTelephone module which purpose is to format phone numbers to UK format.
Whilst I understand development should focus on creating single purpose modules that are good at doing one thing (instead of modules which do many things but not in a very good way), I'm thinking if we start using modules which offer such limited functionality as the one mentioned, we will need to combine hundreds of modules in order to build a rich application which could be disastrous for performance. Instead I would expect this sort of functionality to be put in a class (e.g. Zend\I18n?) and loaded on demand which would be more optimized. But knowing Akrabat's reputation I'm thinking I must be missing something, hence my question:
Is the loading of modules such as the one I mentioned significantly worse for performance than loading the same functionality via PHP classes (or is it similar due to the way ZF2 has been designed)? Does anybody have any figures (i.e. is it 5%, 10%, 15% slower) about module vs class loading performance?
Don't take this comment as a final answer, as hopefully someone of the ZF2 devs will shed some more insight to it, but generally only Module.php and usually module.config.php will be actively loaded. Everything else will simply be registered and be called on demand. So as long as your Module.php and module.config.php are not TOO big in filesize, the performance shouldn't be THAT big of an issue
In the case of Akrabats example, all that's happening is, the registry of a new ViewHelper. Nothing else. The same with all other view helpers inside of Zend. Performance won't really matter a lot in these cases.
Personally the Skeleton loaded with 80ms on my Webspace and with BjyAuthorize, ZfcBase, ZfcUser and my own module, the loading time ramped up to 100ms. And this is without any sort of memory caching enabled!
Loading a module is not much more than loading any class, like Sam pointed out.
As long as you don't use anything from your module and do things right, it's just beeing registered.
Now what does "do things right" mean?
Just try to put a big nonsense loop inside your module classes bootstrap() method. You will see that this slows down every request on your application, because the bootstrap method of your module is called on every request and it should be used very carefully, only for light weight tasks. The purposes you usually use the bootstrap() method for, won't even slow down your app for a millisecond, but writing a file to the disk in this method could slow down your app for many seconds in each request.
If your app becomes really heavy, you should use the classmap_autoloader and some caching wherever you can. If you did "things right", you won't have any performance problems, just because you have many modules or many classes in your app. One could say, it's just all about algorithms.
Keep going on using best practices, like the one you mentioned. Usually these aren't the bottlenecks of your application, but your own algorithms and failures are.
edit:
When you're using modules from the community, you should always check them for performance issues. Even a module that seems to be very light could be a bottleneck for your application if it has bad algorithms. But the case that you're loading an additional module is not the point of it.
Good question. I would like to contribute a little bit to the reaction of Sam.
Module performance is not solely the loading of the module (which is, as pointed out quite fast), but also the communication of the modules in-between. So this question might boil down to: how slow/fast is the ServiceLocator and Event-driven system in comparison to traditional non-modulair systems?
I recall that ZF2 was build with performance in mind. For instance, the ServiceLocator registers factories, so that objects can be instantiated on-the-fly. So this requires only a few extra in-memory objects and instantiations, I guess this does not impact the total performance for your application much. The EventManager works in much the same way and I have not seen it being overloaded with registered events, even in large applications.
What might slow down, on the other hand, is the loading of the modules configuration. I figure that using a cache might solve this problem. I'm not sure but maybe Zend Optimizer might do this already.
So, in short, applications should scale pretty wel, provided that modules behave well, and do not over-register events or misuse the ServiceLocator.
From the MVC component's perspective there are no modules at all! There's one big configuration file - a result of merge of every module's configuration. Unless your modules don't have a onBootstrap method or don't do much, module loading is as fast as invoking new Module on every one of them, which is painless and memory inexpensive.
The configuration merge procedure, which I mentioned above, happens only in DEV mode which is enabled by default.
There are number of tricks also to speed up your ZF2 application, like:
Enable merged config cache
Use EdpSuperluminal module
Return the ViewModel objects from actions, not arrays
Explicitly set the template name on the ViewModel
Use template maps instead of template path stack alone
Route order in the config matters! Its a LIFO queue (last in-first out).
Make sure you don't load Console modules in HTTP context.
Let the Composer do the autoloading, not ZF2
... and more. There's a quite good talk by Gary Hockin on the ZF2 app performance.
Authorization modules will surely slow down your app. There are number of things going down under the hood: the identity of the user needs to be fetched (from the database?), user needs to be authenticated agains your rules. Surely you can speed things up by using memcached or such, but this requires to have some knowledge about the lifecycle of the ZF2 application, about the modules you use, etc.
Also there is Zend Framework 3 going to be released soon, some things will go faster, but don't expect much. A lot of overhead is a result of your lack of knowledge about ZF2 - no offense!
As our company starts using Zend Framework as the base framework for most of our projects, we want to share some common elements across all our projects. I talk about things like:
An implementation of a model (based on doctrine2)
RBAC for the model, including user, group, role models
A xml-based templating engine for ajax backend interfaces
(you name it) ...
Basically, all things to put "zend on rails" and get going. What is the best way to package these components? I see two possibilities:
As modules
We include the necessary functions as separate modules into the modules folder.
Pro:
We can set routes and execute code, which is good for many modules (imaginary example: a paypal module needs some kind of callback url. If our module can set it up on its own, no configuration from the "project developer" is needed).
We can provide real functionality (like the user administration) out of the box
We have a bootstrap to set up autoloading and doctrine etc.
Con:
Bad place? Interferes with the users project
A little harder to share between projects (git submodules instead of classpath)
In the library folder
We put it in the library folder and point the classpath to it.
Pro:
Clean solution
Sharing across projects
Con:
Bootstrap has to be explicitly called
No direct routing or actions - everything has to be proxied through the concrete project
So, how do you solve this? Where do you put your reusable, general purpose stuff in zf?
I think you should use both approaches.
When developing "library-like" code, as in kind of "infrastructure" classes and other things that are reusable (like ZF's own components, Doctrine 2's components etc.), you can put them into the library directory. (or its own entirely separate project)
When developing actual ZF modules (like an auth module for example), then format the code around the ZF module structure.
I think by using this kind of approach you get all the benfits you listed, and pretty much none of the cons :)
As one additional idea, if you develop your architecture parts as "services", you could even keep them running as their own web service endpoints.
I am looking for a (small) library that helps me cleanly implement a front controller for my pet project and dispatches requests to single controller classes. The front controller/dispatcher and controller classes need to be fully unittestable without sending HTTP requests.
Requirements
PSR-0 compatible
installable via its own PEAR channel
support for unit testing:
checking if the correct HTTP headers are sent
catches output to allow inspection in unit tests
perferably PHPUnit helper methods to help inspecting the output (for different output types, i.e. HTML, XML, JSON)
allows setting of incoming HTTP headers, GET and POST parameters and cookies without actually doing HTTP requests
needs to be usable standalone - without the db abstraction, templating and so that the fat frameworks all provide
Background
SemanticScuttle, the application that is bound to get proper "C" support, is an existing, working application. The library needs to blend in it and needs to work with the existing structure and classes. I won't rewrite it to match a framework's specific required directory layout.
The application already has unittests, but based on HTTP requests which make them slow. Also, the current old way of having several dozens of .php files in the www directory isn't the most managable solution, which is why proper controller classes need to be introduced. All in all, there will be about 20-30 controllers.
Previous experience
In general, I was pretty happy with Zend Framework for some previous projects but it has several drawbacks:
not pear-installable, so I cannot use it as dependency in my pear-installble applications
only available as one fat download, so I manually need to extract the required bits from it - for each single ZF update.
while unit test support exists for ZF controllers, it's lacking some advanced utility functionality like assertions for json, HTTP status code and content type checks.
While these points seem to be nit-picking, they are important for me. If I have to implement them myself, I do not need to use an external libary but write my own.
What I don't want
StackOverflow has a million "what's the best PHP framework" questions (1, 2, 3, 4, 5), but I'm not looking for those but for a specific library that helps with controllers. If it's part of a modular framework, fine.
I also know the PHP framework comparison website, but it doesn't help answer my question since my requirements are not listed there.
And I know that I can build this all on my own and invent another microframework. But why? There are so many of them already, and one just has to have all that I need.
Related questions
What's your 'no framework' PHP framework?
How do you convert a page-based PHP application to MVC?
Knowing Symfony2 well, I can assure you it's definitely possible to use it just for the "C" in MVC. The models and templates are completely free and are typically executed from the Controllers anyway, so if you don't call Doctrine or Twig specifically, you can do what you want.
As for functional testing, which is really what you're talking about in your article, what you want to look at is the WebTestCase class, which is well complemented by the LiipFunctionalTestBundle bundle for more advanced cases.
That allows for some things like this example of testing a contact form that sends an email, where the entire HTTP request is done in process, since the framework is written to allow multiple requests per process and has no global state, this works very well and does not require a http server to be running or anything. As you can see I do assertions on the HTTP status code of the response too, and was able to capture the email without sending it since in test configuration sending of emails is disabled in the standard distro of Symfony2.
That being said, you could also just use the Request and Response classes from Symfony2's HttpFoundation component. It should allow you to test your code, but IMO you wouldn't get as many nice features as you could if you'd use the entire framework. Of course that's just my biased opinion ;)
I would recommend downloading the Symfony 2 framework Routing component: https://github.com/symfony/Routing
Documentation is found here: http://symfony.com/doc/current/book/routing.html
Perhaps it does not satisfy all you requirements, but it's the closest.
If you are familiar with symfony (which I think you are) you should check out silex From their website this is what they say about it:
A microframework provides the guts for building simple single-file apps. Silex aims to be:
Concise: Silex exposes an intuitive
and concise API that is fun to use.
Extensible: Silex has an extension
system based around the Pimple micro
service-container that makes it even
easier to tie in third party
libraries.
Testable: Silex uses
Symfony2's HttpKernel which abstracts
request and response. This makes it
very easy to test apps and the
framework itself. It also respects
the HTTP specification and encourages
its proper use.
I'd add Net_URL_Mapper, it doesn't have the assertions though. Is that why you ruled it out?
Another pretty interesting thing is silex. It also comes with controller tests. I'd use that over Symfony2. But that's my personal preference.
Quite a understandable wishlist. I think we all hate it in testing when we run into dependencies that make testing to havoc. Tests should be simple and short, having many things to solve before and after running each test can be a burden.
From the description of your question it looks like that you pretty specifically know what you're looking for.
My first reaction would be that you use PHPUnit for this. It does not qualify all your requirements, but it's a base you can build on. It's highly expendable and flexible, however it does not support PSR-0 but has an autoloader of it's own so probably that does not weight that hard.
From the information you give in your question I'm not sure if the design of your testsuite(s) or the design of your application are hindering in writing and performing the tests you would love to.
I smell sort of probably both. If your application code is not easily testable, then there is not much a testing framework like PHPUnit can do about. So for example, if your controllers do not use a request object with an interface, it's not so easy to inject some request that was not triggered by the HTTP request, but by your tests. As HTTP is most often the entry-point into a webapplication, it pays to abstract here for tests. There exist some suggestions apart from specific frameworks: Fig/Http. However this is just a pointer.
Similar is this with the database scenario you give: If your application code is depending on the database, then your tests will be as well. If you don't want to test against your database all the time, you need to have your controllers being able to work w/o the concrete database. This is comparable with the HTTP requests.
There exists numerous approaches to cope with these circumstances, but as I read you question you don't look uneducated, but it's more you're looking for a better solution than exisiting ones.
As with every own code, it's pretty hard to find something that matches the own design. The best suggestion I can give is to extend PHPUnit to add those suites and constraints you need to for your application while you use the support of automated tests to refactor your application to fit the needs of how you would like to test.
So you can start with the tests and then develop the controller like you need it. This will keep your controller light I assume and help you to find the solutions you need.
If you find something that is missing with PHPUnit, you can first extend it on your own and additionally the author is very helpful in adding missing features.
Keep in mind that if there does not exist what you need, you need to code it your own. However if you're able to share (part) of the work with others, you most often get a benefit than by doing everything alone. That's a point for an existing framework, be it for testing or the application.
So if as of yet there is no such controller / MVC that does support easy unit-testing out of the box that fits your needs, chime in and develop one TDD-wise. If done right it can exactly match your requirements. However I think you're not alone with this problem. So not a very concrete answer, but I can only say that I made very good experiences with PHPUnit and it's extendability. That includes output tests you're mentioning in your question.
And probably a little differentiation at the end: It's one thing to test code-units and another to test if they all work in concert in the application with it's various requests. The last most often requires larger test setups by nature. However, if you can separate units from each other and clearly define with which other units they interact, then you normally only need to test the interaction between those which can reduce the setup. This does not save you from infrastructure problems, but those are normally not tested with unit-tests anyway (albeit you can extend PHPUnit to perform other type of checks).
A popular framework - even with a bad design - has the big plus that components tend to be better tested by use. That normally helps to go over the first years of your application until design issues in a framework make you need to rewrite your whole code base (probably).
As controllers often are sort in the middle of everything, this can lead to the scenario that you tend to test the whole application while you only want to test the controller(s). So you should think about the design and role of the controllers and their place within the overall application, what you really want to test with your controllers, so you can really make them testable according to your needs. If you don't need to test the database, you don't need to test the models. So you could mock a model returning random data to take it to the extreme. But if you want to test if HTTP handling is right, then probably a unit that abstracts HTTP handling is needed at first. Each controller relying on this would not be needed to test (theoretically) as the HTTP processing has been tested already. It's a question of the level of abstraction as well. There is no overall solution, it's only that frameworks can offer something but you're then bound to those paradigms the framework expects. AFAIK testing in php is getting more and more popular but that doesn't mean that the existing frameworks have good support for it. I know from the zend framework that they are working on this to improve the situation since longer. So it's probably worth to look into the more recent developments in the more popular frameworks to what this leads to as well.
And for the very specifics, you need to test on your own always.
Opting to PHPUnit and own testcases however does look as a practically way to me. Code your controllers as you need them for your project in TDD and you should get what you need.
Probably the more component based approach of Symfony 2 is better fitting your needs than what you experienced with Zend Framework. However, I can not suggest you anything specific as needs highly differ within application design. What's a quick and solid solution for one application is a burden for the other. See Page Controller.
You could take a look at the http://ezcomponents.org/ witch is becoming apache zeta
There are three ways how to make eZ components available for your PHP environment, please read the whole of this article before continuing with the practical part:
Use PEAR Installer for convenient installation via command line
Download eZ components packaged in an archive
Get the latest sources from SVN
I haven't got my hands into it yet but looks like a good solution...
Seldaek: WebTestCase isn't quite the right thing - it's for testing a view directly, and a controller or model only indirectly.
A unit test case for a controller would invoke the controller, likely giving it a mock object for the templating engine (e.g. a mock Smarty object), then check the values that were assigned to that object for display: for example, if you invoked the controller for /countries/south-sudan, you could check that the template variable $continent was set to "Africa". This kind of unit testing wouldn't actually involve any template rendering in most cases.
I'm starting to familiarize myself with using the module-based architecture for zend framework projects. My real reason behind being interested in the module architecture is to be able to take a module from one project and just drop it into another project. Maybe I'm not getting it right..
But what I'm noticing right off the bat is that controllers within each module cannot have the same name as any other controller in the main application (or in any other module, though I haven't tested this). This leads me to think that modules are not really independent self-contained units, so I wonder how this affects their ease of distribution from one project to another.
The other issue is what if I were to take a module and drop it into another project. Do I have to update the .zfproject.xml manually? and wouldn't that be a bit too cumbersome to be done manually?
Maybe I'm not clear on how modules should be used in zend, so I'd like to know when you decide it's best to use them, and when do you decide not to use them, or do you use them all the time, or do you never use them?
I always used module based architecture so far in my projects, because I like to separate concepts. For example I have always an ADMIN module whose classes and controllers dont mix with the rest of the application. Using modules you can reuse modules for other applications, for example if you create a BLOG module.
The names of your controllers will be something like Admin_IndexController for the admin module even if the file is named IndexController.php.
Another concept that is nice and help you reuse resources is the plugins. Use them for authentication or to check validity of the requests.
You need to setup namespaces for your modules so that they are easily moved into a new project without renaming.
If you are using Zend Tool then you will have to edit the zfproject.xml. I haven't spent a lot of time using this so I'm not sure if there is another way without manually editing.