I'm in the process of evaluating the benefits of Zend_Config_Ini versus using a simple constant file.
e.g. -
define('DB_HOST',localhost);
//versus
$config = new Zend_Config_Ini('/path/to/config.ini', 'staging');
echo $config->database->params->host; // prints "dev.example.com"
The only thing is that the $config is not globally accessible. So then you need to use Zend_Registry to store for application usage, without having to initiate each time.
This seems to add more complexity than needed.... am I missing something or is Zend_Config + Zend_Registry a technique that is better in the long run as an app grows?
The main advantage of using the Registry with a config is to avoid polluting the global namespace. Say, you want to include a third party lib and your app and the lib both define a constant DB_HOST. No good.
In addition, many of the factories in Zend Framework utilize the config object to create instances. Zend_Db is a good example of this. You just pass it $this->database and it will take what it needs to instantiate your adapter.
You can also extend the registry with custom functionality, e.g. finder methods or stuff like that. Check the pattern description by Fowler.
A nice advantage of Zend_Config is that you don't depend on a "PHP source code file" ; just a .ini file will do -- and some prefer modifying an .ini file instead of a PHP one.
And it's easier to modify an .ini / XML file programatically - there are classes / functions to do that !
With .ini files and Zend_Config, you also have nice functionnalities already provided ; for instance, inheritance between sections (ie, you can have a "generic" section, and "staging" and "production" that overwrite some values)
A thing that can be insteresting about Zend_Config, too, is consitency : Zend Framework, with Zend_Application, already supposes you'll have one configuration file ; so, why not a second one ? Or even re-use the first one ?
And it's the same with several classes of the Framework, which can work or be configured with an instance of Zend_Config ; if you already have that one, it suddenly becomes easier ;-)
If I had to choose between a PHP file with defines and a .ini file, for things that are configurable, I would probably go with the .ini file (And Zend_Config + Zend_Registry when needed).
Yes, you're correct, using the Zend sanctioned method for configuration is more complex than defining a series of global constants. The advantages you get are
No Global Namespace Collisions
If you ever need to integrate your application with another project, having global constants represent your application config may cause problems. What are the chances another project has a constant named DB_HOST? How can a developer who's using your hybrid system tell which constants are the config for your application vs. the integrated application?
Building on the Work of Others
So, you have a single file with all your config values. How are you going to deploy that into different environments? (production, staging, etc.) Chances are you're a smart person who could come up with a solution, but how is another developer coming into your project going to know how that solution works? How are other modules in your application going to know how to read your global config?
Zend_Config has already "solved" many of the problems. By taking on a bit more complexity and abstraction up-front you get a known path forward, and can spend more time on solving the problems of your application instead of inventing and supporting Yet Another Configuration System.
Related
I'm new to php and inherited a website project with hundreds of pages, all procedural (when I do a text search of the files, there isn't even a function definition anywhere). I'm coming from the c# and Java worlds. I'm looking for a way to incrementally add OOP. (They want me to update the front end and I am trying to convince them of fixing the backend at the same time and they don't want to use a framework (dammit)).
While looking into autoloader... Well, here's my understanding. It's a method of registering folders where classes are stored and when you instantiate a class, trait, etc. it searches the folder based on the class/filename/namespace and loads the appropriate definitions.
I have a few questions:
Does autoloader search the folder and load the appropriate definitions on every page lifecycle (or does it cache them)?
Pre-loading:
Is there a way to use autoloader, or some alternative, to pre-load ALL class definitions into memory and make them available across all sessions?
If so, when updating class files, how would I tell this mechanism to reload everything to memory when I make changes to class files?
UPDATE TO QUESTIONS:
Thank you both for your answers and it helps a little, but... I do have a bad habit of posing the wrong question(s) on StackOverflow.
The thing I want to avoid is slowing down pages by adding classes. So let's say I add a library and register the paths with autoloader. A page instanciates a class with multiple dependencies. Let's say that the dependency graph includes 15 files. For each request lifecycle, the server loads the page and 15 other files just on that one page.
Since I am coming from compiled languages, I feel a little strange not loading these classes into memory. All the classes together should not be over say 5MB.
(Or maybe I should just create a RAM Disk and copy all the files in there on boot and just have a symlink?)
Auto loaders in PHP are lazy. When PHP encounters a the use of a class it doesn't know about, it will ask the registered autoloader (or chain of autoloaders) to go find it. It's the autoloader's job to figure out where to get the file the class is defined in and include it. Having some sort of convention for naming your classes and organizing your class files is key to having a useful autoloader, and several conventions have arisen in the PHP community, such as PSR-4.
Does autoloader search the folder and load the appropriate definitions on every page lifecycle (or does it cache them)?
The autoloader(s) is(are) called on every request, but only when the need to autoload a class arises.
Pre-loading: Is there a way to use autoloader, or some alternative, to pre-load ALL class definitions into memory and make them available across all sessions?
I don't believe so, but as the number of classes grow, this becomes more and more wasteful.
Welcome to the wonderful[citation needed] world of legacy PHP, I highly recommend you check out Modernizing Legacy Applications In PHP. It's like a strategy guide for getting from Mordor back to the Shire.
I think you may misunderstand the purpose of autoloading. It is simply instructions on what to do when your code calls for a class that PHP doesn't recognize. That's it. The autoloader just calls requires /path/to/classfile so that PHP will see the class.
Does autoloader search the folder and load the appropriate definitions
on every page lifecycle (or does it cache them)?
There is no caching across requests, so if you make a change to file, the next http request will incorporate those changes. It's just as if you changed any other instruction in your script, for example change echo 1 to echo 2
Pre-loading: Is there a way to use autoloader, or some alternative, to
pre-load ALL class definitions into memory and make them available
across all sessions?
There is no need for this. A well written autoloader has instructions for where to find any class, so loading all possible classes ahead of time is wasteful. If you're still running into undefined classes errors, you need to either improve the autoloader or place the class files in accordance with the current autoloader instructions.
If you really want to preload all your classes, use the auto_prepend_file setting in php.ini. The docs say
Specifies the name of a file that is automatically parsed before the
main file
Set it to an initialization script. In that script have something like:
//put all your class files in this folder
$dir = '/path/to/classes/folder';
$handle = opendir($dir);
//require all PHP files from classes folder
while (false !== ($item = readdir($handle))){
$path = $dir.'/'.$item;
if(is_file($path) && pathinfo($path,PATHINFO_EXTENSION)==='php')
require_once $path;
}
This is simplified. There is significant risk in just including all files in any directory into your script so I would not do this. You would also need to adjust this if you want to include files in subdirectories.
Basically, don't do this. Just have a good autoloader.
No one posted what I was looking for but it seems the best route is the OptCache that's prebuilt into php 5.5 and above (my client is using 5.3 so I didn't know about it).
https://github.com/zendtech/ZendOptimizerPlus
The Zend OPcache
The Zend OPcache provides faster PHP execution through opcode caching
and optimization. It improves PHP performance by storing precompiled
script bytecode in the shared memory. This eliminates the stages of
reading code from the disk and compiling it on future access. In
addition, it applies a few bytecode optimization patterns that make
code execution faster.
I'm just new to Kohana and its cascading file system.
From what I understand, using the cascading file system allows extending of core classes and making your module use the subclass in place of the original core class (kind of like monkey patching). What I don't quite understand is why we need to create blank sub classes and put all the logic on Kohana classes. It just seems like a hack and the duplicate classes makes it very hard to trace the calls.
Based from this doc on cascading file system, it will always check for application path first before modules, so is it possible to just completely overwrite the core classes with new versions on the application path? I'm not sure where the blank classes fit in here. An actual concrete example would help, thanks.
I've never really understood the need for the empty classes extending the core Kohana ones either, so you're not alone.
I have often created classes with the same names as the empty ones in order to overwrite them completely. This would be done in either the modules or the application folders.
Kohana compiles the files in this order: system -> modules -> application...so if you were to create a class with the same name within the application directory, it would overwrite any class with the same name in system or modules.
I often create re-usable classes within my own modules and then overwrite certain methods within other modules if I need them to behave slightly differently. You can specify the order that the modules load in by changing your bootstrap.php file in the application directory.
Pretty much the only reason I'm still using Kohana is because of the Hierarchical MVC (HMVC) capabilities, for which I can't seem to find equivalent functionality in any of the other frameworks. It is massively powerful and flexible, especially for large projects.
However, if you are only just getting in to Kohana you may want to reconsider, as it does seem to be a dying framework - the devs seem to have lost interest, which is a real shame because it has so much potential. It is a stable enough framework as it stands though.
Hope this helps you.
From php.net:
In PHP 5, this is no longer necessary. You may define an __autoload() function which is automatically called in case you are trying to use a class/interface which hasn't been defined yet. By calling this function the scripting engine is given a last chance to load the class before PHP fails with an error.
Now I am wanting to know, is it bad practice to solely use __autoload to load the appropriate classes on a dynamic site?
The way my site is setup is to include files into the index.php file, for example http://www.site.com/index.php?p=PAGE-I-WANT-TO-LOAD
So if I am on the forums section or the blogs section of my site, I want only appropriate classes and functions to be loaded, so I use autoload but I never include a file manually, should I be using __autoload as a last resort or is what I am doing fine even on a high traffic system?
Bad? No. __autoload() is one of my favorite additions to PHP 5. It removes the responsibility (and annoyance) of manually having to include/require the class files necessary to your application. That being said, it's up to you as the developer to ensure that only the 'appropriate classes' are loaded. This is easily done with a structured naming scheme and directory structure. There are plenty examples online of how to properly use __autoload(), do a Google search and you'll find plenty of information.
Autoload is a good way to load only what classes is needed.
In PHP 5 >= 5.1.2, most of the problems with the old __autoload() dissapeared, thanks to spl_autoload_register().
Now I am wanting to know, is it bad practice to solely use __autoload to load the appropriate classes on a dynamic site?
Not at all. You can rely on autoload, all you need to do is to devise a good naming convention and implement an efficient autoloader.
There is one major issue to consider. Autoloading and Zend Guard do not play well together, because Zend Guard tends to rename things, which will mean that the naming convention you decided to use will most likely not be the same. If you will be using Zend Guard (or any other obfuscator for that matter) you will most likely be forced to include all the files by hand.
Here is a quote from the Zend Guard user guide:
Autoloading classes will not work since the filename on the disk would not
match the obfuscated class name.
The only danger to __autoload() is if you define a poor autoloading function. Generally, all you're going to get in terms of a performance hit is a few disk seeks as PHP looks for the right files that contain your classes. The upside is getting rid of all those annoying include() calls.
If you're worried about performance at this level, then you should already be using an opcode cache such as APC.
What is the best way to integrate an external script into the Zend Framework? Let me explain because I may be asking this the wrong way. I have a script that downloads and parses an XML file. This script, which runs as a daily cron job, needs to dump its data into the database.
I am using Zend Framework for the site which uses this script and it seems to me that it would be best to use my subclassed model of Zend_Db_Abstract to do the adding and updating of the database. How does one go about doing this? Does my script go in the library next to the Zend Components (i.e. library/Mine/Xmlparse.php) and thus have access to the various ZF components? Do I simply need to include the correct model files and the Zend DB component in the file itself? What is the best way to handle this sort of integration?
Yes, you should put your own classes that maybe inherit Zend Framework classes or add further classes into your own folder next to the Zend Framework folder in library.
When you have Zend_Loader s auto-loading enabled, the class names will automatically map to the class you created, e.g.:
My_Db_Abstract will map to My/Db/Abstract.php .
In your library directory you should have your own library next to the Zend library folder. Whatever you call it (Mylib, Project, ...) you should include it into the Zend Autoloader and that's done as follows:
require_once 'Zend/Loader/Autoloader.php';
$loader = Zend_Loader_Autoloader::getInstance();
$loader->registerNamespace('Project_');
$loader->setFallbackAutoloader(true);
if ($configSection == 'development')
{
$loader->suppressNotFoundWarnings(false);
}
In order for you library to integrate nicely with ZF and the Autoloader you should stick to the ZF naming conventions. This means two things:
if you extend an existing ZF class, replicate the ZF folder structure so that your file has the same path and name except for the library name. E.g. /library/Zend/Db/Abstract.php => /library/Project/Db/Abstract.php.
if you write your own classes, still stick to the ZF naming conventions for the autoloader to find them.
I just came across something that may be germane to this question. This IBM developerWorks article.
The author recommends simply creating a scripts folder in the ZF hierarchy and the using it as one normally would within ZF (though he does set the ini path and call autoload). Is it that simple? Does simply being in the hierarchy of the framework and including the path and autoloader grant your script access to all of the goodies?
I'm not 100% sure what you're trying to ask but I will try to help. If at any point you add a reference to "/path/to/zend/framework" into your php include path then you have in essence enabled the Zend Framework. From there if you do:
require_once('Zend/Loader.php');
Zend_Loader::registerAutoload();
Then at any point in your script you can pretty much just create new Zend Framework objects and Zend_Loader will handle the rest.
One of the big things about the Zend Framework though is not forcing you to do things a certain way. That's why sometimes there are several ways to accomplish the same thing. So, if you feel you need to make your script use the Zend Framework just for the sake of doing so this is not really necessary. But if you think it may improve your script in some way then go for it.
I usually put custom stuff that I think could be used across projects in a custom folder in the library. So I have a library/Ak33m folder that has scripts that may be outside of the framework.
As a ZF noob myself, I think I understand some of what the OP is trying to figure out. So, I'll just explain a bit of what I understand in the hope that it is helpful either to the OP (or more likely, to a future reader, since the original question is so old and I imagine that OP is now a ZF guru).
I understand that ZF claims to be largely "use at will", so that you need no buy into an entire structure, like the Zend_Application, the Zend_Bootstrap class, the entire MVC approach, etc.
Further, I understand conventions for class naming and file locations that enable easy autoloading. Ex: class App_Model_User resides in a folder App/Model/User.php
I think what can be potentially confusing is that in the script context, where you have not yet
done the .htaccess magic that pushes all request to public/index.php
set your APPLICATION_PATH and include paths in public/index.php
created your Application or Bootstrap object tied to a config file
it can be a little bit unclear how best to avail yourself of most of the ZF goodness we get in that context and want in another context.
I guess my answer to the original question would be that the usual entry point sequence of
http request -> .htaccess -> index.php -> config
sets up much of our environment for us, we would need to duplicate some of that for different entry path.
So, for your script, my first instinct would be to create a common include file that mirrors much of what happens in index.php - set the include paths, the APPLICATION_PATH, instantiates and calls a bootstrap, and then does your script-specific processing.
Even better, it might be desirable to create a single entry point for all your scripts, like we do in the http/web context. Extend Zend_Application for your own script purposes so that $application->run(); no longer starts up the MVC router-controller-dispatch processing, but rather does your own stuff. In that way, this single script entry point would look almost identical to the web entry point, the only difference being which application object gets instantiated. Then pass the name of your desired Application class as a command line parameter to the script.
But here I confess to being less confident and just throwing out ideas.
Hope all this helps someone. It actually helped me to write it all down. Thanks and cheers!
Update 2009-09-29: Just ran across this article: Using Zend Framework from the Command Line
Update 2009-11-20: And another article: Cron jobs in Zend Framework | GS Design
Update 2010-02-25: Easy command line scripts with Zend Application - David Caunt
What solution would you recommend for including files in a PHP project?
There aren't manual calls of require/include functions - everything loads through autoload functions
Package importing, when needed.
Here is the package importing API:
import('util.html.HTMLParser');
import('template.arras.*');
In this function declaration you can explode the string with dots (package hierarchy delimeter), looping through files in particular package (folder) to include just one of them or all of them if the asterisk symbol is found at the end of the string, e.g. ('template.arras.*').
One of the benefits I can see in package importing method, is that it can force you to use better object decomposition and class grouping.
One of the drawbacks I can see in autoload method - is that autoload function can become very big and not very obvious/readable.
What do you think about it?
What benefits/drawbacks can you name in each of this methods?
How can I find the best solution for the project?
How can I know if there will be any performance problems if package management is used?
I use __autoload() extensively. The autload function that we use in our application has a few tweaks for backwards compatibility of older classes, but we generally follow a convention when creating new classes that allow the autoload() to work fairly seemlessly:
Consistent Class Naming: each class in its own file, each class is named with camel-case separated by an underscore. This maps to the class path. For example, Some_CoolClass maps to our class directory then 'Some/CoolClass.class.php'. I think some frameworks use this convention.
Explicitly Require External Classes: since we don't have control over the naming of any external libraries that we use, we load them using PHP's require_once() function.
The import method is an improvement but still loads up more than needed.
Either by using the asterisk or loading them up in the beginning of the script (because importing before every "new Classname" will become cumbersome)
I'm a fan of __autoload() or the even better spl_autoload_register()
Because it will include only the classes you're using and the extra benefit of not caring where the class is located. If your colleges moves a file to another directory you are not effected.
The downside is that it need additional logic
to make it work properly with directories.
I use require_once("../path-to-auto-load-script.php.inc") with auto load
I have a standard naming convention for all classes and inc files which makes it easier to programaticaly determine what class name is currently being requested.
for example, all classes have a certain extension like inc.php (so I know that they'll be in the /cls directory)
and
all inc files start with .ht (so they'll be in the /inc directory)
auto load accepts one parameter: className, which I then use to determine where the file is actually located. looping once I know what my target directory is, each time adding "../" to account for sub sub pages, (which seemed to break auto load for me) and finally require_once'ing the actual code file once found.
I strongly suggest doing the following instead:
Throw all your classes into a static array, className => filepath/classFile. The auto load function can use that to load classes.
This ensures that you always load the minimum amount of files. This also means you avoid completely silly class names, and parsing of said names.
If it's slow, you can throw on some accelerator, and that will gain you a whole lot more, if it still is slow, you can run things through a 'compile' process, where often used files are just dumped into common files, and the autoload references can be updated to point to the correct place.
If you start running into issues where your autoloading is too slow, which I find hard to believe, you can split that up according to packages, and have multiple autoloading functions, this way only subsets of the array are required, this works best if your packages are defined around modules of your software (login, admin, email, ...)
I'm not a fan of __autoload(). In a lot of libraries (some PEAR libraries, for instance), developersuse class_exists() without passing in the relatively new second parameter. Any legacy code you have could also have this issue. This can cause warnings and errors if you have an __autoload() defined.
If your libraries are clear though, and you don't have legacy code to deal with, it's a fantastic tool. I sometimes wish PHP had been a little smarter about how they managed the behavior of class_exists(), because I think the problem is with that functionality rather than __autoload().
Rolling your own packaging system is probably a bad idea. I would suggest that you go with explicit manual includes, or with autoload (or a combination for that matter).