I have this problem that is really causing me headeches whenever i'm designing my apps in php: I don't know if i should create separete files for each function(e.g.: functions for validating specific forms).
OK, one would possibily argue that this makes no sense because I would have to include each file separetly and this would result in a more slow application maybe?
But I still think it does make sense since for one pageload i doubt that other functions would be used by the script at all, so they must be loaded just for nothing? besides, i don't have to include each function-file manually if the system i design does this dinamically (parsing url vars or such) for me, that is, loading function(-files) exactly when needed. What do you think?
The overhead in file includes is minimal, you shouldn't have to worry about it really, considering caching and other things. Of It's more about how you can keep yourself organized and find your stuff quickly.
Honestly, I rarely use functions, I use classes. The rule is usually to have a class per file. But I also have a toolbox file that contains all my global functions.
Are you using OO? If so, then you should definitely keep it one class per file, and name the files intelligently...
class Page {
...
}
should be findable somewhere like classes/Page.php or includes/Page.class.php or similar.
If you just have a bunch of global functions, you should group them in files, e.g. includes/functions/general.php.
To elaborate, your functions folder may have...
array.php
string.php
form_validation.php
request.php
general.php
html.php
If you are organising your files like this, a better idea is to use a class and make the functions static, e.g. string::isAlphaNum($str). This is a better idea because it only introduces one new term to your global namespace, instead of a bunch of them needlessly.
If you are using PHP 5.3, you could also look at namespaces.
You should just make sure that you have APC, xCache or eAccelerator installed. All of them provide cache for compiled PHP bytecode.
It means that once the file has been included it will be stored in memory and ready to use by feature requests. There won't be any need to include files.
You will almost certainly see a more significant performance hit through increased disk I/O reads on (many) multiple file includes than for smaller set of files with many functions.
For automatic file includes, wrap functions into suitable classes and use spl_autoload to let PHP handle the include process.
Related
This question already has answers here:
Autoloader for functions
(13 answers)
Closed 1 year ago.
I have a single functions file for my entire site and given a single page 90% of the file is not even called. So i want to load only the functions which are called in the page and i am new to php.
Group functions together into several objects with common traits, by making them static functions, and put each object in a separate file. Then use PHP5 autoloading functions to load appropriate objects only when they are used.
You can split this file into many small files and include only that you need to use.
However if it is not a big file it will not decrease your performance at all
You could split up your functions file in multiple files, but mind that if you need more files, loading may even be slower, because you need more IO commands to load the different files.
Futhermore, you split files by functionality. If you feel all these functions belong together, keep them together in that file. It will not slow down your script very much.
If you like, you can put the functions in (static) classes and use an autoloader to load the file, but I'm not in favor of this solution. I think static classes are just an excuse to get functions (and vars) out of the global scope, and creating classes just for auto-loading is abusing the autoload functionality.
Of course, if you create a more object oriented script, using classes makes sense too, and auto-loading them might be convenient.
Actually it's not a problem to have a function available for use but you only use it in part of your application. It's more important you have everything when you need it at hand w/o actually caring about when to load.
If your system grows, what you might look for is an autoloader. PHP supports autoloading of classes but not for functions. However you can group your functions into classes (some will slap me for making such a statement) to make use of autoloading then.
If Your file size is not big then it will not decrease your performance at all.still if you want to achieve your goal then group related functions and place them in separate files and include only necessary files...
I'd like to check if my understanding's correct about require_once(). I have a bunch of functions in file foo.php. Let's say 7/8 out of them are always used, and one of them's rather rare and rather large. I have to keep this function's definition in foo.php. Can you tell me if the following approach achieves anything, and if you think it's worth what it achieves.
Take out the body of the function and put it in an external file.
Redefine the original function to require_once() that particular file and pass execution over to the helper function.
I understand this could save server memory on the requests where that function isn't run. What exactly am I saving though? Just the memory it takes to hold the body of the function? That would mean it would have to be a pretty big function before being worth it I guess. Also, if I'm using something like APC does it become less useful?
Please correct or add to this as appropriate!
Thank you very much.
I highly doubt you will gain anything from refactoring it into a separate file. Also, do you actually have any problems with the code as it is now that makes you consider this? You know premature optimization is the root of all evil. Profile your code to see if and where it has bottlenecks before you waste your time on pointless µoptimizations.
Update: From your comment to Gordon I see you're using Drupal: This method doesn't work there, because Drupal is heavily object-oriented already. What is described here is making use of the Autoloading mechanism for a function-based project using static classes.
As Gordon says, this will save memory only for really incredibly huge functions. In my experience, though, loading includes into memory takes up much more space than the exact number of bytes needed to hold the code. I'm not a PHP internals expert, but I assume the code is parsed or at least preprocessed either way.
Assuming your PHP application is entirely function-based with no OOP, one idea that comes to mind that you could use to split up your includes is putting your functions into classes, and making use of the autoloader mechanism:
class maintenance_functions
{
public static function xyz() { ................ }
and start calling them statically:
maintenance_functions::xyz();
one class would occupy one file.
Group all the rarely used functions into separate files.
Then, you could make use of the Autoloading mechanism. This will load needed classes automatically at the point they are needed. If I for example, call
datamining_functions::xyz();
the autoloader will look for the file containing datamining_functions and include that. This eliminates the complete 'require()' hassle and lets you concentrate on how to group your functions most efficiently.
This is not a real transition to OOP: We just use class constructs to group the functions, in order to be able to use the autoloader. It should be possible to migrate functions into such classes without the need for major rewrites.
You should use Xdebug to find out the answer - worth or not is subjective.
I'm trying to find the best pragmatic approach to import functions on the fly... let me explain.
Say I have a directory called functions which has these files:
array_select.func.php
stat_mediam.func.php
stat_mean.func.php
.....
I would like to: load each individual file (which has a function defined inside) and use it just like an internal php function.. such as array_pop(), array_shift(), etc.
Once I stumbled on a tutorial (which I can't find again now) that compiled user defined functions as part of a PHP installation.. Although that's not a very good solution because on shared/reseller hosting you can't recompile the PHP installation.
I don't want to have conflicts with future versions of PHP / other extensions, i.e. if a function named X by me, is suddenly part of the internal php functions (even though it might not have the same functionality per se) I don't want PHP to throw a fatal error because of this and fail miserably.
So the best method that I can think of is to check if a function is defined, using function_exists(), if so throw a notice so that it's easy to track in the log files, otherwise define the function. However that will probably translate to having a lot of include/require statement in other files where I need such a function, which I don't really like. Or possibly, read the directory and loop over each *.func.php file and include_once. Though I find this a bit ugly.
The question is, have you ever stumbled upon some source code which handled such a case? How was it implemented? Did you ever do something similar? I need as much ideas as possible! :)
One way you could pull something like this off is to put those functions into classes and then set up an __autoload function. If you are against wrapping the functions in classes than this solution probably won't apply to you. Personally I like it because it allows me to namespace my functions and share private methods between them.
First you set up your autoload function similar to this. You'll want to adjust the naming convention to fit your own style, and probably introduce some error handling, but this is just to get the basic idea across.
function __autoload($class_name){
require_once(strtolower("library/$class_name.class.php"));
}
Then anywhere in your code regardless of scope you can do something like this.
arrayFunctions::doStuff($myArray);
PHP will automatically try to include "library/arrayFunctions.class.php" and look for a method called "doStuff" in the arrayFunctions class.
I have issues with this idea. Hitting the file system to include a single function is very expensive in the terms of it lowering your max possible requests per second.
It's generally much better to load/parse five functions in a single file (static class?) and only use two of them (one stat call) rather than load two files for two functions (two stat calls).
Which obviously becomes even worse when you need all five functions.
To automatically load stuff when need, put your functions in classes and use autoloading.
For the name conflict, use namespaces (if you have PHP 5.3).
If I had a large number of functions would it be better to keep them all in one large file or would it be better to separate them into several files of related functions. By better I mean more efficient for both maintainability and for the server processing the request.
For example right now I have all my files in a file named include.php. But would it be wiser to have an include file of includes like:
<?php
include('/functions/user.php');
include('/functions/admin.php');
include('/functions/content.php');
include('/functions/nav.php');
include('/functions/database.php');
include('/functions/other_junk.php');
?>
Definitely separate them, for maintainability sake. I doubt performance will suffer at all, but even if it does (just a teensy bit) you're better off writing maintainable, readable code.
You want to be sure you're using a PHP cache like XCache or APC. Your PHP files should then all be in memory and you shouldn't be worried about your includes hitting the disk at all.
I would definitely find it easier if you broke up like minded functions/classes into their own files.
In terms of maintainability, it's usually better to separate your functions into related groups. ( like you showed above, user.php would be only the user-related functions ).
You should only have a file that has all of those includes if you know that you'll need all of the included files every time you need to include any file.
Otherwise, it defeats the purpose of having that 'catch-all' file.
In my experience multiple includes and/or requires generally arent goping to set you too much back if youre talking a couple dozen or so files for function libraries. Especially if you can manage to only call the statement for a particular file once during a request lifecycle.
Where it starts to show performance hits is if you get into OOP or a highly complex functional/procedural type architecture where there may be hundreds of different classes/files. But generally at that point youll have hopefully done some kind of mitigation via caching/compiling.
I have a list of includes in a central .config file.
For all OOP classes though I use autoload -> I know it's slightly slower, but it saves having to including them as I make a new class. And they're only loaded as required.
As an aside, include is quicker than include_once as it doesn't have to check if the file has been included already.
I have a javascript file that communicates to a controller class, which in turn delegates which function to run to a transactions class.
Is it better to have the transactions class broken up into multiple smaller files and then in my switch statement include which ever smaller file i need? or should i have all my transactions in one file?
I know keeping file size down is always a good idea, but will that affect my ajax functions if my transactions file starts getting pretty lengthy?
I vote for smaller files, so you avoid antipattern : God Object
Whether you include files or not is really up to you, and in the end they use up very minimal disk i/o unless you include like 100 files.
What I'd suggest is breaking them up into smaller files based on sets of functions (HTML functions, URL functions, etc) or classes (one class per file) just like any other script.
This is partly to save your sanity, and also because it is a good practice to simply seperate everything so you can take one file containing all of X functions over to another project that needs the same functions.