Is it quicker to not load/include many files in php? - php

I have about a 2000 line functions file, and I have realized I can split it up into a bout 4 files, and only include the one's required. At the moment it is very neat, and ideally I would like to leave it that way, however if speed increases can be gained I would like to only include the different sections on particular conditions.
My question is basically, would it be quicker to have an if statement and only load the php functions needed. Speed is a factor, as this library is called in an ajax polling situation.

best approach would be to divide all your code into meaningful folders/files/classes/functions. This serves the purpose of maintainable and readable code.
And then use some kind of cache like APC whih removes the problem of many includes/IOs almost completely.

Related

List or single line array/parameters: Does one or the other perform better?

A little bit of a generic question but it has been playing on my mind for a while.
Whilst learning php coding, to help me create a WordPress Theme from scratch, I have noticed that some arrays/parameters are kept to a single line whilst others are listed underneath one an other. Personally, I prefer listing the arrays underneath one and other as I feel this helps with readability and generally just looks tidier - Especially, if the array is long.
Does anyone know if listing arrays/parameters have any performance 'ill effects' such as slowing down the page load speed etc? As far as I can see, it is just a coder's preference. Is this a correct assumption?
Code formatting has no effect on performance.
Even if you claim that a larger file takes longer to read, if you are using at least PHP 5.5 then PHP will use an opcode cache - it will cache how it parsed your files for subsequent requests, eliminating any formatting that you have in your file.

One PHP file for multiple pages?

I'm creating a dashboard which will have different pages for different purposes. I'm thinking if wether I should have one PHP file for each page or one for all the pages?
Pros to me is:
One file with all functions to simplify work (less files - less clutter)
Cons:
Longer loading time because one large file?
I guess my major question is if there really is a major advantage/disadvantage to either?
Using a building irony, nobody says we can't pile 50 rooms one above another instead of spreading 25 on two levels!
1) Putting all files in one file is less clutter indeed, but more likely to break everything once something goes wrong at some point.
2) Depending on how many files you have to create, I would advise to separate functionality in multiple files. If one breaks, others may still work.
3) I love Ajax in that it can reduce all this "clutter issue", keeping in mind though, that everything will rely on your Ajax function being well done and securely put together
:)

How many lines are too many when reading from a file?

I intend to make a dynamic list in php, for which I have a plain text file with an element of the list in every line. Every line has a string that needs to be parsed into several smaller chunks before rendering the final html document.
Last time I did something similar, I used a file() function to load my file into an array, but in this case I have a 12KB file with more than 50 lines, that will most certainly grow bigger over time. Should I load the entries from the file to a SQL database to avoid performance issues?
Yes, put the information into a data base. Not for performance reasons (in terms of sequential reading) because a 12KB file will be read very quickly, but for the part about parsing into separate chunks. Make those chunks into columns of your DB table. It will make the whole programming process go faster, with greater flexibility.
Breaking stuff up in to properly formatted database is -almost- always a good idea and will be a performance saver.
However, 50 lines is pretty minor (even a few hundred lines is pretty minor). A bit of quick math, 12KB / 50 lines tells me each line is only about 240 characters long on average.
I doubt that amount of processing (or even several times that much) will be a significant enough performance hit to cause dread unless this is a super high performance site.
While 50 lines doesn't seem like too much, it would be a good idea to use the database now rather than making the change later. One think you would have to remember is that using database won't straight-away eliminate performance issues, but help you make better use of resources. In fact, you can write a similarly optimized process using files too, and they would work just about the same except for I/O difference.
I reread the question and you realize that you might mean that you would load the file to the database every time. I don't see how this can help unless you are using database as a form of cache to avoid repeated hits to the file. Ultimately, reading from a file or database would only differ in how the script uses I/O, disk caches, etc... The processing you do on the list might make more of a difference here.

Most efficient way to share functions among several php pages

I have about 10 dynamic php pages which use about 30 functions. Each function is needed in more than 1 page, and every page needs a different subset of functions.
I've been pondering these options:
1- all functions in a single include file: every page loads unneeded code
2- each function in its own include file: too many server requests when loading each page
3 - single include file with conditionals only declaring functions needed based on REQUEST_URI: additional processing when loading each page
4 - one include file per php page, with copies of functions needed by that page: difficult to maintain
How does people handle this scenario? Thanks!
Option 1 is the simplest, the easiest to maintain, and probably quicker to run than options 2 and 3. Option 4 would be very very slightly faster to run, at the cost of being a maintenance nightmare.
Stick with option 1.
throw related functions into a library include. include libraries as needed.
Further, if you spend another 5 seconds thinking about this, that will be 5 additional seconds you've wasted
(In case you don't get what I'm saying, worrying about include optimization is about the 5billionth thing on your list of things you should ever worry about, until such time as a reported performance problem from end users and subsequent profiling tells you otherwise.)
You problem shouts OOP.
Organize code by Classes. Load Classes as needed.
More into PHP OOP.
The impact on your server of loading up 30 functions is despicable compared to the impact of yourself having to maintain a map and possible links to all 30 functions.
Start by putting them all in one include_once'ed file.
If performance becomes an issue, start by installing a PHP accelerator (to cache the parsed PHP into opcodes, so constant re-parsing is avoided) on the server.
When maintenance becomes an issue, break them up by function. You'll still end up with a catch-all "util.inc" or "misc.inc" file, though...

Which is better performance in PHP?

I generally include 1 functions file into the hader of my site, now this site is pretty high traffic and I just like to make every little thing the best that I can, so my question here is,
Is it better to include multiple smaller function type files with just the code that's needed for that page or does it really make no difference to just load it all as 1 big file, my current functions file has all the functions for my whole site, it's about 4,000 lines long and is loaded on every single page load sitewide, is that bad?
It's difficult to say. 4,000 lines isn't that large in the realms of file parsing. In terms of code management, that's starting to get on the unwieldy side, but you're not likely to see much of a measurable performance difference by breaking it up into 2, 5 or 10 files, and having pages include only the few they need (it's better coding practice, but that's a separate issue). Your differential in number-of-lines read vs. number-of-files that the parser needs to open doesn't seem large enough to warrant anything significant. My initial reaction is that this is probably not an issue you need to worry about.
On the opposite side of the coin, I worked on an enterprise-level project where some operations had an include() tree that often extended into the hundreds of files. Profiling these operations indicated that the time taken by the include() calls alone made up 2-3 seconds of a 10 second load operation (this was PHP4).
If you can install extensions on your server, you should take a look at APC (see also).
It is free, by the way ;-) ; but you must be admin of your server to install it ; so it's generally not provided on shared hosting...
It is what is called an "opcode cache".
Basically, when a PHP script is called, two things happen :
the script is "compiled" into opcodes
the opcodes are executed
APC keeps the opcodes in RAM ; so the file doesn't have to be re-compiled each time it is called -- and that's a great thing for both CPU-load and performances.
To answer the question a bit more :
4,000 lines is not that much, speaking of performances ; Open a couple of files of any big application / Framework, and you'll rapidly get to a couple thousand of lines
a really important thing to take into account is maintenability : what will be easier to work with for you and your team ?
loading many small files might imply many system calls, which are slow ; but those would probably be cached by the OS... So probably not that relevant
If you are doing even 1 database query, this one (including network round-trip between PHP server and DB server) will probably take more time than the parsing of a couple thousand lines ;-)
I think it would be better if you could split the functions file up into components that is appropriate for each page; and call for those components in the appropriate pages. Just my 2 cents!
p/s: I'm a PHP amateur and I'm trying my hands on making a PHP site; I'm not using any functions. So can you enlighten me on what functions would you need for a site?
In my experience having a large include file which gets included everywhere can actually kill performance. I worked on a browser game where we had all game rules as dynamically generated PHP (among others) and the file weighed in at around 500 KiB. It definitely affected performance and we considered generating a PHP extension instead.
However, as usual, I'd say you should do what you're doing now until it is a performance problem and then optimize as needed.
If you load a 4000 line file and use maybe 1 function that is 10 lines, then yes I would say it is inefficient. Even if you used lots of functions of a combined 1000 lines, it is still inefficient.
My suggestion would be to group related functions together and store them in separate files. That way if a page only deals with, for example, database functions you can load just your database functions file/library.
Anothe reason for splitting the functions up is maintainability. If you need to change a function you need to find it in your monalithic include file. You may also have functions that are very, very similar but don't even realise it. Sorting functions by what they do allows you to compare them and get rid of things you don't need or merge two functions into one more general purpose function.
Most of the time Disc IO is what will kill your server so I think the lesser files you fetch from disc the better. Furthermore if it is possible to install APC then the file will be stored compiled into memory which is a big win.
Generally it is better, file management wise, to break stuff down into smaller files because you only need to load the files that you actually use. But, at 4,000 lines, it probably won't make too much of a difference.
I'd suggest a solution similar to this
function inc_lib($name)
{
include("/path/to/lib".$name.".lib.php");
}
function inc_class($name)
{
include("/path/to/lib".$name.".class.php");
}

Categories