PHP include inside loop - is it hard on disk io? - php

Im working on a website that shows products.
The site is written in PHP.
to make it easier to maintain, I created a php code for the "product" item with thumbnail, price, etc.
I would like to know if it is hard on Disk IO to put an include file inside a foreach. Let say the array counts about 200 items.
foreach($wines AS $wine):
require 'components/wine.php';
endforeach;
Are we still ok or there will have some hosting issue?
Thanks!

Answer
Regarding your question though, its probably Ok with the disk. Files imported using require() are also cached in precompiled bytecode the same way as the main file (if you have OPCache or any cache system enabled), so PHP wont read it from disk every time you include it.
Recomendation
I would not recommend that approach at all. I think a more recomendable approach would be to define a function that returns or displays whatever you want to show, then require the file once and call the function between the loop.
I see many downsides in your approach, like:
Its a bad practice, it couples your code because now this file can only be included in this file. It becomes required that the contents of the file are aware of the file that its including it, making it harder to maintain and more prone to errors in the future.
It can arise problems in the future, eg. if someone declares a function inside the file, it would cause a crash as requiring the file twice would redeclare the function leading to an error
It will cause some overhead in the execution, as PHP perform some validations and operations when a file is included
If you want more information about require or OPCache I link documentation below
https://www.php.net/manual/en/function.include.php
https://www.php.net/manual/en/intro.opcache.php

Related

PHP / xdebug profiler require_once poor performance

I just started using xdebug to profile my application and immediately noticed something strange in the results. One of the require_once functions is shown to be taking around 12% of the processing time. There are quite a few other calls to require_once throughout the application and they're all taking less than 1% of the processing time.
The poorly-performing require_once is including a file that's not significantly different or larger than any of the other files, so I'm not sure what could be causing the problem. Has anybody else ever experienced something like this?
Edit: Wanted to provide a little more info. I'm doing the profiling on windows using XAMPP. Normally the application runs on a unix box. I haven't got an easy way to get xdebug onto the box, so it may not be feasible for me to try and compare the results that way.
One last edit: Here's an idea of the code in case that helps (intentionally being vague for the standard CYA legal reasons blah blah blah):
This class is the one with the slow include (test.inc):
require_once('/xx/yy/zz/dao/basedao.inc');
require_once('/xx/yy/zz/vo/test.inc');
class TestDAO extends BaseDAO {
// bunch of code to handle database records and return VO objects
And this is the file being included:
require_once('/xx/yy/zz/vo/basevo.inc');
class Test extends BaseVO {
// bunch of properties, getters/setters, that kinda stuff
I have quite a few other VO/DAO objects that are built the exact same way, without any issue. All are located within the same respective paths.
That does indeed sound odd. Definitely worth pursuing, though it'll be hard to work it out for sure without seeing the actual code. 12% of the total program time for a single require_once() does sound very excessive.
But here are some thoughts on possible avenues of investigation:
require_once() keeps a lookup table of files that have been included, so perhaps it's slowing things down having to refer to that lookup table.
If this is the cause, you could solve it by using require() rather than require_once() wherever possible.
Perhaps it's the path lookup? Are you including a path with the filename? If not, it'll be checking in a number of places to find the file; perhaps it isn't in the first place it looks, it'll be taking longer to find the file before it can include it.
If this is the cause, you could solve it by being more specific about the path in your code.
Hope that helps. Would be interested to hear how this pans out.
Oh, and by the way -- if your biggest problem area in your code is require_once(), then it sounds like you've done a good job with your code! I dream of the day require_once() even shows up in my profiler reports, let alone with an significant effect.

PHP's include_once is just a static "instance" of that file?

Reading over the documentation about include_once it basically says that the file in question won't be re-included if it's included already.
Can I equate this in my head as a static instance of the script, or will this lead me to trouble down the road?
EDIT
Not looking for a tutorial on include_once, I understand it quite well. Would like to know if I can conceptually attach the properties and characteristics of a static member, to this idea.
EDIT 2
Would the downvoters care to explain? It's a conceptual question, with a clear answer.
I'm not sure how the actual function works, but the best way to think about it is:
"If this file has been included/required earlier in this stream, no need to include/require it again."
The point is to avoid including a file that "redefines" a class or function because it was already defined in that same file included earlier.
The important thing to keep in mind is the idea of the script (and any included scripts along the way) as having a start and a finish. Any calls to include_once checks if that file has already been included since the start of the whole script. If it has, it doesn't bother to include it again, it just goes off of the originally included file.
Once the script is finished (no work left to be done, stream is closed, interrupted, aborted, etc), then re-starting the process treats the first include/include_once of a file as the first include, since it's the first time it's been included since the new running of the script. In that case, your back to square one, with all values being set back to default or unset.

Make getimagesize() referenced cached values

I am tweaking a zen-cart website to be more cpu efficient. After profiling it, I see that the getimagesize function is accounting for nearly 50% of page load time. The images are locally stored. One option is to go through zen-cart's source code and replace this function with something custom to reference a cached value since images are rarely ever changed. However, since php is open source, perhaps another option is available: Is there any way to modify this function to make it just read a value from a cache which I can set whenever I upload an image to the server? Maybe by adding an optional parameter to the function that makes it read from cache.
Interesting idea, but this would require recompiling PHP. While not impossible, it's probably not a good idea from a maintenance point of view: You would have to re-integrate your patch on every PHP update.
However, you might be able to override getimagesize(): There seem to be PHP modules and libraries that can add this capability to PHP.
I have no experience with any of them, but here are some suggestions on how to do it:
PHP - override existing function
Override default php function (the namespaces idea is clever, but probably won't work in your case)

Alternative to eval() when caching and displaying generated PHP pages

I've worked on a CMS which would use Smarty to build the content pages as PHP files, then save them to disc so all subsequent views of the same page could bypass the generation phase, keeping DB load and page loading times down. These pages would be completely standalone and not have to run in the context of another script.
The problem was the instance where a user first visited a page that wasn't cached, they'd still have to be displayed the generated content. I was hoping I could save my generated file, then include() it, but filesystem latency meant that this wasn't an option.
The only solution I could find was using eval() to run the generated string after it was generated and saved to disc. While this works, it's not nice to have to debug in, so I'd be very interested in finding an alternative.
Is there some method I could use other than eval in the above case?
Given your scenario, I do not think there is an alternative.
As for the debugging part, you could always write it to disc and include it for the development to test / fix it up that way and then when you have the bugs worked out, switch it over to eval.
Not knowing your system, I will not second guess that you know it better than I do, but it seems like a lot of effort, especially since that the above scenario will only happen once per page...ever. I would just say is it really worth it for that one instance to display the initial page through eval and why could you not be the initial user to generate the pages?

Which is better: require_once? or if (!defined('FOOBAR')) require? or something entirely different?

Assuming PHP version >= 5.2, which is a better solution for managing includes:
require_once DEFPATH . 'config.inc.php';
or
if (!defined('FOOBAR')) require DEFPATH . 'config.inc.php';
or something entirely different?
Site does not use a front controller or autoloader. Typical number of files needed to be included is 3 - 8.
I've heard, here and elsewhere, that require_once adds overhead and does not play nice with caching. But I've also heard that a small number of require_once statements is OK in practice in later versions of PHP.
There must also be some overhead associated with checking if something is defined, but that may be less of an issue.
In general, which is the better practice and why?
Yes, require_once() comes with CPU and memory overhead, but not in any way significant to performance. In fact, it's a very efficient operation because PHP just does one hashtable lookup to decide whether that file has already been parsed or not. Don't give yourself any unnecessary headaches with hacks like your !defined('FOOBAR') example, because it does the same thing only with less elegance.
What takes time during a PHP request is the file path lookup and then the subsequent parsing step. The amount of resources needed for the runtime to determine whether you've already parsed the file before is negligible.
There are a few things you can do to make your request run faster:
avoid unecessarily long search path in include() and require()
use absolute include paths were possible
include stuff only as needed
cache large output fragments
cache complex query results
write terse code, offload seldom-used functionality to included-on-demand source files
If you're using a code cache or some kind of runtime optimizer, read the manual - they have different strategies that can actually hurt performance or introduce bugs depending on your code.
One final advice: beware premature optimization! Requests taking between 0 and 50 miliseconds on the server are practically not distinguishable on the user side.
I would use require_once in your case. Even if it adds overhead, you can neglect it, especially when you only have to include max 8 files. The require_once is a lot less error prone and more 'clean'...
In your example, checking if FOOBAR is defined would be better. But that's because your example is not specifying a full path to the include file.
You should always specify the full path for just about anything, unless you specifically want to use a relative path. Otherwise PHP needs to search in the various search path directories for your file. If it happens to be in the last search path directory, that's a bit of overhead.
I always create an array containing the full path to directories I load files from, then I prepend the appropriate path to the file I'm loading.

Categories