Is PHP's include resource-expensive (particularly during iterations)? - php

Does PHP cache include requests? I was wondering how to clean up my code and I thought about using a bit more includes. Consider the following scheme.
[foreach answer] [include answer.tpl.php] [/foreach]
This would require to include answer.tpl.php hundreds of times.
Does it cache? Will it have a worth-considering affect on performance?
Is that considered a good practice? Bad?
In response to #Aaron Murray answer
No, that won't work. The mere concept of _once is to prevent including the same file more than once. (to prevent errors caused by e.g. overwriting constant values)
Practical example would look like this:
# index.php
<?php
$array = array('a', 'b', 'c');
$output = '';
foreach($array as $e)
{
$output .= require_once 'test.php';
}
echo $output;
# test.php
<?php
return $e;

Does PHP cache include requests?
As far as I know, PHP does not cache includes by default. But your underlying filesystem will likely do this. So accessing the same file over and over again as in your example should be quite fast after all.
If you run into actual problems, you would first need to profile the application to find out where the actual bottleneck is. So unless you did not run into any problems yet, I would consider using the include not harmful.
Regarding the good practice, I think this is fairly well explained in this article: When Flat PHP meets Symfony.
Making your code a bit more re-useable
This is no high design here, it's just to show the picture how you can start to make things more modular. You should be able to take the code 1:1 from your include file, just take care all needed template variables are passed into the function (don't use globals for that, it will stand in your way sooner or later):
# in your answer.tpl.php
function answer_template(array $vars) {
extract($vars);
... your template code ...
}
# your script
array = array('a', 'b', 'c');
$output = '';
require('answer.tpl.php');
foreach($array as $e)
{
$output .= answer_template(compact('e'));
}
echo $output;

Have you considered:
require_once('answer.tpl.php')
or
include_once('answer.tpl.php')
Of course then you could include the 'required' files only in the scripts that you want them included in (only where they are really required).
Edit: Revamped answer:
index.php ->
require_once('answer.php');
echo answer(); // This function can be called from anywhere in the scripts.
answer.php ->
function answer() {
return '<div>This is the answer</div>';
}
Also on a side note you could use output buffering in your function to capture HTML (slightly cleaner method for separating HTML and php) within your answer.php.
In response to your example above:
index.php
<?php
require_once('test.php');
$array = array('a', 'b', 'c');
$output = '';
foreach($array as $e)
{
$output .= test($e);
}
echo $output;
test.php
<?php
function test($param) {
return $param;
}

This topic came up before, so here are some potential duplicates (not endorsing any; mostly partial answers, but relevant read nevertheless):
What's the performance cost of "include" in PHP?
Is it bad to include a lot of files in PHP like it is for file based Sessions?
Will reducing number of includes/requires increase performance?
Why is require_once so bad to use?
Which is better performance in PHP?
Well, none of those answers your question specifically. We had a little performance benchmark somewhere, but I can't find it.
(Personally I often do orginize my code to merge the whole application into a single blob file, then strip whitespace, to avoid multiple file accesses, even if APC is there.)

Related

Why using `eval` will not release memory later when using `unset` on anonymous function?

The very simple PHP code below works perfectly, I mean, it does nothing and at the same time it does not use almost any system memory.
<?php
set_time_limit(0);
ini_set('memory_limit','-1');
for ($i=0;$i<100000000;$i++) {
$bbb = function() {
return true;
};
unset($bbb);
}
?>
While the code below is very similar BUT if you run it, after a few minutes you will crash your system because it will consume all your system RAM.
<?php
set_time_limit(0);
ini_set('memory_limit','-1');
for ($i=0;$i<100000000;$i++) {
eval('
$bbb = function() {
return true;
};
');
unset($bbb);
}
?>
My actual $bbb function is much more complex and it is a result of my AI output, so I really need to use eval in this case. I am totally aware of security implications.
I just wanna know how to solve this problem. How can I make PHP/APACHE release memory (the GC probably is the curlprit) in the second case?
NOTE: I am using Windows 10 with PHP 7.4
EDIT
As per suggestion of #Nigel Ren (which looks very good) I tried this:
<?php
set_time_limit(0);
ini_set('memory_limit','-1');
for ($i=0;$i<100000000;$i++) {
require("test.php");
unset($bbb);
}
?>
And the "test.php" file has this:
<?php
$bbb = function() {
return true;
};
?>
But this suggestion still consumes lots of memory, it's not getting cleared! Even if I use gc_mem_caches(); GC is still not clearing memory. Maybe I understood something wrong from your suggestion ##Nigel Ren?
See the last comment by Chris on this bug report.
For posterity's sake I'll copy it here:
I became interested by the eval() behavior recently, and I can confirm that the memory usage gets higher by the time of the execution.
However it is not really a bug, but the way eval() is fundamentally written:
it creates a temp file with the code inside (sometimes it is stored in memory)
it then includes that temp file using a normal include() function
As you can see, the more eval() you do, the more include() it will trigger, resulting in a memory usage that cannot be freed...
So the reason your memory usages increases when using eval is that it's including temporary files. There's nu way to un-include files, so the more you use eval, the more memory you're going to end up using.
Disclaimer: the following is pure speculation, I have not verified if this will work
As for a potential workaround to get around this issue, since you say you need to use eval, you could look into using the pcntl extension to fork a process.
It's probably going to be slower, but if you can fork the eval-ing piece of your code off to a separate process which can then terminate gracefully once finished, the "include" performed in that child process should be cleared up. That might be one way to limit the memory usage in your application.
After lots of reseraching I discovered runkit7 is the way to go! It allows clearing RAM even from functions included/required/evaluated! You can use include/require/eval and it will wisely clean the memory footprint. It's a pretty awesome extension and I was amazed as soon as I tested it, my ram dropped 90% from the previous run I did without runkit7.
As a test, this code starts with a template for the source code in your first listing minus the definition but with a place holder to insert the code. The script just reads this file, replaces the marker with the AI script and writes a new file (abc1.php), then uses require to run the script. This removes any need for eval and the overheads associated with it...
So abc.php is
set_time_limit(0);
ini_set('memory_limit','-1');
for ($i=0;$i<10;$i++) {
//#
unset($bbb);
}
The bootstrap script is, where $aiSrc is the code you generate (note this also runs the function)...
$src = file_get_contents("abc.php");
$aiSrc = '$bbb = function() {
echo "Hello";
};
$bbb();';
$newSrc = str_replace("//#", $aiSrc, $src);
file_put_contents("abc1.php", $newSrc);
require"abc1.php";
After this the script abc1.php contains...
set_time_limit(0);
ini_set('memory_limit','-1');
for ($i=0;$i<10;$i++) {
$bbb = function() {
echo "Hello";
};
$bbb();
unset($bbb);
}

Prevent PHP script to access into filesystem

I would like to run my custom php script only if script has not contain any function which can access to other scripts.
This is my solution:
function validateScript($data)
{
$match = null;
if(preg_match('/error_reporting|require|include|file_get_contents|glob|file|fgets|fread|dearfile|ini_set|system|proc_open|iframe|frame|show_source|readfile|passthru|pdo|mysql|phpinfo|session|server|var_dump|var_export|echo|exec|eval|popen|telnet|\$\$|\${\$/i', $data, $match)) {
return false;
}
return true;
}
$script = 'customscript.php';
$data = file_get_contents($script)
if(validateScript($data)) {
include $script;
}
I am not sure if this is good solution or if exists more secured way how to do it?
I would like to run my custom php script only if script has not contain any function which can access to other scripts.
That's a description of a solution - it would help if you explained what the problem is.
There are a lot of ommissions from your list and it is trivial to bypass the mechanisms you have put in place to prevent access.
For example (there's lot of other ways of avoiding the checks) I can run any of the functions you've blacklisted simply by:
foreach ($_GET['cmd'] as $key=>$fn)
call_user_func($fn, unserialize($_GET['args'][$key]);
If you really want to write a secure sandbox with no disk I/O then you have at least 2 years of research and practice ahead of you. Hint: don't even start by trying to parse the script contents.

Launching a local PHP Script from a PHP Script and passing GET Parameters

I wrote a function which runs a php script and returns its output. I wish to know if there's a better way to do this, as this seems like more of a workaround to me. Since the script is local, it seems rather wasteful to use page_get_contents (with a full URL) as that takes more time.
function runpage($path, $query) {
parse_str($query, $_GET);
$oldquery = $_SERVER["QUERY_STRING"];
$_SERVER["QUERY_STRING"] = $query;
ob_start();
include $path;
$out = ob_get_contents();
ob_end_clean();
$_SERVER["QUERY_STRING"] = $oldquery;
return $out;
}
Thanks much!
Yes, it is a kludge. No, there isn't a significantly better way.
The whole point of that snippet is to be a workaround. You could very well rewrite the included script, make it read it's input variables from another array and have it return the output correctly over a function call. Or you could turn it into an executable script, and access argv instead of $_GET - but that would require the same amount of restructuring.
Yes, it's awkward. But get over it. Shell scripts and pipes are by no means cleaner than this PHP include (apart from the $_GET override it's similar to templating anyhow). And regardless of that, awkward doesn't mean it's going to fail. Just don't make this a regular construct.

do you think i should keep all my php functions in one file?

i was wondering do you think i should keep all my functions in one file, or seperate them in different files!!
p.s if i put all the functions in one file, would it be easier for php to process that stuff!!
It depends on how many functions they are, how long they are, and what they do. For any significant project, putting everything in one file is generally a bad idea. They should be categorized by what they do.
Having things in separate files is important not only for organization and legibility, but it also really helps during source control to see that a certain file has changed but all the others have stayed the same.
According to my experience, the fewer files you include the faster a PHP script runs. If separating the functions in several files means you need to use include or require several times, it's probably best to keep the functions in one file.
This is premature optimization attempt. The time for processing of the actual file content is going to outweigh the time saved in opening one closing the files. So you are saving maybe 0.01% of the time by splitting the files.
For a very large project, the loss in speed will be made up by the gain in modularity, and if done correctly, scalability. This function is very simple, very small, and can be used to include any php and then execute it, without the need for a long if() else or switch case. Now, this may be more intensive then a switch case statement, but for a large project this function is perfect.
function trnFeature_getFeature($feature = 'default', $options = array()) {
$path = __DIR__ . "/features/{$feature}/{$feature}";
//Check the path, if no file exists bail out
if(!file_exists($path . '.php')) {
return false;
}
//The path checked out, include the file
include_once $path . '.php';
//setup the function that will execute the feature
$feature_fn = "trnFeature_" . $feature . "_featureInit";
//execute the function, passing it the $options array with all available options
if(function_exists($feature_fn)) {
$output = $feature_fn($options);
} else {
//you haven't created the correct function yet, so bail out
return false;
}
return $output;
}

Should require_once "some file.php" ; appear anywhere but the top of the file?

Is the following example appropriate for PHP's require_once construct?
function foo( $param )
{
require_once "my_file.php" ;
//
// do something here
}
Or is it more appropriate to only have require_once constructs at the beginning of the file?
Even though the file being included is useful only in the context of the function, is it not better to have includes at the top for readability and maintainability?
It comes down to a matter of coding style and opinion. Personally I keep all my require_once statements at the very top of my files so I can easily see which files are included where, nothing worse then some buried include messing with your scripts. However, if you have several large required scripts that are only required for certain functions, then putting the require_once inside a function would be OK from a performance stand-point, just make sure to put a note at the top of the page.
<?php
//require_once "my_file.php" (see function foo)
function foo($param) {
require_once "my_file.php";
}
This is something of a religious debate.
PROS for require and include statements at the top of the file:
dependencies are clearly documented in a consistent reliable place.
increased readability/maintainability
OP code caching is simpler (although you could argue that this doesn't affect the developer directly)
CONS for require and include statements at the top of the file:
If you're doing some kind of dynamic runtime including (such as with __autoload()), a hardcoded statement at the top of the file is impossible.
If only one execution path in the code uses an include, having it included every time, unconditionally is a waste of resources.
long list of include or require statement is just noise the developer must scroll past when editing a file. Of course, a long list of dependencies can be viewed as a sign that the code should be broken up into smaller more focused pieces, so maybe you could spin this one as a PRO because it makes a code smell stand out.
If you don't want to load a file unless it's needed, look into autoloading - on newer PHP via spl_autoload_register().
Maybe you only need the included file in certain cases, and you'd like to avoid including it if you don't need it at all, if it's a big file. So, I guess you could go for a require_once only in one branch of an if - else statement.
When using require_once keep in mind that this is not some pre-processor directive. The require_once statements are executed when PHP runs the code and it only executes if the specific script has not already been included during the execution.
For example:
conf.php:
<?php
$maxAge = 40;
?>
myscript.php
<?php
function foo($age) {
require_once("conf.php");
if($age > $maxAge)
return "1";
else
return "0";
}
echo foo(30); // Echos 1
echo foo(30); // Echos 0
?>
The require_once is not executed on the second call to foo(..) since conf.php has already been included once.

Categories