Does PHP's require() function cache its results in PHP 5.6? - php

In a PHP script intended to work on generic shared LAMP hosting, I am using require() as a function to read data from a file. The data file looks like this:
<?php
# storage.php
return [
// Rotating secrets
'last_rand' => '532e89355b78aafdb85f5f01f0eed20440d6bd9e0a2d6ae1bd17be4e1d7d21c7bb7a822a2077e3f4',
];
And I read it like this:
$data = require($root . '/storage.php');
$lastRand = $data['last_rand'];
In my script, I will read information from a configuration file using this approach. In some cases, the operation will re-write a new config file (using file_put_contents()) and then re-read it, again using require().
This has worked solidly so far, on a variety of LAMP hosts, but today I found a web host where the require() does not seem to notice changes in the file. This host is using PHP 5.6, whereas all the others I have tested are using 7+.
What's extremely odd is that require() gets the old contents of the file even though file_get_contents() can see the new version, as if require() is doing its own internal caching.
Failed fix attempts
I have even tried waiting for require() to catch up, in case some underlying cache needed time to expire:
$ok = true;
$t = microtime(true);
while ($oldRandom != $this->getFileService()->requirePhp($this->getStoragePath())['last_rand'])
{
$elapsedTime = microtime(true) - $t;
if ($elapsedTime > 4) // Wait 4 seconds
{
$ok = false;
break;
}
usleep(1000);
}
That did not work either (the timeout just expires with no change in the results brought back from require()).
However, upon the next run of the PHP script, require() suddenly sees the new file contents.
I've also tried to delete the storage.php file before re-reading it, and that has no effect either! I was really expecting that to help.
Future actions
So, I don't understand this behaviour at all. To work around it I could:
rather than modifying a new config, saving it to file and then reloading it from file, I could refactor my code so that I save the new config to memory, thus avoiding having to reload it
refactor so that I use file_get_contents() rather than require() (for rather dull reasons it's not as convenient, but I'll prefer whatever works reliably).
However, these both feel like I am ignoring a bug, and that I should investigate the cause. I happen to know also that this host (a free one) is nearly always under heavy CPU load. It's a 64 bit server running Linux on the rather old 2.6 kernel, and phpinfo() indicates the CGI/FastCGI server API is in force.
Can anything be done to mitigate this behaviour?
More attempts
Some helpful commenters suggest this problem could be related to opcaching, which seems to fit the general purpose of require(). I've added this code after the file_put_contents() that re-writes the config:
$reset = opcache_reset(); // Returns true
$invalid = opcache_invalidate($this->getStoragePath()); // Returns false
However, it makes no difference - require() stubbornly reads the same value. I have confirmed this by doing a require() before and after the file write, and also a file_get_contents() to get the true contents.

When I originally encountered this error, I was hesitant to work around it, since it felt like a critical bug that ought not be ignored. However, now that the culprit is likely to be opcaching (and thus related to require() specifically), I think working around it by keeping values in memory is not a bad solution. I shall have to remember that require() is only good for one call per HTTP request, even if that does not hold true for all PHP installations.
I am conscious also that if I did try to work around this, I would have to contend with a number of opcache invalidation mechanisms, which is more complexity than I am comfortable with.

I found the same problem with a config file, the only valid solution for now is to change the name of the config file with each modification:
1 - load config file
// get the config file name
$config_file=glob('config/config*.php'))[0] ?? null;
// load the settings
if($config_file) $settings=require_once($config_file);
2 - save the config file when needed
...
$data=['var'=>'value','other-var'=>'other value'];
$vars=var_export($data, TRUE);
// new file name
$file_name='config/config-'.time().'.php';
file_put_contents($file_name,"<?php\n return $vars;");
// delete old config file
if($config_file) #unlink($config_file);

Related

How to use just one file as Doctrine annotation reader cache in Symfony 2?

I'm working on a Symfony 2 project running on a shared hosting (cheap but slow, I've to admit). Performances are really bad, even minifying all css/js and enabling the gzip compression (PHP output only with a custom php.ini). Server responds in 5-10 seconds, that is the time to parse and execute PHP files. Dowloading resources takes just millisecs, hence the delay is between the request and response. This is what (I think) I've found with the Chrome network console.
Looking at the annotation reader cache folder (app/cache/prod/annotations) I've noticed that and there are 441 *.cache.php files in it. I think that the annotations reader is going to read all of these files on each request!
This could be the source of the bad performances. How can I speed up things and write a custom annotation cache driver that uses just less files or a single file as annotations cache?
Merging annotation cache files into one won't speed up things, moreover, it
will slow them down. I've got an explanation for this statement.
First of all if you check the Annotation cache (file)reader source
82: public function getClassAnnotations(\ReflectionClass $class) {
....
$path = $this->dir.'/'.strtr($key, '\\', '-').'.cache.php';
if (!is_file($path)) {
$annot = $this->reader->getClassAnnotations($class);
$this->saveCacheFile($path, $annot);
return $this->loadedAnnotations[$key] = $annot;
}
....
return $this->loadedAnnotations[$key] = include $path;
So, basically, what happens here is that every file is annotation
content of specific class. It's a simple small file being fetched and
ready for use. If it's not there, cache is being 'slowly' generated manually
and getting saved for future use.
If you want this driver dump cache into one file, then instead of:
include $path;
you'll have to do something like:
file_get_contents('one_big_cache');
...searching/regexps/arrays/loops...
return $result;
And this looks way slower if you ask me.
I'm not claiming to be absolutely right on this one and will be glad to hear
opinions on this topic

How to get pthreads working in PHP?

I am using wampserver to test & run wordpress code in my local computer. In order to run pthread, I have followed the following steps:
1) I got the pthread zip file from http://windows.php.net/downloads/pecl/releases/pthreads/0.44/
(My machine has php 5.3.13 and downloaded the php_pthreads-0.44-5.3-ts-vc9-x86.zip file from the above link).
2) Extracted the zip file. Moved the php_pthreads.dll to the C:\wamp\bin\php\php5.3.13\ext directory.
3) Moved pthreadVC2.dll to the C:\wamp\bin\php\php5.3.13 directory.
4) Then Opened C:\wamp\bin\php\php5.3.13\php.ini and added the code extension=php_pthreads.dll at the begining of the file.
But when I try to run the following code:
<?php
class My extends Thread {
public function run() {
printf("%s is Thread #%lu\n", __CLASS__, $this->getThreadId());
}
}
$my = new My();
$my->start();
?>
It gives me the following error:
Fatal error: Class 'Thread' not found in C:\wamp\www\wp-admin\includes\post.php on line 2
Can you please tell me how to install pthreads in my computer to use with php? and do I have to install any other software?
I've noticed that wampserver has php.ini in two separate places. One place is in the /wamp/bin/php/php5... directory, and the other place is in the /wamp/bin/apache/apache.../bin directory (where "..." represents version numbers). The two files need to be identical, because apparently both are loaded at different times by the overall wampserver boot-up procedure.
(Note I only discovered this recently, and may be well "behind the curve" of doing fancy things with wampserver --maybe everyone else has been dealing with both files for a long time. So I don't know if this simple thing will fix your problem; I came here looking for info, myself, regarding doing some multi-threading stuff. :)
One other thing. According to this page: www.php.net/manual/en/pthreads.requirements.php
PHP has to be compiled with "--enable-zts" in order for pthreads stuff to work. I have not been able to find any evidence that the PHP part of wampserver was compiled that way.
(months later)
Having decided I didn't really immediately need to do any threading stuff, I went on to do other things, until the need actually arose. I now can say that the version of PHP compiled into WampServer does support the "pthread" extension, although some set-up work is needed, first. The instructions I saw mentioned putting a couple of .dll files (after a download and unZip) into certain places, but that didn't work for me. Copying them to the \Windows\System32 directory did work. (Putting them into the \apache...\bin directory also works; there are some other php .dll files in there.)
After that, much like what you did, it is necessary to define a "class" that extends the "Thread" class, in order to actually do something in another thread. The "run()" function in the Thread class is "abstract", and needs to be "realized" as an actual function in the extended class. Then the "new" operator can create an "instance", an object of that specified class, for actual use. Here's the class that I needed:
//Purpose: Use another thread to run the code in another php file, after a delay
class xT extends Thread
{ var $fil, $tim;
function savWhatWhen($f="", $t=0)
{ $this->fil = $f; //save What, file to process
$this->tim = $t; //save When, delay before processing file
return;
}
function run()
{ ini_set('max_execution_time', 600); //600 seconds = 10 minutes
sleep($this->tim); //do a delay; beware of max-exec-time!
include($this->fil); //load file-to-process, and process it
return;
} }
That "savWhatWhen()" function was created specifically for this extension of the basic Thread class. Here's some code for using that class:
$TH = new xT(); //prepare a separate Thread
$TH->savWhatWhen("d:/wamp/myscripts/test.php", 45);//file-name and delay time
$TH->start(); //after delay, process file
//the code that does this can terminate, while OTHER thread is doing a delay
Note for anyone copying this code, you might need to make sure your "open_basedir" setting in the php.ini allows access to the specified file.
More months later: With lots of things being worked on, I haven't put a lot of time into using my pthread object. I did encounter a peculiarity that makes me wonder about whether or not I can actually use pthreads the way I had hoped. Here is what I have observed:
1. An initial php file is called by AJAX, to do something.
2. The PHP processor on the Web Server does that thing.
3. Various data is supposed to be echoed to the browser.
4. The initial php file calls for the creation of another thread, and terminates.
5. The browser does not yet receive the echoed data!
6. The PHP processor on the Web Server does the work delegated to the second thread.
7. When the second thread terminates, NOW the browser receives the echoed data!
At this writing I'm thinking I missed something. Perhaps I need to do some forceful "flush" stuff when the first thread ends, so that the browser can receive the echoed data and the user can do things while the PHP processor on the server is also doing things.
Check for extension_dir = "ext" in you php.ini file. Make sure it points to the folder where your extensions reside and make sure it's not commented (it has a semicolon ; in front of it)
You have to add a require_once() with the path of the Thread class before extending it (if your framework don't use an autoload class system)
I've encountered the same problem, in my case placing the pthreadVC2.dll in
..wamp\bin\apache\Apache2.4.4\bin
(instead of ..\wamp\bin\php\php5.4.16 as the guide in php.net instructs) solved the problem
Wamp server has a separate php.ini config file for the browser and for the cli.
To use the pthreads module in the browser with WAMP Server you need to copy the "pthreadVC2.dll" into the apache "bin" directory also.
You should now have he "pthreadVC2.dll" in both of these folders (if installed in default location):
C:\wamp\bin\php\php[x.x.xx]\bin
C:\wamp\bin\apache\apache[x.x.x]\bin
You will also need to update the php.ini file within the php bin directory AND the apache bin directory to include:
extension=php_pthreads.dll
This now means you can use pthreads in the browser and in the cli with wamp server
After encountering the same problem, I noticed that I have installed the wrong Pthread version (3.1.6 : requires PHP7+) which wasn't compatible with my PHP version (5.5.12). Solved the problem with Pthread version 0.0.44. An earlier version should probably work well.
Here is the download page for Pthread and the installation page. Be careful about the both php.ini location as mentioned above (Apache folder=for Browser, PHP folder=CLI).

How to configure logrotate with php logs

I'm running php5 FPM with APC as an opcode and application cache. As is usual, I am logging php errors into a file.
Since that is becoming quite large, I tried to configure logrotate. It works, but after rotation, php continues to log to the existing logfile, even when it is renamed. This results in scripts.log being a 0B file, and scripts.log.1 continuing to grow further.
I think (haven't tried) that running php5-fpm reload in postrotate could resolve this, but that would clear my APC cache each time.
Does anybody know how to get this working properly?
I found that "copytruncate" option to logrotate ensures that the inode doesn't change. Basically what is [sic!] was looking for.
This is probably what you're looking for. Taken from: How does logrotate work? - Linuxquestions.org.
As written in my comment, you need to prevent PHP from writing into the same (renamed) file. Copying a file normally creates a new one, and the truncating is as well part of that options' name, so I would assume, the copytruncate option is an easy solution (from the manpage):
copytruncate
Truncate the original log file in place after creating a copy,
instead of moving the old log file and optionally creating a new
one, It can be used when some program can not be told to close
its logfile and thus might continue writing (appending) to the
previous log file forever. Note that there is a very small time
slice between copying the file and truncating it, so some log-
ging data might be lost. When this option is used, the create
option will have no effect, as the old log file stays in place.
See Also:
Why we should use create and copytruncate together?
Another solution I found on a server of mine is to tell php to reopen the logs. I think nginx has this feature too, which makes me think it must be quite common place. Here is my configuration :
/var/log/php5-fpm.log {
rotate 12
weekly
missingok
notifempty
compress
delaycompress
postrotate
invoke-rc.d php5-fpm reopen-logs > /dev/null
endscript
}

php5 reading remote file the valid way

I recently upgraded from php4 to php5 and with it came the notice that all my remote php file access no longer worked. I've been doing quite a bit of research into this and I dont seem to have a clear answer as to what is the correct way to include remote urls in php5.
The first example is to include the file this way
<?php
$data = file_get_contents("http://example.com/example.inc.php",0);
echo $data;
?>
The 2nd is this way
<?php
$ch = curl_init("http://example.com/example.inc.php");
curl_exec($ch);
curl_close($ch);
?>
and 3rd is to set in my php.ini file
allow_url_include = On
allow_url_fopen = On
and use the good old
<?php include_once('http://example.com/example.inc.php');?>
I want to do this right and secure.
All solutions are correct and there is no real difference in safety AFAIK.
I think the difference can be summed up like this:
The ini-settings provide the behaviour as known from prior versions. The reason why they are disabled by default is the security thread, but that is equal to all three solutions. Including remote files is a security problem, regardless of whether you control the rmeote site or not.
file_get_contents() and the curl extension creates some overhead, since you have to buffer the content, but for php include files that is more a cosmetic thing. Their usage is slightly more complex when reading through a script. But the buffering also adds benefits: you might create a local cache for example or a checksum towards a basic plausibility check. Also a syntax check prior to execution is possible thus preventing the crash of your calling script.
Curl is provided as a php extension. So the curl solution only works when the extension is installed, but it offers a much higher grade of freedom, much more options. If you don't require those stay with the builtin functions.
Well,
First method => Is correct and you shouldnt worry using it.
Second method= > Is correct, but curl extension should be enabled
Third method => Is correct, but using this option is not recommended because enabling allow_url_include likely makes your site vulnerable. For more detail see http://en.wikipedia.org/wiki/Include_vulnerability and this link http://wiki.dreamhost.com/Allow_url_include

Using PHP APC cache in CLI mode using dumpfiles

I've recently started using APC cache on our servers. One of the most important parts of our product is a CLI (Cron/scheduled) process, whose performance is critical. Typically the batchjob consists of running some 16-32 processes in parallel for about an hour (they "restart" every few minutes).
By default, using APC cache in CLI is a waste of time due to the opcode cache not being retained between individual calls. But APC also contains apc_bin_dumpfile() and apc_load_dumpfile() functions.
I was thinking these two function might be used to make APC efficient in CLI mode by having it all compiled sometime outside the batchjob, stored in a single dumpfile and having the individual processes load the dumpfile.
Does anybody have any experience with such a scenario or can you give good reasons why it will or will not work? If any significant gains could reasonably be had, either in memory use or performance? What pitfalls are lurking in the shadows?
Disclaimer: As awesome as APC is when it works in CLI, and it is awesome, it can equally be as frustrating. Use with a healthy load of patience, be thorough, step away from the problem if you're spinning, keep in mind you are working with cache that is why it seems like its doing nothing, it is actually doing nothing. Delete dump file, start with just the basics, if that doesn't work forget it try a new machine, new OS, if it is working make a copy, piece by piece expand functionality - there are loads of things that won't work, if it is working commit or make a copy, add another piece and test again, for sanity-check recheck the copies that were working before, cliches or not; if at first you don't succeed try try again, you can't keep doing the same thing expecting new results.
Ready? This is what you've been waiting for:
Enable apc for cli
apc.enable-cli=1
it is not ideal to create, populate and destroy the APC cache on every CLI request
- previous answer by unknown poster since removed.
You're absolutely right that sucks, lets fix it shall we?
If you try and use APC under CLI and it is not enabled you will get warnings.
something like:
PHP Warning: apc_bin_loadfile(): APC is not enabled,
apc_bin_loadfile not available.
PHP Warning: apc_bin_dumpfile(): APC is not enabled,
apc_bin_dumpfile not available.
Warning: I suggest you don't enable cli in php.ini, it is not worth the frustration, you are going to forget you did it and have numerous other headaches with other scripts, trust me its not worth it, use a launcher script instead. (see below)
apc_loadfile and apc_dumpfile in cli
As per the comment by mightye php we need to disable apc.stat or you will get a warnings
something like:
PHP Warning: apc_bin_dumpfile(): Excluding some files from apc_bin_dump[file].
Cached files must be included using full path with apc.stat=0.
launcher script - php-apc.sh
We will use this script to launch our apc enabled scripts (ex. ./php-apc.sh apc-cli.php) instead of changing the properties in php.ini directly.
#/bin/sh
php -d apc.enable_cli=1 -d apc.stat=0 $1
Ready for the basic functionality? Sure you are =)
basic APC persisted - apc-cli.php
<?php
/** check if dump file exists, you don't want to use file_exists */
if (false !== $dump_file = stream_resolve_include_path('apc.dump'))
/** so where were we lets have a look see shall we */
if (false !== apc_bin_loadfile($dump_file))
/** fetch what was stored last run just for fun */
if (false !== $value = apc_fetch('my.awesome.apc.store'))
echo "$value from apc\n";
/** store what gets fetched the next run just for fun */
apc_store('my.awesome.apc.store', 'awesome in cli');
/** what a shlep lets not do that all over again shall we */
apc_bin_dumpfile(array(),null,'apc.dump');
Notice: Why not use file_exists? Because file_exists == stat you see and we want to reap the reward that is apc.stat=0 so; work within the include path; use absolute and not relative paths - as returned by stream_resolve_include_path(); avoid include_once, require_once use the non *_once counterparts; check your stat usage, when not using APC(Muchos important senor), with the help of a StreamWrapper echo for calls to method url_stat; Oops: Fatal scope over-run error! aborting notice thread. see url_stat
message: Error caused by StreamWrapper outside the scope of this discussion.
The smoke test
Using the launcher execute the basic script
./php-apc.sh apc-cli.php
A whole bunch of nothing happened that's what we want right, why else do you want to use cache? If it did output anything then it didn't work, sorry.
There should be a dump file called apc.dump see if you can find it? If you can't find it then it didn't work, sorry.
Good we have the dump file there were no errors lets run it again.
./php-apc.sh apc-cli.php
What you want to see:
awesome in cli from apc
Success! =)
There are few in PHP as satisfying as a working APC implementation.
nJoy!
I would definitely not use it in the CLI as when you restart it, it's almost as if it was never running in the first place!
The better way of using APC is to have it running on the webserver itself all the time, this way with it being active it will actually do what it's supposed to do!
I tryed with curl and APC.it works
use these commands in CLI
curl --data "param1=value2" http://testsite.com/test.php
so it will post data to test.php and you writes the code in it.

Categories