PHP 7.1 "require/include" result is not up to date - php

I have a file containing:
<?php
return '2000-01-01 00:00:00';
?>
and I have this code:
<?php
$oldValue = require 'file.php';
$now = new DateTime();
$handle = fopen('file.php', "w");
fputs($handle, "<?php\nreturn '" . $now->format('Y-m-d H:i:s') . "';");
fclose($handle);
$newValue = require 'file.php';
echo "Old value: $oldValue ";
echo "New value: $newValue ";
?>
The output with PHP 5.3 is:
Old value: 2000-01-01 00:00:00 New value: 2018-03-28 10:33:12
The output with PHP 7.1 is:
Old value: 2000-01-01 00:00:00 New value: 2000-01-01 00:00:00
In the two cases, the string in the file changes.
Can some one help me to update the new value with PHP 7.1?
Note: it's not the real problem. It's just an abstraction of the problem to make things more simple and comprehensible. So please, no lessons of PHP best practices. I just like to get a good response to my question.
Thanks :)

As commented by iainn The issue is that the PHP server is caching the file once it's loaded and is not re-calling the file from disc on the second require, instead calling it from it's memory cache.
As you have stated that:
"the content of the file changes "
then the issue is the new contents are not passed to the script, instead using the memory of the older contents.
Therefore call clearstatcache() to force clear the cached file data. This should be placed after the new data is written to update the file, and before the file is called for a second time.
If this does not work then the file data may be cached elsewhere in its route.
<?php
$oldValue = require 'file.php';
$now = new DateTime();
$handle = fopen('file.php', "w");
fputs($handle, "<?php\nreturn '" . $now->format('Y-m-d H:i:s') . "';");
fclose($handle);
clearstatcache(); // THIS line should help you
$newValue = require 'file.php';
echo "Old value: $oldValue ";
echo "New value: $newValue ";
?>
As also commented by iainn opcache_invalidate()may be a more specific/less general solution for you.

It could be an issue with your file being cached (OPcache) and php returning the same file in both require calls
Can you try modifying opcache settings
opcache.enable = 0
and then testing it ? Also there is
opcache_reset()
could help you but if you running your code from CLI it may not work.

Related

How I can process files in a directory all at one time?

I have a directory with 100 of xlxs files. Now what I want to do is to convert all these files into PDF all at one time or some at one time. The conversion process is working fine at the moment with foreach and cron. But it can process or convert files one at a time which increase waiting time at the user end who is waiting for PDF files.
I am thinking about parallel processing at this time but don't know how to implement this.
Here is my current code
$files = glob("/var/www/html/conversions/xlxs_files/*");
if(!empty($files)){
$now = time();
$i = 1;
foreach ($files as $file) {
if (is_file($file) && $i <= 8) {
echo $i.'-----'.basename($file).'----'.date('m/d/Y H:i:s',#filemtime($file));
echo '<br>';
$path_parts = pathinfo(basename($file));
$xlsx_file_name = basename($file);
$pdf_file_name = $path_parts['filename'].'.pdf';
echo '<br>';
try{
$result = ConvertApi::convert('pdf', ['File' => $common_path.'xlxs_files/'.$xlsx_file_name],'xlsx');
echo $log = 'conversion start for '.basename($file).' on '. date('d-M-Y h:i:s');
echo '<br>';
$result->getFile()->save($common_path.'pdf_files/'.$pdf_file_name);
echo $log = 'conversion start for '.basename($file).' on '. date('d-M-Y h:i:s');
echo '<br>';
mail('amit.webethics#gmail.com','test','test');
unlink($common_path.'xlxs_files/'.$xlsx_file_name);
}catch(Exception $e){
$log_file_data = createAlogFile();
$log = 'There is an error with your file .'. $xlsx_file_name.' -- '.$e->getMessage();
file_put_contents($log_file_data, $log . "\n", FILE_APPEND);
continue;
}
$i++;
}
}
}else{
echo 'nothing to process';
}
Any help will be highly appreciated. Thanks
Q : I am thinking about parallel processing at this time but don't know how to implement this.
Fact #1:this is not a kind of a true-[PARALLEL] orchestration of the flow of processing.
Fact #2:a standard GNU parallel (all details kindly read in man parallel) will help you maximise the performance of your processing pipeline, given the list of all files to convert and tweaking other parameters as the amounts of CPU/cores used and RAM-resources you may reserve/allocate to perform this batch conversion as fast as possible.
ls _files_to_convert.mask_ | parallel --jobs _nCores_ \
--load 99% \
--block _RAMblock_ \
... \
--dry-run \
_converting_process_
might serve as an immediate apetiser for what the GNU parallel is capable of.
All credits and thanks are to go to Ole Tange.
You could start multiple PHP scripts at a time. How to do that in detail answer is here: https://unix.stackexchange.com/a/216475/91593
I would go for this solution:
N=4
(
for thing in a b c d e f g; do
((i=i%N)); ((i++==0)) && wait
task "$thing" &
done
)
Another way is to try to use PHP for that. There is in depth answer to this question: https://stackoverflow.com/a/36440644/625521

ob_implicit_flush(), flush(), ob_flush() - not working on remote server

If I load this script in chrome from my local server on XAMPP:
header("Content-Type:text/plain");
set_time_limit(0);
$max = 40;
for ($i = 0; $i < $max; $i++) {
$response = array( 'server time: ' . date("h:i:s", time()), 'progress' => round($i/$max*100));
echo json_encode($response);
ob_flush();
flush();
sleep(1);
}
ob_clean();
It works as you would expect, every second the page displays a new response.
However, when I upload it to my remote server (running the same version of php), it waits until the entire script finishes before it displays the output.
On very long scripts, it updates the output every 30-60 seconds or so.
As the title suggests, I've tried using all the different flush functions, but nothing works.
There is likely some difference in the php.ini of my local server and my remote server, but I don't know what.
Please help.
---EDIT---
I've been doing some more testing. I've noticed that exactly it only updates every 4096 bytes, which happens to be what my remote server's php ini value for 'output_buffering' is.
However, for some reason, if I change output_buffering to '1' or 'off', nothing changes. It still only updates every 4096 bytes.
I'm testing the 2 identical scripts on different servers on the same browser.
I didn't take into account nginx, which has it's own output buffer.
I simply added 'header("X-Accel-Buffering: no");' to the top of the php script and it all works fine now.
For me adding header('Content-Encoding: none'); did the trick. This is needed when using PHP-FPM.
This works fine in Apache + PHP
header('Content-Encoding: none');
ob_implicit_flush(1);
echo "<br>PROCESSING bla bla bla";
Optionally you can add the following line (after every small piece of data) if you want to throw out really small chunks too.
echo str_repeat(' ',1024*64);
Before you need use ob_start() and ob_end_clean(). And add header Content-Length or Transfer-Encoding: chunked. And check if «implicit_flush» is On in your php.ini
Add padding for response. Check this code:
<?php
set_time_limit(0);
ob_start();
header('Content-Type: text/plain');
define("PADDING", 16);
//+padding
for($i=0;$i<PADDING;$i++){
//64 spaces (1 block)
echo str_repeat(' ', 64);
}
$max = 40;
for ($i = 0; $i < $max; $i++) {
$response = array( 'server time: ' . date("h:i:s", time()), 'progress' => round($i/$max*100));
echo json_encode($response);
ob_flush();
flush();
sleep(1);
}
ob_end_clean();
?>

Check Files In Directory Are Modified

I need to deploy a PHP application written by CodeIgniter to client's web server (CentOS 5 or 6). As PHP is the scripting language, it does not need to compile to binary code for deployment. It has chances that client will modify the PHP program by themselves without a notice to me. If client has modified the program that made the application out of order, we need to take extra man power to find their modification and fix it.
So I would like to made something that can easy to let me know any files (php, css, html, etc.) of the application has been modification after my deployment. Is there any method suggested by anyone?
Thank you,
Use filemtime()
int filemtime ( string $filename )
This PHP function returns the time when the data blocks of a file were being written to, that is, the time when the content of the file was changed.
<?php
// outputs e.g. somefile.txt was last modified: December 12 2014 09:16:23.
$filename = 'somefile.txt';
if (file_exists($filename)) {
echo "$filename was last modified: " . date ("F d Y H:i:s.", filemtime($filename));
}
?>
To get the last modification time of a directory, you can use this:
<pre>
$getLastModDir = filemtime("/path/to/directory/.");
</pre>
Take note on the last dot which is needed to see the directory as a file and to actually get a last modification date of it.
This comes in handy when you want just one 'last updated' message on the frontpage of your website and still taking all files of your website into account.
To get the modification date of some remote file, you can use the fine function by notepad at codewalker dot com (with improvements by dma05 at web dot de and madsen at lillesvin dot net).
But you can achieve the same result more easily now with stream_get_meta_data (PHP>4.3.0).
However a problem may arise if some redirection occurs. In such a case, the server HTTP response contains no Last-Modified header, but there is a Location header indicating where to find the file. The function below takes care of any redirections, even multiple redirections, so that you reach the real file of which you want the last modification date.
<?php
// get remote file last modification date (returns unix timestamp)
function GetRemoteLastModified( $uri )
{
// default
$unixtime = 0;
$fp = fopen( $uri, "r" );
if( !$fp ) {return;}
$MetaData = stream_get_meta_data( $fp );
foreach( $MetaData['wrapper_data'] as $response )
{
// case: redirection
if( substr( strtolower($response), 0, 10 ) == 'location: ' )
{
$newUri = substr( $response, 10 );
fclose( $fp );
return GetRemoteLastModified( $newUri );
}
// case: last-modified
elseif( substr( strtolower($response), 0, 15 ) == 'last-modified: ' )
{
$unixtime = strtotime( substr($response, 15) );
break;
}
}
fclose( $fp );
return $unixtime;
}
?>

Appending multiple entries to an output file with a crontab using php?

I am writing a script in PHP in which I had to write the system uptime, the current time, and the amount of users logged in the system into a log file, and be updated continually via a crontab.
What I need help with is that I would like the updates to accumulate within the file and be added continually. So far, whenever my script gets executed, the newest update overwrites the previous update.
What I've done is that I tried to declare an array of entries and as I iterate through the array push the contents of the update into the array (It might be a bit of half-baked logic on my part).
My Code:
$fileName = '../so-and-so directory/output.log';
$dt = date('m/d/y');
$time = date('h:i A');
$data = shell_exec('uptime');
$uptime= explode(' up ', $data);
$uptime = explode(', ', $uptime[1]);
$uptime = $uptime[0].','.$uptime[1];
$users = system('w', $who);
$array = new SplFixedArray(3);
$fileLog = fopen($fileName, 'w');
$fileString = "Date: ".$dt. "\n". " Time: ".$time . "\n".
"System uptime ". $uptime ."\n" ."Users " . $users;
foreach ($array as $entry) {
array_push(file_put_contents($fileName, $fileString));
}
fclose($fileLog);
I feel that the solution is very simple but I'm missing it. Would somebody please clue me in?
The "w" filemode truncates the file on open. "a" appends to the end instead. See fopen(3) or the PHP documentation for details.
Also, file_put_contents() is destroying the file. Try fwrite() instead.
drop fopen; simply use
file_put_contents($fileName, $fileString);
file_put_contents will overwrite the existing file by default.
In short:
$fileName = '../so-and-so directory/output.log';
$dt = date('m/d/y');
$time = date('h:i A');
$data = shell_exec('uptime');
$uptime= explode(' up ', $data);
$uptime = explode(', ', $uptime[1]);
$uptime = $uptime[0].','.$uptime[1];
$users = system('w', $who);
$fileString = "Date: ".$dt. "\n". " Time: ".$time . "\n".
"System uptime ". $uptime ."\n" ."Users " . $users;
file_put_contents($fileName, $fileString);
So it turns out that I needed to edit my crontab file as such:
* * * * * such-and-such-script.php >> ../so-and-so directory/output.log 2>&1
To make them append without the previous one being overwritten by the new one. I also lost the fopen() and instead of doing file_put_contents, I did fwrite() into the file. It works great now. Thank you!

How do i format the error log message in log file using PHP

Suppose I want to display the sql error in log file which has been created by me. For example ,error log file called "myerror.log "
Now i am using the following code to print my message in log file , " error_log(mysql_error(), 3, "tmp/myerror.log"); ".
So Every time when message is printed in myerror.log file with same line, Here i want to print the message one after another.
Kindly help me
Thanks
Dinesh Kumar Manoharan
As I understand you want to have each entry/message on new line. If so:
error_log(mysql_error() . PHP_EOL, 3, "tmp/myerror.log");
If you want to have timestamp as well, you will have to add it yourself:
$dt = date('Y-m-d H:i:s', time());
error_log("[{$dt}] " . mysql_error() . PHP_EOL, 3, "tmp/myerror.log");
If you need to constantly use such formatting I recommend to create your own function (something like this):
define('MY_ERROR_LOG', 'tmp/myerror.log');
function myErrorLog($message)
{
$dt = date('Y-m-d H:i:s', time());
error_log("[{$dt}] " . $message . PHP_EOL, 3, MY_ERROR_LOG);
}
// use it
myErrorLog(mysql_error());
maybe you need function like this:
function error_log($message, $log_file) {
file_put_contents($log_file, $message . "\n");
}

Categories