Troubles using ob, checking PHP connection lost - php

First of all, sorry for my probably bad english...
I'm trying to make a script that works only until an event occur or the connection is stopped. I have to work with output buffers (for logging and debugging reasons) but I noticed that after the connection is lost, everything I try to put in the buffer it doesn't appear.
Here I made a scripts to test this behaviour:
<?php
// here I avoid to let the script stops when the connection is lost
// in order to continue with logging, debugging or wathever
ignore_user_abort(true);
// here there is an utility function that flushes the content
// to let us know if the connection is still active
$fnExec = 0; // this is a variable useful for log the script activity
function is_connection_aborted() {
global $fnExec;
$fnExec++;
// now I get the content of all output buffers and store them,
// then I clean and flush the outputs
$obs = 0;
$contents = Array();
while ($level = ob_get_level()) {
$obContent = ob_get_contents();
array_unshift($contents, $obContent);
ob_clean();
ob_end_flush();
$obs++;
}
echo "\r"; // this is needed in order to have at least a character to flush
flush();
$conn = connection_aborted();
// here I start an output buffer to log what's inside the $contents variable
ob_start();
var_dump($contents);
$fh = fopen('test/fn_'.$fnExec.'.txt', 'w');
fwrite($fh, ob_get_clean());
fclose($fh);
// Now I restore the content in all the output buffers
$count = count($contents) - 1;
$index = 0;
do {
ob_start();
echo $contents[$index];
} while (++$index < $obs);
// Finally returns the value of connection_aborted()
return $conn;
}
// I want to start an output buffer here to manage the output of the script
ob_start();
// This is a simple script that run at most for 5 seconds and print an incremental
// number each iteration, after slept 2 seconds.
// It can be stopped aborting the connection!
$start = time();
$attempts = 0;
while (true) {
echo ++$attempts."\r\n";
if (is_connection_aborted() || (time() - $start) > 5) {
break;
}
sleep(2);
}
// Here I get the content of the output buffer
$content = ob_get_clean();
// And finally I log the content in a file
$fh = fopen('test/test_ob.txt', 'w');
fwrite($fh, $content);
fclose($fh);
Now to the test!
I leave the connection opened up to 2 seconds, and so I expect to see only 1-2 iterations and then a final output like '1 2 3' in the 'test/test_ob.txt' file.
The result, instead, is that I have the file 'test/test_ob.txt' empty but also the file log of the fn()'s execution! It seems like every ob_start() after the connection is lost, will be filled with anything!
Why do output buffers work only until the connection is lost? Is the loss of connection really the problem?
Thanks in advance!

Related

Shared access: How to fix "fread(): Length parameter must be greater than 0"?

When I run this function on multiple scripts one script generated warning:
fread(): Length parameter must be greater than 0
function test($n){
echo "<h4>$n at ".time()."</h4>";
for ($i = 0; $i<50; $i++ ){
$fp = fopen("$n.txt", "r");
$s = fread($fp, filesize("$n.txt") );
fclose($fp);
$fp = fopen("$n.txt", "w");
$s = $_SERVER['HTTP_USER_AGENT'].' '.time();
if (flock($fp, LOCK_EX)) { // acquire an exclusive lock
fwrite($fp, $s);
// fflush($fp);// flush output before releasing the lock
flock($fp, LOCK_UN); // release the lock
} else {
echo "Couldn't get the lock!";
}
}
}
I try to write reading of the file for multiple users, but only one user can write the file. I know that when I use fwrite with flock - LOC_EX, next scripts must wait till the write is finished. But here it seems like filesize doesn't wait till the write operation is finished. My opinion is that it tries to reach the file when the file size is 0, and as a result this produces the problem: 0 bytes will be read from the file, when it is written by original script.
Is it possible to fix this for fread function?
Purpose of this script is to test fread with some limit and to check the data which I read later, if the data are really written when I did not used fflush.
function test($n){
echo "<h4>$n at ".time()."</h4>";
for ($i = 0; $i<50; $i++ ){
$start = microtime(true);
$fp = fopen("$n.txt", "r");
if(filesize($n.txt) > 0)
{
$s = fread($fp, filesize($n.txt) );
fclose($fp);
$fp = fopen("$n.txt", "w");
$s = $_SERVER['HTTP_USER_AGENT'].' '.time();
if (flock($fp, LOCK_EX)) { // acquire an exclusive lock
fwrite($fp, $s);
// fflush($fp);// flush output before releasing the lock
flock($fp, LOCK_UN); // release the lock
} else {
echo "Couldn't get the lock!";
}
}
else
{
echo "Filesize must be greater than 0";
}
}
}
please change $s variables name its use same things two time
$fp = fopen("$n.txt", "r");
$s = fread($fp, filesize("$n.txt") );
fclose($fp);
The error occurs in the middle line of the above three lines.
Firstly, these three lines could be rewritten into a single line as follows:
$s = file_get_contents("$n.txt");
However, these isn't necessary, as these three lines are entirely redundant in your code. They don't do anything useful.
What they do is open a file, store its contents to $s and then close it.
But you are then immediately setting $s to a different value, thus throwing away the previous value, and making it pointless to have read it from the file in the first place.
If you need to keep the original contents of the file, then use file_get_contents() and make sure you don't overwrite the contents of the variable.
If you don't need the original contents of the file, then just delete those three lines from your code.
Incidentally, this error highlights a couple of good coding practices that you should take on board: Firstly, never re-use a variable for two different things, and secondly always give your variables (and functions) good names. $s is not a good name; $previousFileContents would be a better name; it would have made the error much more obvious.

file_exists returns false between file_put_content (or fwrite) and unlink in second execution

I'm working on a cron system and need to execute a script only once at a time. By using the following codes, I execute the script first time and while it's looping (for delaying purpose), executing it again but file_exists always returns false while first execution returns content of file after loop is done.
Cronjob.php:
include "Locker.class.php";
Locker::$LockName = __DIR__.'/OneTime_[cron].lock';
$Locker = new Locker();
for ($i = 0 ; $i < 1000000; $i++){
echo 'Z';
$z = true;
ob_end_flush();
ob_start();
}
Locker.class.php:
class Locker{
static $LockName;
function __construct($Expire){
if (!basename(static::$LockName)){
die('Locker: Not a filename.');
}
// It doesn't help
clearstatcache();
if (file_exists(static::$LockName)){ // returns false always
die('Already running');
} else {
$myfile = fopen(static::$LockName, "x"); // Tried with 'x' and 'w', no luck
fwrite($myfile, 'Keep it alive'); // Tried with file_put_content also, no luck
fclose($myfile);
}
// The following function returns true by the way!
// echo file_exists(static::$LockName);
}
function __destruct(){
// It outputs content
echo file_get_contents(static::$LockName);
unlink(static::$LockName);
}
}
What is the problem? Why file_exists returns false always?
I suspect the PHP parser has noticed that you never use the variable $Locker, so it immediately destroys the object, which runs the destructor and removes the file. Try putting a reference to the object after the loop:
include "Locker.class.php";
Locker::$LockName = __DIR__.'/OneTime_[cron].lock';
$Locker = new Locker();
for ($i = 0 ; $i < 1000000; $i++){
echo 'Z';
$z = true;
ob_end_flush();
ob_start();
}
var_dump($Locker);
If you're goal is to prevent a potentially long running job from executing multiple copies at the same time, you can take a simpler approach and just flock() the file itself.
This would go in cronjob.php
<?php
$wb = false;
$fp = fopen(__FILE__, 'r');
if (!$fp) die("Could not open file");
$locked = flock($fp, LOCK_EX | LOCK_NB, $wb);
if (!$locked) die("Couldn't acquire lock!\n");
// do work here
sleep(20);
flock($fp, LOCK_UN);
fclose($fp);
To address your actual question, I found that by running your code, the reason the file is going away is because on subsequent calls, it outputs Already running if a job is running, and then the second script invokes the destructor and deletes the file before the initial task finishes running.
The flock method above solves this problem. Otherwise, you'll need to ensure that only the process that actually creates the lock file is able to delete it (and take care that it never gets left around too long).

Please help to improve my pHp cache ?

i have read a bit around in the internet about the php cache.
at the moment i am using, this system to cache my pages:
This is putted on the start of the page
<?php
// Settings
$cachedir = 'cache/'; // Directory to cache files in (keep outside web root)
$cachetime = 600; // Seconds to cache files for
$cacheext = 'html'; // Extension to give cached files (usually cache, htm, txt)
// Ignore List
$ignore_list = array(
'addedbytes.com/rss.php',
'addedbytes.com/search/'
);
// Script
$page = 'http://' . $_SERVER['HTTP_HOST'] . $_SERVER['REQUEST_URI']; // Requested page
$cachefile = $cachedir . md5($page) . '.' . $cacheext; // Cache file to either load or create
$ignore_page = false;
for ($i = 0; $i < count($ignore_list); $i++) {
$ignore_page = (strpos($page, $ignore_list[$i]) !== false) ? true : $ignore_page;
}
$cachefile_created = ((#file_exists($cachefile)) and ($ignore_page === false)) ? #filemtime($cachefile) : 0;
#clearstatcache();
// Show file from cache if still valid
if (time() - $cachetime < $cachefile_created) {
//ob_start('ob_gzhandler');
#readfile($cachefile);
//ob_end_flush();
exit();
}
// If we're still here, we need to generate a cache file
ob_start();
?>
MY HTML CODE Goes here .............
and the code below is at the footer of my page.
<?php
// Now the script has run, generate a new cache file
$fp = #fopen($cachefile, 'w');
// save the contents of output buffer to the file
#fwrite($fp, ob_get_contents());
#fclose($fp);
ob_end_flush();
?>
There are some things that i need and this code dont have them :
gzip
the expired cache is not autodeleted after it expire.
Also wanted to ask, if this code is secure to use , if some one can suggest a better one or something to improve the current code it will be just great
Thank you fro reading this post.
Best Regards
Meo
….
// Show file from cache if still valid
if (time() - $cachetime < $cachefile_created) {
//ob_start('ob_gzhandler');
echo gzuncompress(file_get_contents($cachefile));
//ob_end_flush();
exit();
} else {
if(file_exists($cachefile) && is_writable($cachefile)) unlink($cachefile)
}
….
and
// Now the script has run, generate a new cache file
$fp = #fopen($cachefile, 'w');
// save the contents of output buffer to the file
#fwrite($fp, gzcompress(ob_get_contents(), 9));
#fclose($fp);
ob_end_flush();
?>
Use ob_start("ob_gzhandler"); to initiate gzipped buffering (it'll take care of determining if the client can actually accept/wants gzipped data and adjust things accordingly).
To delete the cached files:
if (time() - $cachetime < $cachefile_created) {
#readfile($cachefile);
//ob_end_flush();
exit();
} else {
unlink($cachefile);
exit();
}
But there can be delay or maybe error when the file is being written and someone requests for that page. You should use flock to overcome such problems as mentioned at Error during file write in simple PHP caching
Something like this at the end of page
<?php
$fp = #fopen($cachefile, 'w');
if (flock($fp, LOCK_EX | LOCK_NB)) {
fwrite($fp, gzcompress(ob_get_contents(), 9));
flock($fp, LOCK_UN);
fclose($fp);
}
ob_end_flush(); ?>

PHP Readfile() number of bytes when user aborted

I'm using a PHP script to stream a live video (i.e. a file which never ends) from a remote source. The output is viewed in VLC, not a web browser. I need to keep a count of the number of bytes transferred. Here is my code:
<?php
ignore_user_abort(true);
$stream = $_GET['stream'];
if($stream == "vid1")
{
$count = readfile('http://127.0.0.1:8080/');
logThis($count);
}
function logThis($c)
{
$myFile = "bytecount.txt";
$handle = fopen($myFile,'a');
fwrite($handle,"Count: " . $c . "\n");
fclose($handle);
}
?>
However it appears that when the user presses the stop button, logThis() is never called, even though I've put in ignore_user_abort(true);
Any ideas on what I'm doing wrong?
Thanks
Update2: I've changed my code as I shoudn't be using ignore_user_abort(true) as that would continue to download the file forever even after the client has gone. I've changed my code to this:
<?php
$count = 0;
function bye()
{
//Create Dummy File with the filename of equal to count
}
register_shutdown_function('bye');
set_time_limit(0);
ignore_user_abort(false);
$stream = $_GET['stream'];
if($stream == "vid1")
{
$GLOBALS['count'] = readfile('http://127.0.0.1:8080/');
exit();
}
?>
My problem now is that when the script is aborted (i.e. user presses stop), readfile won't return a value (i.e. count remains at 0). Any ideas on how I can fix this?
Thanks
When a PHP script is running normally the NORMAL state, is active. If the remote client disconnects the ABORTED state flag is turned on. A remote client disconnect is usually caused by the user hitting his STOP button. If the PHP-imposed time limit (see set_time_limit()) is hit, the TIMEOUT state flag is turned on.
so setting the set_time_limit to 0 should help.
Ok folks I managed to fix this. The trick was to not use readfile() but read the video stream byte by byte. Ok it may not be 100% accurate, however a few bytes inaccuracy here or there is ok.
<?php
$count = 0;
function logCount()
{
//Write out dummy file with a filename equal to count
}
register_shutdown_function('logCount');
set_time_limit(0);
ignore_user_abort(false);
$stream = $_GET['stream'];
if($stream == "vid1")
{
$filename = 'http://127.0.0.1:8080/';
$f = fopen($filename, "rb");
while($chunk = fread($f, 1024)) {
echo $chunk;
flush();
if(!connection_aborted()) {
$GLOBALS['count'] += strlen($chunk);
}
else {
exit();
}
}
}
?>

PHP output buffering to text file

I am having trouble with an update script. It runs for a few hours so I would like it to output live to a text file.
I start the document with
ob_start();
Then within the while loop (as it iterates through the records of the database) I have this
$size=ob_get_length();
if ($size > 0)
{
$content = ob_get_contents();
logit($contents);
ob_clean();
}
And finally the logit function
function logit($data)
{
file_put_contents('log.txt', $data, FILE_APPEND);
}
However the log file remains empty. What am I doing wrong?
try
logit($content);
// ^^ Note the missing s
$contents is not the same variable as $content.
For anyone coming here looking for a function for this, I wrote a nice one today:
//buffer php outout between consecutive calls and optionally store it to a file:
function buffer( $toFilePath=0, $appendToFile=0 ){
$status = ob_get_status ();
if($status['level']===1) return ob_start(); //start the buffer
$res = ob_get_contents();
ob_end_clean();
if($toFilePath) file_put_contents($toFilePath, $res, ($appendToFile ? FILE_APPEND : null));
return $res;
}
Sample usage:
buffer(); //start the buffer
echo(12345); //log some stuff
echo(678910);
$log = buffer('mylog.txt',1); //add these lines to a file (optional)
echo('Heres the latest log:'.$log);

Categories