I'm trying to learn long poll and I'm using following PHP script:
<?php
$lastMod = filemtime('test.txt');
$counter = 10;
while ($counter > 0) {
if (filemtime('test.txt') !== $lastMod) {
echo file_get_contents('./test.txt');
exit;
}
$counter--;
sleep(1);
}
echo $lastMod;
But no matter what I do or try, it doesn't do anything when I change test.txt content during script execution.
I would be really happy if somebody told me where is the mistake.
The filemtime result is cached during runtime, so you need to reset it explicitly using clearstatcache().
Run it right before your if.
From the docs:
Note: The results of this function are cached. See clearstatcache()
for more details.
Related
I have a php script that calls a go script. It gets results every 1-2 seconds, and print's them. Using php's exec and output, I only get the results when the program finishes. Is there a way I can check the output to see when it changes and output that while it's still running?
Something like this, but pausing the execution?:
$return_status = 0;
$output = [];
$old_output = ["SOMETHING ELSE"];
while ($return_status == 0) {
exec($my_program,$output,$return_status); #somehow pause this?
if $output != $old_output {
echo($output);
$old_output = $output;
}
}
Yes. Use the popen() function to get a file handle for the command's output, then read from it a line at a time.
I am using php loop in order to get to know any file changes. If the file changes occur , I am trying to stop the loop. But in my condition instead of stopping ..loop continues.
Route::get('api.chat.buffer/{job_id}',function($job_id){
$award= DB::table('job_awards')->where('job_id',$job_id)->first();
$dirname = base_path().'/files/chat/';
$filename = $award->client_id.'_'.$award->user_id.'.timelog.log';
$i =1;
for($z=0;$z<=20;$z++){
$current_file_time = filemtime($dirname.$filename);
if($_GET['timestamp']<$current_file_time){
echo json_encode(array('status'=>'success','data'=>$current_file_time,'node'=>1)); die;
break;
}
sleep(1); // this should halt for 3 seconds for every loop
}
echo json_encode(array('status'=>'success','data'=>$current_file_time,'node'=>0));
die;
});
I am creating chatting script and tracking changing while file change.
Thanks
can you use sleep function inside a for loop?. Have you tried with yield. Try assigning the numbers of loops statements to break; in this case it would be break 1; http://php.net/manual/en/control-structures.break.php
I'm trying to write a function that pings a few hundred addresses and returns their values (milliseconds). So far I've achieved the initial idea which is to ping and get the result but the problem arises when using the same code for hundreds of addresses, the PHP page stalls until it either times out or reaches the last ping command.
I would be glad if I could get some suggestions to output the results progressively, here is my current code:
<?php
// "for" loop added according to suggestion for browser compatibility (IE, FF, CHR, OPR, SFR)
for($i = 0; $i < 5000; $i++)
{
echo ' ';
}
function GetPing($ip = NULL) {
// Returns the client ping if no address has been passed to the function
if(empty($ip)) {
$ip = $_SERVER['REMOTE_ADDR'];
}
// Check which OS is being run by the client
if(getenv('OS') == 'Windows_NT') {
//echo '<b>Detected local system:</b> Windows NT/2000/XP/2003/2008/Vista/7<p>';
$exec = exec("ping -n 1 -l 32 -i 128 " . $ip);
return end(explode(' ', $exec));
}
else {
//echo '<b>Detected local system:</b> Linux/Unix<p>';
$exec = exec("ping -c 1 -s 32 -t 128 " . $ip);
$array = explode('/', end(explode('=', $exec )));
return ceil($array[1]) . 'ms';
}
// ob_flush and flush added according to suggestion for buffer output
ob_flush();
flush();
}
// Added to test 20 sequential outputs
for($count = 0; $count < 20; $count++)
echo GetPing('8.8.8.8') . '<div>';
?>
After some feedback, I've added a for loop as well as ob_flush() and flush() to my script and I've also set output_buffering to 0 in php.ini. It seems to work for most browsers that I tested so far (IE8, Firefox 12, Chrome 19, Opera 11, Safari 5). It seems the current code is now working as intended but any suggestion to improve on it is immensely appreciated.
Thank you for your feedback.
this is just a guess; I've written years ago a very wobbly chat script that used output buffering as well (and fetching new messages in while(true) loop) ..
While doing this I've encountered the same problems that sometimes the script stalled (blank screen), sometimes it took a while until the characters appeared and additionally this was also browser specific.
Here are the relevant code snippets I've added to the script to have it work with IE6 and FF2 (as I said, years ago ...)
<?php
// End output buffering
ob_end_flush();
// IE and Safari Workaround
// They will only display the webpage if it's completely loaded or
// at least 5000 bytes have been "printed".
for($i=0;$i<5000;$i++)
{
echo ' ';
}
while( ... )
{
echo 'Message';
ob_flush();
flush();
}
?>
It worked for me, so maybe you could give it a try as well. (Altough I have no idea how modern browsers and server infrastrucutre will behave to this).
I think what you may be looking for is progressively running and outputting the script, rather than asynchronous functions.
See Is there a way to make PHP progressively output as the script executes?
I have a .php script that I use for creating the list of my products.
I am on shared hosting, so I can't do a lot of queries otherwise I get a blank page.
This is how I use my script now:
script.php?start=0&end=500&indexOfFile=0 ->> make a product0.txt file with first 500 products
script.php?start=501&end=1000&indexOfFile=1 ->> product1.txt file with another 500 products
script.php?start=1001&end=1500&indexOfFile=2 ->> product2.txt file with last 500 products
How can I modify the script so it will make all these files automatically, so that I don't have to change each time the link manually?
I would like to click a button which will do this:
make the product0.txt file with the first 500 products
wait 5 seconds
make the product1.txt file with with another 500 products
wait 5 seconds
make the product2.txt file with the last 500 products
use:
sleep(NUMBER_OF_SECONDS);
before starting your actions, use
sleep(5);
or:
usleep(NUMBER_OF_MICRO_SECONDS);
In Jan2018 the only solution worked for me:
<?php
if (ob_get_level() == 0) ob_start();
for ($i = 0; $i<10; $i++){
echo "<br> Line to show.";
echo str_pad('',4096)."\n";
ob_flush();
flush();
sleep(2);
}
echo "Done.";
ob_end_flush();
?>
i use this
$i = 1;
$last_time = $_SERVER['REQUEST_TIME'];
while($i > 0){
$total = $_SERVER['REQUEST_TIME'] - $last_time;
if($total >= 2){
// Code Here
$i = -1;
}
}
you can use
function WaitForSec($sec){
$i = 1;
$last_time = $_SERVER['REQUEST_TIME'];
while($i > 0){
$total = $_SERVER['REQUEST_TIME'] - $last_time;
if($total >= 2){
return 1;
$i = -1;
}
}
}
and run code =>
WaitForSec(your_sec);
Example :
WaitForSec(5);
OR
you can use sleep
Example :
sleep(5);
I am on shared hosting, so I can't do a lot of queries otherwise I get a blank page.
That sounds very peculiar. I've got the cheapest PHP hosting package I could find for my last project - and it does not behave like this. I would not pay for a service which did. Indeed, I'm stumped to even know how I could configure a server to replicate this behaviour.
Regardless of why it behaves this way, adding a sleep in the middle of the script cannot resolve the problem.
Since, presumably, you control your product catalog, new products should be relatively infrequent (or are you trying to get stock reports?). If you control when you change the data, why run the scripts automatically? Or do you mean that you already have these URLs and you get the expected files when you run them one at a time?
In https://www.php.net/manual/es/function.usleep.php
<?php
// Wait 2 seconds
usleep(2000000);
// if you need 5 seconds
usleep(5000000);
?>
So i've been trying to write this little piece of code to read a file (status.txt), search it for 1 of 4 keywords and loop until either time runs out (5 minutes) or it finds one of the words. I've already written a few simple php scripts to write the words to a txt file, but I can't seem to get this part to work. It either doesn't clear the file in the beginning or seems to hang and never picks up the changes. Any advice would be hugely helpful.
<?php
//Variables
$stringG = "green";
$stringR = "red";
$stringB = "blue";
$stringO = "orange";
$clear = "";
$statusFile = "status.txt";
//erase file
$fh = fopen($statusFile, 'w'); //clear the file with "clear"
fwrite($fh, $clear);
fclose($fh);
//Insert LOOP
$counter = 0;
while ( $counter <= 10 ) {
//echo "loop begun";
// Read THE FILE
$fh = fopen($statusFile, 'r');
$data = fread($fh, filesize($statusFile));
fclose($fh);
//process the file
if(stristr($data,$stringG)) {
echo "Green!";
$counter = $counter + 30; //stop if triggered
}
elseif (stristr($data,$stringR)) {
echo "Red";
$counter = $counter + 30; //stop if triggered
}
elseif (stristr($data,$stringB)) {
echo "Blue";
$counter = $counter + 30; //stop if triggered
}
elseif (stristr($data,$stringO)) {
echo "Orange";
$counter = $counter + 30; //stop if triggered
}
else {
//increment loop counter
$counter = $counter + 1;
//Insert pause
sleep(10);
}
}
?>
You should open the file before your read loop, and close it after the loop. As in :
open the file
loop through the lines in the file
close the file
Also, if you clear the file before you read it, isn't it going to be empty every time?
Well, first of all, you don't need to "clear" the file this way... The "w" option in fopen will already do that for you.
Also, I wouldn't try to read the whole file at once, because, if it's very large, that won't work without intense memory usage.
What you should do is read the file sequentially, which means you always read a fixed amount of bytes and look for the keywords. To avoid losing keywords which are cut in half by your reading mechanism, you could make your reads overlay a bit (the length of your longest keyword-1), to solve that problem.
Then you should modify your while loop so that it also checks if you are at the end of the file ( while(!feof($fh)) ).
PS: It has been mentioned that you clear your file before reading it. What I understood is that your file gets a lot of input really fast, so you expect it to already have content again when you reopen it. If that's not the case, you really need to rethink your logic ;)
PPS: You don't need to abort your while loop by incrementing your counter variable past the boundaries you define. You can also use the break-keyword.
You haven't included the code which deletes the file in the while loop, so it only clears the file once. Also, I'd use unlink($statusFile); to delete the file.
You should rather use for cycle. And to your problem - you clear the file, then get the data from it. Try dumping this $data, you'll end up with string(0) "" for sure. First, save the data, then clear the file.
Edit: If you are changing the file in the loop itself in another thread, there's another problem. You should look after anatomic file stream. For example, you can use Nette SafeStream class.