Do-while loops while bound to a socket - php

So I've got my PHP script to bind to a socket and write incoming data to a text file, as new information comes down the stream. This is working well. What I'd like to do is have another, separate PHP script run each time that text file is updated.
I've tried to make this happen by using an 'include' command, but it's not working. I've tested that my 'include' method works by mocking up a very simple script with that command. Maybe I'm putting the line of code in the wrong place? I have it in the do-while loop (see code below).
Maybe the script/socket needs to be restarted or reset for my changes to take place?
Any help would be appreciated. Thank you!
Here's the gist of my code:
[socket create, bind, and listen stuff goes here]
do {
$input = socket_read($spawn, 4096, 1) or die("Could not read input\n");
$trimmed = trim($input);
if ($trimmed != "") {
echo date($dateformat) . ": Received input: $trimmed\n";
if ($trimmed == "END") {
socket_close($spawn);
break;
}
else {
// write content
$fhp = fopen($textFile, 'w') or die("can't open file");
fwrite($fhp, $trimmed);
fclose($fhp);
echo date($date) . ": Wrote: " . $trimmed ."\n";
// run my other PHP script - the crux of my issue - this is not working
include '/home/public_html/update.php';
}
echo date($dateformat) . ": updated \n";
}
}
} while (true);
socket_close($socket);
echo "Socket term\n";

the logic in the script in your question is
Start Loop
Read Socket
Open File Handle
Write To file
Close File
require '/home/public_html/update.php';
echo
Close Socket
End Loop
i can't understand why the script doesn't work as the way you say because the file gets written and closed before the include() line.
the following line of code is outside of if/else, and might cause some confusion since if the IF condition is TRUE then it will still output that's updated..
echo date($dateformat) . ": updated \n";
if problem continues can you paste the update.php script, maybe the problem is located there.
-- Edited
to include the update.php IF the file has content apply this code after the close of the file.
if(filesize($textFile) != 0){
include '/home/public_html/update.php';
}
the code above checks if $textFile doesn't equal to zero, in order to include the update.php file.

Related

Delete content stored in an array from a log file in php

I extracted some data from a log file and put it in an array (lets call it $line_content). I copied the 1st 15 lines from the array into another array ($line_content15). I want to delete/remove these 15 lines from the log file. How should I do it? I tried to use str_replace, like in this code snippet:
file_put_contents($filename, str_replace($line_content15 . "\r\n", "",
file_get_contents($filename)));
Any input would be helpful. Thank you!
As #user3783243 commented, I needed to implode the array and use
array_splice($imploded_content,$initial_line_number,$last_line_number);.
In this case,
$initial_line_number=0 and $last_line_number=15.
Update:
I also need to change permission for PHP to access and modify the file.
Given that log files often get very large then trying to do this in memory is not a good solution. Also, given that log files are usually part of one's audit trail, they should not be modified. But assuming there is a valid reason.....
function delete_lines($fname, $startline, $endline)
{
$tmp=tmpfile();
$in=fopen($fname, 'r+');
if (!flock($in, LOCK_EX, $wouldblock) || $wouldblock) {
trigger_error("Unable to lock file");
return false;
}
for ($x=0; $x<$startline; $x++) {
fputs($tmp, fgets($in));
}
for ($x=0; $x<($endline-$startline); $x++) {
fgets($in);
}
while (!feof($in)) {
fputs($tmp, fgets($in));
}
fseek($tmp,0);
fseek($in, 0);
$newsize=0;
while (!feof($tmp)) {
$newsize+=fputs($in, fgets($tmp));
}
ftruncate($in,$newsize);
fclose($in);
fclose($tmp);
return true;
}
You may want to add additional error handling in the above. This can be implemented with a single open file - but it can become messy quickly.

Is it possible to run a php script in the background every hour?

I'm Trying to get the links of all script tags of any given site (will only be trying it on my own personal sites). Using simplehtmldom http://simplehtmldom.sourceforge.net/manual.htm I have a working bit of code but no idea how or if it is even possible to run this script on a lamp server in the background once an hour, forever. By in the background I mean without a user actually on the site. Is there a simple way to achieve this? even a quick botched (simple) way would be great! Thanks.
require 'simple_html_dom.php';
function logToFile($filename, $msg)
{
// open file
$fd = fopen($filename, "a");
// write string
fwrite($fd, $msg . "\n");
// close file
fclose($fd);
}
$html = file_get_html('randomsite.com');
// set default timezone
date_default_timezone_set('Pacific/Auckland');
$current_date = date('d/m/Y | H:i:s');
// set scripts src
$current_src = '';
$scripts = $html->find('script');
foreach($scripts as $s) {
if(strpos($s->src, 'jquery') !== false) {
//do nothing
}else{
$current_src = $current_src . $s->src . ' ';
}
}
echo $current_src;
logToFile("data.log", "$current_date : $current_src".PHP_EOL);
Yeah,this is possible. If you want to run any script every hour than you can use cronjob in which you have to define like below cronjob and it will work fine.
Syntax :
minute hour day month day-of-week command-line-to-execute
if you want to every hour than need to select like below
*1 * * * php /var/www/html/VMonitor/download_file.php

bash echoing value to process, php loop (the process) not reading stdin

Background:
I'm in a position where I'm placing data into the command line and I need a php loop (what will become a server of sorts) to read STDIN and just echo what it reads to the shell its running in.
The following terrible code works when the process is running in the same shell as the content echoed:
<?php
echo getmypid();
$string = "/proc/" . getmypid() . "/fd/0";
while (true) {
fwrite(STDOUT, fgets(fopen($string, 'r'), 4096) . " worked\n");
}
?>
I've tried many variants:
<?php
echo getmypid();
$string = "/proc/" . getmypid() . "/fd/0";
while (true) {
$fo = fread(STDIN, 1024);
fwrite(STDOUT, $fo);
}
?>
The problem is that whenever I write to this loop from a separate terminal, the output appears in the other terminal but is not processed by the loop.
When I enter text in the same terminal, the text is echoed right back.
I need a way to get command line data into this loop from any source.

PHP writing into text file while looping

I'm developing an app where user upload excel [.xlsx] file for dumping data into MySQL database. I have programmed in such a way that there is a LOG created for each import. So that user can see if there is any error occurred and etc.. My script was working perfectly before implementing the log system.
After implementing the log system i can see duplicate rows inserted into database. Also die() command is not working.
It just keep looping continuously!
I have written sample code below. Please tell whats wrong in my logging method.
Note: if i remove logging [Writing into file] script works correctly.
$file = fopen("20131105.txt", "a");
fwrite($file, "LOG CREATED".PHP_EOL);
foreach($hdr as $k => $v) {
$username = $v['un'];
$address = $v['adr'];
$message = $v['msg'];
if($username == '') {
fwrite($file, 'Error: Missing User Name'.PHP_EOL);
continue;
} else {
// insert into database
}
}
fwrite($file, PHP_EOL."LOG CLOSED");
fclose($file);
echo 1;
die();
First, your die statement is after your loop. It needs to be inside your loop to end it;
Second, you're looping over $hdr. It's not defined in your snippet tho. It has to be an array. What does it contain?
var_dump($hdr);
The documentation for foreach as given in php manual highlights
"Reference of a $value and the last array element remain even after the foreach loop. It is recommended to destroy it by unset()."[1].
Try unsetting the values in foreach using unset($value) . This might be the reason for duplicate values.

File reading, searching and looping problems

So i've been trying to write this little piece of code to read a file (status.txt), search it for 1 of 4 keywords and loop until either time runs out (5 minutes) or it finds one of the words. I've already written a few simple php scripts to write the words to a txt file, but I can't seem to get this part to work. It either doesn't clear the file in the beginning or seems to hang and never picks up the changes. Any advice would be hugely helpful.
<?php
//Variables
$stringG = "green";
$stringR = "red";
$stringB = "blue";
$stringO = "orange";
$clear = "";
$statusFile = "status.txt";
//erase file
$fh = fopen($statusFile, 'w'); //clear the file with "clear"
fwrite($fh, $clear);
fclose($fh);
//Insert LOOP
$counter = 0;
while ( $counter <= 10 ) {
//echo "loop begun";
// Read THE FILE
$fh = fopen($statusFile, 'r');
$data = fread($fh, filesize($statusFile));
fclose($fh);
//process the file
if(stristr($data,$stringG)) {
echo "Green!";
$counter = $counter + 30; //stop if triggered
}
elseif (stristr($data,$stringR)) {
echo "Red";
$counter = $counter + 30; //stop if triggered
}
elseif (stristr($data,$stringB)) {
echo "Blue";
$counter = $counter + 30; //stop if triggered
}
elseif (stristr($data,$stringO)) {
echo "Orange";
$counter = $counter + 30; //stop if triggered
}
else {
//increment loop counter
$counter = $counter + 1;
//Insert pause
sleep(10);
}
}
?>
You should open the file before your read loop, and close it after the loop. As in :
open the file
loop through the lines in the file
close the file
Also, if you clear the file before you read it, isn't it going to be empty every time?
Well, first of all, you don't need to "clear" the file this way... The "w" option in fopen will already do that for you.
Also, I wouldn't try to read the whole file at once, because, if it's very large, that won't work without intense memory usage.
What you should do is read the file sequentially, which means you always read a fixed amount of bytes and look for the keywords. To avoid losing keywords which are cut in half by your reading mechanism, you could make your reads overlay a bit (the length of your longest keyword-1), to solve that problem.
Then you should modify your while loop so that it also checks if you are at the end of the file ( while(!feof($fh)) ).
PS: It has been mentioned that you clear your file before reading it. What I understood is that your file gets a lot of input really fast, so you expect it to already have content again when you reopen it. If that's not the case, you really need to rethink your logic ;)
PPS: You don't need to abort your while loop by incrementing your counter variable past the boundaries you define. You can also use the break-keyword.
You haven't included the code which deletes the file in the while loop, so it only clears the file once. Also, I'd use unlink($statusFile); to delete the file.
You should rather use for cycle. And to your problem - you clear the file, then get the data from it. Try dumping this $data, you'll end up with string(0) "" for sure. First, save the data, then clear the file.
Edit: If you are changing the file in the loop itself in another thread, there's another problem. You should look after anatomic file stream. For example, you can use Nette SafeStream class.

Categories