PHP - Saving array to file - Not stacking up - php

Really stumped on this one and feel like an idiot! I have a small PHP cron job that does it's thing every few minutes. The client has requested that the app emails them with a daily overview of issues raised....
To do this, I decided to dump an array to a file for storage purposes. I decided against a SQL DB to keep this standalone and lightweight.
What I want to do is open said file, add to a set of numbers and save again.
I have tried this with SimpleXML and serialize/file_put_contents.
The issue I have is what is written to file does not correspond with the array being echo'd the line before. Say I'm adding 2 to the total, the physical file has added 4.
The following is ugly and just a snippet:
echo "count = ".count($result);"<br/>";
$arr = loadLog();
dumpArray($arr, "Pre Load");
$arr0['count'] = $arr['count']+(count($result));
echo "test ".$arr0['count'];
dumpArray($arr0, "Pre Save");
saveLog($arr0);
sleep(3);
$arr1 = loadLog();
dumpArray($arr1, "Post Save");
function saveLog($arr){
$content = serialize($arr);
var_dump($content);
file_put_contents(STATUS_SOURCE, $content);
}
function loadLog(){
$content = unserialize(file_get_contents(STATUS_SOURCE));
return $content;
}
function dumpArray($array, $title = false){
echo "<p><h1>".$title."</h1><pre>";
var_dump($array);
echo "</pre></p>";
}
Output View here
Output File: a:1:{s:5:"count";i:96;}
I really appreciate any heads up - Have had someone else look who also scratched his head.

Check .htaccess isn't sending 404 errors to the same script. Chrome was looking for favicon.ico which did not exist. This caused the script to execute a second time.

Related

How To Make a PHP Script Synchronously Update From Terminal Values

Using a Raspberry Pi I created a script which loads the CPU Temperature of the Pi through an Apache Server onto the Browser.
<?php
$temp = exec('vcgencmd measure_temp');
$temp = str_replace('temp=','',$temp);
$temp = str_replace('\'C','',$temp);
echo $temp;
?>
Using the code above I have to manually refresh the page to see the latest value.
It works fine but I'd like to know how I could set this up without having to refresh the browser all the time.
Within the Terminal on the Pi I was able to use the "watch" command which will give me the current value every 0.1 seconds.
But by executing this script, the browser will give me a blank page.
<?php
$temp = exec('watch -n 0.1 vcgencmd measure_temp');
$temp = str_replace('temp=','',$temp);
$temp = str_replace('\'C','',$temp);
echo $temp;
?>
Is there any way to make the script using the "watch" command work with the PHP Script? If not, is there any other way to make it refresh everytime the value changes in the terminal?
Note: I am new to programming and using the Pi.
I would really appreciate any helpful information!
Thank you in advance!
Watch wont work in your case, you can call jquery cdn from official website and then do this function. Do not forget to open console to see what comes back. F12
Add this into your php file.
if(isset($_GET)){
$temp = exec('vcgencmd measure_temp');
$temp = str_replace('temp=','',$temp);
$temp = str_replace('\'C','',$temp);
echo $temp;
}
Then into your index.html
$(function() {
startRefresh();
});
function startRefresh() {
setTimeout(startRefresh,1000); // 1000 represents 1 second, free to change
$.get('index.php', function(data) { // i assume your index.php in same folder with your html file.
console.log(data);
});
}
I actually found a easier way to set this up.
What I wanted to do was not having to manually refresh the page in order to get the current temperature values.
The answers above were correct but I wasn't able to set it up by myself so I figured out I can add a header and refresh into my PHP script which will make the page refresh every second (or whatever timeframe is needed).
The code looks like this now:
<?php header('refresh: 1');
$temp = exec('vcgencmd measure_temp');
$temp = str_replace('temp=','',$temp);
$temp = str_replace('\'C','',$temp);
echo $temp;
?>
Thank you to everyone who was trying to help me!

Return only updated result PHP

I was wondering how to detect when a page gets updated with PHP. I've researched things on Google, but came across nothing.
What I want to do is call a specific function when a page gets updated. I will be running a cron job in order to run the code.
I want something like this:
$contents = file_get_contents('example.com/page');
$Pupdate = file_get_contents('pageupdate.txt');
if ($Pupdate == ($pageUpdate = $contents)) {
// the content is the same
} else {
// the page has been updated, do whatever you need to do
// and store the new updated content only in the file
$fp = fopen('pageupdate.txt', 'w');
fwrite($fp, $pageUpdate);
fclose($fp);
}
If I can't do something like that then I want to at least know how to detect when a page gets updated in txt and return only updated result with PHP.
Please help, thanks!
If you're running the script with cron, could you simply check if the file has been updated since the script was last run?
Get the last modified time of the file with filetime()
Check if the time now - last modified time is < the cron interval
If it is, "do whatever you need to do"

PHP file_get_contents is asynchronous?

I read that file_get_content is synchronous, but when I tried the code below I dont' think so :
$url = "http://foo.com";
$a = array("file11.php", "file2.php", "file3.php");
foreach ($a as $file)
{
$final = $url . "/" . $file;
print "Calling $final ...";
$res = file_get_contents($final);
if ($res)
print "OK";
else
print "ERR!";
print "<br>";
}
Each file executes some complex tasks, so I know the minimal excution time of any script, but this code runs very fastly and seems not to wait each request ! How can I wait for each file request?
Thanks :)
The above code is definitely synchronous. So if you say that the code exits after a few seconds, while it should be a lot longer, then you probably have a problem with the code.
Try to wrap this code in a try {} catch. And print the error. See what it says.
Try { code here } catch (Exception $e) { }
Also, most default settings in the php.ini for MAX_EXECUTION for a script is 30 seconds. After that it will exit on a fatal timeout error too. Check the setting in your php.ini and adjust it to your needs.
Edit:
Gathering your comments, I now assume you are trying to execute the php files you are referring to. This makes your question very confusing and the tags just wrong.
The code you use in your example only reads the contents of the file, so it's not executing anything. Which explains why it returns so fast, while you expect it to take a while.
If you want to execute the referred php files, approach it like this:
Include_once( $final );
Instead of opening the contents.

Check when file_get_contents is finished

Is there anyway I can check when file_get_contents has finished loading the file, so I can load another file, will it automatically finish loading the one file before going onto the next one?
Loading a file with file_get_contents() will block operation of your script until PHP is finished reading it in completely. It must, because you couldn't assign the $content = otherwise.
PHP is single threaded - all functions happen one after the other. There is a php_threading PECL extension if you did want to try loading files asynchronously, but I haven't tried it myself so I can't say if it would work or not.
simple example that will loop through and get google.co.uk#q=* 5 times and output if it got it or not, pretty useless but kinda answers your question that a check can be done to see if file_get_contents was successful before doing the next one, obviously google could be changed to something else. but wouldn't be very practical. plus output buffering dont output within functions.
<?php
function _flush (){
echo(str_repeat("\n\n",256));
if (ob_get_length()){
#ob_flush();
#flush();
#ob_end_flush();
}
#ob_start();
}
function get_file($loc){
return file_get_contents($loc);
}
for($i=0;$i<=5;$i++){
$content[$i] = #get_file("http://www.google.co.uk/#q=".$i);
if($content[$i]===FALSE){
echo'Error getting google ('.$i.')<br>';
return;
}else{
echo'Got google ('.$i.')<br>';
}
ob_flush();
_flush();
}
?>

Crazy buffer from Ajax and php script

I have a PHP web crawler that just checks out websites. I decided a few days ago to make the crawlers progress show in real time using AJAX. The php script writes to a file in JSON and AJAX reads the tiny file.
I double and triple checked my PHP script wondering what the hell was going on because after I finished the simple AJAX script the data appearing on my browser leaped up and down in strange directions.
The php script executed perfectly and very quickly but my AJAX would slowly increase the values, every 2 seconds as set, then drop. The numbers only increase in PHP they do not go down. However, the numbers showing up on my webpage go up and down as if the buffer is working on multiple sessions or reading from something that is being updated even though the PHP stopped about an hour ago.
Is there something I'm missing or need to keep clear like a buffer or a reset button?
This is the most I can show, I just slapped it together a really long time ago. If you know of better code then please share, I love any help possible. But, I'm sort of new so please explain things outside of basic functions.
AJAX
//open our json file
ajaxRequest.onreadystatechange = function(){
if(ajaxRequest.readyState == 4){
//display json file contents
document.form.total_emails.value = ajaxRequest.responseText;
}
}
ajaxRequest.open("GET", "test_results.php", true);
ajaxRequest.send(null);
PHP
//get addresses and links
for($x=(int)0; $x<=$limit; $x++){
$input = get_link_contents($link_list[0]);
array_shift($link_list);
$link_list = ($x%100==0 || $x==5)?filter_urls($link_list,$blacklist):$link_list;
//add the links to the link list and remove duplicates
if(count($link_list) <= 1000) {
preg_match_all($link_reg, $input, $new_links);
$link_list = array_merge($link_list, $new_links);
$link_list = array_unique(array_flatten($link_list));
}
//check the addresses against the blacklist before adding to a a file in JSON
$res = preg_match_all($regex, $input, $matches);
if ($res) {
foreach(array_unique($matches[0]) as $address) {
if(!strpos_arr($address,$blacklist)){
$enum++;
json_file($results_file,$link_list[0],$enum,$x);
write_addresses_to_file($address, $address_file);
}
}
}
unset($input, $res, $efile);
}
The symptoms might indicate the PHP script not closing the file properly after writing, and/or a race condition where the AJAX routine is fetching the JSON data in between the PHP's fopen() and the new data being written.
A possible solution would be for the PHP script to write to a temp file, then rename to the desired filename after the data is written and the file is properly closed.
Also, it's a good idea to check response.status == 200 as well as response.readyState == 4.
Tools like ngrep and tcpdump can help debugging this type of problem.

Categories