PHP file_get_contents is asynchronous? - php

I read that file_get_content is synchronous, but when I tried the code below I dont' think so :
$url = "http://foo.com";
$a = array("file11.php", "file2.php", "file3.php");
foreach ($a as $file)
{
$final = $url . "/" . $file;
print "Calling $final ...";
$res = file_get_contents($final);
if ($res)
print "OK";
else
print "ERR!";
print "<br>";
}
Each file executes some complex tasks, so I know the minimal excution time of any script, but this code runs very fastly and seems not to wait each request ! How can I wait for each file request?
Thanks :)

The above code is definitely synchronous. So if you say that the code exits after a few seconds, while it should be a lot longer, then you probably have a problem with the code.
Try to wrap this code in a try {} catch. And print the error. See what it says.
Try { code here } catch (Exception $e) { }
Also, most default settings in the php.ini for MAX_EXECUTION for a script is 30 seconds. After that it will exit on a fatal timeout error too. Check the setting in your php.ini and adjust it to your needs.
Edit:
Gathering your comments, I now assume you are trying to execute the php files you are referring to. This makes your question very confusing and the tags just wrong.
The code you use in your example only reads the contents of the file, so it's not executing anything. Which explains why it returns so fast, while you expect it to take a while.
If you want to execute the referred php files, approach it like this:
Include_once( $final );
Instead of opening the contents.

Related

PHP - Saving array to file - Not stacking up

Really stumped on this one and feel like an idiot! I have a small PHP cron job that does it's thing every few minutes. The client has requested that the app emails them with a daily overview of issues raised....
To do this, I decided to dump an array to a file for storage purposes. I decided against a SQL DB to keep this standalone and lightweight.
What I want to do is open said file, add to a set of numbers and save again.
I have tried this with SimpleXML and serialize/file_put_contents.
The issue I have is what is written to file does not correspond with the array being echo'd the line before. Say I'm adding 2 to the total, the physical file has added 4.
The following is ugly and just a snippet:
echo "count = ".count($result);"<br/>";
$arr = loadLog();
dumpArray($arr, "Pre Load");
$arr0['count'] = $arr['count']+(count($result));
echo "test ".$arr0['count'];
dumpArray($arr0, "Pre Save");
saveLog($arr0);
sleep(3);
$arr1 = loadLog();
dumpArray($arr1, "Post Save");
function saveLog($arr){
$content = serialize($arr);
var_dump($content);
file_put_contents(STATUS_SOURCE, $content);
}
function loadLog(){
$content = unserialize(file_get_contents(STATUS_SOURCE));
return $content;
}
function dumpArray($array, $title = false){
echo "<p><h1>".$title."</h1><pre>";
var_dump($array);
echo "</pre></p>";
}
Output View here
Output File: a:1:{s:5:"count";i:96;}
I really appreciate any heads up - Have had someone else look who also scratched his head.
Check .htaccess isn't sending 404 errors to the same script. Chrome was looking for favicon.ico which did not exist. This caused the script to execute a second time.

Does file_get_contents store any data

I'm using file_get_contents as below and set cron job to run that file every hour, so it opens the described url which is for running some other functions. Now I have two questions completely similar.
<?php
file_get_contents('http://107.150.52.251/~IdidODiw/AWiwojdPDOmwiIDIWDIcekSldcdndudsAoiedfiee1.php');
?>
1) if the above url returns null value, does it store anything on server (temporory value or log)?
2) if the above url returns error, does it store anything like errors or temporary values to server permanently?
The function itself does not leave any trace.
Since you are running this code in a cron job, you cannot directly inspect its output. Therefore you need to log the result to a log file. Look into monolog for instance.
You will then log the result of your function like this :
$contents = file_get_contents( ... );
if($contents == false){
$log->error("An error occurred");
} else {
$log->debug("Result", array('content' => $content));
}
If you are suspecting anything wrong with the above command or want to debug it . You can print the error / success msg with the following code and re-direct it to log file.
$error = error_get_last();
echo $error['message'];

PHP - file_put_contents writes content multiple times

I have an extremely oversimplified logger function:
<?php
class Logger {
public function __construct($logFile) {
$this->logFile = $logFile;
}
public function log($message) {
$message = date('c') . $message;
file_put_contents($this->logFile, $message, FILE_APPEND | LOCK_EX);
echo "it ran ";
}
}
Calling it like this
$logger = new Logger('log.txt');
$logger->log("message");
echo "called the method";
causes the message to be written to the file exactly 3 times, instead of 1.
The code is outside of any loop, which is confirmed by echo statements, which get printed only once.
Also, if I simply run file_put_contents() function on place where I'd call the log method, it works fine and writes the content just once. So it might have something to do with my class, but no clue what.
EDIT: #Tommy: here is the log file content:
2014-09-26T07:24:51-04:00message2014-09-26T07:24:54-04:00message2014-09-26T07:24:54-04:00message
EDIT 2: I tried using die() function after calling the method, and then it did write the message just once. So I kept moving the die() through the code, and it starts writing the message 3 times after this exact line:
if (isset($_POST['create_account'])) {
die;
Since there's a die below it, it shouldn't even matter what's in further code, right?
Wonder if it might be some sort of php bug, this is very stange. If I put the die() above this line, it will work fine and write the message just once.
There's a fairly good chance that your code does a redirect or reload somewhere. This causes a new request to start, which wipes away the original echo but does not remove the one written to file. As a result it looks like it was echo'd once and written thrice. But really, it was echo'd thrice as well, just the other copies have been removed.
If you want to see what's going on, print part of the stack-trace into the log-file along with the message. You can see exactly on which line the message is created and during which function call.
The main issue as per my experience is the index.php is being called twice. So, to fix:
change the file name
fix the index.php such that favicon.ico is missing!

Check when file_get_contents is finished

Is there anyway I can check when file_get_contents has finished loading the file, so I can load another file, will it automatically finish loading the one file before going onto the next one?
Loading a file with file_get_contents() will block operation of your script until PHP is finished reading it in completely. It must, because you couldn't assign the $content = otherwise.
PHP is single threaded - all functions happen one after the other. There is a php_threading PECL extension if you did want to try loading files asynchronously, but I haven't tried it myself so I can't say if it would work or not.
simple example that will loop through and get google.co.uk#q=* 5 times and output if it got it or not, pretty useless but kinda answers your question that a check can be done to see if file_get_contents was successful before doing the next one, obviously google could be changed to something else. but wouldn't be very practical. plus output buffering dont output within functions.
<?php
function _flush (){
echo(str_repeat("\n\n",256));
if (ob_get_length()){
#ob_flush();
#flush();
#ob_end_flush();
}
#ob_start();
}
function get_file($loc){
return file_get_contents($loc);
}
for($i=0;$i<=5;$i++){
$content[$i] = #get_file("http://www.google.co.uk/#q=".$i);
if($content[$i]===FALSE){
echo'Error getting google ('.$i.')<br>';
return;
}else{
echo'Got google ('.$i.')<br>';
}
ob_flush();
_flush();
}
?>

register_shutdown_function : registered function cannot include files if terminating on failure?

Greetings,
I am writing some code inside a framework for PHP 5.3, and I am trying to catch all errors in a way that will allow me to gracefully crash on client side and add some log entry at the same time. To be sure to also catch parse errors, I am using register_shutdown_function to specifically catch parse errors.
Here is the function that I register
static function shutdown()
{
if(is_null($e = error_get_last()) === FALSE)
if($e["type"] == E_PARSE)
self::error($e["type"], $e["message"], $e["file"], $e["line"], array(self::$url));
}
The error method does two things :
It adds an error entry to a log file using fopen in append.
It execute an error display: it explicitely sets the HTTP code to 500, and display a custom format 500 error page. Some include (which I do within a wapper class, but is only an include for now) are required from there
For some reason, I can fopen my log file and append, but cannot do a simple include; it just silently dies from there.
Here is what the log outputs if I add a Log entry for each includes
static public function file($file)
{
if(class_exists("Logs"))
Logs::append("errors.log", $file . ":" . ((include $file) ? 1 : 0));
else
include $file;
}
// Inside the Logs class...
static public function append($file, $message)
{
if(!is_scalar($message))
$message = print_r($message, true);
$fh = fopen(Config::getPath("LOGS") . "/" . $file, 'a');
fwrite($fh, $message."\r\n");
fclose($fh);
}
Here what the log file gives me:
/Users/mt/Documents/workspace/core/language.php:1
...
/Users/mt/Documents/workspace/index.php:1
/Users/mt/Documents/workspace/core/view.php:1
[2010-01-31 08:16:31] Parsing Error (Code 4) ~ syntax error, unexpected T_VARIABLE {/Users/mt/Documents/workspace/controllers/controller1.php:13}
After the parse error is hitted, it does run the registered function, but as soon as it hits a new include file, it dies of a silent death... is there a way to circumvent this? Why would I be able to open a file for read/write, but not for inclusion?
Try to use Lagger
It would seem that it is related with either something in the config, either with some build specifics.
I ran the code initially on MacOSX, which would not run and fail as described, but it runs on a compiled version of PHP under Ubuntu.
Which is kinda fine for me, but pretty much makes me wonder why it still fails under OSX (XAMPP to be more precise).

Categories