I'm using file_get_contents as below and set cron job to run that file every hour, so it opens the described url which is for running some other functions. Now I have two questions completely similar.
<?php
file_get_contents('http://107.150.52.251/~IdidODiw/AWiwojdPDOmwiIDIWDIcekSldcdndudsAoiedfiee1.php');
?>
1) if the above url returns null value, does it store anything on server (temporory value or log)?
2) if the above url returns error, does it store anything like errors or temporary values to server permanently?
The function itself does not leave any trace.
Since you are running this code in a cron job, you cannot directly inspect its output. Therefore you need to log the result to a log file. Look into monolog for instance.
You will then log the result of your function like this :
$contents = file_get_contents( ... );
if($contents == false){
$log->error("An error occurred");
} else {
$log->debug("Result", array('content' => $content));
}
If you are suspecting anything wrong with the above command or want to debug it . You can print the error / success msg with the following code and re-direct it to log file.
$error = error_get_last();
echo $error['message'];
Related
I am trying to print out a multi lvl array so I can see what info is passed and maybe add something.
I tried with a foreach but there are too many different levels depending on the request.
There is also html text which hinders me reading the output with simple print_r
I copied a usually working exception log logging part and edited it. For some reason it is not working.
I also created the file in which to write and the folder exists.
I dont find any error logs indicating that there is an error
The script is continuing to execute after this part
I also tried this with the "error_log" func but I dont find that file either. Here I checked under ./var/log/ and not as with the dump under source/log/
I also checked the other load balancing servers...
Code:
$sOfferId = $aArticle['ggebayofferid'];
$aBody = $this->createOfferItem($aArticle);
print_r("\n abody: ".$aBody." \n"); //<- this shows me that the execution is reaching here
$sDump = "\n";
$sDump .= $aBody;
file_put_contents( "log/ebayupdate.log", $sDump, FILE_APPEND ); //<-- not able to find this log
$sURL = $this->_sURL.'/sell/inventory/v1/offer/'.$sOfferId;
//error_log( print_r($aBody, TRUE) ); <-- not able to find this log
I read that file_get_content is synchronous, but when I tried the code below I dont' think so :
$url = "http://foo.com";
$a = array("file11.php", "file2.php", "file3.php");
foreach ($a as $file)
{
$final = $url . "/" . $file;
print "Calling $final ...";
$res = file_get_contents($final);
if ($res)
print "OK";
else
print "ERR!";
print "<br>";
}
Each file executes some complex tasks, so I know the minimal excution time of any script, but this code runs very fastly and seems not to wait each request ! How can I wait for each file request?
Thanks :)
The above code is definitely synchronous. So if you say that the code exits after a few seconds, while it should be a lot longer, then you probably have a problem with the code.
Try to wrap this code in a try {} catch. And print the error. See what it says.
Try { code here } catch (Exception $e) { }
Also, most default settings in the php.ini for MAX_EXECUTION for a script is 30 seconds. After that it will exit on a fatal timeout error too. Check the setting in your php.ini and adjust it to your needs.
Edit:
Gathering your comments, I now assume you are trying to execute the php files you are referring to. This makes your question very confusing and the tags just wrong.
The code you use in your example only reads the contents of the file, so it's not executing anything. Which explains why it returns so fast, while you expect it to take a while.
If you want to execute the referred php files, approach it like this:
Include_once( $final );
Instead of opening the contents.
The following function receives a string parameter representing an url and then loads the url in a simple_html_dom object. If the loading fails, it attemps to load the url again.
public function getSimpleHtmlDomLoaded($url)
{
$ret = false;
$count = 1;
$max_attemps = 10;
while ($ret === false) {
$html = new simple_html_dom();
$ret = $html->load_file($url);
if ($ret === false) {
echo "Error loading url: $url\n";
sleep(5);
$count++;
$html->clear();
unset($html);
if ($count > $max_attemps)
return false;
}
}
return $html;
}
However, if the url loading fails one time, it keeps failing for the current url, and after the max attemps are over, it also keeps failing in the next calls to the function with the rest of the urls it has to process.
It would make sense to keep failing if the urls were temporarily offline, but they are not (I've checked while the script was running).
Any ideas why this is not working properly?
I would also like to point out, that when starts failing to load the urls, it only gives a warning (instead of multiple ones), with the following message:
PHP Warning: file_get_contents(http://www.foo.com/resource): failed
to open stream: HTTP request failed! in simple_html_dom.php on line
1081
Which is prompted by this line of code:
$ret = $html->load_file($url);
I have tested your code and it works perfectly for me, every time I call that function it returns valid result from the first time.
So even if you load the pages from the same domain there can be some protection on the page or server.
For example page can look for some cookies, or the server can look for your user agent and if it see you as an bot it would not serve correct content.
I had similar problems while parsing some websites.
Answer for me was to see what is some page/server expecting and make my code simulate that. Everything, from faking user agent to generating cookies and such.
By the way have you tried to create a simple php script just to test that 'simple html dom' parser can be run on your server with no errors? That is the first thing I would check.
On the end I must add that in one case, while I failed in numerous tries for parsing one page, and I could not win the masking game. On the end I made an script that loads that page in linux command line text browser lynx and saved the whole page locally and then I parsed that local file which worked perfect.
may be it is a problem of load_file() function itself.
Problem was, that the function error_get_last() returns all privious erros too, don't know, may be depending on PHP version?
I solved the problem by changing it to (check if error changed, not if it is null)
(or use the non object function: file_get_html()):
function load_file()
{
$preerror=error_get_last();
$args = func_get_args();
$this->load(call_user_func_array('file_get_contents', $args), true);
// Throw an error if we can't properly load the dom.
if (($error=error_get_last())!==$preerror) {
$this->clear();
return false;
}
}
This is a bit of a long shot, but I figured I'd ask anyway. I have an application that has web-based code editing, like you find on Github, using the ACE editor. The problem is, it is possible to edit code that is within the application itself.
I have managed to detect parse errors before saving the file, which works great, but if the user creates a runtime error, such as MyClass extends NonExistentClass, the file passes the parse check, but saves to the filesystem, killing the application.
Is there anyway to test if the new code will cause a runtime error before I save it to the filesystem? Seems completely counter-intuitive, but I figured I'd ask.
Possibly use register_shutdown_function to build a JSON object containing information about the fatal error. Then use an AJAX call to test the file; parse the returned value from the call to see if there is an error. (Obviously you could also run the PHP file and parse the JSON object without using AJAX, just thinking about what would be the best from a UX standpoint)
function my_shutdown() {
$error = error_get_last();
if( $error['type'] == 1 ) {
echo json_encode($error);
}
}
register_shutdown_function('my_shutdown');
Will output something like
{"type":1,"message":"Fatal error message","line":1}
Prepend that to the beginning of the test file, then:
$.post('/test.php', function(data) {
var json = $.parseJSON(data);
if( json.type == 1 ) {
// Don't allow test file to save?
}
});
Possibly helpful: php -f <file> will return a non-zero exit code if there's a runtime error.
perhaps running the code in a separate file first and attach some fixed code on the bottom to check if it evaluates?
Greetings,
I am writing some code inside a framework for PHP 5.3, and I am trying to catch all errors in a way that will allow me to gracefully crash on client side and add some log entry at the same time. To be sure to also catch parse errors, I am using register_shutdown_function to specifically catch parse errors.
Here is the function that I register
static function shutdown()
{
if(is_null($e = error_get_last()) === FALSE)
if($e["type"] == E_PARSE)
self::error($e["type"], $e["message"], $e["file"], $e["line"], array(self::$url));
}
The error method does two things :
It adds an error entry to a log file using fopen in append.
It execute an error display: it explicitely sets the HTTP code to 500, and display a custom format 500 error page. Some include (which I do within a wapper class, but is only an include for now) are required from there
For some reason, I can fopen my log file and append, but cannot do a simple include; it just silently dies from there.
Here is what the log outputs if I add a Log entry for each includes
static public function file($file)
{
if(class_exists("Logs"))
Logs::append("errors.log", $file . ":" . ((include $file) ? 1 : 0));
else
include $file;
}
// Inside the Logs class...
static public function append($file, $message)
{
if(!is_scalar($message))
$message = print_r($message, true);
$fh = fopen(Config::getPath("LOGS") . "/" . $file, 'a');
fwrite($fh, $message."\r\n");
fclose($fh);
}
Here what the log file gives me:
/Users/mt/Documents/workspace/core/language.php:1
...
/Users/mt/Documents/workspace/index.php:1
/Users/mt/Documents/workspace/core/view.php:1
[2010-01-31 08:16:31] Parsing Error (Code 4) ~ syntax error, unexpected T_VARIABLE {/Users/mt/Documents/workspace/controllers/controller1.php:13}
After the parse error is hitted, it does run the registered function, but as soon as it hits a new include file, it dies of a silent death... is there a way to circumvent this? Why would I be able to open a file for read/write, but not for inclusion?
Try to use Lagger
It would seem that it is related with either something in the config, either with some build specifics.
I ran the code initially on MacOSX, which would not run and fail as described, but it runs on a compiled version of PHP under Ubuntu.
Which is kinda fine for me, but pretty much makes me wonder why it still fails under OSX (XAMPP to be more precise).