Cakephp never ending request - php

I'm quite new to cakephp and trying to debug a code from someone else.
The problem is I get a never ending request, despite the fact that both view and crontroller seem to run properly. I even tryed to add an exit; in both of them or even introduce a syntax error in the controller, the request never ends and the browser keeps trying to load the page endlessly.
Here is the code of the controller :
public function categories()
{
file_put_contents("/tmp/logfile.log",time()." categories bla\n", FILE_APPEND);
$catData = $this->SpecificKeywordCategorie->find('all');
$modelnameLessValues = array();
foreach($catData as $singleCat)
{
$modelnameLessValues[] = $singleCat['SpecificKeywordCategorie'];
}
$this->set('categories',$modelnameLessValues);
file_put_contents("/tmp/logfile.log",time()." categories end blu\n", FILE_APPEND);
}
and the view code "categories.ctp :
<?php
file_put_contents("/tmp/logfile.log","view json ".json_encode($categories),FILE_APPEND);
print(json_encode($categories));
file_put_contents("/tmp/logfile.log","view json before exit",FILE_APPEND);
exit;
?>
all the file_put_contents entries are written in the logfile. but the exit seems to be ignored and if I do a request in a browser it never ends...
Same thing happens if I add a syntax error on controller or view. (of course, in this case, log entries are not written)
I know nothing about cakephp internals, but php scripts running outside it are running well on same apache instance.
Any idea where to look to find where this infinite request comes from ?
We are running cakephp 2.2.3

Related

PHP - file_put_contents writes content multiple times

I have an extremely oversimplified logger function:
<?php
class Logger {
public function __construct($logFile) {
$this->logFile = $logFile;
}
public function log($message) {
$message = date('c') . $message;
file_put_contents($this->logFile, $message, FILE_APPEND | LOCK_EX);
echo "it ran ";
}
}
Calling it like this
$logger = new Logger('log.txt');
$logger->log("message");
echo "called the method";
causes the message to be written to the file exactly 3 times, instead of 1.
The code is outside of any loop, which is confirmed by echo statements, which get printed only once.
Also, if I simply run file_put_contents() function on place where I'd call the log method, it works fine and writes the content just once. So it might have something to do with my class, but no clue what.
EDIT: #Tommy: here is the log file content:
2014-09-26T07:24:51-04:00message2014-09-26T07:24:54-04:00message2014-09-26T07:24:54-04:00message
EDIT 2: I tried using die() function after calling the method, and then it did write the message just once. So I kept moving the die() through the code, and it starts writing the message 3 times after this exact line:
if (isset($_POST['create_account'])) {
die;
Since there's a die below it, it shouldn't even matter what's in further code, right?
Wonder if it might be some sort of php bug, this is very stange. If I put the die() above this line, it will work fine and write the message just once.
There's a fairly good chance that your code does a redirect or reload somewhere. This causes a new request to start, which wipes away the original echo but does not remove the one written to file. As a result it looks like it was echo'd once and written thrice. But really, it was echo'd thrice as well, just the other copies have been removed.
If you want to see what's going on, print part of the stack-trace into the log-file along with the message. You can see exactly on which line the message is created and during which function call.
The main issue as per my experience is the index.php is being called twice. So, to fix:
change the file name
fix the index.php such that favicon.ico is missing!

Double page load how to prevent it?

I have the following if statement in PHP:
if(isset($_POST['approved']))
{
// do stuff
}
else
{
// give the user an error message
}
Now the problem is that AND the if is getting executed AND the else is getting executed.
Now i've tracked the problem to a double page load problem and and a post and get request are sent at the same time or something like that but executed in the same file.
Simply put, how do I prevent this so my users don't get shipped with error messages whilst everything was done allright?
Is there a way to configure PHP to only allow a one time execution?
Trying to think about this gives me serious headaches, as if I have a parralel universe where the opposite happens but it all comes back to this universe as both have happened...
Schrodingers cat but without the collapsed wave function.
If the page is being loaded twice, then the problem is not with your php in this file, it's a problem with the file that calls it. Basically, it's loading twice because it's being requested twice. Don't try to stop the php file from executing, stop the second request from happening.

Attempting to load again a URL when it fails

The following function receives a string parameter representing an url and then loads the url in a simple_html_dom object. If the loading fails, it attemps to load the url again.
public function getSimpleHtmlDomLoaded($url)
{
$ret = false;
$count = 1;
$max_attemps = 10;
while ($ret === false) {
$html = new simple_html_dom();
$ret = $html->load_file($url);
if ($ret === false) {
echo "Error loading url: $url\n";
sleep(5);
$count++;
$html->clear();
unset($html);
if ($count > $max_attemps)
return false;
}
}
return $html;
}
However, if the url loading fails one time, it keeps failing for the current url, and after the max attemps are over, it also keeps failing in the next calls to the function with the rest of the urls it has to process.
It would make sense to keep failing if the urls were temporarily offline, but they are not (I've checked while the script was running).
Any ideas why this is not working properly?
I would also like to point out, that when starts failing to load the urls, it only gives a warning (instead of multiple ones), with the following message:
PHP Warning: file_get_contents(http://www.foo.com/resource): failed
to open stream: HTTP request failed! in simple_html_dom.php on line
1081
Which is prompted by this line of code:
$ret = $html->load_file($url);
I have tested your code and it works perfectly for me, every time I call that function it returns valid result from the first time.
So even if you load the pages from the same domain there can be some protection on the page or server.
For example page can look for some cookies, or the server can look for your user agent and if it see you as an bot it would not serve correct content.
I had similar problems while parsing some websites.
Answer for me was to see what is some page/server expecting and make my code simulate that. Everything, from faking user agent to generating cookies and such.
By the way have you tried to create a simple php script just to test that 'simple html dom' parser can be run on your server with no errors? That is the first thing I would check.
On the end I must add that in one case, while I failed in numerous tries for parsing one page, and I could not win the masking game. On the end I made an script that loads that page in linux command line text browser lynx and saved the whole page locally and then I parsed that local file which worked perfect.
may be it is a problem of load_file() function itself.
Problem was, that the function error_get_last() returns all privious erros too, don't know, may be depending on PHP version?
I solved the problem by changing it to (check if error changed, not if it is null)
(or use the non object function: file_get_html()):
function load_file()
{
$preerror=error_get_last();
$args = func_get_args();
$this->load(call_user_func_array('file_get_contents', $args), true);
// Throw an error if we can't properly load the dom.
if (($error=error_get_last())!==$preerror) {
$this->clear();
return false;
}
}

PHP test for fatal error

This is a bit of a long shot, but I figured I'd ask anyway. I have an application that has web-based code editing, like you find on Github, using the ACE editor. The problem is, it is possible to edit code that is within the application itself.
I have managed to detect parse errors before saving the file, which works great, but if the user creates a runtime error, such as MyClass extends NonExistentClass, the file passes the parse check, but saves to the filesystem, killing the application.
Is there anyway to test if the new code will cause a runtime error before I save it to the filesystem? Seems completely counter-intuitive, but I figured I'd ask.
Possibly use register_shutdown_function to build a JSON object containing information about the fatal error. Then use an AJAX call to test the file; parse the returned value from the call to see if there is an error. (Obviously you could also run the PHP file and parse the JSON object without using AJAX, just thinking about what would be the best from a UX standpoint)
function my_shutdown() {
$error = error_get_last();
if( $error['type'] == 1 ) {
echo json_encode($error);
}
}
register_shutdown_function('my_shutdown');
Will output something like
{"type":1,"message":"Fatal error message","line":1}
Prepend that to the beginning of the test file, then:
$.post('/test.php', function(data) {
var json = $.parseJSON(data);
if( json.type == 1 ) {
// Don't allow test file to save?
}
});
Possibly helpful: php -f <file> will return a non-zero exit code if there's a runtime error.
perhaps running the code in a separate file first and attach some fixed code on the bottom to check if it evaluates?

Selenium RC WaitForPageToLoad Hangs

I am trying to get Selenium RC up and running for doing some automated testing on my website. I am finding that I constantly want to verify that I haven't broken any features, and manual testing is starting to become tiresome.
However, I can't seem to get Selenium RC to work with WaitForPageToLoad.
I tried copying the basic example that they give in the selenium documentation, but the test always gets stuck at: $this->waitForPageToLoad("30000"); I can see that it gets that far in the window that it brings up and that the page appears to have loaded correctly (we are at a google search result page). But the test fails with a timeout.
require_once 'PHPUnit/Extensions/SeleniumTestCase.php';
/**
* Description of Test
*
* #author brian
*/
class Test extends PHPUnit_Extensions_SeleniumTestCase {
function setUp() {
$this->setBrowser("*safari");
$this->setBrowserUrl("http://www.google.com/");
}
function testMyTestCase() {
$this->open("/");
$this->type("q", "selenium rc");
$this->click("btnG");
$this->waitForPageToLoad("30000");
$this->assertTrue($this->isTextPresent("Results * for selenium rc"));
}
}
What is even more interesting is that if I refresh the page when it is waiting, everything continues on as expected. So it would appear as though the waitForPageToLoad isn't realizing that the page has already loaded.
The example in the Selenium RC documentation is obsolete. Google changed the way their home page worked quite a while ago, and it is no longer a simple HTML page. Pressing the search button is now an AJAX-type operation that sends the search request and gets back a JSON response that is processed by the JavaScript code in the page. So the page never is re-loaded, and WaitForPageToLoad() eventually times out.
There is also another possible cause of this situation that I ran into just now. According to the documentation, if you call ANY SELENIUM COMMANDS in between loading a page and calling waitForPageToLoad, then it is possible that waitForPageToLoad will hang. (If I understand it correctly, it is technically a race condition between the test script and selenium server, so it happens sometimes, not necessarily all the time).
In most cases, the page load is caused by a click event. When you have a test script like:
$this->click("/some/path");
// <<- NO SELENIUM COMMANDS HERE
$this->waitForPageToLoad("30000");
Make extra sure that no selenium commands ever accidentally get inserted into the marked area.
While this is not technically the same problem that the OP posted about, it has the same error message, and I couldn't find this information without digging around quite a bit. Hopefully this is easier to find for other people in the future.
I have observed same problem many times. Hence I did not use this command when user is not navigating away from current page. It hangs at times and using IsElementPresent in while loop and exit after it return true.
An alernative to "WaitForPageToLoad()" Is to wait for an element to be present.
$SECONDS = 360;
for ($second = 0; ; $second++) {
if ($second >= $SECONDS) $this->fail("TIMEOUT");
try {
if ($this->isElementPresent("edit-submit")) break;
} catch (Exception $e) {}
sleep(1);
}
This code will loop for 360 seconds, checking if the value (edit-submit) is present each second. ("sleep(1)"). It essentially will achieve the same result as WaitForPageToLoad, but you can specify an absolute target.

Categories