Double page load how to prevent it? - php

I have the following if statement in PHP:
if(isset($_POST['approved']))
{
// do stuff
}
else
{
// give the user an error message
}
Now the problem is that AND the if is getting executed AND the else is getting executed.
Now i've tracked the problem to a double page load problem and and a post and get request are sent at the same time or something like that but executed in the same file.
Simply put, how do I prevent this so my users don't get shipped with error messages whilst everything was done allright?
Is there a way to configure PHP to only allow a one time execution?
Trying to think about this gives me serious headaches, as if I have a parralel universe where the opposite happens but it all comes back to this universe as both have happened...
Schrodingers cat but without the collapsed wave function.

If the page is being loaded twice, then the problem is not with your php in this file, it's a problem with the file that calls it. Basically, it's loading twice because it's being requested twice. Don't try to stop the php file from executing, stop the second request from happening.

Related

Cakephp never ending request

I'm quite new to cakephp and trying to debug a code from someone else.
The problem is I get a never ending request, despite the fact that both view and crontroller seem to run properly. I even tryed to add an exit; in both of them or even introduce a syntax error in the controller, the request never ends and the browser keeps trying to load the page endlessly.
Here is the code of the controller :
public function categories()
{
file_put_contents("/tmp/logfile.log",time()." categories bla\n", FILE_APPEND);
$catData = $this->SpecificKeywordCategorie->find('all');
$modelnameLessValues = array();
foreach($catData as $singleCat)
{
$modelnameLessValues[] = $singleCat['SpecificKeywordCategorie'];
}
$this->set('categories',$modelnameLessValues);
file_put_contents("/tmp/logfile.log",time()." categories end blu\n", FILE_APPEND);
}
and the view code "categories.ctp :
<?php
file_put_contents("/tmp/logfile.log","view json ".json_encode($categories),FILE_APPEND);
print(json_encode($categories));
file_put_contents("/tmp/logfile.log","view json before exit",FILE_APPEND);
exit;
?>
all the file_put_contents entries are written in the logfile. but the exit seems to be ignored and if I do a request in a browser it never ends...
Same thing happens if I add a syntax error on controller or view. (of course, in this case, log entries are not written)
I know nothing about cakephp internals, but php scripts running outside it are running well on same apache instance.
Any idea where to look to find where this infinite request comes from ?
We are running cakephp 2.2.3

totally random timeouts after session_start() is called

I have been trying to fix this wired php session issue for some time now.
Setup: running on IIS V6.0, php for windows V 5.2.6
Issue:
At totally random times, the next php line after session_start() times out.
I have a file, auth.php that gets included on every page on an extranet site (to check valid logins)
auth.php
session_start();
if (isset($_SESSION['auth']==1) { <---- timesout here
do something ...
}
...
When using the site, I get random "maximum execution time of 30 seconds exceeded" errors at the line 2: if (isset($_SESSION['auth']==1) {
If I modify this script to
session_start();
echo 'testing'; <---- timesout here
if (isset($_SESSION['auth']==1) {
do something ...
}
...
The random error now happens on line 2 as well (echo 'testing'), which is a simple echo statement, strange.
It looks like session_start() is randomly causing issues, preventing line of code right after it to throw a timeout error (even for a simple echo statement) ....
This is happening on all sorts of page on the site (db intensive, relatively static ...) which is making it difficult to troubleshoot. I have been tweaking the session variables and timeouts in php.ini without any luck
Has anyone encountered something like that, or could suggest possible places to look at ?
thanks !
A quick search suggests that you should be using session_write_close() to close the session when you are done using it if you are on an NTFS file system. Starting a session locks the session file so no other file can access it while code is running. For some reason, the lock sometimes doesn't release automatically reliably on Windows/NTFS, so you should manually close the session when you are done with it.

PHP - how to assure correct command execution sequence

Given a simple code like :
$file = 'hugefile.jpg';
$bckp_file = 'hugeimage-backup.jpg';
// here comes some manipulation on $bckp_file.
The assumed problem is that if the file is big or huge - let´s say a jpg - One would think that it will take the server some time to copy it (by time I mean even a few milliseconds) - but one would also assume that the execution of the next line would be much faster ..
So in theory - I could end up with "no such file or directory" error when trying to manipulate file that has not yet created - or worse - start to manipulate a TRUNCATED file.
My question is how can I assure that $bckp_file was created (or in this case -copied) successfully before the NEXT line which manipulates it .
What are my options to "pause" , "delay" the next line execution until the file creation / copy was completed ?
right now I can only think of something like
if (!copy($file, $bckp_file)) {
echo "failed to copy $file...\n";
}
which will only alert me but will not resolve anything (same like having the php error)
or
if (copy($file, $bckp_file)) {
// move the manipulation to here ..
}
But this is also not so valid - because let´s say the copy was not executed - I will just go out of the loop without achieving my goal and without errors.
Is that even a problem or am I over-thinking it ?
Or is PHP has a bulit-in mechanism to ensure that ?
Any recommended practices ?
any thoughts on the issue ? ??
What are my options to "pause" , "delay" the next line execution until the file is creation / copy was completes
copy() is a synchronous function meaning that code will not continue after the call to copy() until copy() either completely finishes or fails.
In other words, there's no magic involved.
if (copy(...)) { echo 'success!'; } else { echo 'failure!'; }
Along with synchronous IO, there is also asynchronous IO. It's a bit complicated to explain in technical detail, but the general idea of it is that it runs in the background and your code hooks into it. Then, whenever a significant event happens, the background execution alerts your code. For example, if you were going to async copy a file, you would register a listener to the copying that would be notified when progress was made. That way, your code could do other things in the mean time, but you could also know what progress was being made on the file.
PHP handles file uploads by saving the whole file in a temporary directory on the server before executing any of script (so you can use $_FILES from the beginning), and it's safe to assume all functions are synchronous -- that is, PHP will wait for each line to execute before moving to the next line.

PHP filemtime function - "stat failed for"

I have a problem with PHP filemtime function. In my webapp I use Smarty template engine with caching option. In my webapp I can do some actions which generate error, but lets focus on only one action. When I click link on page some content is updated - I can click few times and everything is OK but about one request on 10 fails. Following error occurs:
filemtime() [<a href='function.filemtime'>function.filemtime</a>]: stat failed for
and the line that causes the problem:
return ($_template->getCachedFilepath() && file_exists($_template->getCachedFilepath())) ? filemtime($_template->getCachedFilepath()) : false ;
As you can see, file exists because it is checked.
Problematic line of code is included in smarty_internal_cacheresource_file.php (part of Smarty lib v3.0.6)
App is run on UNIX system, external hosting.
Any ideas? Should I post more details?
file_exists internally uses the access system call which checks permissions as the real user, whereas filemtime uses stat, which performs the check as the effective user. Therefore, the problem may be rooted in the assumption of effective user == real user, which does not hold. Another explanation would be that the file gets deleted between the two calls.
Since both the result of $_template->getCachedFilepath() and the existance of the file can change in between system calls, why do you call file_exists at all? Instead, I'd suggest just
return #filemtime($_template->getCachedFilepath());
If $_template->getCachedFilepath() can be set to a dummy value such as false, use the following:
$path = $_template->getCachedFilepath();
if (!$path) return false;
return #filemtime($path);
Use:
Smarty::muteExpectedErrors();
Read this and this
I used filemtime successfully without checking "file_exists" for years. The way I have always interpreted the documentation is that FALSE should be returned from "filemtime" upon any error. Then a few days ago something very weird occurred. If the file did not exist, my Cron job terminated with a result. The result was not in the program output but rather in the Cron output. The message was "file length exceeded". I knew the Cron job ended on the filemtime statement because I sent myself an email before and after that statement. The "after" email never arrived.
I inserted a file_exists check on the file to fix the Cron job. However, that should not have been necessary. I still do not know what was changed on the hosting server I use. Several other Cron jobs started failing on the same day. I do not know yet whether they have anything to do with filemtime.

Can PHP tell when the browser goes away?

If I'm generating a stream of data to send out to a browser, and the user closes the browser, can I tell within PHP that I don't need to bother generating or sending the rest of the stream? I'd like to insert something into this loop:
while (!feof($pipes[1])) {
echo fgets($pipes[1]);
}
My fallback plan is to have the browser use a JavaScript onunload to hit another PHP page to kill the process that's generating the data, but it would be cleaner if PHP could tell when I'm echoing to nowhere.
By default PHP will abort the script if the user navigates away. There are however times where you don't want this to happen so php has a config you set called ignore_user_abort.
http://php.net/manual/en/misc.configuration.php
There's also a function called register_shutdown_function() that is supposedly executed when execution halts. I've never actually used it, so I won't vouch for how well it works, but I thought I'd mention it for completeness.
I believe that script will automatically abort when loaded normally (No ajax). But if you want to implement some sort of long polling via php using xmlhttprequest I think you will have to do it with some sort of javascript because then php can't detect it. Also like to know the precise case.
These answers pointed me towards what I was looking for. The underlying process needed special attention to kill it. I needed to jump out of the loop. Thanks again, Stack Overflow.
while (!feof($pipes[1]) && !connection_aborted())
{
echo fgets($pipes[1]);
}
if (connection_aborted())
{
exec('kill -4 '.$mypid);
}

Categories