PHP script unexpectedly not continuing - php

I have a PHP script (let's call it execute.php) that draws the whole page (HTML tags and body tags etc.) at the beginning and, afer that, executes some commands (C++ programs) in the background. It then waits for these programs to terminate (some depend on the results of others, so they may be executed sequentially) and then has a JavaScript that auto-submits a form to another PHP script (which we will call results.php) because results.php needs the POST-information from the previous script.
execute.php:
<?php
print"
<html>
<body>
Some HTML code here
</body>
</html>
";
// Here come some C++-program calls
$pid_program1 = run_in_background($program1)
$pid_program2 = run_in_background($program2)
while (is_running($pid_program1) or is_running($pid_program2) )
{
//echo(".");
sleep(1);
}
// Here come some later C++-program calls that execute quickly
$pid_program3 = run_in_background($program3)
$pid_program4 = run_in_background($program4)
while (is_running($pid_program3) or is_running($pid_program4) )
{
sleep(1);
}
...
// We are now finished
print "
<form action=\"results.php\" id=\"go_to_results\" method=\"POST\">
<input type='hidden' name=\"session_id\" value=\"XYZ\">
</form>
<script type=\"text/javascript\">
AutoSubmitForm( 'go_to_results' );
</script>
";
This works nicely if the C++ programs 1 and 2 execute quickly. However, when they take their time (around 25 minutes in total), the PHP script seems to fail to continue. Interestingly the C++ programs 3 and 4 are nevertheless executed and produce the expected outputs etc.
However, when I put a echo("."); in the first while-loop before the sleep(), it works and continues until the JavaScript autosubmit.
So it seems to me that the remaining PHP code (including the autosubmit) is, for whatever reason, not send when there is no output in the first while loop.
I have also tried using set_time_limit(0) and ignore_user_abort(true) and different other things like writing to an outputbuffer (don't want to clutter the already finally displayed webpage) instead of the echo, but none of these work.
When I run the same scripts on a machine with multiple cores, so that program1 and 2 can be executed in parallel, it also works, without the echo(".").
So I am currently very confused and can't find any error messages in the apache log or PHP log and thus would really appreciate your thoughts on this one.
EDIT
Thanks again for your suggestions so far.
I have now adopted a solution involving (really simple) AJAX and it's definitely nicer this way.
However, if the C++-programs executions take "longer" it is not autosubmitting to the results-page, which is actually created this time (failed to do so before).
Basically what I have done is:
process.php:
<?php
$params = "someparam=1";
?>
<html>
<body>
<script type="text/javascript">
function run_analyses(params){
// Use AJAX to execute the programs independenantly in the background
// Allows for the user to close the process-page and come back at a later point to the results-link, w/o need to wait.
if (window.XMLHttpRequest)
{
http_request = new XMLHttpRequest();
}
else
{
//Fallback for IE5 and IE6, as these don't support the above writing/code
http_request = new ActiveXObject("Microsoft.XMLHTTP");
}
//Is http_request still false
if (!http_request)
{
alert('Ende :( Kann keine XMLHTTP-Instanz erzeugen');
}
http_request.onreadystatechange=function(){
if (http_request.readyState==4 && http_request.status==200){
// Maybe used to display the progress of the execution
//document.getElementById("output").innerHTML=http_request.responseText;
// Call of programs is finished -> Go to the results-page
document.getElementById( "go_to_results" ).submit();
}
};
http_request.open("POST","execute.php",true);
//Send the proper header information along with the request
http_request.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
http_request.setRequestHeader("Content-length", params.length);
http_request.setRequestHeader("Connection", "close");
http_request.send(params);
};
</script>
<?php
// Do some HTML-markup
...
// Start the programs!
print "
<script type=\"text/javascript\">
run_analyses('".$params."');
</script>
<form action=\"results.html" id=\"go_to_results\" method=\"POST\">
<input type='hidden' name=\"session_id\" value=\"XYZ\">
</form>
?>
</html>
</body>
and execute.php contains the C++-program calls, waiting-routines and finally, via "include("results.php")" the creation of the results-page.
Again, for "not so long" program executions, the autosubmission works as expected, but not if it takes "longer". By "longer" I mean around 25 minutes.
I have absolutely no idea what could cause this as again, there are no error-messages to be found.
Am I missing a crucial configuration option there (apache, php, etc.)?
EDIT
As it turned out, letting the requested PHP-script "echo" something repeatedly prevents the timeout. So it is basically the same as for the PHP-solution without AJAX, but this time, since the responseText of the AJAX-request is not necessarily needed, the progress-page is not cluttered and it may be used as a workaround. Specifically, I would not necessarily recommend it a as a general solution or good-practice.

It occurs to me that a better approach would be to:
Output the complete HTML page
Show a loading message to the user
Send an AJAX request to start the external program
Wait for callback (waiting for external program to finish)
Repeat steps 3 and 4 until all program have been executed
Update the page to tell the user what is going on
Submit the form
This way, you get the HTML to the user as quickly as possible, then you execute the programs sequentially in an orderly and controlled fashion without worrying about hitting the max_execution_time threshold. This also enables you to keep your user informed - after each AJAX callback, you can tell the user that "program ABC has completed, starting DEF..." and so on.
EDIT
Per request, I'll add an outline of how this could be implemented. A caveat, too: If you are going to be adding more javascript-derived functionality to your page, you'll want to consider using a library like jQuery or mootools (my personal favorite). This is a decision you should make right away - if you aren't going to be doing a lot of javascript except this, then a library will only bloat your project, but if you are going to be adding a lot of javascript, you don't want to have to come back later and re-write your code because you add a library 3/4 of the way through the project.
I've used mootools to create this demonstration, but it isn't necessary or even advisable to add mootools if this is the only thing you're going to use it for. It is simply easier for me to write an example really quick without having to stop and think :)
First, the main page. We'll call this page view.php. This should contain your initial HTML as well as the javascript that will fire off the AJAX requests. Basically, this entire jsFiddle would be view.php: http://jsfiddle.net/WPnEy/1/
Now, execute.php looks like this:
$program_name = isset($_POST['program_name']) ? $_POST['program_name'] : false;
switch ($program_name) {
case 'program1':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 1';
break;
case 'program2':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 2';
break;
case 'program3':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 3';
break;
case 'program4':
$program_path = '/path/to/executable/';
$friendly_name = 'Some program 4';
break;
default:
die(json_encode(array(
'program_name'=>'Invalid',
'status'=>'FAILED',
'error'->true,
'error_msg'=>'Invalid program'
)));
break;
}
$pid = run_in_background($program_path)
while (is_running(pid)) {
sleep(1);
}
// check here for errors, get any error messages you might have
$error = false;
$error_msg = '';
// use this for failures that are not necessarily errors...
$status = 'OK';
die(json_encode(array(
'program_name'=>$friendly_name,
'status'=>$status,
'error'->$error,
'error_msg'=>$error_msg
)));
execute.php would then be called once for each program. The $friendly_program variable gives you a way to send back something for the user to see. The switch statement there makes sure that the script isn't being asked to execute anything you aren't expecting. The program is executed, and when it is done you send along a little package of information with the status, the friendly name, any errors, etc. This comes into the javascript on view.php, which then decides if there are more programs to run. If so, it will call execute.php again... if not, it will submit the form.

This seems rather convoluted... And very risky. Any network glitches, the user's browser closing for whatever reason, and even a firewall timing out, and this script is aborted.
Why not run the whole thing in the background?
<?php
session_start();
$_SESSION['background_run_is_done'] = false;
session_write_close(); // release session file lock
set_time_limit(0);
ignore_user_abort(true); // allow job to keep running even if client disconnects.
.... your external stuff here ...
if ($successfully_completed) {
session_start(); // re-open session file to update value
$_SESSION['background_run_is_done'] = TRUE;
}
... use curl to submit job completion post here ...
?>
This disconnects the state of the user's browser from the processing of the jobs. You then just have your client-side code ping the server occasionally to monitor the job's progress.

Launching and managing multiple and long-running processes from a webserver PHP process is fraught with complications and complexity. It's also very different on different platforms (you didn't say which you are using).
Handling the invocation of these processes synchronously from the execution of your PHP is not the way to address this. You really need to run the programs in a seperate session group - and use (e.g.) Ajax or Comet to poll the status of them.

Related

Stop PHP with ajax

I have a JavaScript functions which calls a PHP function through AJAX.
The PHP function has a set_time_limit(0) for its purposes.
Is there any way to stop that function when I want, for example with an HTML button event?
I want to explain better the situation:
I have a php file which uses a stream_copy_to_stream($src, $dest) php function to retrieve a stream in my local network. The function has to work until I want: I can stop it at the end of the stream or when I want. So I can use a button to start and a button to stop. The problem is the new instance created by the ajax call, in fact I can't work on it because it is not the function that is recording but it is another instance. I tried MireSVK's suggest but it doesn't worked!
Depending on the function. If it is a while loop checking for certain condition every time, then you could add a condition that is modifiable from outside the script (e.g. make it check for a file, and create / delete that file as required)
It looks like a bad idea, however. Why you want to do it?
var running = true;
function doSomething(){
//do something........
}
setInterval(function(){if(running){doSomething()}},2000); ///this runs do something every 2 seconds
on button click simply set running = false;
Your code looks like:
set_time_limit(0);
while(true==true){//infinite loop
doSomething(); //your code
}
Let's upgrade it
set_time_limit(0);
session_start();
$_SESSION['do_a_loop'] = true;
function should_i_stop_loop(){
#session_start();
if( $_SESSION['do_a_loop'] == false ) {
//let's stop a loop
exit();
}
session_write_close();
}
while(true==true){
doSomething();
should_i_stop_loop(); //your new function
}
Create new file stopit.php
session_start();
$_SESSION['do_a_loop'] = false;
All you have to do now is create a request on stopit.php file (with ajax or something)
Edit code according to your needs, this is point. One of many solutions.
Sorry for my English
Sadly this isn't possible (sort of).
Each time you make an AJAX call to a PHP script the script spawns a new instance of itself. Thus anything you send to it will be sent to a new operation, not the operation you had previously started.
There are a number of workarounds.
Use readystate 3 in AJAX to create a non closing connection to the PHP script, however that isn't supported cross browser and probably won't work in IE (not sure about IE 10).
Look into socket programming in PHP, which allows you to create a script with one instance that you can connect to multiple times.
Have PHP check a third party. I.E have one script running in a loop checking a file or a database, then connect to another script to modify that file or database. The original script can be remotely controlled by what you write to the file/database.
Try another programming language (this is a silly option, but I'm a fan of node). Node.js does this sort of thing very very easily.

using ignore_user_abort and set_time_limit(0)

I have a form in page1.php which both redirects to page3.php and also triggers an ajax post in page2.php (with no success function), page2.php might need to run for an hour, but the user doesn't need the results. I do need the user to see page2.php, but he might navigate away.
Do I need to use in page2.php these 2 functions? Or just one of them? Or none? I want to make sure the script in page2.php runs until the end.
Page1.php
<form id="form" action="page2.php" method="post">
<!--form input elements with names-->
</form>
<script>
$('#form').submit(function() {
$.post('page3.php', {
var1: $('input[name="name1"]').val(),
var2: $('input[name="name2"]').val(),
});
});
</script>
Page2.php
<?php
ignore_user_abort(true); // Allow user to navigate away
set_time_limit(0); // Allow script to run indefinitely
// a lot of code which will run for a while - between 3 minutes and an hour
?>
Page3.php
<html>
<!--some code here including links to go to page4.php-->
</html>
I am asking partly because I thought there is no need for any of these functions, but was told to use them, but when I try using them, eventhough there is die(); and the script stops, it still seems to be processsing something and I'm afraid because of this "indefinitely" it will be too much on the server.
As I don't want to add unnecessary loads.
Yes you would need both of those functions in order to accomplish your current criteria, my suggestion would be to move this out of the http protocal. Depending on what your script is actually accomplishing if it requires no further interaction from the client it would be best used in the command line.
A theory of use would be to create a cron script that is called at the needed intervals, it would then access a queue which page2.php would populate.
If there is a queue available the cron script would process the information as it is currently done on page2.php. Since your script runs for a long period of time I would suggest using a locking mechanism for the cron, see php.net/flock for a simple file system lock. You check the file if its locked its already running.
Here is a simple example that you put into a standalone script for processing via cron:
$fp = fopen(DATA_PATH . '/locks/isLocked', 'w+');
if (!flock($fp, LOCK_EX | LOCK_NB)) { //locks the file
$logger->info('Already Running');
exit(0);
}
fwrite($fp, time()); //write our new time so we can inspect when it ran last if needed
try {
if (hasQueue()) { //checks to see if any jobs are waiting in mysql
run(); //process normally completed by page2.php
}
} catch (Exception $e) {
//something went wrong here could setup a log / email yourself etc..
}
flock($fp, LOCK_UN); //unlock the file

Keep posting number on line with for statements PHP

I am looking for something that shows a number on each line, rather than just all the numbers after the page has loaded.
the code for instance is:
for($a=0;$a<=10;$a++){
echo $a;
echo '<br>';
}
The output would of course be:
1
2
all the way to 10 after the page would load,
but I want it to show,
1
then 2, without the browser just loading everything.
I want a pause and to watch the numbers increase.
You can force php to flush it's output with flush(). Of course, if PHP's output buffering is enabled, this will only flush it into the output buffer.
Once PHP flushes, it's not guaranteed to go directly to the browser, PHP flushes to the web server, which sends it on to the browser depending on the web server's own configuration.
However, as far as PHP is concerned, the following will work (at least on the command line, or on your webserver, if it's configured right):
demo.php
<?php
ob_end_flush(); // make sure output buffering is off
for($i=0;$i<10;$i++){
echo "{$i}\n";
flush();
sleep(1);
}
From the command line:
$ php demo.php
should display 1... 2 ... 3 ... with a one second delay between.
EDIT: One more thing I thought of. Even if the web server does "stream" your output as you flush from PHP, if your output is in the middle of other markup, the users' browsers may not render anything until the entire response is received.
That said, if you're doing something basic, I've used the above strategy to output status for long-running utility scripts. In those cases, I probably didn't even include tags in my output, but it worked like you want it to (at least on the servers I was dealing with at the time).
This approach might be good enough for internal tools, but I'd never rely on this technique for anything end-users might ever see.
Here is a php script which will accomplish this. It isn't really a PHP script, but rather a page that accomplishes the same with javascript. Which gives you much more control with what happens on the clients browser. With output buffering and everything else, there's no way to guarantee the browser will render it as you want to just controlling the output with PHP.
<?php ?>
<html>
<head>
<title>
The Counting Page
</title>
<script type="text/javascript">
var oldbody;
function countTo(a,b)
{
if(oldbody == null)
{
oldbody = document.body.innerHTML;
document.body.innerHTML = '';
}
if(a <= b)
{
elem = document.createElement("div");
elem.innerHTML = a;
document.body.appendChild(elem);
a++;
setTimeout('countTo(' + a + ',' + b + ')',1000);
}
else
{
document.body.innerHTML = oldbody;
}
}
</script>
<head>
<body onload="countTo(1,10)">
Here is the body text.
</body>
</html>

Ajax long polling (comet) + php on lighttpd v1.4.22 multiple instances problem

I am new to this site, so I really hope I will provide all the necessary information regarding my question.
I've been trying to create a "new message arrived notification" using long polling. Currently I am initiating the polling request by window.onLoad event of each page in my site.
On the server side I have an infinite loop:
while(1){
if(NewMessageArrived($current_user))break;
sleep(10);
}
echo $newMessageCount;
On the client side I have the following (simplified) ajax functions:
poll_new_messages(){
xmlhttp=GetXmlHttpObject();
//...
xmlhttp.onreadystatechange=got_new_message_count;
//...
xmlhttp.send();
}
got_new_message_count(){
if (xmlhttp.readyState==4){
updateMessageCount(xmlhttp.responseText);
//...
poll_new_messages();
}
}
The problem is that with each page load, the above loop starts again. The result is multiple infinite loops for each user that eventually make my server hang.
*The NewMessageArived() function queries MySQL DB for new unread messages.
*At the beginning of the php script I run start_session() in order to obtain the $current_user value.
I am currently the only user of this site so it is easy for me to debug this behavior by writing time() to a file inside this loop. What I see is that the file is being written more often than once in 10 seconds, but it starts only when I go from page to page.
Please let me know if any additional information might help.
Thank you.
I think I found a solution to my problem. I would appreciate if anyone could tell, if this is the technique that is being used in COMET and how scalable this solution.
I used a user based semaphore like this:
$sem_id = sem_get($current_user);
sem_acquire($sem_id);
while(1){
if(NewMessageArrived($current_user))break;
sleep(10);
}
sem_release($sem_id);
echo $newMessageCount;
It seems common for long-polling requests to timeout after 30 seconds. So in your while loop you could echo 'CLOSE' after 30 seconds.
while(!$new_message && $timer < 30){
$new_message = NewMessageArrived($current_user);
if(!$new_message) {
sleep(10);
$timer += 10;
}
}
if($newMessageCount) {
echo $newMessageCount;
} else {
echo 'CLOSE';
}
In the Javascript, you can listen for the CLOSE.
function poll_new_messages(){
xmlhttp=GetXmlHttpObject();
//...
xmlhttp.onreadystatechange=got_new_message_count;
//...
xmlhttp.send();
}
function got_new_message_count(){
if (xmlhttp.readyState==4){
if(xmlhttp.responseText != 'CLOSE') {
updateMessageCount(xmlhttp.responseText);
}
//...
poll_new_messages();
}
}
Now, the PHP will return a response within 30 seconds, no matter what. If you use stays on the page, and you receive a CLOSE, you just don't update the count on the page, and re-ask.
If the user moves to a new page, your PHP instance will stop the loop regardless within 30 seconds, and return a response. Being on a new page though, the XHR that cared about that connection no longer exists, so it won't start up another loop.
You might try checking connection_aborted() periodically. Note that connection_aborted() might not pick up on the fact that the connection has in fact been aborted until you've written some output and done a flush().
In fact, just producing some output periodically may be sufficient for php to notice the connection close itself, and automatically kill your script.

Being redirected to the wrong place - sometimes

I have a javascript setInterval that checks an external page every 5 seconds for mail, I am finding sometimes that if I login or click a form submit at the same time as the request goes out, I sometimes find myself looking at a Y or a N (what my JS was to intercept) instead of the real link I wanted to go to.
How does one debug this? I am using firefox with firebug, my app is using PHP with javascript.
EDIT: it's almost as if the onComplete is being missed by java, and it just dumps it as the user is signing in.... it only happens when someone is changing pages and the java is running at the same time.
EDIT 2: If you want to see this for yourself, you'll need visit my site and create an account and go through the signup process (2-3 mins to do tops), the website is http://mikesandmegs.com and the beta password is goldfish. What you want to do is login just as the check mail sends its request off. Its like I need to cancel something or tell java to throw the callback out or something. You should see the requests every 5 seconds, (well it adds 5 seconds each request) but you'll see. It may take a couple try's or some luck, but it is reproducible.
This is the javascript that is running (i think I have it all posted) If I seem to be missing anything, let me know. I also posted an htnl input html that the javascript checks...
<input id="hasMail" type="hidden" value="y">
<script type='text/javascript'>
mailTimer = setInterval("checkMail();", 10000);
function checkMail()
{
// should we check the mail now?
if ($('hasMail').value == "y")
{
// remove mail new mail alert (mail-check.php returns y or n
new Ajax.Request('mail-check.php',
{
method: 'post',
postBody: '',
onComplete: checkMailNotify
});
}
}
function checkMailNotify(req)
{
if (req.responseText.length > 5)
{
$('hasMail').value = "n";
clearInterval (mailTimer);
return;
}
if (req.responseText == "y")
{
$('hasMail').value = "n";
$('topMessage').update('You have new mail...');
$('alertBox').appear();
clearInterval (mailTimer);
}
else
{
clearInterval (mailTimer);
mailInterval = mailInterval + 5000;
mailTimer = setInterval("checkMail();", mailInterval);
}
}
</script>
I know this is nowhere near a solution, but it WILL help to increase the 5 second interval, even to something like 30 seconds. I've done work with mailservers before, and we often came across problems where people would have e.g their iphone as well as their desktop mail client ping the server at very short intervals. This would result in confusing (to them) failures because of locks.
So yeah, 5 seconds for messages is very quick (it doesn't look like chat but rather just messages, is that right?). At best if you do that then the problem will happen a lot less if it all. You will however have the horrible knowledge that it can happen.
Please don't take this as an attempt at a solution to your problem. just a suggestion.
I think what's happening is that while changing pages, the data from the mail-check.php is clashing with the new request that is coming back from the network at the same time. I think a possible solution is to disable the setInterval whenever you change a page or submit a form, then re-enable it after loading the new data.
Something like:
<input type="button" onClick="clearInterval('mailTimer'); this.submit()" />
...

Categories