Causing two things to load in parallel? - php

I'm writing some PHP that does a fair amount of processing and then generates reports of the results. Previously it would do a periodic flush() but we're moving to Zend Framework and can't do that anymore. Instead, I would like to have some kind of status display that updates while the report is generated. So I made a progress bar that loads in an iframe, added shared memory to the progress bar update action and the report generation action, and caused the output to load via xmlhttprequest. This all works fine. My issue is that the browser wants to do the two requests serially instead of in parallel, so it will request the progress bar and then BLOCK until the progress bar completes BEFORE it requests the actual output. This means that the process will never end since the real work never starts.
I've searched all morning for some way around this and came up empty-handed.
Is there some way to cause two connections, or am I just screwed?
My next action will be to break the processing apart some more and make the status updating action do the actual work, save the result, and then use the other action to dump it. This will be really painful and I'd like to avoid it.
Edit: Here is the javascript, as requested:
function startProgress()
{
var iFrame = document.createElement('iframe');
document.getElementsByTagName('body')[0].appendChild(iFrame);
iFrame.id = 'progressframe';
iFrame.src = '/report/progress';
}
function Zend_ProgressBar_Update(data)
{
document.getElementById('pg-percent').style.width = data.percent + '%';
document.getElementById('pg-text-1').innerHTML = data.text;
document.getElementById('pg-text-2').innerHTML = data.text;
}
function Zend_ProgressBar_Finish()
{
document.getElementById('pg-percent').style.width = '100%';
document.getElementById('pg-text-1').innerHTML = 'Report Completed';
document.getElementById('pg-text-2').innerHTML = 'Report Completed';
document.getElementById('progressbar').style.display = 'none'; // Hide it
}
function ajaxTimeout(){
xmlhttp.abort();
alert('Request timed out');
}
var xmlhttp;
var xmlhttpTimeout;
function loadResults(){
if (window.XMLHttpRequest){
// code for IE7+, Firefox, Chrome, Opera, Safari
xmlhttp=new XMLHttpRequest();
}else{
// code for IE6, IE5
xmlhttp=new ActiveXObject(\"Microsoft.XMLHTTP\");
}
xmlhttp.open(\"POST\",\"/report/output\",true);
xmlhttp.onreadystatechange=function(){
if (xmlhttp.readyState == 4 && xmlhttp.status == 200) {
clearTimeout(xmlhttpTimeout);
document.getElementById('report-output').innerHTML=xmlhttp.responseText;
}
}
var xmlhttpTimeout=setTimeout(ajaxTimeout,600000); // Ten minutes
xmlhttp.setRequestHeader('Content-Type','application/x-www-form-urlencoded');
xmlhttp.send('".file_get_contents("php://input")."');
}
This gets called from the following onload script:
onload="startProgress(); setTimeout(loadResults,1000);"
The issue is not in Javascript. If you put an alert() in there, the alert will be triggered at the right time, but the browser is delaying the second http transaction until the first completes.
Thank you everyone for your input.
I didn't come up with a satisfactory answer for this within the timeframe permitted by our development schedule. It appears that every common browser wants to re-use an existing connection to a site when doing multiple transactions with that site. Nothing I could come up with would cause the browser to initiate a parallel connection on demand. Any time there are two requests from the same server the client wants to do them in a serial fashion.
I ended up breaking the processing into parts and moving it into the status bar update action, saving the report output into a temporary file on the server, then causing the status bar finish function to initiate the xmlhttprequest to load the results. The output action simply spits out the contents of the temporary file and then deletes it.

Using two async ajaxes could do the trick. With the first ajax request you should start the process by calling the php-cli to do the actual work deep in the background (so it doesn't expire or cancel) and return the id of the process (task). Now when you have the process id, you can start the periodical ajax to display the process made.
Making a db table containing process_id, state, user would not be a bad thing. In this case even if the user would close the browser while the process is running, the process would continue until done. The user could revisit the page and see the percentage done, because the process running in cli would save the progress into the db table.

Make a system call to the php file and detach it?
ex:
exec('nohup php test.php > test.out 2> test.err < /dev/null &');
echo 'I am totally printing here';
test.php contains a sleep for 2 seconds and prints, but echo returns immediately.
Have it store the results in a file/database/whatever. It will act like a very dirty fork.
You could also do something similar with a CURL call I bet if you have issues executing.
Credit here for the code example from bmellink (mine was way worse than his).

If you are able to load the report in the iFrame, you can kind of reverse your logic (I have done this to track file uploads to PHP).
Load Report in iFrame (can be hidden or whatever you like).
Make ajax call to get progress (step 1 will have to log progress as others have mentioned).
When the progress reports loading complete, you may show the iframe or whatever is needed to complete.
Hope that helps. Just did a whole lot with iFrames, CORS, and Ajax calls to API's.

Related

Efficient and user-friendly way to present slow-loading results

I have read many similar questions concerning cancelling a POST request with jQuery, but none seem to be close to mine.
I have your everyday form that has a PHP-page as an action:
<form action="results.php">
<input name="my-input" type="text">
<input type="submit" value="submit">
</form>
Processing results.php on the server-side, based on the post information given in the form, takes a long time (30 seconds or even more and we expect an increase because our search space will increase as well in the coming weeks). We are accessing a Basex server (version 7.9, not upgradable) that contains all the data. A user-generated XPath code is submitted in a form, and the action url then sends the XPath code to the Basex server which returns the results. From a usability perspective, I already show a "loading" screen so users at least know that the results are being generated:
$("form").submit(function() {
$("#overlay").show();
});
<div id="overlay"><p>Results are being generated</p></div>
However, I would also want to give users the option to press a button to cancel the request and cancel the request when a user closes the page. Note that in the former case (on button click) this also means that the user should stay on the same page, can edit their input, and immediately re-submit their request. It is paramount that when they cancel the request, they can also immediately resend it: the server should really abort, and not finish the query before being able to process a new query.
I figured something like this:
$("form").submit(function() {
$("#overlay").show();
});
$("#overlay button").click(abortRequest);
$(window).unload(abortRequest);
function abortRequest() {
// abort correct request
}
<div id="overlay">
<p>Results are being generated</p>
<button>Cancel</button>
</div>
But as you can see, I am not entirely sure how to fill in abortRequest to make sure the post request is aborted, and terminated, so that a new query can be sent. Please fill in the blanks! Or would I need to .preventDefault() the form submission and instead do an ajax() call from jQuery?
As I said I also want to stop the process server-side, and from what I read I need exit() for this. But how can I exit another PHP function? For example, let's say that in results.php I have a processing script and I need to exit that script, would I do something like this?
<?php
if (isset($_POST['my-input'])) {
$input = $_POST['my-input'];
function processData() {
// A lot of processing
}
processData()
}
if (isset($_POST['terminate'])) {
function terminateProcess() {
// exit processData()
}
}
and then do a new ajax request when I need to terminate the process?
$("#overlay button").click(abortRequest);
$(window).unload(abortRequest);
function abortRequest() {
$.ajax({
url: 'results.php',
data: {terminate: true},
type: 'post',
success: function() {alert("terminated");});
});
}
I did some more research and I found this answer. It mentions connection_aborted() and also session_write_close() and I'm not entirely sure which is useful for me. I do use SESSION variables, but I don't need to write away values when the process is cancelled (though I would like to keep the SESSION variables active).
Would this be the way? And if so, how do I make one PHP function terminate the other?
I have also read into Websockets and it seems something that could work, but I don't like the hassle of setting up a Websocket server as this would require me to contact our IT guy who requires extensive testing on new packages. I'd rather keep it to PHP and JS, without third party libraries other than jQuery.
Considering most comments and answers suggest that what I want is not possible, I am also interested to hear alternatives. The first thing that comes to mind is paged Ajax calls (similar to many web pages that serve search results, images, what-have-you in an infinite scroll). A user is served a page with the X first results (e.g. 20), and when they click a button "show next 20 results" those are shown are appended. This process can continue until all results are shown. Because it is useful for users to get all results, I will also provide a "download all results" option. This will then take very long as well, but at least users should be able to go through the first results on the page itself. (The download button should thus not disrupt the Ajax paged loads.) It's just an idea, but I hope it gives some of you some inspiration.
On my understanding the key points are:
You cannot cancel a specific request if a form is submitted. Reasons are on client side you don't have anything so that you can identify the states of a form request (if it is posted, if it is processing, etc.). So only way to cancel it is to reset the $_POST variables and/or refresh the page. So connection will be broken and the previous request will not be completed.
On your alternative solution when you are sending another Ajax call with {terminate: true} the result.php can stop processing with a simple die(). But as it will be an async call -- you cannot map it with the previous form submit. So this will not practically work.
Probable solution: submit the form with Ajax. With jQuery ajax you will have an xhr object which you can abort() upon window unload.
UPDATE (upon the comment):
A synchronous request is when your page will block (all user actions) until the result is ready. Pressing a submit button in the form - do a synchronous call to server by submitting the form - by definition [https://www.w3.org/TR/html-markup/button.submit.html].
Now when user has pressed submit button the connection from browser to server is synchronous - so it will not be hampered until the result is there. So when other calls to server is made - during the submit process is going on - no reference of this operation is available for others - as it is not finished. It is the reason why sending termination call with Ajax will not work.
Thirdly: for your case you can consider the following code example:
HTML:
<form action="results.php">
<input name="my-input" type="text">
<input id="resultMaker" type="button" value="submit">
</form>
<div id="overlay">
<p>Results are being generated</p>
<button>Cancel</button>
</div>
JQUERY:
<script type="text/javascript">
var jqXhr = '';
$('#resultMaker').on('click', function(){
$("#overlay").show();
jqXhr = $.ajax({
url: 'results.php',
data: $('form').serialize(),
type: 'post',
success: function() {
$("#overlay").hide();
});
});
});
var abortRequest = function(){
if (jqXhr != '') {
jqXhr.abort();
}
};
$("#overlay button").on('click', abortRequest);
window.addEventListener('unload', abortRequest);
</script>
This is example code - i just have used your code examples and changed something here and there.
Himel Nag Rana demonstrated how to cancel a pending Ajax request.
Several factors may interfere and delay subsequent requests, as I have discussed earlier in another post.
TL;DR: 1. it is very inconvenient to try to detect the request was cancelled from within the long-running task itself and 2. as a workaround you should close the session (session_write_close()) as early as possible in your long-running task so as to not block subsequent requests.
connection_aborted() cannot be used. This function is supposed to be called periodically during a long task (typically, inside a loop). Unfortunately there is just one single significant, atomic operation in your case: the query to the data back end.
If you applied the procedures advised by Himel Nag Rana and myself, you should now be able to cancel the Ajax request and immediately allow a new requests to proceed. The only concern that remains is that the previous (cancelled) request may keep running in the background for a while (not blocking the user, just wasting resources on the server).
The problem could be rephrased to "how to abort a specific process from the outside".
As Christian Bonato rightfully advised, here is a possible implementation. For the sake of the demonstration I will rely on Symphony's Process component, but you can devise a simpler custom solution if you prefer.
The basic approach is:
Spawn a new process to run the query, save the PID in session. Wait for it to complete, then return the result to the client
If the client aborts, it signals the server to just kill the process.
<?php // query.php
use Symfony\Component\Process\PhpProcess;
session_start();
if(isset($_SESSION['queryPID'])) {
// A query is already running for this session
// As this should never happen, you may want to raise an error instead
// of just silently killing the previous query.
posix_kill($_SESSION['queryPID'], SIGKILL);
unset($_SESSION['queryPID']);
}
$queryString = parseRequest($_POST);
$process = new PhpProcess(sprintf(
'<?php $result = runQuery(%s); echo fetchResult($result);',
$queryString
));
$process->start();
$_SESSION['queryPID'] = $process->getPid();
session_write_close();
$process->wait();
$result = $process->getOutput();
echo formatResponse($result);
?>
<?php // abort.php
session_start();
if(isset($_SESSION['queryPID'])) {
$pid = $_SESSION['queryPID'];
posix_kill($pid, SIGKILL);
unset($pid);
echo "Query $pid has been aborted";
} else {
// there is nothing to abort, send a HTTP error code
header($_SERVER['SERVER_PROTOCOL'] . ' 599 No pending query', true, 599);
}
?>
// javascript
function abortRequest(pendingXHRRequest) {
pendingXHRRequest.abort();
$.ajax({
url: 'abort.php',
success: function() { alert("terminated"); });
});
}
Spawning a process and keeping track of it is genuinely tricky, this is why I advised using existing modules. Integrating just one Symfony component should be relatively easy via Composer: first install Composer, then the Process component (composer require symfony/process).
A manual implementation could look like this (beware, this is untested, incomplete and possibly unstable, but I trust you will get the idea):
<?php // query.php
session_start();
$queryString = parseRequest($_POST); // $queryString should be escaped via escapeshellarg()
$processHandler = popen("/path/to/php-cli/php asyncQuery.php $queryString", 'r');
// fetch the first line of output, PID expected
$pid = fgets($processHandler);
$_SESSION['queryPID'] = $pid;
session_write_close();
// fetch the rest of the output
while($line = fgets($processHandler)) {
echo $line; // or save this line for further processing, e.g. through json_encode()
}
fclose($processHandler);
?>
<?php // asyncQuery.php
// echo the current PID
echo getmypid() . PHP_EOL;
// then execute the query and echo the result
$result = runQuery($argv[1]);
echo fetchResult($result);
?>
With BaseX 8.4, a new RESTXQ annotation %rest:single was introduced, which allows you to cancel a running server-side request: http://docs.basex.org/wiki/RESTXQ#Query_Execution. It should solve at least some of the challenges you described.
The current way to only return chunks of the result is to pass on the index to the first and last result in your result, and to do the filtering in XQuery:
$results[position() = $start to $end]
By returning one more result than requested, the client will know that there will be more results. This may be helpful, because computing the total result size is often much more expensive than returning only the first results.
I hope I understood this correctly.
Instead of letting the browser "natively" submit the FORM, don't: write JS code that does this instead. In other words (I didn't test this; so interpret as pseudo-code):
<form action="results.php" onsubmit="return false;">
<input name="my-input" type="text">
<input type="submit" value="submit">
</form>
So, now, when the that "submit" button is clicked, nothing will happen.
Obviously, you want your form POSTed, so write JS to attach a click handler on that submit button, collect values from all input fields in the form (actually, it is NOT nearly as scary as it sounds; check out the link below), and send it to the server, while saving the reference to the request (check the 2nd link below), so that you can abort it (and maybe signal the server to quit also) when the cancel-button is clicked (alternatively, you can simply abandon it, by not caring about the results).
Submit a form using jQuery
Abort Ajax requests using jQuery
Alternatively, to make that HTML markup "clearer" relative to its functionality, consider not using FORM tag at all: otherwise, what I suggested makes its usage confusing (why it is there if it's not used; know I mean?). But, don't get distracted with this suggestion until you make it work the way you want; it's optional and a topic for another day (it might even relate to your changing architecture of the whole site).
HOWEVER, a thing to think about: what to do if the form-post already reached the server and server already started processing it and some "world" changes have already been made? Maybe your get-results routine doesn't change data, so then that's fine. But, this approach probably cannot be used with change-data POSTs with the expectation that "world" won't change if cancel-button is clicked.
I hope that helps :)
The user doesn't have to experience this synchronously.
Client posts a request
The server receives the client request and assigns an ID to it
The server "kicks off" the search and responds with a zero-data page and search ID
The client receives the "placeholder" page and starts checking if the results are ready based on the ID (with something like polling or websockets)
Once the search has completed, the server responds with the results next time it's polled (or notifies the client directly when using websockets)
This is fine when performance isn't quite the bottleneck and the nature of processing makes longer wait times acceptable. Think flight search aggregators that routinely run for 30-90 seconds, or report generators that have to be scheduled and run for even longer!
You can make the experience less frustrating if you don't block user interactions, keep them updated of search progress and start showing results as they come in if possible.
You must solve this conceptually first before writing any code. Here are some things that come to mind offhand:
What does it mean to free up resources on the server?
What constitutes to a graceful abort that will free up resources?
Is it enough to kill the PHP process waiting for the query result(s)? If so, the route suggested by RandomSeed could be interesting. Just keep in mind that it will only work on a single server. If you have multiple load balanced servers you won't have a way to kill a process on another server (not as easily at least).
Or do you need to cancel the database request from the database itself? In that case the answer suggested by Christian GrĂ¼n is of more interest.
Or is it that there is no graceful shutdown and you have to force everything to die? If so, this seems awfully hacky.
Not all clients are going to explicitly abort
Some clients are going to close the browser, but their last request won't come through; some clients will lose internet connection and leave the service hanging, etc. You are not guaranteed to get an "abort" request when a client disconnects or has gone away.
You have to decide whether to live with potentially unwanted behavior, or implement an additional active state tracking, e.g. client pinging server for keepalive.
Side notes
30 secs or greater query time is potentially long, is there a better tool for the job; so you won't have to solve this with a hack like this?
you are looking for features of a concurrent system, but you're not using a concurrent system; if you want concurrency use a better tool/environment for it, e.g. Erlang.

Wait few second before closing page

I need a script that, when user closes the page, waits few seconds (without popups) and then closes the page.
I remember seeing somewhere here a way to do this, using ajax (if I remember correctly), by running a php file and waiting the answer before closing, but I can't find it anymore. The php file contained sleep-function.
Any help is greatly appreciated
(This is used mainly to fade out text. When user comes to site text fades in via css3 transition, and when he leaves page the text fades out. I just need time for fadeout. Yes, I know this is not user-friendly but I was specially asked to do it this way)
Your probably thinking of a synchronous ajax request (which blocks the UI):
window.addEventListener('unload',function()
{
var xhr = new XMLHttpRequest();
xhr.open('GET', 'script.php?when=unload',false);//<-- false makes request synchronous
xhr.send();
},false);
But there are other ways, check this answer
On the whole, I'd not do things like this. If a site attempted to deny me the option of closing the window when I feel like it, I'd never use/visit it again. That, and the fact that your JS code is still subject to how the browser implements it, and the browser can be controlled by the client's OS. If I close the browser application, a JS event has nothing to say in that matter, especially if I terminate the browser process (using kill -9, or ctrl+alt+del).
The very least you can do is offer the client a choice, to either force-quit, or wait, explaining why you'd rather the client waited a while:
window.addEventListener('beforeunload',function u(e)
{
var forceQuit = confirm('\tDo you wish to leave Now?\n
if you do, some changes you made won\'t be saved');
if (forceQuit)
{
return e;
}
//synchronous ajax result here, or:
e.returnValue = false;
e.cancelBubble = true;
if (e.preventDefault)
{
e.preventDefault();
e.stopPropagation();
}
setTimeout(function()
{//first, remove handler, so the beforeunload's behaviour is back to default
window.removeEventListener('beforeunload',u,false);
//dispatch new beforeunload event:
window.dispatchEvent( new Event('beforeunload'));
},5000);
},false);
Have a look at jquery unload. You can bind a delay-function to the unload-event.

Run 'Long Wait' Script On Page Load, With AJAX Check If Its Finished

I have a webpage that i am embedding a script in that could take up to 10 minutes to run backend.
I have tried many different versions of an ajax script loader with a timer.
What i need:
On Page Load, I need to trigger the main working script to run.
The very last line of this script will create a unique text file in a folder, with filename of the user, so that the file appears once the script has completed.
And, then triggered from page load also, would be an AJAX function, to load a second script every 10 seconds,
this seconds script is very minimal, and checks if the user file (from script above) is in the specified dir.
If the file is not there yet (script still working), then it echoes
<img src"../loading.gif">
and if the file is now there (script has finished), then it echoes a link (or maybe a header to another page, i haven't decided about that yet)...
this will mean that on page load, the main processing script starts, and also trigger the ajax script (which will instantly display loading image), and once the main script has finished executing, the loading image will change to a link (or maybe just re-direct you to another page)
sorry for rambling, just tying to give as much info as possible...
ps, i presume i will need a simple load once ajax function to call the main processing script, so that it works in the background, or the main page will take ages to load
my latest attempt:
function MakeRequest()
{
var xmlHttp = getXMLHttp();
xmlHttp.onreadystatechange = function()
{
if(xmlHttp.readyState == 4)
{
HandleResponse(xmlHttp.responseText);
}
}
xmlHttp.open("GET", "processing_script.php?user=<?php echo $_SESSION['user']; ?>", true);
xmlHttp.send(null);
}
function HandleResponse(response)
{
document.getElementById('ResponseDiv').innerHTML = response;
}
To load a file every 5 seconds you could call your function every xy seconds with the setInterval function.
Have you considered using a javascript framework such as jquery? They provide some very easy to use ajax-methods to simplify the whole process. Handling an ajax-request "manually" is always a "pain in the ass" to me.

php monitor process

I'm sure similar questions have bee answered over and over again. If yes then I googled in the wrong direction and apologize for that.
My problem:
I'm writing a web page with a process running in the background. The process I'm talking about is a R script which runs quite long maybe several days. When the progress is started, its started like that.
exec(sprintf("%s > %s 2>&1 & echo $! >> %s", $cmd, $outputfile, $pidfile));
Id' like to track whether the process is still running. This way, when the user checks he gets either the message that it is finished or not. The tracking starts and stops when the user chooses the input file he uploaded on to the server after for example he logs in again or did something else on the page. I also want it to update the page when the process finishes so the page changes in case he is just looking at it.
I have two possibilities. I can either check it via the process id or whether an output file is generated or not.
I tried
while(is_process_running($ps)){
ob_flush();
flush();
sleep(1);
}
this works kind of, except that all other functionality on the page freezes.
That's my is_process_running($ps) function.
function is_process_running($PID){
exec("ps $PID", $ProcessState);
return(count($ProcessState) >= 2);
}
What I really need is another process running in the background checking whether the first process is still running and if not, refreshes the page when the first process finishes.
How would you do that? Please let me know if you need additional information. All help is much appreciated.
with symcbean's answer I was able to solve it. My javascript code is below maybe its of use to someone who faces the same problem. I'm also open for improvement. I still consider myself a beginner.
function loadXMLDoc(){
var xmlhttp;
if (window.XMLHttpRequest){// code for IE7+, Firefox, Chrome, Opera, Safari
xmlhttp=new XMLHttpRequest();
}else{// code for IE6, IE5
xmlhttp=new ActiveXObject("Microsoft.XMLHTTP");
}
return xmlhttp;
}
function start(file,time){
var xmlhttp = loadXMLDoc();
if(lines == 0){
timer = setInterval(function(){sendRequest(xmlhttp,file)},time);
}
}
function sendRequest(xmlhttp,file){
xmlhttp.open("POST",file,true);
xmlhttp.send()
xmlhttp.onreadystatechange=function() {
if (xmlhttp.readyState==4 && xmlhttp.status==200 && xmlhttp.responseText != ""){
var text = xmlhttp.responseText;
var splitted = text.split("\n");
var length = splitted.length
if(splitted[length-2]=="finished"){
refresh_page();
clearInterval(timer);
lines = 0;
}
}
}
}
The start method is called with file path and the time interval its supposed to check for changes. Note the refresh page method I did not post but that just whatever you want to refresh on your page. The page is refreshed when the last line in the file says finished.
I included the line variable to check whether the process is already started or not. I have not fully tested the code but so far its doing what I want it to do.
First off, there are a number of issues with the way you are starting the process - it really needs to be a in a separate process group from the PHP which launches it.
As to checking its status, again, you don't want PHP stuff hanging around for a long time at the end of an http connection. Also getting the webserver to flush content to the browser on demand AND getting the browser to render it progressively is very difficult - and fragile. Even if you get it working on one browser/webserver combination it's unlikely to work in another.
Use Ajax to poll a simple script which reports back on the process status / progress.

How to run a php file in background [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Best way to manage long-running php script?
I have to built a big email list.Everything works perfectly,but when i submit the form page is loading untill every email is send.So i want this email sending script run in background.and notice the user that script is runnign in background.
I cant use Ajax.
i want something like.. proc_open,exec,shell_exec..
You can have cron job which would run php script which will get queue from db and send email
On main script you just need to add emails to queue
I would use ajax only if you need progress bar. With ajax solution you would need to keep window open until it's ended.
You could build an AJAX call that calls the php script. This way, your site will still be operational while the request is fulfilled. And when it's finished, your AJAX will return and you can show a messagebox to the user.
For more information, check at least this and if you understand what AJAX is and what it does, use it with this
Ajax request would be the best choice for this. You can send a request using javascript and even report progress to user (which might require some additional work)
If you find ajax too difficult - run script in an iframe. This is not the most elegant, but the most simple method.
Submit the form with AJAX and update the progress in a Div
For example - write to some place "A"(db or file) current state of your script runtime: "complete"/"incomplete". After start script in background send to your user waiting page which using AJAX handling changes at "A".
This Ajax script will execute a PHP file on the background. It could also send the response to a HTML element if you want.
<script type="text/javascript" language="javascript">
function execute(filename,var1,var2,var3)
{
var xmlhttp;
if(window.XMLHttpRequest)
{
//Code for IE7+, Firefox, Chrome, Opera, Safari
xmlhttp = new XMLHttpRequest();
}
else if(window.ActiveXObject)
{
//Code for IE6, IE5
xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");
}
else
{
alert("Your browser does not support AJAX!");
}
var url = filename+"?";
var params = "var1="+var1+"&var2="+var2+"&var3="+var3;
xmlhttp.open("POST", url, true);
xmlhttp.onreadystatechange=function()
{
if(xmlhttp.readyState==4)
{
//Below line will fill a DIV with ID 'response'
//with the reply from the server. You can use this to troubleshoot
//document.getElementById('response').innerHTML=xmlhttp.responseText;
xmlhttp.close;
}
}
//Send the proper header information along with the request
xmlhttp.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
xmlhttp.setRequestHeader("Content-length", params.length);
xmlhttp.setRequestHeader("Connection", "close");
xmlhttp.send(params);
}
</script>
You can try to run the script through an ajax function as well if you don't want to set cron script.
PHP has an function that can keep an process running even if the user that requested the page leaves the page : ignore_user_abort if you check the comments there you can see this example :
<?php
ignore_user_abort(1); // run script in background
set_time_limit(0); // run script forever
$interval=60*15; // do every 15 minutes...
do{
// add the script that has to be ran every 15 minutes here
// ...
sleep($interval); // wait 15 minutes
}while(true);
?>
It IS an pure php cron job BUT, the risk with this script is that it continues indefinitely or atleast untill you reset/kill php.
Setting the set_time_limit(0); to set_time_limit(86400); would kill the script after an day.
This should point you in the right direction/.
IMPORTANT
After the problem by the OP, it is advisable to only run this script if you have SSH access to the server so you can KILL/RESTART php apache in case the server keeps hanging.
Also do not run the script on a LIVE server.

Categories