I have read many similar questions concerning cancelling a POST request with jQuery, but none seem to be close to mine.
I have your everyday form that has a PHP-page as an action:
<form action="results.php">
<input name="my-input" type="text">
<input type="submit" value="submit">
</form>
Processing results.php on the server-side, based on the post information given in the form, takes a long time (30 seconds or even more and we expect an increase because our search space will increase as well in the coming weeks). We are accessing a Basex server (version 7.9, not upgradable) that contains all the data. A user-generated XPath code is submitted in a form, and the action url then sends the XPath code to the Basex server which returns the results. From a usability perspective, I already show a "loading" screen so users at least know that the results are being generated:
$("form").submit(function() {
$("#overlay").show();
});
<div id="overlay"><p>Results are being generated</p></div>
However, I would also want to give users the option to press a button to cancel the request and cancel the request when a user closes the page. Note that in the former case (on button click) this also means that the user should stay on the same page, can edit their input, and immediately re-submit their request. It is paramount that when they cancel the request, they can also immediately resend it: the server should really abort, and not finish the query before being able to process a new query.
I figured something like this:
$("form").submit(function() {
$("#overlay").show();
});
$("#overlay button").click(abortRequest);
$(window).unload(abortRequest);
function abortRequest() {
// abort correct request
}
<div id="overlay">
<p>Results are being generated</p>
<button>Cancel</button>
</div>
But as you can see, I am not entirely sure how to fill in abortRequest to make sure the post request is aborted, and terminated, so that a new query can be sent. Please fill in the blanks! Or would I need to .preventDefault() the form submission and instead do an ajax() call from jQuery?
As I said I also want to stop the process server-side, and from what I read I need exit() for this. But how can I exit another PHP function? For example, let's say that in results.php I have a processing script and I need to exit that script, would I do something like this?
<?php
if (isset($_POST['my-input'])) {
$input = $_POST['my-input'];
function processData() {
// A lot of processing
}
processData()
}
if (isset($_POST['terminate'])) {
function terminateProcess() {
// exit processData()
}
}
and then do a new ajax request when I need to terminate the process?
$("#overlay button").click(abortRequest);
$(window).unload(abortRequest);
function abortRequest() {
$.ajax({
url: 'results.php',
data: {terminate: true},
type: 'post',
success: function() {alert("terminated");});
});
}
I did some more research and I found this answer. It mentions connection_aborted() and also session_write_close() and I'm not entirely sure which is useful for me. I do use SESSION variables, but I don't need to write away values when the process is cancelled (though I would like to keep the SESSION variables active).
Would this be the way? And if so, how do I make one PHP function terminate the other?
I have also read into Websockets and it seems something that could work, but I don't like the hassle of setting up a Websocket server as this would require me to contact our IT guy who requires extensive testing on new packages. I'd rather keep it to PHP and JS, without third party libraries other than jQuery.
Considering most comments and answers suggest that what I want is not possible, I am also interested to hear alternatives. The first thing that comes to mind is paged Ajax calls (similar to many web pages that serve search results, images, what-have-you in an infinite scroll). A user is served a page with the X first results (e.g. 20), and when they click a button "show next 20 results" those are shown are appended. This process can continue until all results are shown. Because it is useful for users to get all results, I will also provide a "download all results" option. This will then take very long as well, but at least users should be able to go through the first results on the page itself. (The download button should thus not disrupt the Ajax paged loads.) It's just an idea, but I hope it gives some of you some inspiration.
On my understanding the key points are:
You cannot cancel a specific request if a form is submitted. Reasons are on client side you don't have anything so that you can identify the states of a form request (if it is posted, if it is processing, etc.). So only way to cancel it is to reset the $_POST variables and/or refresh the page. So connection will be broken and the previous request will not be completed.
On your alternative solution when you are sending another Ajax call with {terminate: true} the result.php can stop processing with a simple die(). But as it will be an async call -- you cannot map it with the previous form submit. So this will not practically work.
Probable solution: submit the form with Ajax. With jQuery ajax you will have an xhr object which you can abort() upon window unload.
UPDATE (upon the comment):
A synchronous request is when your page will block (all user actions) until the result is ready. Pressing a submit button in the form - do a synchronous call to server by submitting the form - by definition [https://www.w3.org/TR/html-markup/button.submit.html].
Now when user has pressed submit button the connection from browser to server is synchronous - so it will not be hampered until the result is there. So when other calls to server is made - during the submit process is going on - no reference of this operation is available for others - as it is not finished. It is the reason why sending termination call with Ajax will not work.
Thirdly: for your case you can consider the following code example:
HTML:
<form action="results.php">
<input name="my-input" type="text">
<input id="resultMaker" type="button" value="submit">
</form>
<div id="overlay">
<p>Results are being generated</p>
<button>Cancel</button>
</div>
JQUERY:
<script type="text/javascript">
var jqXhr = '';
$('#resultMaker').on('click', function(){
$("#overlay").show();
jqXhr = $.ajax({
url: 'results.php',
data: $('form').serialize(),
type: 'post',
success: function() {
$("#overlay").hide();
});
});
});
var abortRequest = function(){
if (jqXhr != '') {
jqXhr.abort();
}
};
$("#overlay button").on('click', abortRequest);
window.addEventListener('unload', abortRequest);
</script>
This is example code - i just have used your code examples and changed something here and there.
Himel Nag Rana demonstrated how to cancel a pending Ajax request.
Several factors may interfere and delay subsequent requests, as I have discussed earlier in another post.
TL;DR: 1. it is very inconvenient to try to detect the request was cancelled from within the long-running task itself and 2. as a workaround you should close the session (session_write_close()) as early as possible in your long-running task so as to not block subsequent requests.
connection_aborted() cannot be used. This function is supposed to be called periodically during a long task (typically, inside a loop). Unfortunately there is just one single significant, atomic operation in your case: the query to the data back end.
If you applied the procedures advised by Himel Nag Rana and myself, you should now be able to cancel the Ajax request and immediately allow a new requests to proceed. The only concern that remains is that the previous (cancelled) request may keep running in the background for a while (not blocking the user, just wasting resources on the server).
The problem could be rephrased to "how to abort a specific process from the outside".
As Christian Bonato rightfully advised, here is a possible implementation. For the sake of the demonstration I will rely on Symphony's Process component, but you can devise a simpler custom solution if you prefer.
The basic approach is:
Spawn a new process to run the query, save the PID in session. Wait for it to complete, then return the result to the client
If the client aborts, it signals the server to just kill the process.
<?php // query.php
use Symfony\Component\Process\PhpProcess;
session_start();
if(isset($_SESSION['queryPID'])) {
// A query is already running for this session
// As this should never happen, you may want to raise an error instead
// of just silently killing the previous query.
posix_kill($_SESSION['queryPID'], SIGKILL);
unset($_SESSION['queryPID']);
}
$queryString = parseRequest($_POST);
$process = new PhpProcess(sprintf(
'<?php $result = runQuery(%s); echo fetchResult($result);',
$queryString
));
$process->start();
$_SESSION['queryPID'] = $process->getPid();
session_write_close();
$process->wait();
$result = $process->getOutput();
echo formatResponse($result);
?>
<?php // abort.php
session_start();
if(isset($_SESSION['queryPID'])) {
$pid = $_SESSION['queryPID'];
posix_kill($pid, SIGKILL);
unset($pid);
echo "Query $pid has been aborted";
} else {
// there is nothing to abort, send a HTTP error code
header($_SERVER['SERVER_PROTOCOL'] . ' 599 No pending query', true, 599);
}
?>
// javascript
function abortRequest(pendingXHRRequest) {
pendingXHRRequest.abort();
$.ajax({
url: 'abort.php',
success: function() { alert("terminated"); });
});
}
Spawning a process and keeping track of it is genuinely tricky, this is why I advised using existing modules. Integrating just one Symfony component should be relatively easy via Composer: first install Composer, then the Process component (composer require symfony/process).
A manual implementation could look like this (beware, this is untested, incomplete and possibly unstable, but I trust you will get the idea):
<?php // query.php
session_start();
$queryString = parseRequest($_POST); // $queryString should be escaped via escapeshellarg()
$processHandler = popen("/path/to/php-cli/php asyncQuery.php $queryString", 'r');
// fetch the first line of output, PID expected
$pid = fgets($processHandler);
$_SESSION['queryPID'] = $pid;
session_write_close();
// fetch the rest of the output
while($line = fgets($processHandler)) {
echo $line; // or save this line for further processing, e.g. through json_encode()
}
fclose($processHandler);
?>
<?php // asyncQuery.php
// echo the current PID
echo getmypid() . PHP_EOL;
// then execute the query and echo the result
$result = runQuery($argv[1]);
echo fetchResult($result);
?>
With BaseX 8.4, a new RESTXQ annotation %rest:single was introduced, which allows you to cancel a running server-side request: http://docs.basex.org/wiki/RESTXQ#Query_Execution. It should solve at least some of the challenges you described.
The current way to only return chunks of the result is to pass on the index to the first and last result in your result, and to do the filtering in XQuery:
$results[position() = $start to $end]
By returning one more result than requested, the client will know that there will be more results. This may be helpful, because computing the total result size is often much more expensive than returning only the first results.
I hope I understood this correctly.
Instead of letting the browser "natively" submit the FORM, don't: write JS code that does this instead. In other words (I didn't test this; so interpret as pseudo-code):
<form action="results.php" onsubmit="return false;">
<input name="my-input" type="text">
<input type="submit" value="submit">
</form>
So, now, when the that "submit" button is clicked, nothing will happen.
Obviously, you want your form POSTed, so write JS to attach a click handler on that submit button, collect values from all input fields in the form (actually, it is NOT nearly as scary as it sounds; check out the link below), and send it to the server, while saving the reference to the request (check the 2nd link below), so that you can abort it (and maybe signal the server to quit also) when the cancel-button is clicked (alternatively, you can simply abandon it, by not caring about the results).
Submit a form using jQuery
Abort Ajax requests using jQuery
Alternatively, to make that HTML markup "clearer" relative to its functionality, consider not using FORM tag at all: otherwise, what I suggested makes its usage confusing (why it is there if it's not used; know I mean?). But, don't get distracted with this suggestion until you make it work the way you want; it's optional and a topic for another day (it might even relate to your changing architecture of the whole site).
HOWEVER, a thing to think about: what to do if the form-post already reached the server and server already started processing it and some "world" changes have already been made? Maybe your get-results routine doesn't change data, so then that's fine. But, this approach probably cannot be used with change-data POSTs with the expectation that "world" won't change if cancel-button is clicked.
I hope that helps :)
The user doesn't have to experience this synchronously.
Client posts a request
The server receives the client request and assigns an ID to it
The server "kicks off" the search and responds with a zero-data page and search ID
The client receives the "placeholder" page and starts checking if the results are ready based on the ID (with something like polling or websockets)
Once the search has completed, the server responds with the results next time it's polled (or notifies the client directly when using websockets)
This is fine when performance isn't quite the bottleneck and the nature of processing makes longer wait times acceptable. Think flight search aggregators that routinely run for 30-90 seconds, or report generators that have to be scheduled and run for even longer!
You can make the experience less frustrating if you don't block user interactions, keep them updated of search progress and start showing results as they come in if possible.
You must solve this conceptually first before writing any code. Here are some things that come to mind offhand:
What does it mean to free up resources on the server?
What constitutes to a graceful abort that will free up resources?
Is it enough to kill the PHP process waiting for the query result(s)? If so, the route suggested by RandomSeed could be interesting. Just keep in mind that it will only work on a single server. If you have multiple load balanced servers you won't have a way to kill a process on another server (not as easily at least).
Or do you need to cancel the database request from the database itself? In that case the answer suggested by Christian GrĂ¼n is of more interest.
Or is it that there is no graceful shutdown and you have to force everything to die? If so, this seems awfully hacky.
Not all clients are going to explicitly abort
Some clients are going to close the browser, but their last request won't come through; some clients will lose internet connection and leave the service hanging, etc. You are not guaranteed to get an "abort" request when a client disconnects or has gone away.
You have to decide whether to live with potentially unwanted behavior, or implement an additional active state tracking, e.g. client pinging server for keepalive.
Side notes
30 secs or greater query time is potentially long, is there a better tool for the job; so you won't have to solve this with a hack like this?
you are looking for features of a concurrent system, but you're not using a concurrent system; if you want concurrency use a better tool/environment for it, e.g. Erlang.
The apache server I am using to develop my system will not respond to request while the scripts that control the polling of messages is being run. This only happends on a domain level meaning that I can send an http request to any other apps hosted localy and get a response. When I do eventually get a response from this its about a minute later.
Here is the Js
window.fetch_messages = function ()
{
var last_message = $("div.message:last").attr('data-ai_id');
var last_message_status = $("p.message_status:last").text();
var project_id = getParameterByName('project-id');
$.ajax({
url:'/project_messages',
type:'POST',
data:{ project_id:project_id, latest_message:last_message, status:last_message_status },
timeout:50000,
async: true,
success:new_messages, // This upon completion also resends the request
error:function(data){ console.log(data); setTimeout(fetch_messages(),50000); }
});
}; // When On the page that uses this I call this function to start polling
Here is the server side code
do
{
// Check for status change
$status_change = $this->mentor_model->query_status($this->project_id, $this->last_message_id, $this->last_message_status, $_SESSION['user']);
// Check for new messages
$messages = $this->mentor_model->query_messages($this->project_id, $this->last_message_id);
// If there is a status update or new message.
if($messages || $status_change)
break;
usleep(1000000);
}
while(empty($messages) && empty($status_change));
echo json_encode(array("messages"=>$messages, "status"=>$status_change));
exit;
While this action is being run The server takes a long time to handle any request weather it be a GET, POST or another AJax request. Iv also tried changing both code sets to no avail as long as its long polling, the server will take a long time to handle.
Do I have this wrong or is there some apache setting I'm suppose to change. Using xamp on windows 8.1 also tried wamp with no change
Thanks to steven for this. Ansewer taken straight from the source of php manual page
for session_write_close();
You can have interesting fun debugging anything with sleep() in it if
you have a session still active. For example, a page that makes an
ajax request, where the ajax request polls a server-side event (and
may not return immediately).
If the ajax function doesn't do session_write_close(), then your outer
page will appear to hang, and opening other pages in new tabs will
also stall.
I have constructed a PHP file which scrapes a web page (using cURL) to obtain some data, and outputs it to the screen in JSON format.
The target website involves some redirects which temporarily outputs data to my PHP file. Once the redirects have completed successfully, the JSON is presented as expected. The problem that I am encountering is that when I try to access the JSON using jQuery's $.ajax() method, it sometimes returns the incorrect data, because it isn't waiting for the redirects to complete.
My question is if it's possible to tell the AJAX request to wait a certain number of seconds before returning the data, thus allowing time for the redirects in the PHP script to execute successfully?
Please note that there is no cleaner solution for the page scrape, the redirects are essential and have to be outputted to the screen for the scraping to complete.
There's always timeout in the settings.
jQuery docs:
timeout Number
Set a timeout (in milliseconds) for the request. This will
override any global timeout set with $.ajaxSetup().
The timeout period starts at the point the $.ajax call is made;
if several other requests are in progress and the browser
has no connections available, it is possible for a request
to time out before it can be sent. In jQuery 1.4.x and below,
the XMLHttpRequest object will be in an invalid state if
the request times out; accessing any object members may
throw an exception. In Firefox 3.0+ only, script and JSONP
requests cannot be cancelled by a timeout; the script will
run even if it arrives after the timeout period.
You should use promise() in jQuery.
You could always store the result of your ajax call and then wait for the redirects to finsih, i.e.:
$.ajax({
success: function(e)
{
var wait = setTimeout(function(){ doSomethingWithData(e.data); }, 5000); //5 sec
}
})
Alternatively, you could set up an Interval to check if something happened (redirect finished) every x amount of ms. I'm assuming your redirects are letting you know they completed?
http://examples.hmp.is.it/ajaxProgressUpdater/
$i=0;
while (true)
{
if (self::$driver->executeScript("return $.active == 0")) {
break;
}
if($i == 20) {
break;
}
$i++;`enter code here`
echo $i;
usleep(10000);
}
I have a bunch of AJAX requests, the first is a
$.post("go.php", { url: pastedURL, key: offskey, mo: 'inward', id: exID});
This request, in go.php includes a system command, which despite nohup and & at the end, doesn't stop whirling on Firebug - waiting for response.
Unfortunately, the next AJAX request
$.post("requestor.php", { request: 'getProgress', key: offskey}, function(data) { console.log(data.response); }, "json");
Doesn't run, it whirls round in firebug (i'm guessing until go.php has finished) - it overloads everything eventually (this is on a timer to check every few seconds).
So I guess the question is, is there an AJAX method which simply throws data and walks away, instead of waiting for response... or someway I can perform another request whilst the other is waiting.
Hope someone knows what I mean.
Check for async: false, as parameter for $.ajax() method in jQuery.
You need it to be set async: true.
Turns out the PHP request on the other end would need this in order to proceed with waiting:-
"> /dev/null 2>/dev/null &"
source
There are a limited number of connections used by the browser. The number varies by browser, but for older ones like IE7 it can be as little as 2 connections per server. Once you fill them with pending requests you have to wait for those to complete before you can make any additional requests to that server.
(this is on a timer to check every few seconds)
It sounds like you may be making additional requestor.php requests using setTimeout or setInterval before the previous ones have finished? If so, don't do that. Use a full $.ajax (perhaps with a timeout as well) and in the success/error callback only then make another request.
I have a File which process many other files and may take upto 30mins to process. I do an AJAX request to the file in the front end. The file outputs to another temporary file regarding the percentage of completion. And when it finishes it outputs a success message as its own XML output (not to the tmp file).
The problem i am having is, when the processing time is small .. say max upto 3mins, the AJAX request (made through jQuery) stays alive. But a time out occurs when the processing takes longer time (above 4mins). And the AJAX connection is cut. How do i prevent it and make it stay alive till the browser is closed?
You won't be able to do that. Unless it is a comet server, that can keep the connection alive at the server side and when there is any update to the data, it pushes out the contents.
In your case, the only way i can think of is doing this:
function ajax_call () {
$.ajax({
url : 'get_file_processing_output.html',
success : function (response) {
check your response, if file processing is not finished, then call ajax_call() again
if it is finished, then just do whatever you need.
},
timeout : function () {
time out then directly call ajax_call() again, maybe with a time interval would be better
}
})
}
I have a success call back above in ajax, because i feel you should response something from your server side to tell the client that the processing is not yet done.