I have a noticed a strange phenomenon in my LAMP environment.
Over the frontend I execute an AJAX post request with jQuery like this:
$.post('save.php', {data1: d1, data2: d2, [...], dataN: dN})
The variables d1 to dN are collected from the website (e.g. from text inputs, textareas, checkboxes, etc.) with jQuery beforehand.
The file save.php takes the post parameters data1 to dataN and saves them in the database in one query.
The request takes about 500ms and works without problems unless I change pages (e.g. by clicking a link) during the request.
Normally, I would expect the request to be aborted and ignored (which would be fine) but (and this is the strange behaviour) the request seems to be completed but only with part of the data transmitted and thus saved.
That means for example, that the php script saves only data1 to data5 and sets data6 to dataN to empty.
The problem seems to be caused by the AJAX request already (not the php script) since fields $_POST['data6'] to $_POST['dataN'] are not set in php in this scenario.
So my questions:
Why does this happen (is this expected behaviour)?
How can I avoid it?
Update
The problem is neither jQuery nor php solely. jQuery collects the values correctly and tries to post them to php. I just validated it - it works.
The php script on the other hand handles everything it gets as expected - it just does not receive the whole request.
So the problem must be the interrupted request itself. Unlike I'd expect it does not abort or fail, it still transmits all the data until the cut off.
Then php gets this post data and starts handling it - obviously missing some information.
Update 2
I fixed the problem by adding a parameter eof after dataN and checking if it was set in php. This way I can be sure the whole request was transmitted.
Nevertheless this does not fix the source of the problem which I still don't understand.
Any help anyone?
Try the following actions to debug the problem:
Check post_max_size in your php settings and compare it with the data size you are posting.
User HTTP request builder, i.e. Use Fiddler to make an http request and check what it returns.
Use print_r($_POST); on the top of the save.php, to check what you are getting in it.
Use tool like Firebug to check what jQuery has posted.
You should also verify the json object on client side that you are posting. i.e. JSON.stringify(some_object);
Try posting some basic sample data { "data1":1, "data2":2, "data3":3, "data4":4, "data5":5 , "data6":6 }
Most probably you are sending to much data or likely data is invalid!
Edits:
Very Foolish act but lets say you posted count as well. so directly check isset($_POST['data'.$_POST['count']] )
I think we can rule out problems at the server site (unless it's some exotic or self-crafted server daemon), because nobody ever sends "end-of-data"-parameters with a HTTP POST request to make sure all data is really sent. This is handled by HTTP itself (see e.g. Detect end of HTTP request body). Moreover, I don't think that you have to check the Content-Length header when POSTing data to your server, simply because of the fact that nobody does this, ever. At least not in totally common circumstances like you describe them (sending Ajax POST through jQuery).
So I suppose that jQuery sends a syntactically correct POST, but it's cut off. My guess is that if you interrupt this data collecting by navigating to another page, jQuery builds an Ajax request out of the data which it was able to gather and sends a syntactically correct POST to your server, but with cut off data.
Since you're using Firebug, please go to its net tab and activate persist, so traffic data is not lost when navigating to another page. Then trigger your Ajax POST, navigate to another page (and thereby "interrupt" the Ajax call) and check in Firebug's net tab what data has actually been sent to the server by opening ALL the POST requests and checking the Headers tab (and inside this, the Request Headers tab).
My guess is that one of two things might happen:
You will see that the data sent to the server is cut off already in the headers being presented to you in Firebug's net tab and the Content-Length is calculated correctly according to the actual (cut off) length of the POST data. Otherwise, I'm sure the server would reject the request as Bad Request as a whole.
You will see that there are multiple POST requests, some of them (perhaps with the full, non-cut off data) actually interrupted and therefore never reaching the server, but at least one POST request (again, with the cut off data) that ist triggered by some other mechanism in your Javascript, i.e. not the trigger you thought, but by navigating to another page, more and other Ajax requests might be triggered (just a guess since I don't know your source code).
In either case, I think you'll find out that this problem ist client related and the server just processes the (incomplete, but (in terms of HTTP) syntactically valid) data the client sent to it.
From that point on, you could debug your Javascript and implement some mechanism that prevents sending incomplete data to your server. Again, it's hard to tell what to do exactly since I don't know the rest of your source code, but maybe there's some heavy action going on in collecting the data, and you could possibly make sure that the POST only happens if all the data is really collected. Or, perhaps you could prevent navigation until the Ajax request is completed or such things.
What might be interesting, if all of this doesn't make sense, would be to have a look at more of your source code, especially how the Ajax POST is triggered and if there are any other events and such if you navigate to another page. Sample data you're sending could also be interesting.
EDIT: I'd also like to point out that outputting data with console.log() might be misleading, since it's in no way guaranteed that this is the data actually being sent, it's just a logline which evaluates to the given output at the exact time when console.log() is called. That's why I suggested sniffing the network traffic, because then (and only then) you can be sure what is really being sent (and received).
Nonetheless, this is a little tricky if you're not used to it (and impossible if you use encrypted traffic e.g. by using HTTPS), so the Firebug net tab might be a good compromise.
You can verify the value of the Content-Length header being received by the PHP.
This value ought to have been calculated client side when running the POST query. If it does not match, that's your error then and there. And that's all the diagnostics you need - if the Content-Length does not match the POST data, reject the POST as invalid; no need of extra parameters (computing the POST data length might be a hassle, though). Also, you might want to investigate why does PHP, while decoding the POST and therefore being able to verify its length, nonetheless seems to accept a wrong length (maybe the information needed to detect the error is somewhere among the $_SERVER variables?).
If it does match though, and still data isn't arriving (i.e., the Content-Length is smaller, and correctly describes the cut-off POST), then it is proof that the POST was inspected after the cut-off, and therefore either the error is in the browser (or, unlikely, in jQuery) or there is something between the browser and the server (a proxy?) that is receiving an incomplete query (with Content-Length > Actual length) and is incorrectly rewriting it, making it appear "correct" to the server, instead of rejecting it out of hand.
Some testing of both the theory and the workaround
Executive summary: I got the former wrong, but the latter apparently right. See code below for a sample that works on my test system (Linux OpenSuSE 12.3, Apache).
I believed that a request with wrong Content-Length would be refused with a 400 Bad Request. I was wrong. It seems that at least my Apache is much more lenient.
I used this simple PHP code to access the key variables of interest to me
<?php
$f = file_get_contents("php://input");
print $_SERVER['CONTENT_LENGTH'];
print "\nLen: " . strlen($f) . "\n";
?>
and then I prepared a request with a wrong Content-Length sending it out using nc:
POST /p.php HTTP/1.0
Host: localhost
Content-Length: 666
answer=42
...and lo and behold, nc localhost 80 < request yields no 400 error:
HTTP/1.1 200 OK
Date: Fri, 14 Jun 2013 20:56:07 GMT
Server: Apache/2.2.22 (Linux/SUSE)
X-Powered-By: PHP/5.3.17
Vary: Accept-Encoding
Content-Length: 12
Content-Type: text/html
666
Len: 10
It occurred to me then that content length might well be off by one or two in case the request ended with carriage return, and what carriage return - LF? CRLF?. However, when I added simple HTML to be able to POST it from a browser
<form method="post" action="?"><input type="text" name="key" /><input type="submit" value="go" /></form>
I was able to verify that in Firefox (latest), IE8, Chrome (latest), all running on XP Pro SP3, the value of the Content-Length is the same as the strlen of php://input.
Except when the request is cut off, that is.
The only problem is that php://input is not always available even for POST data.
This leaves us still in a quandary:
IF THE ERROR IS AT THE NETWORK LEVEL, i.e., the POST is prepared and supplied with a correct Content-Length, but the interruption makes it so that the whole data is cut off as this comment by Horen seems to indicate:
So only the first couple of post parameters arrived, sometimes the value of one parameter was even interrupted in the middle
then really checking Content-Length will prevent PHP from handling an incomplete request:
<?php
if ('POST' == $_SERVER['SERVER_PROTOCOL'])
{
if (!isset($_SERVER['Content-Length']))
{
header($_SERVER['SERVER_PROTOCOL'] . ' 400 Bad Request', True, 400);
die();
}
if (strlen(file_get_contents('php://input'))!=(int)($_SERVER['Content-Length']))
{
header($_SERVER['SERVER_PROTOCOL'] . ' 400 Bad Request', True, 400);
die();
}
}
// ... go on
?>
ON THE OTHER HAND if the problem is in jQuery, i.e. somehow the interruption prevents jQuery from assembling the full POST, and yet the POST is made up, the Content-Length calculated of the incomplete data, and the packet sent off -- then my workaround can't possibly work, and the "telltale" extra field must be used, or... perhaps the .post function in jQuery might be extended to include a CRC field?
Post data looks just like GET:
Header1: somedata1\r\n
Header2: somedata2\r\n
...
HeaderN: somedataN\r\n
\r\n
data1=1&data2=2&...&dataN=N
When request is aborted, in some cases, last line may be passed only partially. So, here are some possible solutions:
Compare Content-Length and strlen($HTTP_RAW_POST_DATA)
Validate input data
Pass not so much data at one time
I have tried to recreate this problem using triggers, manually, changing server settings, doing my very best to #$%& things up, using different data sizes but I never ever got only half a request in PHP. Simply because apache will not invoke PHP untill the request is completely done. See this question about Reading “chunked” POST data in PHP
So the only thing that can go wrong is that Jquery only gathers part of the data and then makes a POST request. Using just $.post('save.php', data) as you mentioned, that wont happen. Its either working to gather the data or its waiting for a response from the server.
If you switch sites during the gathering, there wont be a request. And if you switch after the request has been made and you move away quickly, before all data has been transmitted, the apache server will see it as half a request and wont invoke PHP.
Some suggestions:
Is it possible that you are using seperate pages for succesfull and partial requests? Because PHP does only add the first 1000 elements to $_POST and perhaps the failed requests have more then data1000=data elements? So there wont be an EOF param.
Is it possible that you are gathering the data in a global var in javascript and have an onbeforeunload method that sends data as well? Because then there might only be half the data in the POST.
Can you share some information on the data you are seding? Are there a lot of small elements (like data1 till data10000) or a few large once?
Is it always the same element that you receive last? Like always data6 as you mention? Because if it is, the chances of a failed attempt always at the exact same dataN field would be very slim.
My problem was there were too many variables in one of my post objects.
PHP has a max_input_vars variable which is set to 1000 by default.
I added this line to my .htaccess file (since I don't have access to the php.ini file):
php_value max_input_vars 5000
Problem solved!
Can you check your host log at
/var/log/messages
Last time i had "missing"post variables at php i found that i was sending null ASCII chars and the server(CentOS) was considering it an attack, then dropping those specific variables... Took me a week to figure it out! This was the server log response:
suhosin[1173]: ALERT - ASCII-NUL chars not allowed within request variables - dropped variable 'data3' (attacker '192.168.0.37', file '/var/www/upload_reader.php')
If that is your problem, tyr to, with js, compress your variables, encode them with base64. Post them with ajax, then receive then at php, decode64 then uncompress! That solved for me ;)
Solved the problem by increasing max_input_vars in my server's php.ini file
Since I had more than 1000 variables in the array, only part of them was received by the server!
Hope this helps someone!
In order to find our more about what is happening, why not properly code you ajax request using jQuery's ajax function. Use all the callback functions to track what happened to your call or what came back? The element type is set to POST and the element data carries whatever object structure { ... } you like.
$.ajax({
url : "save.php",
type : "POST",
data : {
"ajax_call" : "SOME_CUSTOM_AJAX_REQUEST_REFERENCE",
"data1" : data1,
"data2" : data2,
"data2" : data2,
"dataN" : dataN
},
//dataType : "html", contentType: "text/html; charset=utf-8",
dataType : "json", contentType: "application/json; charset=utf-8",
beforeSend: function () {
//alert('before send...');
},
dataFilter: function () {
//alert('data filter...');
},
success: function(data, textStatus, jqXHR) {
//alert('success...');
var response = JSON.parse(jqXHR.responseText, true);
if (undefined != response.data) {
my_error_function();
}
my_response_function(response.data);
},
error: function(jqXHR, textStatus, errorThrown) {
//alert('error...');
},
complete: function (xhr, status) {
//alert('end of call...');
my_continuation_function();
}
});
Before send a request, set "onbeforepageunload" handler for document(to prohibit the transition to another page), and unbind after success.
To example:
$(document).on('unload', function(e){
e.preventDefault();
// Here you can display a message, you need to wait a bit
return false;
});
A guess - the suhosin function on your Ubuntu/Debian server causes the field to be cut off?
I have the same problem. POST size is about 650KB, and gets corrupted if closing browser window or refreshing page in case of ajax, before post is completed. ajax.success() is not fired, but partial data is posted to "save.php" with 200 OK status. $_SERVER's Content-length is of no use as suggested elswhere since it matches the actual content-length of the partial data.
I figured out 2 ways of overcoming this:
as you propose, append a post variable at the end. Seems to work, although it seems a bit risky.
make your "save.php" script save to a temporary db column, and then use the ajax.success() to call another php script, say savefinal.php, without passing any data, which transfers data from the temp column to the final column (or just flags this data as valid). That way if post is interrupted data will only reside on the temp column on the database (or will not be flagged as valid).
The .success() is not called if post is interrupted, so this should work.
I presume this is a jquery bug sending a wrong (very small) content-length to apache, and apache is forced to assume that the post request has completed, but I'm not really sure.
Why does this happen (is this expected behaviour)?
Was also looking for the cause of this strange behaviour when some variables from post were missing and came to this question where the very similar behaviour is explained with the slow connection of the web client sending a POST request with multipart/form-data.
Here goes the mentioned question:
I am facing a problem when a remote web client with slow connection
fails to send complete POST request with multipart/form-data content
but PHP still uses partially received data to populate $_POST array.
As a result one value in $_POST array can be incomplete and more
values can be missing.
See How to check for incomplete POST request in PHP
How can I avoid it?
There you can find the recommended solution, as well. Nevertheless, the same solution was already proposed by you.
You can add field <input type="hidden" name="complete"> (for example)
as the last parameter. in PHP check firstly whether this parameter was
sent from client. if this parameter sent - you can be sure that you
got the entire data.
Related
I'm pretty inexperienced with jquery. My code is
function edit(uID){
var php = "body/weekly_deals.php";
var data = {"edit" : "post is here"}
$.post(php, data,function(response){
console.log(response);
});
}
This is being defined on a WeeklyDeal.php. Now, on my body/weekly_deals.php page, I var_dump($_POST['edit']) and I'm getting NULL, however in the console.log, I'm seeing the value "post is here" string? So, I'm confused. How can it be there, but not be there at the same time?
I suspect you are misunderstanding how this works.
I var_dump($_POST['edit']) and I'm getting NULL
The way you phrased your question, it sounds like you are not seeing that null in your console.log(), which is what would happen if you had the var_dump and called it with ajax. Instead, it sounds like you are loading weekly_deals.php directly in the browser and, since that is a GET request, not a POST request, and no parameters are being passed, it comes back empty.
however in the console.log, I'm seeing the value "post is here" string
Right, because javascript is making an HTTP request using the POST method and passing a parameter.
I think you may be confused about how an HTTP request works. To break it down, you have a resource which comes as a URI -- you know this as the web address. You can ask it questions in a few different ways -- GET, POST, PUT, etc. A web browser, when navigating to a page, issues a GET request to the resource. The web server returns the response and renders it.
When you make an AJAX request, you are doing something very similar as far as the request life cycle is concerned. When you make the request, the server renders the response and sends it back. That is why your console.log() has what you expect to see -- because AJAX made the request the server side expected to get. When you navigate to the page directly in the browser, it is the wrong type of request and thusly you see the wrong response.
I have read many similar questions concerning cancelling a POST request with jQuery, but none seem to be close to mine.
I have your everyday form that has a PHP-page as an action:
<form action="results.php">
<input name="my-input" type="text">
<input type="submit" value="submit">
</form>
Processing results.php on the server-side, based on the post information given in the form, takes a long time (30 seconds or even more and we expect an increase because our search space will increase as well in the coming weeks). We are accessing a Basex server (version 7.9, not upgradable) that contains all the data. A user-generated XPath code is submitted in a form, and the action url then sends the XPath code to the Basex server which returns the results. From a usability perspective, I already show a "loading" screen so users at least know that the results are being generated:
$("form").submit(function() {
$("#overlay").show();
});
<div id="overlay"><p>Results are being generated</p></div>
However, I would also want to give users the option to press a button to cancel the request and cancel the request when a user closes the page. Note that in the former case (on button click) this also means that the user should stay on the same page, can edit their input, and immediately re-submit their request. It is paramount that when they cancel the request, they can also immediately resend it: the server should really abort, and not finish the query before being able to process a new query.
I figured something like this:
$("form").submit(function() {
$("#overlay").show();
});
$("#overlay button").click(abortRequest);
$(window).unload(abortRequest);
function abortRequest() {
// abort correct request
}
<div id="overlay">
<p>Results are being generated</p>
<button>Cancel</button>
</div>
But as you can see, I am not entirely sure how to fill in abortRequest to make sure the post request is aborted, and terminated, so that a new query can be sent. Please fill in the blanks! Or would I need to .preventDefault() the form submission and instead do an ajax() call from jQuery?
As I said I also want to stop the process server-side, and from what I read I need exit() for this. But how can I exit another PHP function? For example, let's say that in results.php I have a processing script and I need to exit that script, would I do something like this?
<?php
if (isset($_POST['my-input'])) {
$input = $_POST['my-input'];
function processData() {
// A lot of processing
}
processData()
}
if (isset($_POST['terminate'])) {
function terminateProcess() {
// exit processData()
}
}
and then do a new ajax request when I need to terminate the process?
$("#overlay button").click(abortRequest);
$(window).unload(abortRequest);
function abortRequest() {
$.ajax({
url: 'results.php',
data: {terminate: true},
type: 'post',
success: function() {alert("terminated");});
});
}
I did some more research and I found this answer. It mentions connection_aborted() and also session_write_close() and I'm not entirely sure which is useful for me. I do use SESSION variables, but I don't need to write away values when the process is cancelled (though I would like to keep the SESSION variables active).
Would this be the way? And if so, how do I make one PHP function terminate the other?
I have also read into Websockets and it seems something that could work, but I don't like the hassle of setting up a Websocket server as this would require me to contact our IT guy who requires extensive testing on new packages. I'd rather keep it to PHP and JS, without third party libraries other than jQuery.
Considering most comments and answers suggest that what I want is not possible, I am also interested to hear alternatives. The first thing that comes to mind is paged Ajax calls (similar to many web pages that serve search results, images, what-have-you in an infinite scroll). A user is served a page with the X first results (e.g. 20), and when they click a button "show next 20 results" those are shown are appended. This process can continue until all results are shown. Because it is useful for users to get all results, I will also provide a "download all results" option. This will then take very long as well, but at least users should be able to go through the first results on the page itself. (The download button should thus not disrupt the Ajax paged loads.) It's just an idea, but I hope it gives some of you some inspiration.
On my understanding the key points are:
You cannot cancel a specific request if a form is submitted. Reasons are on client side you don't have anything so that you can identify the states of a form request (if it is posted, if it is processing, etc.). So only way to cancel it is to reset the $_POST variables and/or refresh the page. So connection will be broken and the previous request will not be completed.
On your alternative solution when you are sending another Ajax call with {terminate: true} the result.php can stop processing with a simple die(). But as it will be an async call -- you cannot map it with the previous form submit. So this will not practically work.
Probable solution: submit the form with Ajax. With jQuery ajax you will have an xhr object which you can abort() upon window unload.
UPDATE (upon the comment):
A synchronous request is when your page will block (all user actions) until the result is ready. Pressing a submit button in the form - do a synchronous call to server by submitting the form - by definition [https://www.w3.org/TR/html-markup/button.submit.html].
Now when user has pressed submit button the connection from browser to server is synchronous - so it will not be hampered until the result is there. So when other calls to server is made - during the submit process is going on - no reference of this operation is available for others - as it is not finished. It is the reason why sending termination call with Ajax will not work.
Thirdly: for your case you can consider the following code example:
HTML:
<form action="results.php">
<input name="my-input" type="text">
<input id="resultMaker" type="button" value="submit">
</form>
<div id="overlay">
<p>Results are being generated</p>
<button>Cancel</button>
</div>
JQUERY:
<script type="text/javascript">
var jqXhr = '';
$('#resultMaker').on('click', function(){
$("#overlay").show();
jqXhr = $.ajax({
url: 'results.php',
data: $('form').serialize(),
type: 'post',
success: function() {
$("#overlay").hide();
});
});
});
var abortRequest = function(){
if (jqXhr != '') {
jqXhr.abort();
}
};
$("#overlay button").on('click', abortRequest);
window.addEventListener('unload', abortRequest);
</script>
This is example code - i just have used your code examples and changed something here and there.
Himel Nag Rana demonstrated how to cancel a pending Ajax request.
Several factors may interfere and delay subsequent requests, as I have discussed earlier in another post.
TL;DR: 1. it is very inconvenient to try to detect the request was cancelled from within the long-running task itself and 2. as a workaround you should close the session (session_write_close()) as early as possible in your long-running task so as to not block subsequent requests.
connection_aborted() cannot be used. This function is supposed to be called periodically during a long task (typically, inside a loop). Unfortunately there is just one single significant, atomic operation in your case: the query to the data back end.
If you applied the procedures advised by Himel Nag Rana and myself, you should now be able to cancel the Ajax request and immediately allow a new requests to proceed. The only concern that remains is that the previous (cancelled) request may keep running in the background for a while (not blocking the user, just wasting resources on the server).
The problem could be rephrased to "how to abort a specific process from the outside".
As Christian Bonato rightfully advised, here is a possible implementation. For the sake of the demonstration I will rely on Symphony's Process component, but you can devise a simpler custom solution if you prefer.
The basic approach is:
Spawn a new process to run the query, save the PID in session. Wait for it to complete, then return the result to the client
If the client aborts, it signals the server to just kill the process.
<?php // query.php
use Symfony\Component\Process\PhpProcess;
session_start();
if(isset($_SESSION['queryPID'])) {
// A query is already running for this session
// As this should never happen, you may want to raise an error instead
// of just silently killing the previous query.
posix_kill($_SESSION['queryPID'], SIGKILL);
unset($_SESSION['queryPID']);
}
$queryString = parseRequest($_POST);
$process = new PhpProcess(sprintf(
'<?php $result = runQuery(%s); echo fetchResult($result);',
$queryString
));
$process->start();
$_SESSION['queryPID'] = $process->getPid();
session_write_close();
$process->wait();
$result = $process->getOutput();
echo formatResponse($result);
?>
<?php // abort.php
session_start();
if(isset($_SESSION['queryPID'])) {
$pid = $_SESSION['queryPID'];
posix_kill($pid, SIGKILL);
unset($pid);
echo "Query $pid has been aborted";
} else {
// there is nothing to abort, send a HTTP error code
header($_SERVER['SERVER_PROTOCOL'] . ' 599 No pending query', true, 599);
}
?>
// javascript
function abortRequest(pendingXHRRequest) {
pendingXHRRequest.abort();
$.ajax({
url: 'abort.php',
success: function() { alert("terminated"); });
});
}
Spawning a process and keeping track of it is genuinely tricky, this is why I advised using existing modules. Integrating just one Symfony component should be relatively easy via Composer: first install Composer, then the Process component (composer require symfony/process).
A manual implementation could look like this (beware, this is untested, incomplete and possibly unstable, but I trust you will get the idea):
<?php // query.php
session_start();
$queryString = parseRequest($_POST); // $queryString should be escaped via escapeshellarg()
$processHandler = popen("/path/to/php-cli/php asyncQuery.php $queryString", 'r');
// fetch the first line of output, PID expected
$pid = fgets($processHandler);
$_SESSION['queryPID'] = $pid;
session_write_close();
// fetch the rest of the output
while($line = fgets($processHandler)) {
echo $line; // or save this line for further processing, e.g. through json_encode()
}
fclose($processHandler);
?>
<?php // asyncQuery.php
// echo the current PID
echo getmypid() . PHP_EOL;
// then execute the query and echo the result
$result = runQuery($argv[1]);
echo fetchResult($result);
?>
With BaseX 8.4, a new RESTXQ annotation %rest:single was introduced, which allows you to cancel a running server-side request: http://docs.basex.org/wiki/RESTXQ#Query_Execution. It should solve at least some of the challenges you described.
The current way to only return chunks of the result is to pass on the index to the first and last result in your result, and to do the filtering in XQuery:
$results[position() = $start to $end]
By returning one more result than requested, the client will know that there will be more results. This may be helpful, because computing the total result size is often much more expensive than returning only the first results.
I hope I understood this correctly.
Instead of letting the browser "natively" submit the FORM, don't: write JS code that does this instead. In other words (I didn't test this; so interpret as pseudo-code):
<form action="results.php" onsubmit="return false;">
<input name="my-input" type="text">
<input type="submit" value="submit">
</form>
So, now, when the that "submit" button is clicked, nothing will happen.
Obviously, you want your form POSTed, so write JS to attach a click handler on that submit button, collect values from all input fields in the form (actually, it is NOT nearly as scary as it sounds; check out the link below), and send it to the server, while saving the reference to the request (check the 2nd link below), so that you can abort it (and maybe signal the server to quit also) when the cancel-button is clicked (alternatively, you can simply abandon it, by not caring about the results).
Submit a form using jQuery
Abort Ajax requests using jQuery
Alternatively, to make that HTML markup "clearer" relative to its functionality, consider not using FORM tag at all: otherwise, what I suggested makes its usage confusing (why it is there if it's not used; know I mean?). But, don't get distracted with this suggestion until you make it work the way you want; it's optional and a topic for another day (it might even relate to your changing architecture of the whole site).
HOWEVER, a thing to think about: what to do if the form-post already reached the server and server already started processing it and some "world" changes have already been made? Maybe your get-results routine doesn't change data, so then that's fine. But, this approach probably cannot be used with change-data POSTs with the expectation that "world" won't change if cancel-button is clicked.
I hope that helps :)
The user doesn't have to experience this synchronously.
Client posts a request
The server receives the client request and assigns an ID to it
The server "kicks off" the search and responds with a zero-data page and search ID
The client receives the "placeholder" page and starts checking if the results are ready based on the ID (with something like polling or websockets)
Once the search has completed, the server responds with the results next time it's polled (or notifies the client directly when using websockets)
This is fine when performance isn't quite the bottleneck and the nature of processing makes longer wait times acceptable. Think flight search aggregators that routinely run for 30-90 seconds, or report generators that have to be scheduled and run for even longer!
You can make the experience less frustrating if you don't block user interactions, keep them updated of search progress and start showing results as they come in if possible.
You must solve this conceptually first before writing any code. Here are some things that come to mind offhand:
What does it mean to free up resources on the server?
What constitutes to a graceful abort that will free up resources?
Is it enough to kill the PHP process waiting for the query result(s)? If so, the route suggested by RandomSeed could be interesting. Just keep in mind that it will only work on a single server. If you have multiple load balanced servers you won't have a way to kill a process on another server (not as easily at least).
Or do you need to cancel the database request from the database itself? In that case the answer suggested by Christian Grün is of more interest.
Or is it that there is no graceful shutdown and you have to force everything to die? If so, this seems awfully hacky.
Not all clients are going to explicitly abort
Some clients are going to close the browser, but their last request won't come through; some clients will lose internet connection and leave the service hanging, etc. You are not guaranteed to get an "abort" request when a client disconnects or has gone away.
You have to decide whether to live with potentially unwanted behavior, or implement an additional active state tracking, e.g. client pinging server for keepalive.
Side notes
30 secs or greater query time is potentially long, is there a better tool for the job; so you won't have to solve this with a hack like this?
you are looking for features of a concurrent system, but you're not using a concurrent system; if you want concurrency use a better tool/environment for it, e.g. Erlang.
I am posting data with a size of approximately 200KB to my server running PHP 5.3.3.7.
The data is actually a JavaScript object with nested properties,
in the request it looks something like this: myObject[prop1][subprop1][key] = 5.
However, all data isn't received on the server. The last part of the posted data is cut off. max_post_size is set to 80MB so that shouldnt be the issue. I have compared the request form data with the data that is accessable through $_POST, and there are lots of data missing.
PHP version is 5.3.3.7.
What could be causing this?
You said you use PHP 5.3.3, but maybe this is not quite right? Since PHP 5.3.9 there is a new setting max_input_vars that limits number of POST (and GET, and COOKIE, and so on, and so on) variables one can pass to a script.
If I am right, then it is enough to adjust it in php.ini, VirtualHost definition, or in .htaccess (ini_set will not work since the POST is already trimmed after your script started)
This setting was introduced for security reasons, so be cautious:
http://www.phpclasses.org/blog/post/171-PHP-Vulnerability-May-Halt-Millions-of-Servers.html
From client side try to use jQuery and convert you data to JSON, before send POST to server
$.ajax({
method: 'POST',
url: 'http://someurl.com',
data: JSON.stringigy(youJsObject),
success: function(data) {
// processing data from server
}
});
Let's assume this is being executed in jQuery:
$.ajax({
url : 'ajaxcall.php',
data : { 'data' : { ary : [1,2,3,3,4,5], txt : "ima let u finish" } },
dataType : 'json',
type : 'post',
timeout : 10000
});
And ajaxcall.php contains:
$return_obj = array();
$return_obj['ary'] = array(9,8,7,6,5);
$return_obj['txt'] = "ok then";
echo json_encode($return_obj);
die();
I'm expecting the following situations to occur (due to packet loss, connection problems, etc):
Ajaxcall.php executes, but the $_POST variable is empty.
The promises of the $.ajax() call are executed, but the data returned to them is empty.
However, what I'm really worried about are situations like these:
Ajaxcall.php executes and $_POST['data']['txt'] has expected values but $_POST['data']['ary'] is missing some values.
The promises of the $.ajax() call are executed and data.ary has the expected values, but data.txt is only half a string (e.g., "ok t").
Are these situations possible?
In short: no, that's not possible. HTTP is based on TCP which guarantees delivery of data. Both the server and client would be aware of an issue that would cause some data to be missed. The TCP layer would retransmit the data as needed until it is complete.
Packet loss and out of order delivery are not uncommon the internet since there is no rule that says routers must forward all packets but TCP automatically corrects for those issues.
None of these situations should happen.
Packet loss is managed at a lower level of the protocol stack.
Over the internet, TCP takes care of integrity of each packet and that all the packets arrive properly and in the right order.
On a higher level of the protocol stack, HTTP has a response header called Content-Length that tells the browser the size of the returned content, it is used by the browser to make sure the response is complete.
Though, some HTTP requests can be answered with a Transfer-Encoding: chunked header that makes the Content-Length useless. These are persistent connections and are maily used when the length of the content is not known beforehand.
Do you have any example of cases where the data is not complete upon arrival?
I'm not sure if I really understand your question, however if the response is somehow truncated (but returned - ie, the server still responds with a 200 status OK) it won't be valid JSON.
It's unclear from your question how data.txt could be 'only have a string' and yet remain valid.
I'm no expert in AJAX (or jQuery) but I thought what I was doing was pretty easy yet when I send an ajax request with:
$.ajax ( requestObj );
it doesn't send and I'm hoping someone can help. In order to give context, I've set the "requestObj" up as follows:
//initialise a request object
var requestObj = {};
requestObj.response = 'ajax-response';
requestObj.type = 'POST';
requestObj.url = my_config['ajax-service-list'][service]['url'];
requestObj.data = $.extend ( requestObj.data , {
action: service,
other: parameters,
_ajax_nonce: my_config['ajax-service-list'][service]['nonce']
});
requestObj.global = false;
requestObj.timeout = 30000;
requestObj.success = function ( r ) {
alert ( "Success: " + r );
}
requestObj.error = function ( r ) {
console.log ("FAILURE WITH AJAX Call ( " + JSON.stringify (r) + ")");
}
There's one thing that probably needs explaining. The two references to "my_config" are references to a Javascript variable that I set using Wordpress's wp_localize_script() function. Basically it just provides context about where to find the URL, the NONCE to use, etc. I have tested that the URL and NONCE information is working correctly so that shouldn't be the problem. For example, I put a breakpoint on the browsers debugger on the line after the two references are defined and got these results:
When I call the ajax function it immediately executes the success function and sends in the value of 0. Looking at my PHP error logs though I can see that the request was never sent. What could be getting in the way of $.ajax(requestOb) from actually sending the request?
UPDATE:
Thanks to Michael's sage advice I realised that I am in fact getting a request to go out but as it's running in a local environment the response is coming back lightening fast. Now I am suspecting this has more to with Wordpress configuration. I have hooked into the wp_ajax_[service_name] but it immediately returns 0. I'll re-ask this question with this new information in the wordpress forum.
You should be using a browser inspector to detect if an ajax request is made. Open up the network tab of any inspector, and you can watch requests as they happen. How is the $.ajax() method being instantiated? You may have an issue with that, as opposed to $.ajax().
Once you've used the inspector, look at the $_POST or $_GET data you're sending in the headers section, and then look at the response. Is the HTTP response code 200? If it's 500, then you probably have an error in your PHP controller that receives the request.
If you have PHP CLI, run this to see if you have a syntax error:
php -l path/to/php/controller.php
If you have a non-fatal error in your file, you'll see the error output in the request response.
Try var_dump( $_REQUEST ) at the top of your php file, too, to make sure that the file is receiving the data, and you can inspect it inside the browser-inspector response.
If you have a problem with the program inside of your controller... you've got yourself a new question to post. :)
At first look, it looks like your URL has spaces around get_action_template. That might be an issue.
Also, passing dataType might help.
If not try getting a JSON response without any parameters and post the output
Ok, i've answered this damn question finally. Arrgh. BIG, BIG THANKS to Mathew to who's troubleshooting skills I could not have done without. Anyway, the problem was in the AJAX request and as a result the Wordpress Ajax manager was never respecting the "hooks" I had put into place on the PHP side.
How was my AJAX request off? I had a POST request but the URL had GET variables hanging off of it. The key variable for Wordpress based Ajax requests is the "action" variable. This is the variable which WP's ajax manager uses to distinguish the various services and is the name that you'll be hooking into.
So in the end, my URL was:
http://mysite.com/wp-admin/admin-ajax.php
and my POST variables included:
action: get-action-template
My wordpress hook is:
add_action ( 'wp_ajax_get-action-template' , 'AjaxServiceManager::ajax_handler' );
My sleepless nights may continue but they won't be related to this damn problem anymore. :^)