I have a page where a user can select from multiple processes to run. The short version of what I want to accomplish is: Select a process, click Execute, and go about the rest of your work on the site while it runs in the background.
The problem I face is, after the process is submitted, the users can not take any further actions on the site. They can not navigate away from the page, log out, or anything else.
The code for the page is this:
<h1>Execute Processes</h1>
<div class='box'>
<select id='processes' style='width:300px;'>
<option value=''>Processes</option>
<option value='test'>Test Process</option>
</select><br />
<input type="submit" value="Execute" id='execute'/>
</div>
<script>
$("#execute").click(function() {
var process = $("#processes").val();
if(process != "") {
$.ajax({
url: '_processes/' + process + '.php',
type: "POST",
data: {}
});
message_popup("Process submitted");
} else {
alert("Please select a process");
}
});
</script>
For this example, we only have one process. "test.php". The code for "test.php" is this:
<?php
require_once("../_include/constants.php");
require_once("../_include/connection.php");
require_once("../_include/_functions/functions_common.php");
require_once("../_include/_functions/functions_form.php");
require_once("../_include/_functions/functions_browser.php");
require_once("../_include/_functions/functions_legacy.php");
require_once("../_include/_functions/functions_data.php");
$p_name = "Sample Process";
$unix_pid = getmypid();
$p_started_by = mysqli_prep($_SESSION['username']);
execute_query("INSERT INTO site_process_monitor (process_id,process_name,start_time,status,started_by) VALUES ({$unix_pid},'{$p_name}',NOW(),'Started','{$p_started_by}')");//PROCESS START
$p_id = mysqli_insert_id($connection);
///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
sleep(60);
//CLOSING STATS////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
execute_query("UPDATE site_process_monitor SET end_time = NOW(), status = 'Completed' WHERE id = {$p_id}");//PROCESS COMPLETE
?>
Very basic. Output some opening stats to the process monitor table on my database, wait 60 seconds, output some closing stats.
Is there a way to start this process so that my user can continue to browse the site and perform other tasks (or submit other processes)?
If you're using default file-based sessions in PHP, note that PHP will keep the session file locked while any particular script instance is using it. This will prevent a user from using any other session-enabled part of your site, as each of their requests will now be waiting for the session lock to be relinquished.
For long-running scripts, you'll need to explicitly release the session lock:
session_write_close();
the_long_running_job();
session_start();
$_SESSION will still be available after you "close" it. But any changes made to $_SESSION will be lost since PHP won't "re-close" it when the script exits.
I think you need to use a Web Worker:
https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/basic_usage
A Web Worker spawns a new browser thread which ensures that user interactions on the page are not hindered.
However, because Web Workers don't have direct DOM access you won't be able to use JQuery to do your ajax request. You will have to write your own XMLHttpRequest.
http://techslides.com/html5-web-workers-for-ajax-requests
I hope this helps.
Javascript will never block the UI thread of a web page. (Sorry, but the "web worker" answer below is naive). If your UI (in javascript) becomes unresponsive, you probably have an infinite loop. Check your network panel on your browser and see if hundreds of POSTs are being done. Try moving the POST action outside of the $("#execute").click(function(). Perhaps you have an id="execute" elsewhere on the page, triggering multiple recursive click()'s.
Related
I have read many similar questions concerning cancelling a POST request with jQuery, but none seem to be close to mine.
I have your everyday form that has a PHP-page as an action:
<form action="results.php">
<input name="my-input" type="text">
<input type="submit" value="submit">
</form>
Processing results.php on the server-side, based on the post information given in the form, takes a long time (30 seconds or even more and we expect an increase because our search space will increase as well in the coming weeks). We are accessing a Basex server (version 7.9, not upgradable) that contains all the data. A user-generated XPath code is submitted in a form, and the action url then sends the XPath code to the Basex server which returns the results. From a usability perspective, I already show a "loading" screen so users at least know that the results are being generated:
$("form").submit(function() {
$("#overlay").show();
});
<div id="overlay"><p>Results are being generated</p></div>
However, I would also want to give users the option to press a button to cancel the request and cancel the request when a user closes the page. Note that in the former case (on button click) this also means that the user should stay on the same page, can edit their input, and immediately re-submit their request. It is paramount that when they cancel the request, they can also immediately resend it: the server should really abort, and not finish the query before being able to process a new query.
I figured something like this:
$("form").submit(function() {
$("#overlay").show();
});
$("#overlay button").click(abortRequest);
$(window).unload(abortRequest);
function abortRequest() {
// abort correct request
}
<div id="overlay">
<p>Results are being generated</p>
<button>Cancel</button>
</div>
But as you can see, I am not entirely sure how to fill in abortRequest to make sure the post request is aborted, and terminated, so that a new query can be sent. Please fill in the blanks! Or would I need to .preventDefault() the form submission and instead do an ajax() call from jQuery?
As I said I also want to stop the process server-side, and from what I read I need exit() for this. But how can I exit another PHP function? For example, let's say that in results.php I have a processing script and I need to exit that script, would I do something like this?
<?php
if (isset($_POST['my-input'])) {
$input = $_POST['my-input'];
function processData() {
// A lot of processing
}
processData()
}
if (isset($_POST['terminate'])) {
function terminateProcess() {
// exit processData()
}
}
and then do a new ajax request when I need to terminate the process?
$("#overlay button").click(abortRequest);
$(window).unload(abortRequest);
function abortRequest() {
$.ajax({
url: 'results.php',
data: {terminate: true},
type: 'post',
success: function() {alert("terminated");});
});
}
I did some more research and I found this answer. It mentions connection_aborted() and also session_write_close() and I'm not entirely sure which is useful for me. I do use SESSION variables, but I don't need to write away values when the process is cancelled (though I would like to keep the SESSION variables active).
Would this be the way? And if so, how do I make one PHP function terminate the other?
I have also read into Websockets and it seems something that could work, but I don't like the hassle of setting up a Websocket server as this would require me to contact our IT guy who requires extensive testing on new packages. I'd rather keep it to PHP and JS, without third party libraries other than jQuery.
Considering most comments and answers suggest that what I want is not possible, I am also interested to hear alternatives. The first thing that comes to mind is paged Ajax calls (similar to many web pages that serve search results, images, what-have-you in an infinite scroll). A user is served a page with the X first results (e.g. 20), and when they click a button "show next 20 results" those are shown are appended. This process can continue until all results are shown. Because it is useful for users to get all results, I will also provide a "download all results" option. This will then take very long as well, but at least users should be able to go through the first results on the page itself. (The download button should thus not disrupt the Ajax paged loads.) It's just an idea, but I hope it gives some of you some inspiration.
On my understanding the key points are:
You cannot cancel a specific request if a form is submitted. Reasons are on client side you don't have anything so that you can identify the states of a form request (if it is posted, if it is processing, etc.). So only way to cancel it is to reset the $_POST variables and/or refresh the page. So connection will be broken and the previous request will not be completed.
On your alternative solution when you are sending another Ajax call with {terminate: true} the result.php can stop processing with a simple die(). But as it will be an async call -- you cannot map it with the previous form submit. So this will not practically work.
Probable solution: submit the form with Ajax. With jQuery ajax you will have an xhr object which you can abort() upon window unload.
UPDATE (upon the comment):
A synchronous request is when your page will block (all user actions) until the result is ready. Pressing a submit button in the form - do a synchronous call to server by submitting the form - by definition [https://www.w3.org/TR/html-markup/button.submit.html].
Now when user has pressed submit button the connection from browser to server is synchronous - so it will not be hampered until the result is there. So when other calls to server is made - during the submit process is going on - no reference of this operation is available for others - as it is not finished. It is the reason why sending termination call with Ajax will not work.
Thirdly: for your case you can consider the following code example:
HTML:
<form action="results.php">
<input name="my-input" type="text">
<input id="resultMaker" type="button" value="submit">
</form>
<div id="overlay">
<p>Results are being generated</p>
<button>Cancel</button>
</div>
JQUERY:
<script type="text/javascript">
var jqXhr = '';
$('#resultMaker').on('click', function(){
$("#overlay").show();
jqXhr = $.ajax({
url: 'results.php',
data: $('form').serialize(),
type: 'post',
success: function() {
$("#overlay").hide();
});
});
});
var abortRequest = function(){
if (jqXhr != '') {
jqXhr.abort();
}
};
$("#overlay button").on('click', abortRequest);
window.addEventListener('unload', abortRequest);
</script>
This is example code - i just have used your code examples and changed something here and there.
Himel Nag Rana demonstrated how to cancel a pending Ajax request.
Several factors may interfere and delay subsequent requests, as I have discussed earlier in another post.
TL;DR: 1. it is very inconvenient to try to detect the request was cancelled from within the long-running task itself and 2. as a workaround you should close the session (session_write_close()) as early as possible in your long-running task so as to not block subsequent requests.
connection_aborted() cannot be used. This function is supposed to be called periodically during a long task (typically, inside a loop). Unfortunately there is just one single significant, atomic operation in your case: the query to the data back end.
If you applied the procedures advised by Himel Nag Rana and myself, you should now be able to cancel the Ajax request and immediately allow a new requests to proceed. The only concern that remains is that the previous (cancelled) request may keep running in the background for a while (not blocking the user, just wasting resources on the server).
The problem could be rephrased to "how to abort a specific process from the outside".
As Christian Bonato rightfully advised, here is a possible implementation. For the sake of the demonstration I will rely on Symphony's Process component, but you can devise a simpler custom solution if you prefer.
The basic approach is:
Spawn a new process to run the query, save the PID in session. Wait for it to complete, then return the result to the client
If the client aborts, it signals the server to just kill the process.
<?php // query.php
use Symfony\Component\Process\PhpProcess;
session_start();
if(isset($_SESSION['queryPID'])) {
// A query is already running for this session
// As this should never happen, you may want to raise an error instead
// of just silently killing the previous query.
posix_kill($_SESSION['queryPID'], SIGKILL);
unset($_SESSION['queryPID']);
}
$queryString = parseRequest($_POST);
$process = new PhpProcess(sprintf(
'<?php $result = runQuery(%s); echo fetchResult($result);',
$queryString
));
$process->start();
$_SESSION['queryPID'] = $process->getPid();
session_write_close();
$process->wait();
$result = $process->getOutput();
echo formatResponse($result);
?>
<?php // abort.php
session_start();
if(isset($_SESSION['queryPID'])) {
$pid = $_SESSION['queryPID'];
posix_kill($pid, SIGKILL);
unset($pid);
echo "Query $pid has been aborted";
} else {
// there is nothing to abort, send a HTTP error code
header($_SERVER['SERVER_PROTOCOL'] . ' 599 No pending query', true, 599);
}
?>
// javascript
function abortRequest(pendingXHRRequest) {
pendingXHRRequest.abort();
$.ajax({
url: 'abort.php',
success: function() { alert("terminated"); });
});
}
Spawning a process and keeping track of it is genuinely tricky, this is why I advised using existing modules. Integrating just one Symfony component should be relatively easy via Composer: first install Composer, then the Process component (composer require symfony/process).
A manual implementation could look like this (beware, this is untested, incomplete and possibly unstable, but I trust you will get the idea):
<?php // query.php
session_start();
$queryString = parseRequest($_POST); // $queryString should be escaped via escapeshellarg()
$processHandler = popen("/path/to/php-cli/php asyncQuery.php $queryString", 'r');
// fetch the first line of output, PID expected
$pid = fgets($processHandler);
$_SESSION['queryPID'] = $pid;
session_write_close();
// fetch the rest of the output
while($line = fgets($processHandler)) {
echo $line; // or save this line for further processing, e.g. through json_encode()
}
fclose($processHandler);
?>
<?php // asyncQuery.php
// echo the current PID
echo getmypid() . PHP_EOL;
// then execute the query and echo the result
$result = runQuery($argv[1]);
echo fetchResult($result);
?>
With BaseX 8.4, a new RESTXQ annotation %rest:single was introduced, which allows you to cancel a running server-side request: http://docs.basex.org/wiki/RESTXQ#Query_Execution. It should solve at least some of the challenges you described.
The current way to only return chunks of the result is to pass on the index to the first and last result in your result, and to do the filtering in XQuery:
$results[position() = $start to $end]
By returning one more result than requested, the client will know that there will be more results. This may be helpful, because computing the total result size is often much more expensive than returning only the first results.
I hope I understood this correctly.
Instead of letting the browser "natively" submit the FORM, don't: write JS code that does this instead. In other words (I didn't test this; so interpret as pseudo-code):
<form action="results.php" onsubmit="return false;">
<input name="my-input" type="text">
<input type="submit" value="submit">
</form>
So, now, when the that "submit" button is clicked, nothing will happen.
Obviously, you want your form POSTed, so write JS to attach a click handler on that submit button, collect values from all input fields in the form (actually, it is NOT nearly as scary as it sounds; check out the link below), and send it to the server, while saving the reference to the request (check the 2nd link below), so that you can abort it (and maybe signal the server to quit also) when the cancel-button is clicked (alternatively, you can simply abandon it, by not caring about the results).
Submit a form using jQuery
Abort Ajax requests using jQuery
Alternatively, to make that HTML markup "clearer" relative to its functionality, consider not using FORM tag at all: otherwise, what I suggested makes its usage confusing (why it is there if it's not used; know I mean?). But, don't get distracted with this suggestion until you make it work the way you want; it's optional and a topic for another day (it might even relate to your changing architecture of the whole site).
HOWEVER, a thing to think about: what to do if the form-post already reached the server and server already started processing it and some "world" changes have already been made? Maybe your get-results routine doesn't change data, so then that's fine. But, this approach probably cannot be used with change-data POSTs with the expectation that "world" won't change if cancel-button is clicked.
I hope that helps :)
The user doesn't have to experience this synchronously.
Client posts a request
The server receives the client request and assigns an ID to it
The server "kicks off" the search and responds with a zero-data page and search ID
The client receives the "placeholder" page and starts checking if the results are ready based on the ID (with something like polling or websockets)
Once the search has completed, the server responds with the results next time it's polled (or notifies the client directly when using websockets)
This is fine when performance isn't quite the bottleneck and the nature of processing makes longer wait times acceptable. Think flight search aggregators that routinely run for 30-90 seconds, or report generators that have to be scheduled and run for even longer!
You can make the experience less frustrating if you don't block user interactions, keep them updated of search progress and start showing results as they come in if possible.
You must solve this conceptually first before writing any code. Here are some things that come to mind offhand:
What does it mean to free up resources on the server?
What constitutes to a graceful abort that will free up resources?
Is it enough to kill the PHP process waiting for the query result(s)? If so, the route suggested by RandomSeed could be interesting. Just keep in mind that it will only work on a single server. If you have multiple load balanced servers you won't have a way to kill a process on another server (not as easily at least).
Or do you need to cancel the database request from the database itself? In that case the answer suggested by Christian GrĂ¼n is of more interest.
Or is it that there is no graceful shutdown and you have to force everything to die? If so, this seems awfully hacky.
Not all clients are going to explicitly abort
Some clients are going to close the browser, but their last request won't come through; some clients will lose internet connection and leave the service hanging, etc. You are not guaranteed to get an "abort" request when a client disconnects or has gone away.
You have to decide whether to live with potentially unwanted behavior, or implement an additional active state tracking, e.g. client pinging server for keepalive.
Side notes
30 secs or greater query time is potentially long, is there a better tool for the job; so you won't have to solve this with a hack like this?
you are looking for features of a concurrent system, but you're not using a concurrent system; if you want concurrency use a better tool/environment for it, e.g. Erlang.
I have problem with two simultaneous AJAX requests running. I have a PHP script which is exporting data to XSLX. This operation take a lot of time, so I'm trying to show progress to the user. I'm using AJAX and database approach. Actually, I'm pretty sure it used to work but I can't figure out why, it's no longer working in any browser. Did something change in new browsers?
$(document).ready(function() {
$("#progressbar").progressbar();
$.ajax({
type: "POST",
url: "{$BASE_URL}/export/project/ajaxExport",
data: "type={$type}&progressUid={$progressUid}" // unique ID I'm using to track progress from database
}).done(function(data) {
$("#progressbar-box").hide();
clearInterval(progressInterval);
});
progressInterval = setInterval(function() {
$.ajax({
type: "POST",
url: "{$BASE_URL}/ajax/progressShow",
data: "statusId={$progressUid}" // the same uinque ID
}).done(function(data) {
data = jQuery.parseJSON(data);
$("#progressbar").progressbar({ value: parseInt(data.progress) });
if (data.title) { $("#progressbar-title").text(data.title); }
});
}, 500);
});
the progress is correctly updating in database
the JS timer is trying to get the progress, I can see it in console, but all these request are loading the whole duration of the first script, as soon as the script ends, these ajax progress calls are loaded
So, why is the second AJAX call waiting for the first one to finish?
Sounds like a session blocking issue
By default PHP writes its session data to a file. When you initiate a session with session_start() it opens the file for writing and locks it to prevent concurrent edits. That means that for each request going through a PHP script using a session has to wait for the first session to be done with the file.
The way to fix this is to change PHP sessions to not use files or to close your session write like so:
<?php
session_start(); // starting the session
$_SESSION['foo'] = 'bar'; // Write data to the session if you want to
session_write_close(); // close the session file and release the lock
echo $_SESSION['foo']; // You can still read from the session.
After a bit of hair-pulling, I found one other way that these non-parallel AJAX requests can happen, totally independent of PHP session-handling... So I'm posting it here just for anyone getting here through Google with the same problem.
XDebug can cause this, and I wouldn't be surprised if Zend Debugger could too.
In my case, I had:
XDebug installed on my local LAMP stack
xdebug.remote_autostart enabled
My IDE accepting inbound debugger-connections, even though no breakpoints were active
This caused all my AJAX tests to run sequentially, no matter what. In retrospect it makes a lot of sense (from the standpoint of debugging things) to force sequential processing, but I simply hadn't noticed that my IDE was still interacting behind-the-scenes.
After telling the IDE to stop listening entirely, parallel runs resumed and I was able to reproduce the race-condition I had been looking for.
Be aware, that session_write_close()(answer of chrislondon) may not resolve the problem if you have enabled output buffering (default in PHP 7+). You have to set output_buffering = Off in php.ini, otherwise session won't be closed correctly.
When working with APIs, you sometimes need to issue multiple AJAX requests to different endpoints. Instead of waiting for one request to complete before issuing the next, you can speed things up with jQuery by requesting the data in parallel, by using jQuery's $.when() function:
Run multiple AJAX requests in parallel
a.php generates a main HTML page that contains two simultaneous AJAX calls to b.php and c.php. In order for b.php and c.php to share session variables, the session variables must exist BEFORE the first AJAX call. Provided this is true, a.php and b.php can change the value of the session variables and see each other's values. Therefore, create the session variables with a.php while generating the HTML page. At least that's how it works with Rogers shared web hosting.
You could also set
async: true,
I have php script which can take quite a lot of time (up to 3-5 minutes), so I would like to notify user how is it going.
I read this question and decided to use session for keeping information about work progress.
So, I have the following instructions in php:
public function longScript()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$generatingProgressSession->unsetAll();
....
$generatingProgressSession->total = $productsNumber;
...
$processedProducts = 0;
foreach($models as $model){
//Do some processing
$processedProducts++;
$generatingProgressSession->processed = $processedProducts;
}
}
And I have simple script for taking data from session (number of total and processed items) which return them in json format.
So, here is js code for calling long script:
$.ajax({
url: 'pathToLongScript',
data: {fileId: fileId, format: 'json'},
dataType: 'json',
success: function(data){
if(data.success){
if(typeof successCallback == "function")
successCallback(data);
}
}
});
//Start checking progress functionality
var checkingGenerationProgress = setInterval(function(){
$.ajax({
url: 'pathToCheckingStatusFunction',
data: {format: 'json'},
success: function(data){
console.log("Processed "+data.processed+" items of "+data.total);
if(data.processed == data.total){
clearInterval(checkingGenerationProgress);
}
}
});
}, 10000)
So, long scripted is called via ajax. Then after 10 seconds checking script is called one time, after 20 second - second time etc.
The problem is that none of requests to checking script is completed until main long script is complete. So, what does it mean? That long script consumes too many resources and server can not process any other request? Or I have some wrong ajax parameters?
See image:
-----------UPD
Here is a php function for checking status:
public function checkGenerationProgressAction()
{
$generatingProgressSession = new Zend_Session_Namespace('generating_progress');
$this->view->total = $generatingProgressSession->total;
$this->view->processed = $generatingProgressSession->processed;
}
I'm using ZF1 ActionContext helper here, so result of this function is json object {'total':'somevalue','processed':'another value'}
I'd
exec ('nohup php ...');
the file and send it to background. You can set points the long running script is inserting a single value in DB to show it's progress. Now you can go and check every ten or whatever seconds if a new value has been added and inform the user. Even might be possible to inform the user when he is on another page within your project, depending on your environment.
Yes, it's possible that the long scripts hogs the entire server and any other requests made in that time are waiting to get their turn. Also i would recommend you to not run the check script every 10 seconds no matter if the previous check has finished or not but instead let the check script trigger itself after it has been completed.
Taking for example your image with the requests pending, instead of having 3 checking request running at the same time you can chain them so that at any one time only one checking request is run.
You can do this by replacing your setInterval() function with a setTimeout() function and re-initialize the setTimeout() after the AJAX check request is completed
Most likely, the following calls are not completing due to session locking. When one thread has a session file open, no other PHP threads can open that same file, as it is read/write locked until the previous thread lets go of it.
Either that, or your Server OR Browser is limiting concurrent requests, and therefore waiting for this one to complete.
My solution would be to either fork or break the long-running script off somehow. Perhaps a call to exec to another script with the requisite parameters, or any way you think would work. Break the long-running script into a separate thread and return from the current one, notifying the user that the execution has begun.
The second part would be to log the progress of the script somewhere. A database, Memcache, or a file would work. Simply set a value in a pre-determined location that the follow-up calls can check on.
Not that "pre-determined" should not be the same for everyone. It should be a location that only the user's session and the worker know.
Can you paste the PHP of "pathToCheckingStatusFunction" here?
Also, I notice that the "pathToCheckingStatusFunction" ajax function doesn't have a dataType: "json". This could be causing a problem. Are you using the $_POST['format'] anywhere?
I also recommend chaining the checks into after the first check has completed. If you need help with that, I can post a solution.
Edit, add possible solution:
I'm not sure that using Zend_namespace is the right approach. I would recommend using session_start() and session_name(). Call the variables out of $_SESSION.
Example File 1:
session_name('test');
session_start();
$_SESSION['percent'] = 0;
...stuff...
$_SESSION['percent'] = 90;
Example File 2(get percent):
session_name('test');
session_start();
echo $_SESSION['percent'];
I have a page with some user selectable options and a button that, when clicked, runs a PHP script and then refreshes a div with another PHP file that uses a session variable that is created at the end of the first PHP script. If the user presses the button again, with different options selected, the div is updated using the newly replaced session variable. The problem is that sometimes, perhaps 1 in 10 times or so, the old session variable data is loaded. I suspect that the second PHP file is catching the variable too early, before it has been updated, but I tried unsetting the session variable at various points with out any luck.
First PHP file:
session_start();
$needle = array();
foreach($_POST['checkboxes'] as $key => $value){
$needle[] = "$value";
}
// code that processes the values from needle and outputs $data
unset($_SESSION['data']);
$_SESSION['data']=$data;
Second PHP file:
session_start();
echo $_SESSION['data'];
Javascript:
$(".userdata").click(function() {
$.post("first.php", $("form#checkboxes").serialize());
});
$(function() {
$("#button").click(function() {
$("#div").load('second.php')
})
})
The problem is that in some cases the first PHP script has not finished running before you click the button that loads the second PHP script (like I implied before in my comment). The fact that this happens is related to how scripts are scheduled by the webserver (which is a different subject entirely).
You thus need to make sure that when you click the button that runs the second script, the first script has completely finished running.
Because in my knowledge, javascript does not allow blocking/signaling on a variable (like Java does), you'll have to use a more 'dirty' technique called busy waiting.
The best way to do this, is to include an extra variable in the javascript you are using.
var wait = false;
function reloadSecond (){
if (wait){
setTimeout('reloadSecond()',200);
} else {
$("#div").load('second.php');
}
}
$(".userdata").click(function() {
wait = true;
$.post("first.php", $("form#checkboxes").serialize(), function(){
wait = false;
});
});
$(function() {
$("#button").click(reloadSecond);
})
While 'busy waiting' is generally not considered the most elegant solution, I think you don't have many other options in this case (except for serverside push, which is much more complicated). Additionally, you'll probably only incur the extra 200 millisecond (or less, you can of course change this value) waiting time once or twice.
(side note: I assume that javascript is single threaded here, which is true in almost all cases: Is JavaScript guaranteed to be single-threaded?).
Can it be a case where the browser or proxy server is caching the html data? Try setting the headers to tell them not to cache. See the examples in http://php.net/manual/en/function.header.php for what headers to set.
I'm using a jQuery dialog to ask the user if he really wants to delete the record (user is logged in).
If yes, I fetch the record's ID and run jQuery.ajax.
My questions are:
Can a user execute a jquery without interacting with the screen?
Can user, through some "hack ware", pass on any record_id thus deleting a record that he's not supposed to delete?
Here's my code:
function initDeleteRecord() {
var options = {
title: "Delete record",
modal: true,
buttons: {
"Ok": function() {
var record_id = jQuery('#recordID').val(); // <-- Can this be manipulated?
deleteRecord(record_id);
},
"Cancel": function() {
jQuery(this).dialog("close");
}
}
};
jQuery('#dialog').dialog(options);
jQuery('#dialog').dialog("open");
}
function deleteStore(store_id) {
jQuery.ajax({
url: siteURL +"/jquery.php",
data: {instance: 'deleteRecord', record_id : record_id},
success: (function(data) {
jQuery('#dialog').dialog("close");
window.location(siteURL);
}),
dataType: 'json'
});
}
Yes and Yes. A user can execute any jQuery code, for example using Opera Dragonfly or Firebug for Firefox.
And a user can always use his own implementation of a "browser". Never ever trust the data coming to you. Always perform checks server side (again). Client side checks are only good to increase comfort if the user accidentally entered incorrect data.
1) Sure, it's trivial to extract the url from a chunk of javascript and invoke the web service directly. It's impossible to guarantee 100% of the time that script x.php was invoked by a piece of javascript executing in a particular page. As far as the PHP script is concerned, a POSt done by an AJAX call is the same as a POST done in a form on a completely different page or server.
2) Easily. Consider someone putting a simple .html page on their own local computer with a form in it:
<form method="post" action="http://yourserver.com/jquery.php">
<input type="hidden" name="instance" value="deleteRecord" />
<input type="text" name="record_id" />
<input type="submit"
</form>
this will have exactly the same effect as your jquery ajax call.
Ad 1: Yes he can execute Javascript as he sees fit.
Ad 2: Yes he sure can (e.g. with Tamper). Never validate anything on the client side, always on the server side. If a user may only delete certain ids, you have to make sure on the server side that the authentified user (via sessioning or whatever) has the rights to do so prior to executing your SQL.
1) Yes. A user can execute any snippet of JavaScript, including your jQuery functions, within his own browser without interacting with the page elements themselves.
2) Yes. The value of the #recordID element can be manipulated to be whatever value the user desires. Alternatively, the user could simply call deleteStore() directly with any record ID.
There are some pretty serious security concerns here. jQuery (and JavaScript in general) are not going to be able to control your users' permissions in this way. You would need to keep track of the logged in user and his permissions on the server (through sessions or something similar) and only delete rows that the user would have permission to delete.