I've got a stack of ajax request:
$("someTable tr").each(function() {
// send the data.
var scriptURL = "/module/action/data/" + $(this).find(".data").html() + "/";
document.cyonVars.xhrPool[count] = $.ajax({
type: "GET",
url: scriptURL,
queue: "autocomplete",
cancelExisting: true,
dataType: 'json',
success: function(data){
// do something
}
}
});
count++;
})
While these requests are running the user can press a button. This triggers another ajax request. Something like that:
var scriptURL = "/module/anotheraction/" +
data) + "/";
$.ajax({
type: "GET",
url: scriptURL,
queue: "autocomplete",
cancelExisting: true,
dataType: 'json',
success: function(data){
// Do another thing
}
});
The requests from the first action responding asynchrone as I wish. When a user triggers the second request that one waits till the other requests are finished. But the second request should be proceed earlier. I already worked with session_write_close() didn't changed anything. Thanks for helping.
i'm not so sure, that it should be processed futher. browsers have limits on connections to the same server and if it takes long for a request to reply your app might stop on few or more requests. just think, that async requests are shoot asynchronously (code after ajax request continues to execute), but requests are executed in sequence.
The problem was Zend_Session::Start();
I had to avoid using Zend_Session while proceeding async requests
Related
After page fully loaded, i make ajax request to an action.
While waiting for response from action (it takes 2 or 3 seconds), if user clicks on other link, i want to abort previous request and kill mysql process at once.
How can i do this?
I tried to do like this
var xhr = $.ajax({
/*
params
*/
});
//before unload page
xhr.abort();
but i think, that it will not kill sql process.
Make a named ajax. Here is ajx is named
var formData = {};//Your data for post
var ajx = $.ajax({
type: "POST",// Http verbs
url: "filename.php",// file to request
data: formData,
success: function(response){
console.log("ajax completed");
console.log(response);
}
});
Stop ajax request by clicking button
$( "#button_id" ).click(function() {
ajx.abort();
});
I have created a PHP web application and use MySQL as database backend, but there is an 'Aborted' note on Firebug's Net panel's status column when I access the web page. Why?
$('#submit').on('click', function () {
// e.preventDefault();
var formData = JSON.stringify($("#frmPayoye").serializeObject());
console.log(formData);
$.ajax({
type: "POST",
url: "http://www.sanaroopay.com/pg/api/ectransact/",
data: formData,
cache: false,
timeout: 60000,
async: false,
processData: true,
dataType: 'json', //you may use jsonp for cross origin request
contentType: "application/json; charset=utf-8",
crossDomain: true,
success: function (data) {
alert(JSON.parse(data));
// alert("ok");
console.log("success");
// window.location.assign('https://secure.payoye.net/web/merchant');
},
error: function () {
console.log("Failed");
}
});
});
You are not cancelling the form submission so the Ajax call is aborted and the page submits the form as it is designed to do. So you need to stop the form submission.
$('#submit').on('click', function (evt) {
evt.preventDefault(); //stop the default action of the button
//Rest of your code
});
Please see the documentation of XHR open() for example here: https://developer.mozilla.org/en-US/docs/DOM/XMLHttpRequest
Note: Calling this method an already active request (one for which open()or openRequest()has already been called) is the equivalent of calling abort().
Just create a new XHR instance whenever you need one. Better yet, use jQuery or other JS library to do AJAX. It should shield you from these intricacies.
How to solve Firebug’s “Aborted” messages upon Ajax requests?
In a web application that interacts with one server, there is a jQuery Ajax call with a PHP script that takes about 20 seconds to execute.
$(document).ready(function() {
$.ajax({
type: 'POST',
dataType: 'json',
url: 'createPDF.php',
data: str,
success: function(response) {
showPDF(response);
}
error: function() {
console.log('ERROR WHEN CREATING PDF');
}
});
});
In the meantime a user ask to the application to get a list (getList.php) through another jQuery Ajax independant from createPdf.
$(document).ready(function() {
$.ajax({
type: 'POST',
dataType: 'json',
url: 'getList.php',
data: str,
success: function(response) {
showList(response);
}
error: function() {
console.log('ERROR WHEN GETTING LIST');
}
});
});
The problem is that getList.php only starts executing once createPDF.php finishes.
The application works with sessions, and tests have been done in separate clients, in which case both requests run parallel, but in the same computer they don't.
In this scenario how to run both requests parallel?
Thank you
Basically what is happening is that on your server the session file is locked by the first script, e.g. saveFiles. After that script has completed its task, the session is unlocked again. Unfortunately your script getList wants to access the same session, since it is locked, getList will have to wait until it is unlocked.
One solution for this is to call session_write_close() after all writing to session data has been done, that will cause the session file to be unlocked and be used by the next script. That call should obviously happen as soon as possible after session_start() to minimize the waiting time for other requests.
For a more detailed explanation have a look at https://stackoverflow.com/a/6933294/2442804
I know how to use $.ajax. I have a Codeigniter project so I just call:
url:'<?php echo base_url()."somecontroller/somefunction"?>',
This is all ok but ajax waits for the response. I just want to the call the url as you would by typing it in your browser. I don't want to wait for the response because the controller does a redirect and then loads a view. Also i need to be able to send some data via POST.
How can I do this?
You can use the following for sending asynchronous request with additional parameters.
$.ajax({
type: "POST",
url: url,
data: data, // additional parameters
async:true,
success: function(response){
}
});
If you want the request to be synchronous, set async:false.
For post, type:POST
Refer http://api.jquery.com/jQuery.ajax/
Do you can try this ?
Change the (alert) function with your function
$.ajax({
url: 'test.html',
async: false,
success: function () { alert('hello people from the world'); }
});
Never use async: false.
Since Javascript runs on the UI thread, an async: false request will freeze the browser until the server replies
I have some ajax script that fire off about 250 synchronous PHP calls . This is my script
$(document).ready(function(){
$("#generate").html("<div class='modal'><p>Initializing...</p></div>");
$.ajax({
url:'/fetch around 250 url from database.php',
async:false,
dataType: 'json',
success: function(data){
$.each(data,function(key,val){
$("#generate").html("<div class='modal'><p>Fetching "+val.url+"</p></div>");
saveimage(val.url);
}
$("#generate").html("<div class='modal'><p>done</p></div>");
finalcreate();
},
});
});
function saveimage(){
$.ajax({
url: 'do some php work.php',
async: false,
});
}
function finalcreate(){
$.ajax({
url: 'do some php work.php',
async: false,
});
}
In the first part script fetch more than 250 urls from database and for every url script do some php calculation using another ajax call. when the loop ends script do final ajax call.
When i run this programe in firefox, it run successfully for only 40 urls, then browser shows dialog box with option of whether user want to stop this script or not, if user want to run this script then the script run again for next 40 urls , same proccess occure till the end.
How i can optimize this script, i dont want browser show option to stop this script. Please help.
Thanks
Try this:
function nextrequest() {
if (requests.length == 0) {
$("#generate").html("<div class='modal'><p>done</p></div>");
finalcreate();
return;
}
var val = requests.pop();
$("#generate").html("<div class='modal'><p>Fetching "+val.url+"</p></div>");
saveimage(val.url);
}
var requests = [];
$(document).ready(function(){
$("#generate").html("<div class='modal'><p>Initializing...</p></div>");
$.ajax({
url:'/fetch around 250 url from database.php',
dataType: 'json',
success: function(data){
requests = data;
nextrequest();
},
});
});
function saveimage(){
$.ajax({
url: 'do some php work.php',
success: function(data) {
// do something...
nextrequest();
}
});
}
function finalcreate(){
$.ajax({
url: 'do some php work.php',
});
}
You store all the URLs in a global variable, and everytime a request is done, you get the next one, until all of them are consumed, (requests.length == 0), you call the final request.
This way the user can still do something else on the page, and you can display progress everytime a request is done. Also, a good thing is that you can make 2 calls at once, or more, to make the process faster.
Ajax call needs much time to complete, as it communicates with remote server. The slowest thing there is a query to the server. You should send one batch request with all data needed to the server, that should separate the data and handle it. Everything should be completed about 250 times faster.
make some time interval for each ajax request
success: function(data){
$.each(data,function(key,val){
$("#generate").html("<div class='modal'><p>Fetching "+val.url+"</p></div>");
setTimeout(saveimage(val.url),3000);
}