i have one php file which process adding of record in Database fro array.
for example in array i have 5 items
aray an='abc','xyz','ert','wer','oiu'
i want to call one php file in j query ajax method
um = an.split(',');
var counter = 0;
if(counter < uemail.length) {
$("#sending_count").html("Processing Record "+ ecounter +" of " + an.length);
var data = {uid: um[counter]
$.ajax({
type: "POST",
url: "save.php",
data: data,
success: function(html){
echo "added";
counter++;
}
what it do, it complete all the prcess but save.php is still working
what i want after one process it stop untill process of save.php complete then it wait for next 10 sec and start adding of 2nd element.
Thanks
Not sure if I understand your issue correctly, but you may want to use synchronous (blocking) ajax calls instead of asynchronous (non-blocking). When you make asynchronous call, code execution continues immediately, leaving ajax calls "in the background". Synchronous call blocks the code execution until the request has finished.
$.ajax({ async: false, ... });
It not my place to question why you would want to do this, although want you are trying could result in a slow and unresponsive UI.
You'll want a while loop, not an if loop:
while(counter < uemail.length) {
Other solution that present themselves,
You'll want to turn off the async flag to ensure the call is complete, before executing the next line. The delay() function will also help.
$.ajax({
async: false, //ensure our requests are synchronous.
type: "POST",
url: "save.php",
data: data,
success: function(html){
echo "added"; //?? not a valid javascript function
delay(10000); //10000ms = 10 seconds.
}
counter++;
}
Also,echo is not a valid jQuery/javascript function, and your braces are somewhat unclear, and probably misssing.
I have assumed above that counter++ is outside your loop, because if it wasn't and you got a failure then it could continue forever.
Related
I have been staring at this problem for the past 2 hours and can't seem to fathom it, even after validating that everything loads correctly when scouring the console.
I basically have two sliders on my page which will eventually populate results in a table, every time I change my slider I send an array of two values to my AJAX script:
function update_results(values)
{
$.ajax({
type: "GET",
url: "./app/core/commands/update_results.php",
data: { query : values },
cache: false,
success: function(data) {
// eventually some success callback
}
});
}
The browser successfully finds update_results.php but it does not perform the logic on the page ( I assume it has found the page as the 404 error does not appear in my console. )
At this point in time the script is extremely bare-bones as I'm obviously trying to establish communication between both files:
<?php
$vals = $_GET['values'];
echo $vals;
In this case $vals is never echoed to the page, am I missing something in my AJAX? I know the values enter the function as alerted them out before attaching the PHP script.
Ajax Calls are suffering from Browser Cache. If your browser thinks, that he already knows the content of update.php, he will return the cached content, and not trigger the script. Therefore your
modified code might simply not get executed. (Therefore your insert query wasn't executed)
To ensure this is not happening in your case, it is always a good idea to pass a parameter (timestamp) to the script, so your browser thinks it's another outcome:
function update_results(values)
{
$.ajax({
type: "GET",
url: "./app/core/commands/update_results.php?random_parameter=" + (new Date().getTime());
data: { query : values },
cache: false,
success: function(data) {
// eventually some success callback
}
});
}
This will ensure that - at least - the browser cache is refreshed once per second for update_results.php, no matter what browser cache-settings or server-side cache advices are telling.
when Ajax is done, the success callback is triggered and the output of you php script is saved in data.
you can handle the data like this:
$.ajax({
type: "GET",
url: "./app/core/commands/update_results.php",
data: { query : values },
cache: false,
dataType: "text",
success: function(data) {
document.write( data )
}
});
PHP, running at server, is unaware of what happening at the front-end browser and it simply respond to ajax request as any other normal http request. So the failure of SQL query has nothing to do with javascript, which only responsible for sending ajax request and receiving and handling the response. I guess there's some errors in your php script.
I've made a simple PHP jQuery Chat Application with Short Polling (AJAX Refresh). Like, every 2 - 3 seconds it asks for new messages. But, I read that Long Polling is a better approach for Chat applications. So, I went through some Long Polling scripts.
I made like this:
Javascript:
$("#submit").click(function(){
$.ajax({
url: 'chat-handler.php',
dataType: 'json',
data: {action : 'read', message : 'message'}
});
});
var getNewMessage = function() {
$.ajax({
url: 'chat-handler.php',
dataType: 'json',
data: {action : 'read', message : 'message'},
function(data){
alert(data);
}
});
getNewMessage();
}
$(document).ready(getNewMessage);
PHP
<?php
$time = time();
while ((time() - $time) < 25) {
$data = $db->getNewMessage ();
if (!empty ($data)) {
echo json_encode ($data);
break;
}
usleep(1000000); // 1 Second
}
?>
The problem is, once getNewMessage() starts, it executes unless it gets some response (from chat-handler.php). It executes recursively. But if someone wants to send a message in between, then actually that function ($("#submit").click()) never executes as getNewMessage() is still executing. So is there any workaround?
I strongly recommend that you read up on two things: the idea behind long polling, and jQuery callbacks. I'll quickly go into both, but only in as much detail as this box allows me to.
Long polling
The idea behind long polling is to have the webserver artificially "slow down" when returning the request so that it waits until an event has come up, and then immediately gives the information, and closes the connection. This means that your server will be sitting idle for a while (well, not idle, but you know what I mean), until it finally gets the info that a message went through, sends that back to the client, and proceeds to the next one.
On the JS client side, the effect is that the Ajax callback (this is the important bit) is delayed.
jQuery .ajax()
$.ajax() returns immediately. This is not good. You have two choices to remedy this:
bind your recursion call in the success and error callback functions (this is important. the error function might very well come up due to a timeout)
(see below):
Use This:
var x = $.ajax({blah});
$.when(x).done(function(a) { recursiveCallHere(); });
Both amount to the same thing in the end. You're triggering your recursion on callback and not on initiation.
P.S: what's wrong with sleep(1)?
In long polling new request should be initiated when you have received the data from the previous one. Otherwise you will have infinite recursion in browser freezing.
var getNewMessage = function() {
$.ajax({
url: 'chat-handler.php',
dataType: 'json',
data: {action : 'read', message : 'message'},
success: function(data) {
alert(data);
getNewMessage(); // <-- should be here
}
});
}
I want to run some AJAX calls at the same page from the same client.
Ajax calls start correctly but the server queued the requests and execute jsut one per time.
I've also check the start request time and the returned message time.
Studying the second one there's a difference between the requests which is multiple than the before request.
Help me please!
$("document").ready(function() {
$(".user-id").each(function() {
var id = $(this).html();
getData(id);
});
});
function getData(id) {
$.ajax({
url: 'loadOperatorDiagram.php',
type: 'GET',
data: {id: id},
async: true,
cache: false,
success: function(resp) {
$("#boxes").append(resp);
draw(id); // A javascript function which draw into a canvas
}
});
}
loadOperatorDiagram.php get some queries and its execution time is about 5 seconds. The first one ajax request response after 5 seconds, the second one after 10 and so on. But everyone starts asyncronusly and correctly with a difference of few milliseconds
If you are using sessions in php (sounds like it, otherwise you could do at least 2 simultaneous requests...), you should close it as soon as possible in your php script as php will block the session.
Just use session_write_close(); as soon as you have what you need from the session.
I have some ajax script that fire off about 250 synchronous PHP calls . This is my script
$(document).ready(function(){
$("#generate").html("<div class='modal'><p>Initializing...</p></div>");
$.ajax({
url:'/fetch around 250 url from database.php',
async:false,
dataType: 'json',
success: function(data){
$.each(data,function(key,val){
$("#generate").html("<div class='modal'><p>Fetching "+val.url+"</p></div>");
saveimage(val.url);
}
$("#generate").html("<div class='modal'><p>done</p></div>");
finalcreate();
},
});
});
function saveimage(){
$.ajax({
url: 'do some php work.php',
async: false,
});
}
function finalcreate(){
$.ajax({
url: 'do some php work.php',
async: false,
});
}
In the first part script fetch more than 250 urls from database and for every url script do some php calculation using another ajax call. when the loop ends script do final ajax call.
When i run this programe in firefox, it run successfully for only 40 urls, then browser shows dialog box with option of whether user want to stop this script or not, if user want to run this script then the script run again for next 40 urls , same proccess occure till the end.
How i can optimize this script, i dont want browser show option to stop this script. Please help.
Thanks
Try this:
function nextrequest() {
if (requests.length == 0) {
$("#generate").html("<div class='modal'><p>done</p></div>");
finalcreate();
return;
}
var val = requests.pop();
$("#generate").html("<div class='modal'><p>Fetching "+val.url+"</p></div>");
saveimage(val.url);
}
var requests = [];
$(document).ready(function(){
$("#generate").html("<div class='modal'><p>Initializing...</p></div>");
$.ajax({
url:'/fetch around 250 url from database.php',
dataType: 'json',
success: function(data){
requests = data;
nextrequest();
},
});
});
function saveimage(){
$.ajax({
url: 'do some php work.php',
success: function(data) {
// do something...
nextrequest();
}
});
}
function finalcreate(){
$.ajax({
url: 'do some php work.php',
});
}
You store all the URLs in a global variable, and everytime a request is done, you get the next one, until all of them are consumed, (requests.length == 0), you call the final request.
This way the user can still do something else on the page, and you can display progress everytime a request is done. Also, a good thing is that you can make 2 calls at once, or more, to make the process faster.
Ajax call needs much time to complete, as it communicates with remote server. The slowest thing there is a query to the server. You should send one batch request with all data needed to the server, that should separate the data and handle it. Everything should be completed about 250 times faster.
make some time interval for each ajax request
success: function(data){
$.each(data,function(key,val){
$("#generate").html("<div class='modal'><p>Fetching "+val.url+"</p></div>");
setTimeout(saveimage(val.url),3000);
}
I've got 2 ajax requests on one page. I ran first request and separately start second one. But second one stops working after the first has been run. And continue when first is over.
First requst take long time - something like 30 - 60 seconds and in this time I need second request to show logs what happens with first request. I try to use async: true but it's not help me.
Here it's my code
<script type="text/javascript">
var auto_refresh = setInterval( function()
{ asyncGet('log.php') }, 1000
);
function asyncGet(addr) {
$.ajax({
url: addr,
async: true,
success: function (response) {
$('#loadLog').html(response);
}
});
}
function getConn(addr) {
$.ajax({
url: addr,
async: true,
success: function (response) {
stopGet();
}
});
}
</script>
<div id="loadLog" class="lLog"></div>
and I call first ajax request in this way: getConn('main.php'); from function when press button.
Second request it's running, but not show respons before first request complete.
I wil attach image from firebug.
main.php - is request that take longer time.
log.php - is the logger that is blocked.
Would really appreciate some pointers to where I'm going wrong
This may be a problem with session. Check out this post. Suppose you may need to close session in your main.php as fast as possible.