I have a simple question, but can't find a quick answer. Maybe somebody here can help me.
If I trigger an AJAX request on my HTML page with jQuery and then trigger a refresh while the AJAX request isn't finished yet, does the AJAX request stop? Or does it continue on the server?
Thanks.
The ajax request will stop. Whether or not your server processes the request will depend on whether or not the server completely received the request before it was stopped.
This is a case where it is ok to use async: false to get the behaviour you want. However, it would be better to just wait to do the refresh until the ajax is complete.
I believe it depends on whether or not a session was started on the ajax page.
If yes then the php will 'block' until the ajax request was completed and the page refresh will not occur until such time.
You can however prevent this 'blocking behavior' with session_write_close(); which will leave the session open for reading but not writing
it will be aborted immediately from client side, but it could make changes to database if it having some update operations
Related
A few times a day, when our website is busy, we have more than 1000 requests per second on our database.
During these busy times, when a user clicks on an element and makes an AJAX call and then clicks another element and makes another AJAX call, the second call will wait for the response of the first call.
How can I have this AJAX calls run simultaneously? Is this time space between two calls because of the server being busy? If yes, how can we handle simultaneous AJAX calls?
I had similar problem as you do in the past. Then it was open session on server side. Even if ajax call was async, then it have to wait for server because of lock on session file.
Try to close session write when you do not write anything and then, check you ajax again.
Here you have reference to proper method: session_write_close
I am calling the server via AJAX and I get the response of the server by checking the XmlHttp object properties like:
if(xmlHttp.readyState==4){
if(xmlHttp.status==200){ // AJAX succeeded }
}
Now, I have a timeout mechanism to wait for 10 seconds using:
setTimeout();
in Javascript for the request to be executed, in case I didn't get the status 200 for the request, then I abort the request via:
xmlHttp.abort()
and then I show a timeout message and I show a button to resend this request.
The AJAX request works perfect in 98% of time but:
The problem when my internet is shaking, I send the request successfully but when the response tries to come back, the internet doesn't work well and I lose the response, so in this case the timeout message will be triggered to user (request not completed, please try again) but actually in the server side (PHP), the request is executed successfully and this request is attached to email to be sent to user, so the user will get the email that the request is done but also a timeout message is seen for the user.
So what shall I do? When internet is off completely, timeout mechanism works fine! But when the ISP has such internet problem (shaking quickly), how can I do to prevent the PHP from executing?
Thanks for your help
You should leave as is , unless it is of critical importance to achieve this redundancy.
In case it is critical :
Instead of immediately throwing an error message, you could retry to send the request 2 or 3 times. This would give the server more chances to respond to the request.
To do that, you'd have to make sure that the same request isn't processed more then once (in your case sending the mail) - you'd have to implement a simple system in your php to cache responses and recognize requests that were already fulfilled so they wont be processed again.
Create an unique id in javascript and send it as a parameter in your ajax.
Use a session array to store the responses of your requests, keyed by this unique id.
When a request come in, check the responses array to see if it was already fulfilled, in which case you just echo back the cached response without processing.
I have a situation where I need to run some PHP, specifically where I need to send out a SOAP request and wait for the response, and then do something with that, however sometimes these requests can be slow and take up to 9 seconds.
Now I don't really want the user sitting there waiting 9 seconds for this to complete.
Basically the user flow is..
User comes to payment page
User clicks button to pay via payment gateway (Paypal)
User then returns to the site (SOAP request and all that need to be finished at this stage)
I was thinking of running it with the Paypal IPN notification but then didn't think it would be finished by the time the user got back to the site.
So, I'm wondering if I could send off a call when the user hits the first page via Ajax and have it run whilst the user is submitting payment and by the time they get back to the site it should be done -- it's not a big deal if they don't end up going through with payment, so I'm not worried about running this code before confirming payment.
My question is, if I fire this off to be run via AJAX, will the code still be executed if the user leaves the page before it has finished? If not, any ideas?
Once a request is sent to the server, irrespective of whether you navigate away from the page the server side of the request will get completed.
The only thing that will not happen is the execution of client side callback method.
If you are using php , there is a php.ini setting ignore_user_abort that tells php what to do when the client aborts the request.
Its value is false by default.
http://www.php.net/manual/en/misc.configuration.php#ini.ignore-user-abort
User clicks some link which execute some ajax request - lets say this request takes 20 secs.
Before the request is complete user clicks oder link which redirects (no ajax) whole page to another page.
What will happen with ajax request? It will be always completed on server side but the response wont come anywhere? Or maybe ajax request on server side will be immidiately "killed?
I ask because I have some script which takes some time to run but user doesn't have to now the result - it's just fire and forget - maybe there is even some option in ajax to force it not to send any response?
The browser should kill the AJAX request, closing the connection to the server; however, this does not mean that your processing on the server is necessarily killed too: ignore_user_abort()
The server will complete the request, unaware that the client has "moved on." The server will return the response to the client like it normally does. The client will simply ignore the response.
So expect everything server-side to happen as normal, so the "fire and forget" method will work (since the client has moved on and has "forgotten"). But if you want to do anything client-side in response (which would negate the "forget" part) then there's no way for the new page to intercept the response. The browser will ignore it.
There might be some cases that your request takes long time because
of some problems with your client internet connection or your server
connection. So since the client doesn't want to wait he clicks on the Ajax
link again which sends the request to the server again which messes up
the following:
Rendering of our website in the browser because we are giving extra
load to the browser.
What if the second request processed correctly and you showed user
the page and then comes along the error message from your first
request(saying request timed out) which loads above on the correct
content and mess up with the user reading the correct content.
I want to stop the 1st Ajax response if the Ajax function is called twice. How do I do this?
so i want to stop the 1st Ajax response if the Ajax function is called
twice
What you actually want is to prevent a second request when a first request is in progress.
For example, You may have to change the Save button to Saving..., disable it (and add a little progress wheel) to give live feedback to the user. (Facebook does this)
The key is love feedback to the user. If the user is clueless on what is going on, they are going to think nothing is happening.
You might want to check why the operation is taking long
If this is a complex/time consuming operation, like, say a report generation or a file upload, a progress bar should do
If this is because of the client's internet connection, say it up front, like Gmail: Your have a slow Internet connection and this site may be slow. Better still, provide a fallback option, with less/no Ajax.
You say cause we are giving extra load to the browser: this is kind of fishy. You will not be giving extra load to the browser unless you are giving it tons of HTML to render. Use Ajax only for small updates on the browser. You may want to reload the page if you expect a large change.
How bout seeing as your using some form of JavaScript to begin with you have that link either hidden or disabled in a manor of speaking til the pages request has been followed through with. You could for example have the requested ajax wait for a second variable that would enable that link so a user could click it. But until that variable is received from the original AJAX request its disabled to the click, then if it is clicked it disables again and waits for the same variable to come back again.
Think of how some developers disable a submit button on a form so a user can't double submit there form.. same concept of notion here.