Handling responses to multiple server request calls on Objective C - php

I am building an iOS matching app on Objective C where a user needs to continuously keep tapping on an image, and on every tap, a request (match.php) is called on the server to check if that image is found or not. If the image is present on the server, the server will send back a positive response, and if it is absent, it will not send any response.
If a positive response is received, the app shows a "Found" alert on the screen. After every tap, a new image is presented to the user.
My problem is, if I wait after tapping on one image, it works properly as I get the desired positive response form the php file for that particular image and then the alert may or may not be shown. But if I continue tapping on consecutive images, I don't get the old responses, and the alert does not popup even if it is present on the server. I get the response only from the last image on which I wait.
Is there any solution for this? Can I queue the responses so that for every server call, I get a response? Currently it seems on every tap the connection is overwritten and old ones get lost?
I am using NSURLConnection to make the server call, and the alert is shown in connectionDidFinishLoading depending upon the response I get. In the middle of these I am also using didReceiveData to append the data in the response and also didReceiveResponse to check response.
I know I should be using NSURLSession, but will it solve this problem, or is there something else I need to do?

Related

How to stop PHP script in case the internet didn't work only for the AJAX response

I am calling the server via AJAX and I get the response of the server by checking the XmlHttp object properties like:
if(xmlHttp.readyState==4){
if(xmlHttp.status==200){ // AJAX succeeded }
}
Now, I have a timeout mechanism to wait for 10 seconds using:
setTimeout();
in Javascript for the request to be executed, in case I didn't get the status 200 for the request, then I abort the request via:
xmlHttp.abort()
and then I show a timeout message and I show a button to resend this request.
The AJAX request works perfect in 98% of time but:
The problem when my internet is shaking, I send the request successfully but when the response tries to come back, the internet doesn't work well and I lose the response, so in this case the timeout message will be triggered to user (request not completed, please try again) but actually in the server side (PHP), the request is executed successfully and this request is attached to email to be sent to user, so the user will get the email that the request is done but also a timeout message is seen for the user.
So what shall I do? When internet is off completely, timeout mechanism works fine! But when the ISP has such internet problem (shaking quickly), how can I do to prevent the PHP from executing?
Thanks for your help
You should leave as is , unless it is of critical importance to achieve this redundancy.
In case it is critical :
Instead of immediately throwing an error message, you could retry to send the request 2 or 3 times. This would give the server more chances to respond to the request.
To do that, you'd have to make sure that the same request isn't processed more then once (in your case sending the mail) - you'd have to implement a simple system in your php to cache responses and recognize requests that were already fulfilled so they wont be processed again.
Create an unique id in javascript and send it as a parameter in your ajax.
Use a session array to store the responses of your requests, keyed by this unique id.
When a request come in, check the responses array to see if it was already fulfilled, in which case you just echo back the cached response without processing.

Is there a way to cancel a POST once it has been submitted without using AJAX?

This is a JavaScript question, but not an AJAX question. Is there a way to cancel a post once it has been submitted and is in motion?
So the user hits the post button - all is well and form is submitted PHP is ready to catch the details.
For whatever reason there is network congestion and the server does not respond.
I want to give the user a chance to post again after a time has passed.
Is there a way to cancel the actual POST once it has been sent?
Is there a way to actually detect - on the server side - that a post was received? (In this case data is to be saved in the database)
I'm imagining the whole post procedure has a beginning and an end on the server side?
Or is there a way to know for sure that the post is going nowhere? It has failed and that's the end of it?
Is there a way to cancel the actual POST once it has been sent?
No, not from the server. Only from the client. You can chose to not respond to the request, but to cancel it "in the middle" is not possible from the server without some overly complicated acrobatics which you really shouldn't be doing.
Is there a way to actually detect - on the server side - that a post
was received?
Your code will only run once a request is received; so by default - if your code is running - request is received. You can use any of the logging mechanisms provided by PHP to log this event; or just check the web server logs.
I'm imagining the whole post procedure has a beginning and an end on
the server side?
Everything starts from the client, and it ends with the client as well. The client requests a resource. If there a no clients, your code is sitting idle twiddling its thumbs. Once the server receives a request from a client, it maps the request URL to a resource and then needs to deliver a response back to the client.
All web requests happen this exact same way. They are started from the client's side; and they all end when the client receives a response and the connection is closed. Then the whole cycle starts again for a new request.
Is there a way to know for sure that the post is going no where. It
has failed and that's the end of it?
If your code was not called; then there was an error at the server end (perhaps misconfiguration). The key thing to remember is the client will always get a response; and it is up to you to figure out what happened.
The best way to do this is to have smart logging in your application - or generally monitoring the server logs (where are requests are tracked).
If a post has gone "no where", the corresponding log entry will tell you. If the log entry does not show any errors and the action that you had expected (for example, a database record was created) hasn't happened - this means the problem was with your code.
If you have a requirement to make sure that a record was created only if a request was successful then use transactions if your database supports them.

$_POST persistency using an AJAX call

I am trying to set up a comment system, in which when the users sends the comment it is displayed on their screen as if it were already stored on the database.
My question is: what would happen if the users send comments and then navigate away
(or most specifically close the window immediately) or they lose connection after the ajax post?
On the code side I have ajax({})...
Then I have code that takes the user input from the textarea and adds it to a div.
This means that the user gets to see the comment they entered instantaneously. But I would like to be sure if the server will get the post info even if the connection was lost, window was closed or the user navigated away.
More info for the question:
A user sends a post to the server with 1mb of values, then right after one millisecond
he/she clicked on the button that made the post the browser window was closed.
Does the server receive and parse the response with ignore_user_abort(true);
inside the file; was the post info received?
Any difference if it were get instead of post for this case?
Assume website.com?myget=value
Trying to connect then closing the window immediately, on a browser window for example,
just hitting that on the address bar and then closing very right away, imagine it to
be automatically.
step 1 go to website.com?myget=value (don't wait at all for any server response, just
straight away (a millisecond or whatever it takes the script to do so) close completely
the window.
Would $_GET['myget'] be received server side at index.php of website.com?
This is a UX problem, not a technical one. What you want to do is display the new comment only after it has been stored. The workflow should go something like this:
User types message
User clicks on "submit" button
System grays out "submit" button and displays a message that reads,
"Posting..."
When System can confirm that the message has been successfully
stored, System will remove "Posting..." text and display actual new
message.
This way the user knows not to close their browser or navigate away until the request is done.
Alternatively you can you onbeforeunload to warn your users to wait before closing the browser or navigating away. Workflow being something like:
Prerequisite: You have a persistent counter somewhere (cookie, local storage, hidden field, etc). When the page loads, it starts at 0.
User types message
User clicks on "submit" button
AJAX request is sent
Counter is increased by 1
Request is complete, you get a response (whether it's successful or not - error handling is another issue), decrease the counter by 1
If at any point, the unload event is triggered, System will check the counter. If greater than 0, warn the user that their request has not been completed and that they might loose their comment (a-la-Gmail).
Will add my five cents. With ajax({}) you will ask browser to start communication with your server. It needs some time to establish connection (ping time) and send data to server. Both parts require some time to be completed. In order for PHP to start execution, browser must sent all data it has to send. No matter is it POST or GET. If user will break sending procedure (browser crashed, tab closed, computer turned off) PHP will not even be started. For instance, you can try to send some large file and see with a debugger when PHP script will be started - only after file is delivered completely (you can even close your browser before file is uploaded and see if your script is executed at all). It makes sense to start PHP execution only after all data delivered to server and ignore connections broken before data delivered. Otherwise there could be problems with data being corrupted. And nobody wants that. Plus, imagine that PHP is started before everything is delivered to server: you would never be sure that $_POST["something"] is not available because it was never entered by user or its data is not yet delivered.
There is no difference if you are using regular form submit or XMLHTTPRequest. In both cases browser need some time to establish connection with server and pass a data to it.
In most cases, whatever action you perform against the server will continue to be executed until such a point as the PHP running on the server tries to output results back to the browser. Only at this point will PHP check whether the connection still exists and do whatever is should do based on the user abort settings. So, if for example you wanted to receive the post, update a database entry, and then echo back some sort of success message, the database activity should continue so long as you have not made any output before the database is queried.
POST vs. GET makes no difference in this behavior.

Is there a way to limit the number of times per day a user can send a certain type of request?

What I'm doing at the moment is creating a row in a table for each Facebook request that gets sent. Then, every time a user opens up the FB friend picker to send a request I make a call to a php file that requests information from that table and returns with a list of the FB user ids of all the people they have sent a request to in the last 24 hours. I do this for each type of request the user can send.
The issue I'm having at the moment is that if the user initiates a request, sends them off to a number of people, and then immediately opens the FB friend picker again the previous request action's records have not yet all been added to our internal table. Thus, the players, if they go fast enough, can send multiple requests to the same FB friends.
Is there a way, on the FB side, to limit this behavior Or is this entirely up to the developer to constrain? For either case, is there a recommended method by which I should achieve this behavior? Thank you.
Update
It occurred to me that our DB is keeping multiple requests from being entered on a per-user-per-24-hour period. What I do now is simply allow the second request to be made on the FB side and when the code attempts and fails to enter the second row into our DB it makes a FB Graph call that uses the app's auth_token to delete the request from Facebook itself. This means that it will show up for a moment on the receiving player's request page on Facebook but since it isn't linked with a row in the internal DB the user won't receive any reward for clicking-thru anyway.
Thanks for the suggestions, though, everybody. #Gil Birman I went ahead and accepted your answer since it's perfectly valid, even if it's not what I ultimately used to fix the problem. Thanks!
There are several ways to solve the lag problem you mentioned, one way would be to disable your send request button via javascript as soon as it is pressed. In your javascript code, instead of immediately displaying the send request dialog via FB.UI, send a json request to your server. Only when the server responds should you display the fb send request dialog. Also, the response that the server sends should include the list of friends to exclude. After the fb request is sent your javascript code should send one more json request to the server to indicate what rows in the database needs to be updated. Only when the server responds this second time should you finally re-enable your send request button.
However, there is no way to actually limit the # number of requests that your user can send. That is, no matter how well you design your javascript/php code, your user could still theoretically invoke the request dialog via the Javascript console to completely bypass your attempts to secure the app.

How to stop the 1st ajax request if it is called again before the first call gets a response?

There might be some cases that your request takes long time because
of some problems with your client internet connection or your server
connection. So since the client doesn't want to wait he clicks on the Ajax
link again which sends the request to the server again which messes up
the following:
Rendering of our website in the browser because we are giving extra
load to the browser.
What if the second request processed correctly and you showed user
the page and then comes along the error message from your first
request(saying request timed out) which loads above on the correct
content and mess up with the user reading the correct content.
I want to stop the 1st Ajax response if the Ajax function is called twice. How do I do this?
so i want to stop the 1st Ajax response if the Ajax function is called
twice
What you actually want is to prevent a second request when a first request is in progress.
For example, You may have to change the Save button to Saving..., disable it (and add a little progress wheel) to give live feedback to the user. (Facebook does this)
The key is love feedback to the user. If the user is clueless on what is going on, they are going to think nothing is happening.
You might want to check why the operation is taking long
If this is a complex/time consuming operation, like, say a report generation or a file upload, a progress bar should do
If this is because of the client's internet connection, say it up front, like Gmail: Your have a slow Internet connection and this site may be slow. Better still, provide a fallback option, with less/no Ajax.
You say cause we are giving extra load to the browser: this is kind of fishy. You will not be giving extra load to the browser unless you are giving it tons of HTML to render. Use Ajax only for small updates on the browser. You may want to reload the page if you expect a large change.
How bout seeing as your using some form of JavaScript to begin with you have that link either hidden or disabled in a manor of speaking til the pages request has been followed through with. You could for example have the requested ajax wait for a second variable that would enable that link so a user could click it. But until that variable is received from the original AJAX request its disabled to the click, then if it is clicked it disables again and waits for the same variable to come back again.
Think of how some developers disable a submit button on a form so a user can't double submit there form.. same concept of notion here.

Categories