I have a situation where I need to run some PHP, specifically where I need to send out a SOAP request and wait for the response, and then do something with that, however sometimes these requests can be slow and take up to 9 seconds.
Now I don't really want the user sitting there waiting 9 seconds for this to complete.
Basically the user flow is..
User comes to payment page
User clicks button to pay via payment gateway (Paypal)
User then returns to the site (SOAP request and all that need to be finished at this stage)
I was thinking of running it with the Paypal IPN notification but then didn't think it would be finished by the time the user got back to the site.
So, I'm wondering if I could send off a call when the user hits the first page via Ajax and have it run whilst the user is submitting payment and by the time they get back to the site it should be done -- it's not a big deal if they don't end up going through with payment, so I'm not worried about running this code before confirming payment.
My question is, if I fire this off to be run via AJAX, will the code still be executed if the user leaves the page before it has finished? If not, any ideas?
Once a request is sent to the server, irrespective of whether you navigate away from the page the server side of the request will get completed.
The only thing that will not happen is the execution of client side callback method.
If you are using php , there is a php.ini setting ignore_user_abort that tells php what to do when the client aborts the request.
Its value is false by default.
http://www.php.net/manual/en/misc.configuration.php#ini.ignore-user-abort
Related
I have a user status field (online/offline) in database. when user hit logout button at that time in database change online to offline. but if user directly close browser without hitting logout then how to update database.
Unless you have a process which pings your servers at a regular interval, you can't.
Your logout button most likely sends a request back to your server. Upon receiving that request, the server runs/delegates logic to update the DB. If the user closes the browser without clicking the logout button, the server never gets this request.
#Antony gave a possible solution.
Another possible solution would be to send a message to your server at a given interval. The server should expect this call. If the server doesn't receive the message, then mark the user as being logged out. It has the downside of the logout timestamp being off by the interval. The logout time will not be exact. See this thread for more detail Ajax call with timer
Edit #1
In the link I mentioned above it mentioned how you can run javascript code at an interval.
setInterval(function() {
//call $.ajax here
}, 5000); //5 seconds
In your case use this function to call your API.
Your API should record timestamps of when it last received a call from your javascript. Have a cron job, or another piece of logic to check when the recording for that particular user stops. The last timestamp would be the approximate time the user logged out.
It's a very involved process for simply tracking a user's logout behavior. You may want to consider if it's worth the trouble.
What I'm doing at the moment is creating a row in a table for each Facebook request that gets sent. Then, every time a user opens up the FB friend picker to send a request I make a call to a php file that requests information from that table and returns with a list of the FB user ids of all the people they have sent a request to in the last 24 hours. I do this for each type of request the user can send.
The issue I'm having at the moment is that if the user initiates a request, sends them off to a number of people, and then immediately opens the FB friend picker again the previous request action's records have not yet all been added to our internal table. Thus, the players, if they go fast enough, can send multiple requests to the same FB friends.
Is there a way, on the FB side, to limit this behavior Or is this entirely up to the developer to constrain? For either case, is there a recommended method by which I should achieve this behavior? Thank you.
Update
It occurred to me that our DB is keeping multiple requests from being entered on a per-user-per-24-hour period. What I do now is simply allow the second request to be made on the FB side and when the code attempts and fails to enter the second row into our DB it makes a FB Graph call that uses the app's auth_token to delete the request from Facebook itself. This means that it will show up for a moment on the receiving player's request page on Facebook but since it isn't linked with a row in the internal DB the user won't receive any reward for clicking-thru anyway.
Thanks for the suggestions, though, everybody. #Gil Birman I went ahead and accepted your answer since it's perfectly valid, even if it's not what I ultimately used to fix the problem. Thanks!
There are several ways to solve the lag problem you mentioned, one way would be to disable your send request button via javascript as soon as it is pressed. In your javascript code, instead of immediately displaying the send request dialog via FB.UI, send a json request to your server. Only when the server responds should you display the fb send request dialog. Also, the response that the server sends should include the list of friends to exclude. After the fb request is sent your javascript code should send one more json request to the server to indicate what rows in the database needs to be updated. Only when the server responds this second time should you finally re-enable your send request button.
However, there is no way to actually limit the # number of requests that your user can send. That is, no matter how well you design your javascript/php code, your user could still theoretically invoke the request dialog via the Javascript console to completely bypass your attempts to secure the app.
I want to create a bidding system where user can see the current price of items. And if any other user on any other location place a bid before me it should auto update bid in my browser.
I have read about autoupdate JS+Ajax functions but even if I place a 5 second timer to auto update the content on user's browser will it not put some extra load on server by making an ajax call every 5 second? Its a bidding system so user will have bids updating within 1-2 seconds so if i put an auto update ajax call for every 1-2 seconds it will put a lot of burden on server.
So I am wondering is there any better way to handle this type of stuff? how do twitter/facebook do update user's feeds?
AJAX or not, bidding systems always have high requests because people keeps refreshing the page to check for the latest bid information.
You can take a look and attempt long polling. Long polling a method where you "push" data from the server to the browser in response to the browser's HTTP request. It is a normal HTTP connection. This may reduce the number of requests sent from users to server, however you will still have many open and active connections between your users and your server.
You will want to look at long polling. In essence, this is how it works
On the server you need some sort of event mechanism (no probem with PHP)
Client (Browser) starts an AJAX request referencing a bidding item
Server checks for changes on the bid, if there is one, returns the request
If not, he waits for some time (minute range), waiting on an event concerning this bid
If such an event occurs, server returns the request with the info, if not he returns the request with "no bid" info
You might be able to get away with a streaming model...
Each JS client connects to the server once and keeps the conneciton open. As new events arrive at the server, they are broadcast to all the open connections in real time.
This is similar to the mechanism twitter uses to broadcast tweets.
I've looked all over for any answer on this and haven't been able to find one, hopefully someone can point me in the right direction, I think I'm close
I have two hosts let's call them host1.mydomain.com and host2.mydomain.com (to get around the 2 concurrent connections per host/per browser issue), so they both point to the same content one is just an alias of the other
User goes to host1.mydomain.com, enters some information to register, clicks Go, which loads an iframe on the same page pointing to a page on host2.mydomain.com which calls a php script via exec("curl") sending the request to the background to start a website scraper, the process ID is then stored in the database for the user. After the iframe has successfully loaded (only takes 1 second since it's creating a background process) I have an AJAX request set on an interval to check the status periodically of the cURL process (by it's process ID in the database) so that I can display the current step of the scraper (there are 6 steps in total). All good so far.
The problem is that the AJAX requests are timing out after step 4 of the scraper (browser default timeout is 115/120 seconds) even though it shouldn't be because I'm working with two different hosts...meaning to say that it's almost as if I'm clogging both connections on host1.mydomain.com when I'm not because I initiated the scraper from host2
The iframe loads this URL: http://host2.mydomain.com/page.php
The contents of the PHP script calls:
exec("curl -o /dev/null 'http://host2.mydomain.com/page.php?method=process' > /dev/null & echo $!", $op);
Then my ajax request is polling http://host1.mydomain.com/status.php?pid=x which looks up in the database to check the status by the process ID
and once the scraper gets to step 4, my ajax requests are timing out
I think I confused myself explaining this, but hopefully someone can help me
Turns out I was successfully getting around the 2 connections per server/browser limitation...however in doing some research I found the reason why my ajax request was hanging is because I was trying to access and write to the session data from both of the requests. Digging a little deeper I found a session_write_close() which closes the session for reading/writing, I basically have to call this after each page request of the scraper and then reinitialize the session, this allows my ajax requests to go through and stops the blocking of the request.
Hopefully someone else finds this useful if you stumble across the same issue
Cheers!
Jeff
Instead of waiting for the request to finish, you should spawn new process which runs in the background on server. And use javascript to "check back" each few seconds to see when the execution has finished. Then all you have to do is pick up the result and display it.
Additionally you might want to make sure that only one php process is spawned.
What is the best way of spitting out the view file in codeigniter and still doing some work on the server? The situation basically is:
I have a payPal checkout where user goes to paypal and clicks Pay Now! A set_express_checkout() is called to start things off.
The user is returned to the Thank You page.
I have to call a Get_express_checkout_details() and a do_checkout() before showing him the Thank you page and this is 2 calls to a pretty slow payPal server.
My problem is, that when the user clicks on Pay Now! button, he is redirected back to my site but hangs at payPal for at least 5 seconds (while my server makes the 2 requests) before he can se anything. So the question is: Where should I make these two calls so the user doesnt have to wait so long before anything is shown to them?
I think using AJAX request is justwhat youwant. The idea is the following:
Output your page to client not performing any paypal requests
Create additional page/method that only performs paypal request and outputs data as json
On the outputted page place AJAX call to that new page
Process the response to know, if the request was successful.
For ajax calls youmight want to have a look at jQuery.ajax. Most convenient way to output json data from PHP is using json_encode PHP function.
You could enable hooks and use the 'post_system' hook to make your two calls to the slow server. See http://codeigniter.com/user_guide/general/hooks.html for more information.
However this will leave you with no easy way of showing any result of the two calls.