I have an CURL request, where at times, and also depending on the request load, it takes minutes to receive response from the API, due to processing and calculations.
For sake of good user experience, this behavior is undesired to have.
While waiting for response, sometimes long time, user is unable to perform any functions on the website.
So I am looking for solutions on how to go by doing it so the user can use the application while this application waits for results from API.
Solutions I have already considered.
Recording the request and using cron job to process it.
Unfortunately there is couple of pitfalls to that.
-Need of running a cron job every minute or constantly.
-Situations when user request is minimal or API is super fast at the moment, entire request may only take 2-3 seconds. But when using cron and it was requested let's say 30 seconds ahead of time for next cron job run, you end up with result of 32 second turn around.
So this solution may improve some, and worsen some, not sure if I really like that.
Aborting CURL request after few seconds.
Although the API sends separate response to url endpoint of my choice, and it seems safe to terminate transmission after posting request, there might be situation when those few seconds may not be enough time to even establish the connection with API. I guess what I am trying to say is that, with terminating CURL I have no way of knowing if the actual request made it through.
Is there another approach that I could consider?
Thank you.
Sounds like what you're asking for is an asynchronous cURL call.
Here's the curl_multi_init documentation
Related
I have a PHP site in which I make an ajax call , in that ajax call I make call to an API that returns XML and I parse it, The problem it sometimes the xML is so huge that it takes many time, The load balancer in EC2 have timeout value of 20 minutes, so If my call is greater than this I get 504 Error, How can I solve this issue? I know its a server issue but how I can solve this?I dont think php.ini is helpful here
HTTP is a stateless protocol. It works best when responses to requests are made within a few seconds of the request. When you don't respond quickly, timeouts start coming into play. This might be a timeout you can control (fcgi process timeout) or one you can't control (third party proxy, client browser).
So what do you do when you have work that will take longer than a few seconds? Use a message queue of course.
The cheap way to do this is store the job in a db table and have cron read from the table and process the work. This can work on a small scale, but it has some issues when you try to get larger.
The proper way to do this is use a real message queue system. Amazon has SQS, but could just as well use Gearman, zeroMQ, rabbitMQ, and others to handle this.
I'm not an expert on http request so this question might be trivial for some. I'm sending a request to a php script which takes a lot of time to process a file and return a response. Is there a way to send a response before this script finishes its task to let the user know about the process status? Since this task can take up to several minutes I'd like to notify the user when key parts of the process are finished.
Note: I cannot break this request into several others
I might not have the correct approach here if so do you have other ideas how this could be handled?
Technically yes, but it would require you to have fine grained control of the http-stack, which you may or may not have in a typical php setup. I would suggest you look into other solutions (E.g. Make request to start the task - then poll to get an update on the progress)
http://www.redips.net/javascript/ajax-progress-bar/
here's a great article that goes over creating ajax a progress bar to use with php.
let me know if it doesn't make sense!
I think the best way for long proccessing requests is cron jobs. You can send request which will create 'task' and catch the task by cron job. Cron job can change task status while working and you can check task status via interval requests. I can't imagine another way to inform users about request proccessing. As soon as you response your headers are sent and PHP stops.
EDIT: it should be noted that Cron jobs are only available on Linux servers. windows servers would require access to the task scheduler, which most web hosts will not allow.
I have a website that needs to send notifications to the online clients at real time same as Facebook, after more googling, I found a lot of documentation about push and pull technology. I found from this documentation ways for implementing them using Ajax or Sockets. I need to know what is the best to use in my case and how is it coded using javascript or jquery and php.
I cannot say you what's the best use in your case without knowing your case in detail.
In most cases it is enough to have the clients check with the server every one or two seconds, asking if something new has happened. I prefer this over sockets most of the time because it works on every web server without any configuration changes and in any browser supporting AJAX, even old ones.
If you have few clients (because every client requires an open socket on the server) and you want real realtime, you can use websockets. There are several PHP implementations, for example this one: http://code.google.com/p/phpwebsocket/
If you can ensure that there will be only single browser open per logged in user then you can apply this long polling technique easily.
Policy for Ajax Call:
Do not make request every 2 seconds.
But wait and make request only after 2 seconds of getting response from previous request.
If a request does not respond within 12 seconds then do not wait send a fresh request. This is connection lost case.
Policy for server response:
if there is update response immediately. to check if there is update rely on session ; (better if you could send some hint from client side like latest message received; this second update checking mechanism will eliminate the restriction of single browser open as mentioned above)
otherwise sleep() for 1 second; (do not use infinite loop but use sleep) and then check whether there is update; if update is there respond; if not sleep again for 1 second; repeat this until total 10 seconds has elapsed and then respond back with no update
If you apply this policy (commonly known as long polling), you will find processor usage reduced from 95% to 4% under heavy load case.
Hope this explains. Best of luck.
Just use apply the long-polling technique using jQuery.
Sockets are not yet supported everywhere and also you would need to open a listening socket on the server for this to work.
I have a login script that passes data to another script for processing. The processing is unrelated to the login script but it does a bit of data checking and logging for internal analysis.
I am using cURL to pass this data, but cURL is waiting for the response. I do not want to wait for the response because it's causing the user to have to wait before the analysis is complete before they can log in.
I am aware that the request could fail, but I am not overly concerned.
I basically want it to work like a multi threaded application where cURL is being used to fork a process. Is there any way to do this?
My code is below:
// Log user in
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,'http://site.com/userdata.php?e=' . $email);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$html = curl_exec($ch);
curl_close($ch);
// Redirect user to their home page
Thats all it does. But at the moment it has to wait for the cURL request to get a response.
Is there any way to make a get request and not wait for the response?
You don't need curl for this. Just open a socket and fire off a manual HTTP request and then close the socket. This is also useful because you can use a custom user agent so as not to skew your logging.
See this answer for an example.
Obviously, it's not "true" async/forking, but it should be quick enough.
I like Matt's idea the best, however to speed up your request you could
a) just make a head request (CURLOPT_NOBODY) which is significantly faster (no response body)
or
b) just set down the requesttime limit really low, however i guess you should test if the abortion of the request is really faster to only HEADing
Another possibility: Since there's apparently no need to do the analysis immediately, why do it immediately? If your provider allows cron jobs, just have the script that curl calls store the passed data quickly in a database or file, and have a cron job execute the processing script once a minute or hour or day. Or, if you can't do that, set up your own local machine to regularly run a script that invokes the remote one which processes the stored data.
It strikes me that what you're describing is a queue. You want to kick off a bunch of offline processing jobs and process them independently of user interaction. There are plenty of systems for doing that, though I'd particularly recommend beanstalkd using pheanstalk in PHP. It's far more reliable and controllable (e.g. managing retries in case of failures) than a cron job, and it's also very easy to distribute processing across multiple servers.
The equivalent of your calling a URL and ignoring the response is creating a new job in a 'tube'. It solves your particular problem because it will return more or less instantly and there is no response body to speak of.
At the processing end you don't need exec - run a CLI script in an infinite loop that requests jobs from the queue and processes them.
You could also look at ZeroMQ.
Overall this is not dissimilar to what GZipp suggests, it's just using a system that's designed specifically for this mode of operation.
If you have a restrictive ISP that won't let you run other software, it may be time to find a new ISP - Amazon AWS will give you a free EC2 micro instance for a year.
I am going to have a daemon that will run on a FreeBSD server, which will exchange small amounts of data with a list of URIs every minute.
I am thinking to use the curl_multi functions to run them all at once, or in groups, every minute, using a post. I am open to other ideas though.
I will have to do some benchmarking later on, but for now, does anyone know how resource intensive it is to make many small posts with curl?
Is there a less intensive way to do this? SOAP perhaps? To start, there will only be a few every minute, but it may grow quickly.
Thanks!
I'd say that SOAP processing (generating query, sending it, get it processed then getting answer back) may be more resource intensive than just a POST, even "multithreaded". Profiling or benchmarking is probably the only way to be sure anyway.