Abort cURL in a ajax request - php

I'm sending an Ajax request to ajax.php file that downloads an XML using cURL.
//ajax.php
$ch = curl_init();
curl_setopt($ch, CURLOPT_USERPWD, USERNAME.':'.PASSWORD);
curl_setopt($ch, CURLOPT_POSTFIELDS, getData());
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
$data = curl_exec($ch);
echo $data;
User is not confirmed about this process and may refresh the webpage.
Sometimes curl_exec($ch) takes a long time and i get a timeout error. this error prevent script to continue. I searched and found no exact solution to solve this problem.
Now the the bigger problem is that in this case while ajax request is processing in background and user refresh the page, it wont refresh until ajax request timeout or ended.
I thought aborting that cURL request when page refreshed is a good temporarily solution but don't know how to do that.
Note: Ajax request has been setup using jQuery and aborting it ajax.abort() did not solved the problem.

You should change the way how your application is handling this functionality.
I would try to put another abstraction layer between ajax call and actual calculation process.
For example, ajax call could initialize php background process or even better middleware message queue with workers. In response you give to customer job id (or store it in db linked together with user id). Then onTimeout() is executing ajax requests to get status of job. In the mean time background process or middleware worker is processing the task and saves response into db with the same job id.
So in the end customer initializes job and after that just checks status of that job. When job is finished, you know it on server side. So on next ajax request you respond with actual job result. Javascript is receiving response with result and is executing callback function that is continuing work on client side.

You could try using a try/catch block to return something that you could deal with whenever this (or other) error occur.

Related

PHP : understand the CURL timeout

From a php page, i have to do a get to another php file.
I don't care to wait for the response of the get or know whether it is successful or not.
The file called could end the script also in 5-6 seconds, so i don't know how to handle the get timeout considering what has been said before.
The code is this
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://mywebsite/myfile.php');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, false);
curl_setopt($ch, CURLOPT_TIMEOUT, 1);
$content = trim(curl_exec($ch));
curl_close($ch);
For the first task (Where you don't need to wait for response )you can start new background process and below that write code which will redirect you on another page.
Yeah, you definitely shouldn't be creating a file on the server in response to a GET request. Even as a side-effect, it's less than ideal; as the main purpose of the request, it just doesn't make sense.
If you were doing this as a POST, you'd still have the same issue to work with, however. In that case, if the action can't be guaranteed to happen quickly enough to be acceptable in the context of HTTP, you'll need to hive it off somewhere else. E.g. make your HTTP request send a message to some other system which then works in parallel whilst the HTTP response is free to be sent back immediately.

Does curl_exec() retry if timeout?

In my application I have to make a POST call to a webservice. They send me an XML response, basically saying "Accepted" or "Refused".
Last week I had an issue with one of these calls: I received a "Refused" response while their backend was telling me this request had been accepted.
I asked them what happened and they told me they received 2 requests (with the same ID - a parameter I send to them). First one was "Refused", second one was "Accepted".
I investigated: in my code, if I receive a "Refused" response, I log it, I update my database, and that's it. I do not try again.
The only thing would be PHP curl functions.
The day the problem occured, the webservice took unusual long time to response (20 seconds).
Could curl have made several calls? There is no retry option in the PHP function (or I didn't find it), but I'd rather ask here to be sure.
Here is my curl code.
$ch = curl_init();
$myArrayWithDatas = array( '...' );
$httpQueryFields = http_build_query($myArrayWithDatas);
$url = "https://www.webservice.com/api";
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $httpQueryFields);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($ch);
if (empty($response)) {
// I log an error
// (no trace of that in my logs)
} else {
// I log the XML response
// (one "Refused" response logged)
}
curl_close($ch);
Is there any case where this code could send 2 or more requests to the $url?
curl_exec will only do 1 call.
Are you running your code via a cron job or scheduled task ? If that's the case, maybe your code has been launched twice and that would explain why there were two calls done.

Page waiting for new POST VARIABLE

I want to redirect my page when I get POST variable by other external domain, my page is:
http://goo.gl/kpm2GT
When you push the red button "Realizar Pago", automatically open a new windows to bank payment platform. Well, when you finish all the payment bank steps, this external web send me some POST variables with important data to my page.
This is what I want:: when someone click "Realizar Pago", the page stay waiting for new $_POST variables (from payment platform), so when the POST variables are already sended to my page, I want redirect my page to ha payment suscessfully page.
Thanks for help guys, and sorry for my english.
This is not possible in the way you think about it.
PHP executes each request separately. When your server executes the request from the external service you could assume it doesn't know anything about that other request from your user.
The $_POST array is unique for every request and could not be read across requests.
Okay, sounds like you are wanting to connect to an outside webservice from your page and then display the results to your users. In PHP, you'd probably want to create a form processor that takes user data and then uses cURL to pass it along to the banking end. Once the bank receives the request, they will send back a response to you which you can then display to the user or redirect them to a page that says it was a success.
cURL will wait for a while (you can specify how long it waits) for the response from the banking folks. In this example, I have told the program to wait for 30 seconds. If it finishes before the 30 seconds, it will go ahead and close the connection.
<?php
$bank_url = 'http://www.bank.com';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $bank_url);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 30);
$response = curl_exec($ch);
print $response;

Asynchronus call in PHP Web page

I am new to PHP. I have a block of code on my webpage which I want to execute asynchronously. This block has following :
1. A shell_exec command.
2. A ftp_get_content.
3. Two image resize.
4. One call to mysql for insert.
Is there way make this block async, So that the rest of the page loads quickly.
Please ask if any more details required.
One possible solution is to use curl to do a pseudo async call. You can put the async part of your code in a separate php file and call it via curl. For example:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'YOUR_URL_WITH_ASYNC_CODE');
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 1);
curl_exec($ch);
curl_close($ch);
You could put the 4 tasks into a queue, maybe something like Beanstalkd, then have a background worker process this queue.

cURL hangs on request, waits for timeout to proceed

I'm encountering a problem to which I can't find a solution anywhere. Even worse, none else seems to have this problem so I'm probably doing something very stupid.
Some background info: I'm trying to make a proxy-like page that forwards an AJAX request to a different server. This to circumvent the same-domain-policy. All I want this code to do is take the POST variables, forward them to a different page, and then return the results. It's been working but for 1 thing: every time it waits for the timeout to continue. I've put it to 1 second now, so it's doing ok for now, but I'd rather have a fast response and proper timeout.
Here's my code:
// create a new cURL resource
$call = curl_init();
// set URL and other appropriate options
curl_setopt($call, CURLOPT_URL, $url);
curl_setopt($call, CURLOPT_POST, true);
curl_setopt($call, CURLOPT_POSTFIELDS, $params);
curl_setopt($call, CURLOPT_HEADER, false);
curl_setopt($call, CURLOPT_RETURNTRANSFER, true);
curl_setopt($call, CURLOPT_CONNECTTIMEOUT, 1);
// grab URL and pass it to the browser
$response = curl_exec($call);
// close cURL resource, and free up system resources
curl_close($call);
echo $response;
I've tried sending a "Connection: close" header with it, and several ways to make the target code specify that it's done running (setting Content-length, flushing, die(), etc.). At this point I really don't know what's going on, what surprises me most is that I can't find anyone with a similar problem.
Who can help me?
This would make sense if the server weren't actually completing the request. This would be expected in a page streaming or service streaming scenario. Are you sure that the server is actually returning a full and complete HTTP response to each request?
Sounds like it's trying to connect, timing out, and the retry is working.
This fixed it for me:
curl_setopt($ch, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4);
I can connect on the commandline via ipv6, so I don't know why this helps.

Categories