PHP : understand the CURL timeout - php

From a php page, i have to do a get to another php file.
I don't care to wait for the response of the get or know whether it is successful or not.
The file called could end the script also in 5-6 seconds, so i don't know how to handle the get timeout considering what has been said before.
The code is this
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://mywebsite/myfile.php');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, false);
curl_setopt($ch, CURLOPT_TIMEOUT, 1);
$content = trim(curl_exec($ch));
curl_close($ch);

For the first task (Where you don't need to wait for response )you can start new background process and below that write code which will redirect you on another page.

Yeah, you definitely shouldn't be creating a file on the server in response to a GET request. Even as a side-effect, it's less than ideal; as the main purpose of the request, it just doesn't make sense.
If you were doing this as a POST, you'd still have the same issue to work with, however. In that case, if the action can't be guaranteed to happen quickly enough to be acceptable in the context of HTTP, you'll need to hive it off somewhere else. E.g. make your HTTP request send a message to some other system which then works in parallel whilst the HTTP response is free to be sent back immediately.

Related

Retrieve / send back HTTP headers with PHP / Curl

I have a HTML/PHP/JS page that I use for an automation process.
On load, it performs a curl request like :
function get_data($url) {
$curl = curl_init();
$timeout = 5;
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_CONNECTTIMEOUT, $timeout);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec($curl);
curl_close($curl);
return $data;
}
$html = get_data($url);
Then it uses DOMDocument to retrieve a specific element on the remote page. My PHP code handles it, makes some operations, then stores it in a variable.
My purpose as you can guess is to simulate a "normal" connexion. To do so, I used the Tamper tool to see what requests are performed, when I was physically interacting with the remote page. HTTP headers are made of UA, cookies (among them, a session cookie), and so on. The only POST variable I have to send back is my PHP variable (you know, the one wich was calculated and stored in a PHP var). I also tested the process with Chrome, which allows me to copy/paste requests as curl.
My question is simple : is there a way to handle HTTP requests / cookies in a simple way ? Or do I have to retrieve them, parse them, store them and send them back "one by one" ?
Indeed, a request and a response are slightly different, but in this case they share many things in common. So I wonder if there is a way to explore the remote page as a browser would do, and interact with it, using for instance an extra PHP library.
Or maybe I'm doing it the wrong way and I should use other languages (PERL...) ?
The code shown above does not handle requests and cookies, I've tried but it was a bit too tricky to handle, hence I ask this question here :) I'm not lazy, but I wonder if there is a more simple way to achieve my goal.
Thanks for your advices, sorry for the english

Page waiting for new POST VARIABLE

I want to redirect my page when I get POST variable by other external domain, my page is:
http://goo.gl/kpm2GT
When you push the red button "Realizar Pago", automatically open a new windows to bank payment platform. Well, when you finish all the payment bank steps, this external web send me some POST variables with important data to my page.
This is what I want:: when someone click "Realizar Pago", the page stay waiting for new $_POST variables (from payment platform), so when the POST variables are already sended to my page, I want redirect my page to ha payment suscessfully page.
Thanks for help guys, and sorry for my english.
This is not possible in the way you think about it.
PHP executes each request separately. When your server executes the request from the external service you could assume it doesn't know anything about that other request from your user.
The $_POST array is unique for every request and could not be read across requests.
Okay, sounds like you are wanting to connect to an outside webservice from your page and then display the results to your users. In PHP, you'd probably want to create a form processor that takes user data and then uses cURL to pass it along to the banking end. Once the bank receives the request, they will send back a response to you which you can then display to the user or redirect them to a page that says it was a success.
cURL will wait for a while (you can specify how long it waits) for the response from the banking folks. In this example, I have told the program to wait for 30 seconds. If it finishes before the 30 seconds, it will go ahead and close the connection.
<?php
$bank_url = 'http://www.bank.com';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $bank_url);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 30);
$response = curl_exec($ch);
print $response;

Abort cURL in a ajax request

I'm sending an Ajax request to ajax.php file that downloads an XML using cURL.
//ajax.php
$ch = curl_init();
curl_setopt($ch, CURLOPT_USERPWD, USERNAME.':'.PASSWORD);
curl_setopt($ch, CURLOPT_POSTFIELDS, getData());
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
$data = curl_exec($ch);
echo $data;
User is not confirmed about this process and may refresh the webpage.
Sometimes curl_exec($ch) takes a long time and i get a timeout error. this error prevent script to continue. I searched and found no exact solution to solve this problem.
Now the the bigger problem is that in this case while ajax request is processing in background and user refresh the page, it wont refresh until ajax request timeout or ended.
I thought aborting that cURL request when page refreshed is a good temporarily solution but don't know how to do that.
Note: Ajax request has been setup using jQuery and aborting it ajax.abort() did not solved the problem.
You should change the way how your application is handling this functionality.
I would try to put another abstraction layer between ajax call and actual calculation process.
For example, ajax call could initialize php background process or even better middleware message queue with workers. In response you give to customer job id (or store it in db linked together with user id). Then onTimeout() is executing ajax requests to get status of job. In the mean time background process or middleware worker is processing the task and saves response into db with the same job id.
So in the end customer initializes job and after that just checks status of that job. When job is finished, you know it on server side. So on next ajax request you respond with actual job result. Javascript is receiving response with result and is executing callback function that is continuing work on client side.
You could try using a try/catch block to return something that you could deal with whenever this (or other) error occur.

php curl check if url is reachable before query

We're having problems with an api we are using.
Here is the code we're using (naming no names on the api front)
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://apiurl.com/whatever/api/we/call');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$ch_output = curl_exec($ch);
curl_close($ch);
This response times out, but not for ages. This is hideously slowing down our web app, and as such further code breaks because of the bad return value. This I can fix, however the response timeout I don't know how to fix. Is there any way to quickly see if a url is "responding" (e.g. something like ping in terminal) before trying to do a curl request?
Thank you.
Do you mean usingcurl_setopt($ch,CURLOPT_CONNECTTIMEOUT,NUMERIC_TIMEOUT_VALUE);to set the timeout?
Your best option would be to set the timeout on curl to a more acceptable level. There are several timeout options available for DNS lookup, connect timeout, transfer timeout, etc. More information is available here http://php.net/manual/en/function.curl-setopt.php

cURL hangs on request, waits for timeout to proceed

I'm encountering a problem to which I can't find a solution anywhere. Even worse, none else seems to have this problem so I'm probably doing something very stupid.
Some background info: I'm trying to make a proxy-like page that forwards an AJAX request to a different server. This to circumvent the same-domain-policy. All I want this code to do is take the POST variables, forward them to a different page, and then return the results. It's been working but for 1 thing: every time it waits for the timeout to continue. I've put it to 1 second now, so it's doing ok for now, but I'd rather have a fast response and proper timeout.
Here's my code:
// create a new cURL resource
$call = curl_init();
// set URL and other appropriate options
curl_setopt($call, CURLOPT_URL, $url);
curl_setopt($call, CURLOPT_POST, true);
curl_setopt($call, CURLOPT_POSTFIELDS, $params);
curl_setopt($call, CURLOPT_HEADER, false);
curl_setopt($call, CURLOPT_RETURNTRANSFER, true);
curl_setopt($call, CURLOPT_CONNECTTIMEOUT, 1);
// grab URL and pass it to the browser
$response = curl_exec($call);
// close cURL resource, and free up system resources
curl_close($call);
echo $response;
I've tried sending a "Connection: close" header with it, and several ways to make the target code specify that it's done running (setting Content-length, flushing, die(), etc.). At this point I really don't know what's going on, what surprises me most is that I can't find anyone with a similar problem.
Who can help me?
This would make sense if the server weren't actually completing the request. This would be expected in a page streaming or service streaming scenario. Are you sure that the server is actually returning a full and complete HTTP response to each request?
Sounds like it's trying to connect, timing out, and the retry is working.
This fixed it for me:
curl_setopt($ch, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4);
I can connect on the commandline via ipv6, so I don't know why this helps.

Categories