PHP cURL request takes up the entire Timeout - php

I currently have a Laravel application, which is doing a CURL request from one route to another route within the same route. My CURL looks like this:
//LOGGING THAT A CURL CALL IS ABOUT TO BE MADE
$url = env('APP_URL') . '/tests/add/results';
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); //return server error
curl_setopt($ch, CURLOPT_HEADER, FALSE);
curl_setopt($ch, CURLINFO_HEADER_OUT, FALSE);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, TRUE);
curl_setopt($ch, CURLOPT_POSTFIELDS, $test_post_data);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
$response = curl_exec($ch);
In the route, that the POST is being sent to, the first thing i log, is the request data:
//LOGING THAT I RECEIVED THE CURL CALL in receiving function
I'm noticing, that the logs for the request data get logged exactly the same amount of time as the timeout, meaning the request is actually being sent 10 seconds after the initial call.
In my logs i'll see something like:
10:10:10 - LOGGING CURL CALL
10:10:20 - Recieving CURL call
If i change the timeout to 30, then the log shows 30 seconds later that i received the CURL call.
Does anyone have any idea why this may be happening?
The response from the CURL just comes back as false always.

I did the following to make a post request work:
Instead of calling the route via CURL i did a post directly to the route function
$testController = new TestsController;
$test_data_request = new \Illuminate\Http\Request();
$test_data_request->setMethod('POST');
$test_data_request->request->add( $test_post_data );
$testId = $testController->addTestResults($test_data_request);

You've not provided enough information, but i think, the problem will be one or more of the following:
The WebServer http://127.0.0.1:8000 is not running
The script located on http://127.0.0.1:8000/tests/add/results is running too long and the request timeouts before it is completed
The requested path is returning redirect headers and creates an infinite loop
The response is too big to finish the data transfer in thirty seconds (very wierd if on localhost)
Try some more debugging and provide more information on this, so we may help you.
PS: Firstly i would try to catch the headers (curl_setopt($ch, CURLINFO_HEADER_OUT, true);) and print out the response (var_dump($response);) - or save it to some file :)

Related

What am I doing wrong here (CURL), no matter what I try it returns empty/null

$url = "http://www.reddit.com/r/{mysubreddit}/new.json";
$fields = "sort=new";
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $fields);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec($ch);
curl_close($ch);
var_dump($data);
{mysubreddit} is whatever subreddit I wanna check. It works fine to just grab that url via postman, or even in the browser. But when I use PHP/CURL, it returns empty. I've tried replacing the URL, with another URL to another site, and it works fine, so the curl part is working fine.
Is there something with reddit that I have to set? headers? or explicitly tell it for JSON? Or what?
I thought it might have to do with POST, but I tried GET to, still empty/null.
$url = "http://www.reddit.com/r/{mysubreddit}/new.json?sort=new";
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec($ch);
curl_close($ch);
That doesnt work either
You just need to add:
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
As others have mentioned, reddit is sending you a 302 redirect to https. You would be able to see that by examining the headers returned by curl_getinfo().
Enabling redirect following, as sorak describes, will work. However, it's not a good solution - you will make two HTTP requests on every single API call. This is a completely unnecessary waste of network and increases the execution time of your script. Instead, just change the url that you're requesting to be from https://www.reddit.com/ in the first place.

PHP: Call URL with setting timeout?

I want to call a certain URL in my PHP script and if I do not get a response after 10s, the script should just continue. Does anyone knows how I can do that?
I only found two ways. One is fopen(), which determinates my script if it doesn't get a return and curl which is just calling the URL without waiting/getting response?
But how can I say, try to get the content of the URL and if you do not get anything after 10s then continue and ignore the URL?
You can use curl.
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "url");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
$response = curl_exec($ch);
curl_close($ch);
If timeout is occurred - $response variable has boolean 'false' value.

Does curl_exec() retry if timeout?

In my application I have to make a POST call to a webservice. They send me an XML response, basically saying "Accepted" or "Refused".
Last week I had an issue with one of these calls: I received a "Refused" response while their backend was telling me this request had been accepted.
I asked them what happened and they told me they received 2 requests (with the same ID - a parameter I send to them). First one was "Refused", second one was "Accepted".
I investigated: in my code, if I receive a "Refused" response, I log it, I update my database, and that's it. I do not try again.
The only thing would be PHP curl functions.
The day the problem occured, the webservice took unusual long time to response (20 seconds).
Could curl have made several calls? There is no retry option in the PHP function (or I didn't find it), but I'd rather ask here to be sure.
Here is my curl code.
$ch = curl_init();
$myArrayWithDatas = array( '...' );
$httpQueryFields = http_build_query($myArrayWithDatas);
$url = "https://www.webservice.com/api";
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $httpQueryFields);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($ch);
if (empty($response)) {
// I log an error
// (no trace of that in my logs)
} else {
// I log the XML response
// (one "Refused" response logged)
}
curl_close($ch);
Is there any case where this code could send 2 or more requests to the $url?
curl_exec will only do 1 call.
Are you running your code via a cron job or scheduled task ? If that's the case, maybe your code has been launched twice and that would explain why there were two calls done.

Using CURL - Post and redirect help

I've been banging my head against a wall for a few hours now - and it's probably something really obvious I've missed!
I'm trying to connect to a payment service provider (PSP) using CURL, post data and follow the post so the user actually ends up on the PSP's site.
Using the following:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'https://psp.com/theirpage');
curl_setopt($ch, CURLOPT_REFERER, "http://mysite.com/mypage");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS,$params);
curl_setopt($ch, CURLOPT_POST, 1);
$result=curl_exec($ch);
curl_close($ch);
This successfully connects, verifies the data I've passed, but instead of redirecting the user to the PSP, it just loads the HTML on my site. Safe mode is off, and open_basedir is blank.
What am I doing wrong?
CURL would do an internal redirect and it wont have any effect on the user viewing your curl script. Keep in mind that the payment was made by your server NOT the users computer, hence expecting the session to work for the user is incorrect. cURL 'is the browser'.
If you just want a redirect after payment is made via cURL, you will have to do it via header() or by using some JS like window.location.
The curl request is being made from your server, and as such your server is receiving the response page. There's no way to initiate the request from the server and have the client receive the response. Either return the HTML to the user from your site (as you're doing), or make the request from the client's browser using Javascript. Hope that helps

How to make a cUrl request without receiving the response?

Normally I Post data when I initiate cURL. And I wait for the response, parse it, etc...
I want to simply post data, and not wait for any response.
In other words, can I send data to a Url, via cURL, and close my connection immediately? (not waiting for any response, or even to see if the url exists)
It's not a normal thing to ask, but I'm asking anyway.
Here's what I have so far:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $MyUrl);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data_to_send);
curl_exec($ch);
curl_close($ch);
I believe the only way to not actually receive the whole response from the remote server is by using CURLOPT_WRITEFUNCTION. For example:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $MyUrl);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data_to_send);
curl_setopt($ch, CURLOPT_WRITEFUNCTION, 'do_nothing');
curl_exec($ch);
curl_close($ch);
function do_nothing($curl, $input) {
return 0; // aborts transfer with an error
}
Important notes
Be aware that this will generate a warning, as the transfer will be aborted.
Make sure that you do not set the value of CURLOPT_RETURNTRANSFER, as this will interfere with the write callback.
You could do this through the curl_multi_* functions that are designed to execute multiple simultaneous requests - just fire off one request and don't bother asking for the response.
Not sure what the implications are in terms of what will happen if the script exits and curl is still running.
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $MyUrl);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data_to_send);
$mh = curl_multi_init();
curl_multi_add_handle($mh,$ch);
$running = 'idc';
curl_multi_exec($mh,$running); // asynchronous
// don't bother with the usual cleanup
Not sure if this helps, but via command-line I suppose you could use the '--max-time' option - "Maximum time in seconds that you allow the whole operation to take."
I had to do something quick and dirty and didn't want to have to re-program code or wait for a response, so found the --max-time option in the curl manual
curl --max-time 1 URL

Categories