I got a function to check if a URL is valid before putting it in my page, and I use
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_exec($ch);
$retcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
to do so. However, it recently happened that one external site wasn't working and was loading forever, reflecting that problem to my site as it got stuck at that last line of code, eventually ending in a timeout fatal error.How can I tell the server to only try curl_getinfo for a couple of seconds, and just return false if it's taking too long? It doesn't have to return an uncatchable timeout error that will compromise my page instead of just hiding an URL.
See http://php.net/manual/en/function.curl-setopt.php and use CURLOPT_CONNECTTIMEOUT.
Related
So I have obviously googled the error - but PHP (PHP 7.4.4 (cli)) curl gives me the error:
Curl error: operation aborted by callback with the following code:
private function curl_post($url,$post,$file = null,$file_type = 'audio/wav'){
$ch = curl_init($url);
if (!empty($file)){
$post['SoundFile'] = new CURLFile(UPLOAD_PATH.$file,$file_type,$file);
}
// Assign POST data
curl_setopt($ch, CURLOPT_POST,1);
curl_setopt($ch, CURLOPT_POSTFIELDS,$post);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$result = curl_exec($ch);
if(curl_errno($ch)) echo 'Curl error: '.curl_error($ch);
curl_close($ch);
print'<pre>Curl (rec): '."\n";print_r($result);print'</pre>';
}
I control both (Ubuntu) servers and have rebooted them both. I am posting a fairly large amount of data but in the Google searching this didn't seem to be what is triggering the curl_error. Does anyone know what is causing it? It was working perfectly fine and then it stopped.
Additionally putting file_put_contents(time().'.txt','log'); as a break in my receiving server does log the response. So its clearly landing in the right area.
Additionally what I will say is that the 2 servers talk a number of times to each other through curl (so one curls to one then back a bit). Furthermore - error 42 is the CURL response but https://curl.haxx.se/libcurl/c/libcurl-errors.html doesn't seem to provide much help. I've tried tracing the various calls to each other and can't see why it is breaking - it errors/breaks before the post/calls even occur.
So I found the answer and I hope this helps anyone else in this situation. The reason being was because the file was missing on the server for the CURLFile (having previously been there). My code now reads:
if (!empty($file) && is_file(UPLOAD_PATH.$file)){
$post['SoundFile'] = new CURLFile(UPLOAD_PATH.$file,$file_type,$file);
}
And this no longer generates the error. The key was that it errored even before submitting the post but the error itself wasn't that helpful until I broke it down in a separate test script and added the file element back in.
Edit: I have now checked by using an HTML form to submit the data via POST and this works perfectly so is definitely a cURL error any help hugely appreciated! Seems strange this worked last night and now not tonight...
o I managed to get my first cURL function working last night. For some reason (and with no changes that I'm aware of) it is not working today.
My code is:
<?php
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL,"http://www.example.com/api");
curl_setopt($curl, CURLOPT_POST, 1);
curl_setopt($curl, CURLOPT_POSTFIELDS, "apiKey=var1&message=var2&to=var3&from=var4");
curl_exec ($curl);
curl_close ($curl);
?>
Now I don't get any errors in the PHP log and I can't see any obvious syntax error. I've checked Im' using v7 and cURL is installed (if that's the correct technical term?). When I access the API address the POST data is going to without using this function it throws up an authentication error so at the very least I would expect this but only getting a blank page.
This leads me to believe the cURL itself is not POSTing the data but just ignoring it somehow. If I put headers in after the cURL they do follow this so their is definitely no fatal error in the code.
Sorry not to point the obvious out but have you checked for curl errors?
$err = curl_errno($curl);
$errmsg = curl_error($curl);
$info = curl_getinfo($curl);
So it turns out my provider that has the API stopped accepting requests to HTTP and now needs to be HTTPS.
A quick fix to this - though I hasten to add not optimal from a security standpoint which is fine for us as the data is NOT sensitive but something to bear in my mind - is to add this line of code:
curl_setopt($curl, CURLOPT_SSL_VERIFYPEER, false);
At the beginning of your curl function.
I found the answer here which also discusses a much more robust fix:
http://unitstep.net/blog/2009/05/05/using-curl-in-php-to-access-https-ssltls-protected-sites/
I've been using cURL to get the output of an external page and it's worked great for months, but suddenly it stopped working. My code is like this:
$ch = curl_init($URL);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec($ch);
curl_close($ch);
The URL is valid, I checked that it still works and it does, and through debugging I found that the $output variable's value is false, which according to the PHP manual is what curl_exec() returns on failure.
So, after working for a long time, and without any changes to my code (that I know of), the cURL transfer is suddenly not working.
How can I debug why it's not working?
I would start with curl_error()
You can use the curl_error() function to see the error returned by curl.
I'm encountering a problem to which I can't find a solution anywhere. Even worse, none else seems to have this problem so I'm probably doing something very stupid.
Some background info: I'm trying to make a proxy-like page that forwards an AJAX request to a different server. This to circumvent the same-domain-policy. All I want this code to do is take the POST variables, forward them to a different page, and then return the results. It's been working but for 1 thing: every time it waits for the timeout to continue. I've put it to 1 second now, so it's doing ok for now, but I'd rather have a fast response and proper timeout.
Here's my code:
// create a new cURL resource
$call = curl_init();
// set URL and other appropriate options
curl_setopt($call, CURLOPT_URL, $url);
curl_setopt($call, CURLOPT_POST, true);
curl_setopt($call, CURLOPT_POSTFIELDS, $params);
curl_setopt($call, CURLOPT_HEADER, false);
curl_setopt($call, CURLOPT_RETURNTRANSFER, true);
curl_setopt($call, CURLOPT_CONNECTTIMEOUT, 1);
// grab URL and pass it to the browser
$response = curl_exec($call);
// close cURL resource, and free up system resources
curl_close($call);
echo $response;
I've tried sending a "Connection: close" header with it, and several ways to make the target code specify that it's done running (setting Content-length, flushing, die(), etc.). At this point I really don't know what's going on, what surprises me most is that I can't find anyone with a similar problem.
Who can help me?
This would make sense if the server weren't actually completing the request. This would be expected in a page streaming or service streaming scenario. Are you sure that the server is actually returning a full and complete HTTP response to each request?
Sounds like it's trying to connect, timing out, and the retry is working.
This fixed it for me:
curl_setopt($ch, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4);
I can connect on the commandline via ipv6, so I don't know why this helps.
I'm running a curl request on an eXist database through php. The dataset is very large, and as a result, the database consistently takes a long amount of time to return an XML response. To fix that, we set up a curl request, with what is supposed to be a long timeout.
$ch = curl_init();
$headers["Content-Length"] = strlen($postString);
$headers["User-Agent"] = "Curl/1.0";
curl_setopt($ch, CURLOPT_URL, $requestUrl);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_USERPWD, 'admin:');
curl_setopt($ch,CURLOPT_TIMEOUT,1000);
$response = curl_exec($ch);
curl_close($ch);
However, the curl request consistently ends before the request is completed (<1000 when requested via a browser). Does anyone know if this is the proper way to set timeouts in curl?
See documentation: http://www.php.net/manual/en/function.curl-setopt.php
CURLOPT_CONNECTTIMEOUT - The number of seconds to wait while trying to connect. Use 0 to wait indefinitely.
CURLOPT_TIMEOUT - The maximum number of seconds to allow cURL functions to execute.
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 400); //timeout in seconds
also don't forget to enlarge time execution of php script self:
set_time_limit(0);// to infinity for example
Hmm, it looks to me like CURLOPT_TIMEOUT defines the amount of time that any cURL function is allowed to take to execute. I think you should actually be looking at CURLOPT_CONNECTTIMEOUT instead, since that tells cURL the maximum amount of time to wait for the connection to complete.
There is a quirk with this that might be relevant for some people... From the PHP docs comments.
If you want cURL to timeout in less than one second, you can use CURLOPT_TIMEOUT_MS, although there is a bug/"feature" on "Unix-like systems" that causes libcurl to timeout immediately if the value is < 1000 ms with the error "cURL Error (28): Timeout was reached". The explanation for this behavior is:
"If libcurl is built to use the standard system name resolver, that portion of the transfer will still use full-second resolution for timeouts with a minimum timeout allowed of one second."
What this means to PHP developers is "You can't use this function without testing it first, because you can't tell if libcurl is using the standard system name resolver (but you can be pretty sure it is)"
The problem is that on (Li|U)nix, when libcurl uses the standard name resolver, a SIGALRM is raised during name resolution which libcurl thinks is the timeout alarm.
The solution is to disable signals using CURLOPT_NOSIGNAL. Here's an example script that requests itself causing a 10-second delay so you can test timeouts:
if (!isset($_GET['foo'])) {
// Client
$ch = curl_init('http://localhost/test/test_timeout.php?foo=bar');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_NOSIGNAL, 1);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 200);
$data = curl_exec($ch);
$curl_errno = curl_errno($ch);
$curl_error = curl_error($ch);
curl_close($ch);
if ($curl_errno > 0) {
echo "cURL Error ($curl_errno): $curl_error\n";
} else {
echo "Data received: $data\n";
}
} else {
// Server
sleep(10);
echo "Done.";
}
From http://www.php.net/manual/en/function.curl-setopt.php#104597
Your code sets the timeout to 1000 seconds. For milliseconds, use CURLOPT_TIMEOUT_MS.
You will need to make sure about timeouts between you and the file.
In this case PHP and Curl.
To tell Curl to never timeout when a transfer is still active, you need to set CURLOPT_TIMEOUT to 0, instead of 1000.
curl_setopt($ch, CURLOPT_TIMEOUT, 0);
In PHP, again, you must remove time limits or PHP it self (after 30 seconds by default) will kill the script along Curl's request. This alone should fix your issue.
In addition, if you require data integrity, you could add a layer of security by using ignore_user_abort:
# The maximum execution time, in seconds. If set to zero, no time limit is imposed.
set_time_limit(0);
# Make sure to keep alive the script when a client disconnect.
ignore_user_abort(true);
A client disconnection will interrupt the execution of the script and possibly damaging data,
eg. non-transitional database query, building a config file, ecc., while in your case it would download a partial file... and you might, or not, care about this.
Answering this old question because this thread is at the top on engine searches for CURL_TIMEOUT.
You can't run the request from a browser, it will timeout waiting for the server running the CURL request to respond. The browser is probably timing out in 1-2 minutes, the default network timeout.
You need to run it from the command line/terminal.
If you are using PHP as a fastCGI application then make sure you check the fastCGI timeout settings.
See: PHP curl put 500 error