Using PHP's Curl how do I know that the page is not responding so as to grab another one?
Use the CURLOPT_CONNECTTIMEOUT option:
// Wait two seconds before bailing
curl_setopt($curl, CURLOPT_CONNECTTIMEOUT, 2);
There's also CURLOPT_TIMEOUT, which works for an entire request call (including DNS fetching and reading the data).
To check if a call did time out, you may be able to check its return value. If not, the CURL handler's curl_errno is set, which you can compare to CURLE_OPERATION_TIMEDOUT (or just CURLE_OK).
Related
From a php page, i have to do a get to another php file.
I don't care to wait for the response of the get or know whether it is successful or not.
The file called could end the script also in 5-6 seconds, so i don't know how to handle the get timeout considering what has been said before.
The code is this
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://mywebsite/myfile.php');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, false);
curl_setopt($ch, CURLOPT_TIMEOUT, 1);
$content = trim(curl_exec($ch));
curl_close($ch);
For the first task (Where you don't need to wait for response )you can start new background process and below that write code which will redirect you on another page.
Yeah, you definitely shouldn't be creating a file on the server in response to a GET request. Even as a side-effect, it's less than ideal; as the main purpose of the request, it just doesn't make sense.
If you were doing this as a POST, you'd still have the same issue to work with, however. In that case, if the action can't be guaranteed to happen quickly enough to be acceptable in the context of HTTP, you'll need to hive it off somewhere else. E.g. make your HTTP request send a message to some other system which then works in parallel whilst the HTTP response is free to be sent back immediately.
I am calling a series of links using the file_get_contents() method in a loop. Each link may take more than 15 minutes to process. Now, I worry about whether PHP's file_get_contents() has a timeout period?
If yes, it will time out with a call and move to next link. I don't want to call the next link without the prior one finishing.
So, please tell me whether file_get_contents() has a timeout period. The file which contains the file_get_contents() is set to set_time_limit() to zero (unlimited).
The default timeout is defined by default_socket_timeout ini-setting, which is 60 seconds. You can also change it on the fly:
ini_set('default_socket_timeout', 900); // 900 Seconds = 15 Minutes
Another way to set a timeout, would be to use stream_context_create to set the timeout as HTTP context options of the HTTP stream wrapper in use:
$ctx = stream_context_create(array('http'=>
array(
'timeout' => 1200, //1200 Seconds is 20 Minutes
)
));
echo file_get_contents('http://example.com/', false, $ctx);
As #diyism mentioned, "default_socket_timeout, stream_set_timeout, and stream_context_create timeout are all the timeout of every line read/write, not the whole connection timeout." And the top answer by #stewe has failed me.
As an alternative to using file_get_contents, you can always use curl with a timeout.
So here's a working code that works for calling links.
$url='http://example.com/';
$ch=curl_init();
$timeout=5;
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, $timeout);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$result=curl_exec($ch);
curl_close($ch);
echo $result;
Yes! By passing a stream context in the third parameter:
Here with a timeout of 1s:
file_get_contents("https://abcedef.com", 0, stream_context_create(["http"=>["timeout"=>1]]));
Source in comment section of https://www.php.net/manual/en/function.file-get-contents.php
HTTP context options:
method
header
user_agent
content
request_fulluri
follow_location
max_redirects
protocol_version
timeout
Non HTTP stream contexts
Socket
FTP
SSL
CURL
Phar
Context (notifications callback)
Zip
It is worth noting that if changing default_socket_timeout on the fly, it might be useful to restore its value after your file_get_contents call:
$default_socket_timeout = ini_get('default_socket_timeout');
....
ini_set('default_socket_timeout', 10);
file_get_contents($url);
...
ini_set('default_socket_timeout', $default_socket_timeout);
For me work when i change my php.ini in my host:
; Default timeout for socket based streams (seconds)
default_socket_timeout = 300
For prototyping, using curl from the shell with the -m parameter allow to pass milliseconds, and will work in both cases, either the connection didn't initiate, error 404, 500, bad url, or the whole data wasn't retrieved in full in the allowed time range, the timeout is always effective. Php won't ever hang out.
Simply don't pass unsanitized user data in the shell call.
system("curl -m 50 -X GET 'https://api.kraken.com/0/public/OHLC?pair=LTCUSDT&interval=60' -H 'accept: application/json' > data.json");
// This data had been refreshed in less than 50ms
var_dump(json_decode(file_get_contents("data.json"),true));
We're having problems with an api we are using.
Here is the code we're using (naming no names on the api front)
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://apiurl.com/whatever/api/we/call');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$ch_output = curl_exec($ch);
curl_close($ch);
This response times out, but not for ages. This is hideously slowing down our web app, and as such further code breaks because of the bad return value. This I can fix, however the response timeout I don't know how to fix. Is there any way to quickly see if a url is "responding" (e.g. something like ping in terminal) before trying to do a curl request?
Thank you.
Do you mean usingcurl_setopt($ch,CURLOPT_CONNECTTIMEOUT,NUMERIC_TIMEOUT_VALUE);to set the timeout?
Your best option would be to set the timeout on curl to a more acceptable level. There are several timeout options available for DNS lookup, connect timeout, transfer timeout, etc. More information is available here http://php.net/manual/en/function.curl-setopt.php
I'm encountering a problem to which I can't find a solution anywhere. Even worse, none else seems to have this problem so I'm probably doing something very stupid.
Some background info: I'm trying to make a proxy-like page that forwards an AJAX request to a different server. This to circumvent the same-domain-policy. All I want this code to do is take the POST variables, forward them to a different page, and then return the results. It's been working but for 1 thing: every time it waits for the timeout to continue. I've put it to 1 second now, so it's doing ok for now, but I'd rather have a fast response and proper timeout.
Here's my code:
// create a new cURL resource
$call = curl_init();
// set URL and other appropriate options
curl_setopt($call, CURLOPT_URL, $url);
curl_setopt($call, CURLOPT_POST, true);
curl_setopt($call, CURLOPT_POSTFIELDS, $params);
curl_setopt($call, CURLOPT_HEADER, false);
curl_setopt($call, CURLOPT_RETURNTRANSFER, true);
curl_setopt($call, CURLOPT_CONNECTTIMEOUT, 1);
// grab URL and pass it to the browser
$response = curl_exec($call);
// close cURL resource, and free up system resources
curl_close($call);
echo $response;
I've tried sending a "Connection: close" header with it, and several ways to make the target code specify that it's done running (setting Content-length, flushing, die(), etc.). At this point I really don't know what's going on, what surprises me most is that I can't find anyone with a similar problem.
Who can help me?
This would make sense if the server weren't actually completing the request. This would be expected in a page streaming or service streaming scenario. Are you sure that the server is actually returning a full and complete HTTP response to each request?
Sounds like it's trying to connect, timing out, and the retry is working.
This fixed it for me:
curl_setopt($ch, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4);
I can connect on the commandline via ipv6, so I don't know why this helps.
I'm running a curl request on an eXist database through php. The dataset is very large, and as a result, the database consistently takes a long amount of time to return an XML response. To fix that, we set up a curl request, with what is supposed to be a long timeout.
$ch = curl_init();
$headers["Content-Length"] = strlen($postString);
$headers["User-Agent"] = "Curl/1.0";
curl_setopt($ch, CURLOPT_URL, $requestUrl);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_USERPWD, 'admin:');
curl_setopt($ch,CURLOPT_TIMEOUT,1000);
$response = curl_exec($ch);
curl_close($ch);
However, the curl request consistently ends before the request is completed (<1000 when requested via a browser). Does anyone know if this is the proper way to set timeouts in curl?
See documentation: http://www.php.net/manual/en/function.curl-setopt.php
CURLOPT_CONNECTTIMEOUT - The number of seconds to wait while trying to connect. Use 0 to wait indefinitely.
CURLOPT_TIMEOUT - The maximum number of seconds to allow cURL functions to execute.
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 400); //timeout in seconds
also don't forget to enlarge time execution of php script self:
set_time_limit(0);// to infinity for example
Hmm, it looks to me like CURLOPT_TIMEOUT defines the amount of time that any cURL function is allowed to take to execute. I think you should actually be looking at CURLOPT_CONNECTTIMEOUT instead, since that tells cURL the maximum amount of time to wait for the connection to complete.
There is a quirk with this that might be relevant for some people... From the PHP docs comments.
If you want cURL to timeout in less than one second, you can use CURLOPT_TIMEOUT_MS, although there is a bug/"feature" on "Unix-like systems" that causes libcurl to timeout immediately if the value is < 1000 ms with the error "cURL Error (28): Timeout was reached". The explanation for this behavior is:
"If libcurl is built to use the standard system name resolver, that portion of the transfer will still use full-second resolution for timeouts with a minimum timeout allowed of one second."
What this means to PHP developers is "You can't use this function without testing it first, because you can't tell if libcurl is using the standard system name resolver (but you can be pretty sure it is)"
The problem is that on (Li|U)nix, when libcurl uses the standard name resolver, a SIGALRM is raised during name resolution which libcurl thinks is the timeout alarm.
The solution is to disable signals using CURLOPT_NOSIGNAL. Here's an example script that requests itself causing a 10-second delay so you can test timeouts:
if (!isset($_GET['foo'])) {
// Client
$ch = curl_init('http://localhost/test/test_timeout.php?foo=bar');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_NOSIGNAL, 1);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 200);
$data = curl_exec($ch);
$curl_errno = curl_errno($ch);
$curl_error = curl_error($ch);
curl_close($ch);
if ($curl_errno > 0) {
echo "cURL Error ($curl_errno): $curl_error\n";
} else {
echo "Data received: $data\n";
}
} else {
// Server
sleep(10);
echo "Done.";
}
From http://www.php.net/manual/en/function.curl-setopt.php#104597
Your code sets the timeout to 1000 seconds. For milliseconds, use CURLOPT_TIMEOUT_MS.
You will need to make sure about timeouts between you and the file.
In this case PHP and Curl.
To tell Curl to never timeout when a transfer is still active, you need to set CURLOPT_TIMEOUT to 0, instead of 1000.
curl_setopt($ch, CURLOPT_TIMEOUT, 0);
In PHP, again, you must remove time limits or PHP it self (after 30 seconds by default) will kill the script along Curl's request. This alone should fix your issue.
In addition, if you require data integrity, you could add a layer of security by using ignore_user_abort:
# The maximum execution time, in seconds. If set to zero, no time limit is imposed.
set_time_limit(0);
# Make sure to keep alive the script when a client disconnect.
ignore_user_abort(true);
A client disconnection will interrupt the execution of the script and possibly damaging data,
eg. non-transitional database query, building a config file, ecc., while in your case it would download a partial file... and you might, or not, care about this.
Answering this old question because this thread is at the top on engine searches for CURL_TIMEOUT.
You can't run the request from a browser, it will timeout waiting for the server running the CURL request to respond. The browser is probably timing out in 1-2 minutes, the default network timeout.
You need to run it from the command line/terminal.
If you are using PHP as a fastCGI application then make sure you check the fastCGI timeout settings.
See: PHP curl put 500 error