I am using a simple cURL statement to parse XML on my site. When the API is up and working it works fine, however as soon as the API does down for any reason the entire site crashes.
$url = 'http://www.mydomain.com/webservicexample';
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, 0);
$data = curl_exec($ch);
curl_close($ch);
$xml = new SimpleXmlElement($data);
Is there a conditional I can put around the url so that it only carries out the cURL script if there's a positive response from the API? I tried the following but it didn't work because it never got a server response to give any headers:
$url_headers = #get_headers($url);
if($url_headers[0] == 'HTTP/1.1 200 OK') {
// do script
}
Any help/advice much appreciated!
You can check the return value of curl_exec():
if (false === ($data = curl_exec($ch))) {
die("Eek! Curl error! " . curl_error($ch));
}
And check the response headers too:
if (200 !== (int)curl_getinfo($ch, CURLINFO_HTTP_CODE)) {
die("Oh dear, no 200 OK?!");
}
In the end I was able to get it working by setting a timeout time with CURLOPT_TIMEOUT and CURLOPT_CONNECTTIMEOUT and then put a conditional around it using curl_errno().
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 3);
curl_setopt($ch, CURLOPT_TIMEOUT, 3);
$data = curl_exec($ch);
if(!curl_errno($ch))
{
curl_close($ch);
$xml = new SimpleXmlElement($data);
return $xml;
}
Related
I'm using CURL to get the status of a site, if it's up/down or redirecting to another site. I want to get it as streamlined as possible, but it's not working well.
<?php
$ch = curl_init($url);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch,CURLOPT_TIMEOUT,10);
$output = curl_exec($ch);
$httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
return $httpcode;
?>
I have this wrapped in a function. It works fine but performance is not the best because it downloads the whole page, thing in if I remove $output = curl_exec($ch); it returns 0 all the time.
Does anyone know how to make the performance better?
First make sure if the URL is actually valid (a string, not empty, good syntax), this is quick to check server side. For example, doing this first could save a lot of time:
if(!$url || !is_string($url) || ! preg_match('/^http(s)?:\/\/[a-z0-9-]+(.[a-z0-9-]+)*(:[0-9]+)?(\/.*)?$/i', $url)){
return false;
}
Make sure you only fetch the headers, not the body content:
#curl_setopt($ch, CURLOPT_HEADER , true); // we want headers
#curl_setopt($ch, CURLOPT_NOBODY , true); // we don't need body
For more details on getting the URL status http code I refer to another post I made (it also helps with following redirects):
How can I check if a URL exists via PHP?
As a whole:
$url = 'http://www.example.com';
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_HEADER, true); // we want headers
curl_setopt($ch, CURLOPT_NOBODY, true); // we don't need body
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_TIMEOUT,10);
$output = curl_exec($ch);
$httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
echo 'HTTP code: ' . $httpcode;
// must set $url first....
$http = curl_init($url);
// do your curl thing here
$result = curl_exec($http);
$http_status = curl_getinfo($http, CURLINFO_HTTP_CODE);
curl_close($http);
echo $http_status;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0)");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST,false);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER,false);
curl_setopt($ch, CURLOPT_MAXREDIRS, 10);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt($ch, CURLOPT_TIMEOUT, 20);
$rt = curl_exec($ch);
$info = curl_getinfo($ch);
echo $info["http_code"];
Try PHP's "get_headers" function.
Something along the lines of:
<?php
$url = 'http://www.example.com';
print_r(get_headers($url));
print_r(get_headers($url, 1));
?>
curl_getinfo — Get information regarding a specific transfer
Check curl_getinfo
<?php
// Create a curl handle
$ch = curl_init('http://www.yahoo.com/');
// Execute
curl_exec($ch);
// Check if any error occurred
if(!curl_errno($ch))
{
$info = curl_getinfo($ch);
echo 'Took ' . $info['total_time'] . ' seconds to send a request to ' . $info['url'];
}
// Close handle
curl_close($ch);
curl_exec is necessary. Try CURLOPT_NOBODY to not download the body. That might be faster.
use this hitCurl method for fetch all type of api response i.e. Get / Post
function hitCurl($url,$param = [],$type = 'POST'){
$ch = curl_init();
if(strtoupper($type) == 'GET'){
$param = http_build_query((array)$param);
$url = "{$url}?{$param}";
}else{
curl_setopt_array($ch,[
CURLOPT_POST => (strtoupper($type) == 'POST'),
CURLOPT_POSTFIELDS => (array)$param,
]);
}
curl_setopt_array($ch,[
CURLOPT_URL => $url,
CURLOPT_RETURNTRANSFER => true,
]);
$resp = curl_exec($ch);
$statusCode = curl_getinfo($ch,CURLINFO_HTTP_CODE);
curl_close($ch);
return [
'statusCode' => $statusCode,
'resp' => $resp
];
}
Demo function to test api
function fetchApiData(){
$url = 'https://postman-echo.com/get';
$resp = $this->hitCurl($url,[
'foo1'=>'bar1',
'foo2'=>'bar2'
],'get');
$apiData = "Getting header code {$resp['statusCode']}";
if($resp['statusCode'] == 200){
$apiData = json_decode($resp['resp']);
}
echo "<pre>";
print_r ($apiData);
echo "</pre>";
}
Here is my solution need get Status Http for checking status of server regularly
$url = 'http://www.example.com'; // Your server link
while(true) {
$strHeader = get_headers($url)[0];
$statusCode = substr($strHeader, 9, 3 );
if($statusCode != 200 ) {
echo 'Server down.';
// Send email
}
else {
echo 'oK';
}
sleep(30);
}
I'm getting no response and no errors from this php code. Does anyone know what I'm doing wrong, please? It seems straightforward:
php:
$details_url = "https://maps.googleapis.com/maps/api/geocode/json?address=436+Grant+Street+Pittsburgh&sensor=false&key=mykey";
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 0);
$response = curl_exec($ch);
print_r($response);
You never bothered to tell curl about your url. You should have
$ch = curl_init($details_url);
^^^^^^^^^^^^
or
curl_setopt($ch, CURLOPT_URL, $details_url);
And note that print_r is not a good debug tool. You probably got a boolean false from curl_exec, which print_r won't display at all:
php > $x = false;
php > print_r($x);
php > var_dump($x);
bool(false)
A better option would be
$response = curl_exec($ch);
if ($response === false) {
die(curl_error($ch));
}
I am Using PHP curl, my target url gives 200 or 500 depending on request parameter.
But how ever it throw 500 or 200 i am getting 200 using curl_getinfo($ch, CURLINFO_HTTP_CODE). Here is the code
/**
* use for get any file form a remote uri
*
* #param String $url
* #return String
*/
public function getFileUsingCurl($url)
{
//set all option
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
$file = curl_exec($ch);
if (200 == curl_getinfo($ch, CURLINFO_HTTP_CODE)) {
curl_close($ch);
return $file;
} else {
curl_close($ch);
return false;
}
}
How i can get the right HTTP code form my target url ?
Try this:
public function getFileUsingCurl($url)
{
//set all option
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
$file = curl_exec($ch);
$curlinfo = curl_getinfo($ch);
curl_close($ch);
$httpcode = $curlinfo['http_code'];
if($httpcode == "200"){
return $file;
}else{
return false;
}
}
Note:
Make sure you're not being redirected (code 301 or 302)
Could you try curl'ing the url on your terminal to check its status code with:
curl -I www.site.com
(I know this a question rather than answer, but i dont have enough stackoverflow rep to comment yet haha, so i'll edit this to an answer when i have more info)
you should use curl_setopt($c, CURLOPT_HEADER, true); to include headers in output.
http://www.php.net/manual/en/function.curl-setopt.php
Then use var_dump($file) to see whether it is really 200 or not...
Using below code for checking status should work
$infoArray = curl_getinfo($ch);
$httpStatus = $infoArray['http_code'];
if($httpStatus == "200"){
// do stuff here
}
curl_setopt($c, CURLOPT_HEADER, true); // you need this to get the headers
$code = curl_getinfo($ch, CURLINFO_HTTP_CODE);
Use guzzle to interact with CURL in a sane manner. Then your script becomes something like:
<?php
use Guzzle\Http\Client;
// Create a client and provide a base URL
$client = new Client('http://www.example.com');
$response = $client->get('/');
$code = $response->getStatusCode();
I use this piece of code below to send data to another server via a url and it work successfully. I want to capture the response from the server and process it but I can't seem to capture it.
CODE
$url="http://www.example.com/com_spc/api.php?username=".urlencode($uname)."&password=".urlencode($pwd);
$ch = curl_init(); // create cURL handle (ch)
if (!$ch) {
die("Couldn't initialize a cURL handle");
}
// set some cURL options
$ret = curl_setopt($ch, CURLOPT_URL, $url);
$ret = curl_setopt($ch, CURLOPT_HEADER, 0);
$ret = curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 0);
$ret = curl_setopt($ch, CURLOPT_RETURNTRANSFER, 0);
$ret = curl_setopt($ch, CURLOPT_TIMEOUT, 30);
// execute
$ret = curl_exec($ch);
if (empty($ret)) {
// some kind of an error happened
die(curl_error($ch));
curl_close($ch); // close cURL handler
} else {
$info = curl_getinfo($ch);
curl_close($ch); // close cURL handler
if (empty($info['http_code'])) {
die("No HTTP code was returned");
} else {
}
}
Make sure to set CURLOPT_RETURNTRANSFER to 1. Otherwise curl_exec will not return anything
I'm using CURL to get the status of a site, if it's up/down or redirecting to another site. I want to get it as streamlined as possible, but it's not working well.
<?php
$ch = curl_init($url);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch,CURLOPT_TIMEOUT,10);
$output = curl_exec($ch);
$httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
return $httpcode;
?>
I have this wrapped in a function. It works fine but performance is not the best because it downloads the whole page, thing in if I remove $output = curl_exec($ch); it returns 0 all the time.
Does anyone know how to make the performance better?
First make sure if the URL is actually valid (a string, not empty, good syntax), this is quick to check server side. For example, doing this first could save a lot of time:
if(!$url || !is_string($url) || ! preg_match('/^http(s)?:\/\/[a-z0-9-]+(.[a-z0-9-]+)*(:[0-9]+)?(\/.*)?$/i', $url)){
return false;
}
Make sure you only fetch the headers, not the body content:
#curl_setopt($ch, CURLOPT_HEADER , true); // we want headers
#curl_setopt($ch, CURLOPT_NOBODY , true); // we don't need body
For more details on getting the URL status http code I refer to another post I made (it also helps with following redirects):
How can I check if a URL exists via PHP?
As a whole:
$url = 'http://www.example.com';
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_HEADER, true); // we want headers
curl_setopt($ch, CURLOPT_NOBODY, true); // we don't need body
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_TIMEOUT,10);
$output = curl_exec($ch);
$httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
echo 'HTTP code: ' . $httpcode;
// must set $url first....
$http = curl_init($url);
// do your curl thing here
$result = curl_exec($http);
$http_status = curl_getinfo($http, CURLINFO_HTTP_CODE);
curl_close($http);
echo $http_status;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0)");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST,false);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER,false);
curl_setopt($ch, CURLOPT_MAXREDIRS, 10);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt($ch, CURLOPT_TIMEOUT, 20);
$rt = curl_exec($ch);
$info = curl_getinfo($ch);
echo $info["http_code"];
Try PHP's "get_headers" function.
Something along the lines of:
<?php
$url = 'http://www.example.com';
print_r(get_headers($url));
print_r(get_headers($url, 1));
?>
curl_getinfo — Get information regarding a specific transfer
Check curl_getinfo
<?php
// Create a curl handle
$ch = curl_init('http://www.yahoo.com/');
// Execute
curl_exec($ch);
// Check if any error occurred
if(!curl_errno($ch))
{
$info = curl_getinfo($ch);
echo 'Took ' . $info['total_time'] . ' seconds to send a request to ' . $info['url'];
}
// Close handle
curl_close($ch);
curl_exec is necessary. Try CURLOPT_NOBODY to not download the body. That might be faster.
use this hitCurl method for fetch all type of api response i.e. Get / Post
function hitCurl($url,$param = [],$type = 'POST'){
$ch = curl_init();
if(strtoupper($type) == 'GET'){
$param = http_build_query((array)$param);
$url = "{$url}?{$param}";
}else{
curl_setopt_array($ch,[
CURLOPT_POST => (strtoupper($type) == 'POST'),
CURLOPT_POSTFIELDS => (array)$param,
]);
}
curl_setopt_array($ch,[
CURLOPT_URL => $url,
CURLOPT_RETURNTRANSFER => true,
]);
$resp = curl_exec($ch);
$statusCode = curl_getinfo($ch,CURLINFO_HTTP_CODE);
curl_close($ch);
return [
'statusCode' => $statusCode,
'resp' => $resp
];
}
Demo function to test api
function fetchApiData(){
$url = 'https://postman-echo.com/get';
$resp = $this->hitCurl($url,[
'foo1'=>'bar1',
'foo2'=>'bar2'
],'get');
$apiData = "Getting header code {$resp['statusCode']}";
if($resp['statusCode'] == 200){
$apiData = json_decode($resp['resp']);
}
echo "<pre>";
print_r ($apiData);
echo "</pre>";
}
Here is my solution need get Status Http for checking status of server regularly
$url = 'http://www.example.com'; // Your server link
while(true) {
$strHeader = get_headers($url)[0];
$statusCode = substr($strHeader, 9, 3 );
if($statusCode != 200 ) {
echo 'Server down.';
// Send email
}
else {
echo 'oK';
}
sleep(30);
}