I have a script where I read a file using:
file_get_contents(urlencode($url));
I get this error:
failed to open stream: HTTP request failed! HTTP/1.0 400 Bad request
I tried this, but I still get the error.
I've tried this:
ini_set('default_socket_timeout', 120);
This:
$opts = array('http'=>array('timeout' => 120));
$context = stream_context_create($opts);
$resul = file_get_contents($url,0,$context);
And this:
$opts = array('http'=>array('timeout' => 120,'header'=>'Connection : close'));
$context = stream_context_create($opts);
$resul = file_get_contents($url,false,$context);
Can you help me figure out why I get the error?
You need encode only "querystring", extract query and enconding this, after append enconded query you "url".
Note: file_get_contents requires allow_url_fopen=On in "php.ini", try use curl
Example (read my comments in code)
Note: This example get error in connection and http errors
<?php
//Set your page example
$uri = 'http://localhost/path/webservice.php?callback=&id=153&provenance=153&ret=a:1:{s:5:"infos";a:8:{s:8:"civilite";s:3:"Mme";s:5:"lname";s:0:"";s:5:"fname";s:8:"Nathalie";s:5:"email";s:17:"tometnata#free.fr";s:3:"tel";s:0:"";s:7:"adresse";s:0:"";s:6:"date_n";s:14:"10:"01/06/1969";s:2:"cp";s:0:"";}}';
//extract url
$parsed_url = parse_url($uri);
//Create fixed url
$fixed_url = $parsed_url['scheme'] . '://' . $parsed_url['host'] . $parsed_url['path'];
//If exists query
if (isset($parsed_url['query'])) {
$output = array();
$result = array();
//Extract querystring
parse_str($parsed_url['query'], $output);
//Encode values in querystring
forEach($output as $k => $v) {
$result[] = $k . '=' . rawurlencode($v);
}
//Append encoded querystring
$fixed_url .= '?' . implode('&', $result);
}
echo 'GET url: ', $fixed_url, '<br>';
//Get result in page
$ch = curl_init();
$timeout = 30; //set to zero for no timeout
curl_setopt ($ch, CURLOPT_URL, $fixed_url);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$file_contents = curl_exec($ch);
$errornum = curl_errno($ch);
$info = curl_getinfo($ch);
$status = (int) $info['http_code'];
if ($errornum !== 0) {
echo 'Error: ', curl_error($ch);
$file_contents = NULL;
} else if ($status !== 200) {
echo 'http_error: ', $status;
$file_contents = NULL;
} else {
echo 'Result:<hr>';
echo $file_contents;
}
curl_close($ch);
?>
Enable curl
Windows (Xampp): https://stackoverflow.com/a/1347340/1518921
Linux (like debian): https://stackoverflow.com/a/11724633/1518921
Mac OSX (probably outdated): https://stackoverflow.com/a/11354731/1518921
Related
I'm trying to implement reCAPTCHA in my website, everything seems working fine, except the return from file_get_contents().
Here is my code:
if ($_REQUEST["send"] == 1){
// access
$secretKey = 'my_key';
$captcha = $_POST['g-recaptcha-response'];
$ip = $_SERVER['REMOTE_ADDR'];
$response = file_get_contents("https://www.google.com/recaptcha/api/siteverify?secret=".$secretKey."&response=".$captcha."&remoteip=".$ip);
$responseKeys = json_decode($response,true);
echo ($responseKeys);exit;
if(intval($responseKeys["success"]) !== 1) {
$message = 'Invalid reCAPTCHA';
} else {
$msg = 'content';
send_mail('send_to',"Subject",$msg);
header("location:index.php?send=1");exit;
}
}
My variable response is returning empty.
I tried to open https://www.google.com/recaptcha/api/siteverify? inserting manually the variables and it seems to work fine.
Am I forgeting something?
Thanks
Their API waiting for a POST request. Your code send GET request.
See answer here How to post data in PHP using file_get_contents?
My wrappers were disabled, reason why I couldn't reach the URL and get the return.
As I don't have access to php.ini the workaround was send the request with curl, here is the code:
$url = "https://www.google.com/recaptcha/api/siteverify?secret=".$secretKey."&response=".$captcha."&remoteip=".$ip;
$ch = curl_init();
curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, 5);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($ch);
if (curl_errno($ch)) {
echo curl_error($ch);
echo "\n<br />";
$response = '';
} else {
curl_close($ch);
}
if (!is_string($response) || !strlen($response)) {
echo "Failed to get contents.";
$contents = '';
}
$responseKeys = json_decode($response,true);
I need to parse JSON from multiple URLs. Here is the way that I'm following:
<?php
//call
$url1 = file_get_contents("https://www.url1.com");
$url2 = file_get_contents("https://www.url2.com");
$url3 = file_get_contents("https://www.url3.com");
$url4 = file_get_contents("https://www.url4.com");
$url5 = file_get_contents("https://www.url5.com");
//parse
$decode1 = json_decode($url1, true);
$decode2 = json_decode($url2, true);
$decode3 = json_decode($url3, true);
$decode4 = json_decode($url4, true);
$decode5 = json_decode($url5, true);
//echo
if (is_array($decode1)) {
foreach ($decode1 as $key => $value) {
if (is_array($value) && isset($value['price'])) {
$price = $value['price'];
echo '<span><b>' . $price . '</b><span>';
}
}
}
?>
This way causes slowness in the page openings. On the other hand, I get these errors:
Warning: file_get_contents(https://www.url1.com): failed to open
stream: Redirection limit reached, aborting in
/home/directory/public_html/file.php on line 12
Warning: file_get_contents(https://www.url2.com): failed to open
stream: Redirection limit reached, aborting in
/home/directory/public_html/file.php on line 13
etc.
How can I fix the redirection limit reached warning?
I would suggest using cURL for fetching remote data. You could do this:
$urls = [
"https://www.url1.com",
"https://www.url2.com",
"https://www.url3.com",
"https://www.url4.com",
"https://www.url5.com"
];
$decoded = array_map("loadJSON", $urls);
if (is_array($decoded[0])) {
foreach ($decoded[0] as $key => $value) {
if (is_array($value) && isset($value['price'])) {
$price = $value['price'];
echo '<span><b>' . $price . '</b><span>';
}
}
}
/**
* Downloads a JSON file from a URL and returns its decoded content
*/
function loadJSON($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0); // If your server does not have SSL
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); // Follow redirections
curl_setopt($ch, CURLOPT_MAXREDIRS, 10); // 10 max redirections
$content = curl_exec($ch);
curl_close($ch);
$res = json_decode($content, true);
return $res;
}
I am trying to make a redirect php script, I want that script to check if the link exist and then redirect the user to the link, if it doesn't exist then it will get the next link and so on, but for some reason is not working, maybe you could give me some help on this.
<?php
$URL = 'http://www.site1.com';
$URL = 'http://www.site2.com';
$URL = 'http://www.site3.com';
$handlerr = curl_init($URL);
curl_setopt($handlerr, CURLOPT_RETURNTRANSFER, TRUE);
$resp = curl_exec($handlerr);
$ht = curl_getinfo($handlerr, CURLINFO_HTTP_CODE);
if ($ht == '404')
{ echo "Sorry the website is down atm, please come back later!";}
else { header('Location: '. $URL);}
?>
You are overwriting your $URL variable..
$URL = 'http://www.site1.com';
$URL = 'http://www.site2.com';
$URL = 'http://www.site3.com';
Put these urls in an array and go through it with a for each loop.
You have a few issues in your code. For 1, your $URL will overwrite itself, resulting in only 1 url in there. It needs to be an array:
array( 'http://www.site1.com', 'http://www.site2.com', 'http://www.site3.com' );
You can get many responses, not just a 404, so you should tell cURL to follow redirects. If the URL was a redirect itself, could get a 301 that redirects to a 200. So we want to follow that.
Try This:
<?php
function curlGet($url)
{
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
$output = curl_exec($ch);
$httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
if ( $httpcode == 200 ) {
return true;
}
return false;
}
$urlArray = array( 'http://www.site1.com', 'http://www.site2.com', 'http://www.site3.com' );
foreach ( $urlArray as $url ) {
if ( $result = curlGet($url) ) {
header('Location: ' . $url);
exit;
}
}
// if we made it here, we looped through every url
// and none of them worked
echo "No valid URLs found...";
http://php.net/manual/en/function.file-exists.php#74469
<?php
function url_exists($url) {
if (!$fp = curl_init($url)) return false;
return true;
}
?>
This will give you the url exists check.
to check multiple urls though, you need an array:
<?
$url_array = [];
$url_array[] = 'http://www.site1.com';
$url_array[] = 'http://www.site2.com';
$url_array[] = 'http://www.site3.com';
foreach ($url_array as $url) {
if url_exists($url){
// do what you need;
break;
}
}
?>
PS - this is completely untested, but should theoretically do what you need.
Here's my current script that does the API calling:
$client = "55447265ed444bb5b768ecb0765ba9cb";
$query = $_POST['q'];
$clnum = mt_rand(1,3);
$api = "https://api.instagram.com/v1/tags/".$query."/media/recent?client_id=".$client;
function get_curl($url) {
if(function_exists('curl_init')) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
$output = curl_exec($ch);
echo curl_error($ch);
curl_close($ch);
return $output;
} else{
return file_get_contents($url);
}
}
$response = get_curl($api);
$images = array();
if($response){
foreach(json_decode($response)->data as $item){
$src = $item->images->standard_resolution->url;
$thumb = $item->images->thumbnail->url;
$url = $item->link;
$images[] = array(
"src" => htmlspecialchars($src),
"thumb" => htmlspecialchars($thumb),
"url" => htmlspecialchars($url)
);
}
}
print_r(str_replace('\\/', '/', json_encode($images)));
die();
I found 2 codes that can do caching but need help integrating them into my current script. One script is longer than the other. Both scripts do a $cache variable followed by an "if, else" code then they both branch out into different codes. The 2nd code is really similar to my current script but trying to figure out how to merge them.
1st code:
// Also Perhaps you should cache the results as the instagram API is slow
$cache = './'.sha1($url).'.json';
if(file_exists($cache) && filemtime($cache) > time() - 60*60){
// If a cache file exists, and it is newer than 1 hour, use it
$jsonData = json_decode(file_get_contents($cache));
} else {
$jsonData = json_decode((file_get_contents($url)));
file_put_contents($cache,json_encode($jsonData));
}
$result = '<div id="instagram">'.PHP_EOL;
foreach ($jsonData->data as $key=>$value) {
$result .= "\t".'<a class="fancybox" data-fancybox-group="gallery"
title="'.htmlentities($value->caption->text).' '.htmlentities(date("F j, Y, g:i a", $value->caption->created_time)).'"
style="padding:3px" href="'.$value->images->standard_resolution->url.'">
<img src="'.$value->images->low_resolution->url.'" alt="'.$value->caption->text.'" width="'.$width.'" height="'.$height.'" />
</a>'.PHP_EOL;
}
$result .= '</div>'.PHP_EOL;
return $result;
}
2nd code:
$cache = './cache.json';
if(file_exists($cache) && filemtime($cache) > time() - 60*60){
// If a cache file exists, and it is newer than 1 hour, use it
$images = json_decode(file_get_contents($cache),true); //Decode as an json array
}
else{
// Make an API request and create the cache file
// For example, gets the 32 most popular images on Instagram
$response = get_curl($api); //change request path to pull different photos
$images = array();
if($response){
// Decode the response and build an array
foreach(json_decode($response)->data as $item){
$title = (isset($item->caption))?mb_substr($item->caption->text,0,70,"utf8"):null;
$src = $item->images->standard_resolution->url; //Caches standard res img path to variable $src
//Location coords seemed empty in the results but you would need to check them as mostly be undefined
$lat = (isset($item->data->location->latitude))?$item->data->location->latitude:null; // Caches latitude as $lat
$lon = (isset($item->data->location->longtitude))?$item->data->location->longtitude:null; // Caches longitude as $lon
$images[] = array(
"title" => htmlspecialchars($title),
"src" => htmlspecialchars($src),
"lat" => htmlspecialchars($lat),
"lon" => htmlspecialchars($lon) // Consolidates variables to an array
);
}
file_put_contents($cache,json_encode($images)); //Save as json
}
}
//Debug out
print_r($images);
//Added curl for faster response
function get_curl($url){
if(function_exists('curl_init')){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt ($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt ($ch, CURLOPT_SSL_VERIFYPEER, 0);
$output = curl_exec($ch);
echo curl_error($ch);
curl_close($ch);
return $output;
}else{
return file_get_contents($url);
}
}
I used below code which is taken from your provided code and it seems to be working fine..
<?php
$client = "55447265ed444bb5b768ecb0765ba9cb";
$query = $_POST['q'];
$clnum = mt_rand(1,3);
$api = "https://api.instagram.com/v1/tags/".$query."/media/recent?client_id=".$client;
function get_curl($url) {
if(function_exists('curl_init')) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
$output = curl_exec($ch);
echo curl_error($ch);
curl_close($ch);
return $output;
} else{
return file_get_contents($url);
}
}
$images = array();
$cache = './cache.json';
if(file_exists($cache) && filemtime($cache) > time() - 60*60){
// If a cache file exists, and it is newer than 1 hour, use it
$images = json_decode(file_get_contents($cache),true); //Decode as an json array
} else {
// Make an API request and create the cache file
// For example, gets the 32 most popular images on Instagram
$response = get_curl($api); //change request path to pull different photos
$images = array();
if($response){
// Decode the response and build an array
foreach(json_decode($response)->data as $item){
$title = (isset($item->caption))?mb_substr($item->caption->text,0,70,"utf8"):null;
$src = $item->images->standard_resolution->url; //Caches standard res img path to variable $src
//Location coords seemed empty in the results but you would need to check them as mostly be undefined
$lat = (isset($item->data->location->latitude))?$item->data->location->latitude:null; // Caches latitude as $lat
$lon = (isset($item->data->location->longtitude))?$item->data->location->longtitude:null; // Caches longitude as $lon
$images[] = array(
"title" => htmlspecialchars($title),
"src" => htmlspecialchars($src),
"lat" => htmlspecialchars($lat),
"lon" => htmlspecialchars($lon) // Consolidates variables to an array
);
file_put_contents($cache,json_encode($images)); //Save as json
}
}
}
//Debug out
echo "<pre>";
print_r($images);
Yeah this works. But u need to use the cache function because the instagram api is really slow.
It doesn't matter if u take curl or file_get_contents.... the fastes way is jquery. Jquery uses the client machine php the server.. and if ure server stands away from the api servers so it takes time.
I'm trying to save a users profile image on facebook using CURL. When I use the code below, I save a jpeg image but it has zero bytes in it. But if I exchange the url value to https://fbcdn-profile-a.akamaihd.net/hprofile-ak-snc4/211398_812269356_2295463_n.jpg, which is where http://graph.facebook.com/' . $user_id . '/picture?type=large redirects the browser, the image is saved without a problem. What am I doing wrong here?
<?php
$url = 'http://graph.facebook.com/' . $user_id . '/picture?type=large';
$file_handler = fopen('pic_facebook.jpg', 'w');
$curl = curl_init($url);
curl_setopt($curl, CURLOPT_FILE, $file_handler);
curl_setopt($curl, CURLOPT_HEADER, false);
curl_exec($curl);
curl_close($curl);
fclose($file_handler);
?>
There is a redirect, so you have to add this option for curl
// safemode if off:
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);
but if you have safemode if on, then:
// safemode if on:
<?php
function curl_redir_exec($ch)
{
static $curl_loops = 0;
static $curl_max_loops = 20;
if ($curl_loops++ >= $curl_max_loops)
{
$curl_loops = 0;
return FALSE;
}
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$data = curl_exec($ch);
#list($header, $data) = #explode("\n\n", $data, 2);
$http_code = curl_getinfo($ch, CURLINFO_HTTP_CODE);
if ($http_code == 301 || $http_code == 302)
{
$matches = array();
preg_match('/Location:(.*?)\n/', $header, $matches);
$url = #parse_url(trim(array_pop($matches)));
if (!$url)
{
//couldn't process the url to redirect to
$curl_loops = 0;
return $data;
}
$last_url = parse_url(curl_getinfo($ch, CURLINFO_EFFECTIVE_URL));
if (!$url['scheme'])
$url['scheme'] = $last_url['scheme'];
if (!$url['host'])
$url['host'] = $last_url['host'];
if (!$url['path'])
$url['path'] = $last_url['path'];
$new_url = $url['scheme'] . '://' . $url['host'] . $url['path'] . (#$url['query']?'?'.$url['query']:'');
return $new_url;
} else {
$curl_loops=0;
return $data;
}
}
function get_right_url($url) {
$curl = curl_init($url);
curl_setopt($curl, CURLOPT_HEADER, false);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
return curl_redir_exec($curl);
}
$url = 'http://graph.facebook.com/' . $user_id . '/picture?type=large';
$file_handler = fopen('pic_facebook.jpg', 'w');
$curl = curl_init(get_right_url($url));
curl_setopt($curl, CURLOPT_FILE, $file_handler);
curl_setopt($curl, CURLOPT_HEADER, false);
curl_exec($curl);
curl_close($curl);
fclose($file_handler);
If you can't process the redirect, try this instead:
Make the request to https://graph.facebook.com/<USER ID>?fields=picture and parse the response, which will be in JSON format and look like this - e.g. for Zuck you get this response:
{
"picture": "http://profile.ak.fbcdn.net/hprofile-ak-snc4/157340_4_3955636_q.jpg"
}
Then make your curl request directly to retrieve the image from that cloud storage URL
set
CURLOPT_FOLLOWLOCATION to true
so that it follows the 301/302 redirect the reads the image file from final location.
i.e.
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);
I managed to do it this way, works perfectly fine:
$data = file_get_contents('https://graph.facebook.com/[App-Scoped-ID]/picture?width=378&height=378&access_token=[Access-Token]');
$file = fopen('fbphoto.jpg', 'w+');
fputs($file, $data);
fclose($file);
You just need an App Access Token (APPID . '|' . APPSECRET), and you can specify width and height.
You can also add "redirect=false" to the URL, to get a JSON object with the URL (For example: https://fbcdn-profile-a.akamaihd.net/hprofile-ak-xpa1...)
CURLOPT_FOLLOWLOCATION has been removed in PHP5.4, so it´s not really an option anymore.