HTTP error on file_get_contents for FB Graph API - php

I am trying to retrieve details of a page/profile from Facebook into my PHP application. So I retrieve the id first and then run the following query. However I am getting an error.
I am getting the following error:
Warning: file_get_contents(https://graph.facebook.com/v2.8/40444963499?fields=id,name,picture.width(700).height(700),albums.limit(5){name,photos.limit(2){name, picture}},posts.limit(5)&access_token="my access token here"): failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request
However, if I run the same URL on the browser, I get back the JSON correctly.

Windows almost always has problems with SSL certificates, I wouldn't recommend you do this for your production site, but during development it's fine. By disabling the SSL check you're effectively saying you don't care if the site has a valid SSL certificate, which means that if someone was trying to impersonate graph.facebook.com you would be communicating with this site that is likely trying to steal your access token.
$context = stream_context_create(array(
"ssl"=>array(
"verify_peer"=>false,
"verify_peer_name"=>false,
)
));
file_get_contents('https://graph.facebook.com/v2.8/40444963499?fields=id,name,picture.width(700).height(700),albums.limit(5){name,photos.limit(2){name, picture}},posts.limit(5)&access_token=FB_ACCESS_TOKEN', null, $context);
Now if you're interested in actually fixing the problem on your machine, then review this answer: PHP - SSL certificate error: unable to get local issuer certificate

So the error was that there was a space in between two of the parameters passed in the url. It wasn't showing on the web browser because I assume it does not print the space but because of the space, the file_get_contents wasn't working.

Related

No SSL certificate needed for HTTPS in cURL PHP?

I'm writing a simple cURL script to request data from a website (https://openweathermap.org) which is HTTPS.
Basically, we need to configure curl instance to deal with SSL-enabled websites, through something like this:
curl_setopt($ch, CURL_CAINFO, "path/to/cert");
However, I tried at first to make a request using the testing URL provided by openweathermap.
<?php
$ressource = curl_init("https://samples.openweathermap.org/data/2.5/weather?q=London,uk&appid=439d4b804bc8187953eb36d2a8c26a02");
$data = curl_exec($ressource);
curl_close($ressource);
Normally, I would have an error like: Error: SSL certificate problem: unable to get local issuer certificate
Instead, the request works perfectly:
Is it supposed to work and not return an error by not providing an SSL certificate when requesting an HTTPS URL?
Did I miss something about requesting HTTPS URLs via cURL?

Pinterest API - 403 error when requesting access token on Dreamhost server

I'm using this php wrapper, successfully getting a code from pinterest using the link generated according to the docs:
$loginurl = $pinterest->auth->getLoginUrl($callback_url, array('read_public'));
Then when I run this:
$token = $pinterest->auth->getOAuthToken($_GET['code']);
It works fine on my local server, but when I try to run it on our Dreamhost server, I get:
Pinterest error (code: 403) with message: Forbidden
I looked through the error documentation Pinterest supplies, but I can't find anything relating to 403 errors when retrieving oauth tokens.
The only two places I've seen mention of 403 errors when requesting oauth tokens from Pinterest's API have concluded that Pinterest is blocking the ips or the user agent string.
I've tried manually overriding the user agent string to no avail.
I've tried contacting Pinterest to find out if there is anything I'm missing and they directed me here.
Make sure the Dreamhost server is using an HTTPS URL e.g. is using TLS. That could be the reason why you get a 403.

PHP error doing file_get_contents on a Hostinger host: failed to open stream

I'm trying to use the http://ip-api.com/json/ api for get and show the information of a visitant that conects to my website on hostinguer.com.
When I do the next, it should responds me with a json with all the info in a json format:
echo file_get_contents("http://ip-api.com/json/{$user_ip}");
But I get the following error message:
Warning: file_get_contents(http://ip-api.com/json/[MY PUBLIC IP]): failed to open stream: Connection refused in /home/[HOSTINGUER USER]/public_html/ip2.php on line 31
And other strange thing: if I use another API it works correctly and returns me the correct Json! The other API command:
echo file_get_contents("http://ipinfo.io/{$user_ip}/json");
So, I want to use the ip-api.com API cause the results are more accurate, but only works if I use ipinfo.io API... Why can I do the petition to a website and not to the other?
Otherwhise, I tried both in local using curl or writting in my webrowser and it works correctly. Also I tried in local in a Lammp and the both works perfectly. And finally I tried somthing like this post: PHP file_get_contents() returns "failed to open stream: HTTP request failed!" in hostinguer and it doesn't works..
I thought maybe is something in the hostinguer configuration but...
Thanks in advance!
Your IP address is banned. If you're using normal web hosting, your outgoing IP address is shared with other users, who probably did more requests than allowed.
Go to http://ip-api.com/docs/unban and enter your server's outgoing IP address (check it via http://ipinfo.io/json)

YouTube API with https: "Google_IOException: p roblem with the SSL CA

Short form: I've got some PHP code that is uploading videos from my site to YouTube. I'm using the usual Google-provided PHP library, google-api-php-client. I have this code running on two servers; it works on one (https://www.example.com) but has suddenly stopped working on the other (https://dev.example.com), after a period of working nicely.
Details: The code doing the transfer is relatively standard, as far as I can tell: Once the libraries are loaded and some variables get some values, I'm doing:
$client = new Google_Client();
$client->setClientId($youtube_client_id);
$client->setClientSecret($youtube_client_secret);
$redirect = filter_var('https://example.com/upload-to-youtube', FILTER_SANITIZE_URL);
$client->setRedirectUri($redirect);
$youtube = new Google_YoutubeService($client);
$client->authenticate();
header('Location: ' . $redirect);
For the server that's not working, the $client->authenticate line throws the error:
Google_IOException: HTTP Error: (0) Problem with the SSL CA cert (path? access rights?) in Google_CurlIO->makeRequest() (line 128 of /var/www/html/example/includes/google-api-php-client/src/io/Google_CurlIO.php).
Other possibly-relevant details:
Back in the Developer's Console for my YT application, I have the following Redirect URIs set up:
https://dev.example.com/delete-from-youtube
https://dev.example.com/upload-to-youtube
https://www.example.com/delete-from-youtube
https://www.example.com/upload-to-youtube
According to the SSL certificate tester I found at https://www.sslshopper.com/ssl-checker.html, the certificates on both sites are visible and valid.
The certificate for dev.example.com is from Comodo; the certificate for www.example.com is from GeoTrust. I suppose I could try getting a new cert from GeoTrust, but I'd rather not spend the money unless I know it will fix the problem.
Both servers are running the same version of curl, if that's relevant.
It seems (to me) that the certificate and access to it should be OK (unless the certificate tester is wrong), so I don't understand where the complaint is coming from. The code and the server configuration has been unchanged for quite a while, hence my search for an external explanation. (I understand that these are Famous Last Words, but whatever.) Any thoughts out there? Thanks!
Potential duplicate question:
Amazon MarketplaceWebServiceOrders requests suddenly failing, PHP curl giving SSL CA cert error?
I had to restart the server, not just apache in order to solve the issue.

file_get_contents - failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found

I'm having some weird problems with file_get_contents after moving my site to a new domain. I had to set up a new domain and IP address (using Plesk) to get a new ssl certificate working. Now my file_get_contents calling a script on the same domain is giving me this:
failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found
If I call the same url using file_get_contents on another server it works fine, and if I call www.google.com from the server thats failing that works, so it only seems to be if I call a url on the same sever!
I have a feeling it might have something to do with having two IPs with two different ssl certificates on the one server, when i file_get_contents / (index page) of the server from the server I get the plesk 'this is a new domain' page so its like apache isnt looking up the right virtual host when its called from its own sever.
To clarify (hopefully!):
On the server hosting the domain:
file_get_contents('https://mydomain.com?limit=4&offset=0&s_date=2012-02-05&e_date=2012-03-13&order=release_date&dir=desc&cid=12');
gives "failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found"
file_get_contents('http://www.google.com');
works correctly
On another server:
file_get_contents('https://mydomain.com?limit=4&offset=0&s_date=2012-02-05&e_date=2012-03-13&order=release_date&dir=desc&cid=12');
works fine.
I have tried turning ssl off and I still get the same problem.
I've had this problem too, when I working on a little test server at home. The domain name is resolved to your external IP address, and a request is sent. But because the request is coming from inside your network, the router doesn't recognise it as a normal request. It probably has a web interface for configuring it, and tries to return a page from its own management system, which is then not found at the path you specified.
In that case, I was working on a Windows PC, and I could solve it by adding the domain I was testing to my hosts file, specifying 127.0.0.1 as the IP-address (or the IP-address of the server, if it is another machine within the same network). In Linux there should be a similar solution, I think.
The problem isn't PHP or your server, but your router.
If you just need to handle the warning when the URL is not found (as I did), you may just do this to turn Warnings into Exceptions:
set_error_handler(
function ($err_severity, $err_msg, $err_file, $err_line, array $err_context) {
// do not throw an exception if the #-operator is used (suppress)
if (error_reporting() === 0) return false;
throw new ErrorException( $err_msg, 0, $err_severity, $err_file, $err_line );
},
E_WARNING
);
try {
$contents = file_get_contents($your_url);
} catch (Exception $e) {
echo $e->getMessage();
}
restore_error_handler();
Solution based on this thread/question.
Most hosting provides now block the furl_open parameter which allows you to use file_get_contents() to load data from an external url.
You can use CURL or a PHP client library like Guzzle
Try to do this :
file_get_contents('https://mydomain.com?'.urlencode('limit=4&offset=0&s_date=2012-02-05&e_date=2012-03-13&order=release_date&dir=desc&cid=12'));
I got the same error in CodeIgniter 3. I was doing like this
file_get_contents(base_url('database.json'));
and then this
file_get_contents(site_url('database.json'));
My problem get resolved after I changed it to this
file_get_contents(__DIR__.'/database.php');
Reason behind this was, I was trying to get the internal resource from external url which these methods base_url and site_url return. While __DIR__ return internal url.

Categories