file_get_contents - failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found - php

I'm having some weird problems with file_get_contents after moving my site to a new domain. I had to set up a new domain and IP address (using Plesk) to get a new ssl certificate working. Now my file_get_contents calling a script on the same domain is giving me this:
failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found
If I call the same url using file_get_contents on another server it works fine, and if I call www.google.com from the server thats failing that works, so it only seems to be if I call a url on the same sever!
I have a feeling it might have something to do with having two IPs with two different ssl certificates on the one server, when i file_get_contents / (index page) of the server from the server I get the plesk 'this is a new domain' page so its like apache isnt looking up the right virtual host when its called from its own sever.
To clarify (hopefully!):
On the server hosting the domain:
file_get_contents('https://mydomain.com?limit=4&offset=0&s_date=2012-02-05&e_date=2012-03-13&order=release_date&dir=desc&cid=12');
gives "failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found"
file_get_contents('http://www.google.com');
works correctly
On another server:
file_get_contents('https://mydomain.com?limit=4&offset=0&s_date=2012-02-05&e_date=2012-03-13&order=release_date&dir=desc&cid=12');
works fine.
I have tried turning ssl off and I still get the same problem.

I've had this problem too, when I working on a little test server at home. The domain name is resolved to your external IP address, and a request is sent. But because the request is coming from inside your network, the router doesn't recognise it as a normal request. It probably has a web interface for configuring it, and tries to return a page from its own management system, which is then not found at the path you specified.
In that case, I was working on a Windows PC, and I could solve it by adding the domain I was testing to my hosts file, specifying 127.0.0.1 as the IP-address (or the IP-address of the server, if it is another machine within the same network). In Linux there should be a similar solution, I think.
The problem isn't PHP or your server, but your router.

If you just need to handle the warning when the URL is not found (as I did), you may just do this to turn Warnings into Exceptions:
set_error_handler(
function ($err_severity, $err_msg, $err_file, $err_line, array $err_context) {
// do not throw an exception if the #-operator is used (suppress)
if (error_reporting() === 0) return false;
throw new ErrorException( $err_msg, 0, $err_severity, $err_file, $err_line );
},
E_WARNING
);
try {
$contents = file_get_contents($your_url);
} catch (Exception $e) {
echo $e->getMessage();
}
restore_error_handler();
Solution based on this thread/question.

Most hosting provides now block the furl_open parameter which allows you to use file_get_contents() to load data from an external url.
You can use CURL or a PHP client library like Guzzle

Try to do this :
file_get_contents('https://mydomain.com?'.urlencode('limit=4&offset=0&s_date=2012-02-05&e_date=2012-03-13&order=release_date&dir=desc&cid=12'));

I got the same error in CodeIgniter 3. I was doing like this
file_get_contents(base_url('database.json'));
and then this
file_get_contents(site_url('database.json'));
My problem get resolved after I changed it to this
file_get_contents(__DIR__.'/database.php');
Reason behind this was, I was trying to get the internal resource from external url which these methods base_url and site_url return. While __DIR__ return internal url.

Related

Google reCaptcha validation (siteverify) time out

I am trying to validate Google reCaptcha on my website (I am using godaddy server (cpanel)). But when I try to verify server side, it takes too long, and and then I get a time out.
Can anyone point out what is causing the problem?
if(isset($this->data['g-recaptcha-response']) && !empty($this->data['g-recaptcha-response'])){
$u = "https://www.google.com/recaptcha/api/siteverify?secret=".DataSecret."&response=".$this->data['g-recaptcha-response']."&remoteip=".$_SERVER['REMOTE_ADDR'];
$response = #file_get_contents($u);
$arr = json_decode($response,true);
}
I get this error message:
Warning (2): file_get_contents(https://www.google.com/recaptcha/api/siteverify?secret={XXXXXXXXXX}&response={XXXXXXXXXXXX}&remoteip=xxx.x.xx.xx): failed to open stream: Connection timed out
NOTE: On local host (which is on the same server), file_get_contents is working. I just have issue with recaptcha.
#Yogesh Saroya,
Have you checked allow_url_fopen=On, allow_url_include=On in your server settings? I think these two settings are required.

HTTP error on file_get_contents for FB Graph API

I am trying to retrieve details of a page/profile from Facebook into my PHP application. So I retrieve the id first and then run the following query. However I am getting an error.
I am getting the following error:
Warning: file_get_contents(https://graph.facebook.com/v2.8/40444963499?fields=id,name,picture.width(700).height(700),albums.limit(5){name,photos.limit(2){name, picture}},posts.limit(5)&access_token="my access token here"): failed to open stream: HTTP request failed! HTTP/1.1 400 Bad Request
However, if I run the same URL on the browser, I get back the JSON correctly.
Windows almost always has problems with SSL certificates, I wouldn't recommend you do this for your production site, but during development it's fine. By disabling the SSL check you're effectively saying you don't care if the site has a valid SSL certificate, which means that if someone was trying to impersonate graph.facebook.com you would be communicating with this site that is likely trying to steal your access token.
$context = stream_context_create(array(
"ssl"=>array(
"verify_peer"=>false,
"verify_peer_name"=>false,
)
));
file_get_contents('https://graph.facebook.com/v2.8/40444963499?fields=id,name,picture.width(700).height(700),albums.limit(5){name,photos.limit(2){name, picture}},posts.limit(5)&access_token=FB_ACCESS_TOKEN', null, $context);
Now if you're interested in actually fixing the problem on your machine, then review this answer: PHP - SSL certificate error: unable to get local issuer certificate
So the error was that there was a space in between two of the parameters passed in the url. It wasn't showing on the web browser because I assume it does not print the space but because of the space, the file_get_contents wasn't working.

PHP Warning: php_network_getaddresses: getaddrinfo failed: Name or service not known in /some/path on line x

Our application is running since years without any glitch and one day we decided to use redis as our caching server for speedy delivery of data. Redis was installed/configured on new server and content were cached basis business requirement. The redis services were used at application end. Post production deployment, page started throwing numerous
warning: php_network_getaddresses: getaddrinfo failed: Name or service not known in /path/of/the/calling/script/file
What would be the reason and resolution ?
During development and testbed, we were using the different server, where I was able to make connection to redis server. But post production failover, I noticed, system was unable to resolve the address. It was unable to connect to caching.example.com
$redisServer = 'caching.example.com';
try {
$redis = new Redis();
$redis->connect($redisServer, 6379);
} catch (Exception $e) {
print_r($e);
}
Doing the root cause analysis, I did the host entry for caching.example.com on production server and it worked.
Basically/Theoretically If you are trying to access a remote URL, then file_get_contents() is your best bet. You can provide a full URL to that function, and it will fetch the content at that location using a normal HTTP request.
If you only want to send an HTTP request and ignore the response, you could use fsockopen() and manually send the HTTP request headers, ignoring any response. It might be easier with cURL though, or just plain old fopen(), which will open the connection but not necessarily read any response.

PHP error doing file_get_contents on a Hostinger host: failed to open stream

I'm trying to use the http://ip-api.com/json/ api for get and show the information of a visitant that conects to my website on hostinguer.com.
When I do the next, it should responds me with a json with all the info in a json format:
echo file_get_contents("http://ip-api.com/json/{$user_ip}");
But I get the following error message:
Warning: file_get_contents(http://ip-api.com/json/[MY PUBLIC IP]): failed to open stream: Connection refused in /home/[HOSTINGUER USER]/public_html/ip2.php on line 31
And other strange thing: if I use another API it works correctly and returns me the correct Json! The other API command:
echo file_get_contents("http://ipinfo.io/{$user_ip}/json");
So, I want to use the ip-api.com API cause the results are more accurate, but only works if I use ipinfo.io API... Why can I do the petition to a website and not to the other?
Otherwhise, I tried both in local using curl or writting in my webrowser and it works correctly. Also I tried in local in a Lammp and the both works perfectly. And finally I tried somthing like this post: PHP file_get_contents() returns "failed to open stream: HTTP request failed!" in hostinguer and it doesn't works..
I thought maybe is something in the hostinguer configuration but...
Thanks in advance!
Your IP address is banned. If you're using normal web hosting, your outgoing IP address is shared with other users, who probably did more requests than allowed.
Go to http://ip-api.com/docs/unban and enter your server's outgoing IP address (check it via http://ipinfo.io/json)

Is it possible to differenciate proxy error from endpoint error with php curl

little curl question ^^
You have:
a site web (endpoint.com/request.php)
a proxy server (proxy.com:8080)
Is it possible when sending requests through proxy with php_curl to remote web site (ex: an API) to differentiate cases when the proxy part failed (ex: proxy down, bad auth, timeout ...) from when the remote web site fails (ex: 404, site down, bad auth, timeout ...)
Thx
See the PHP Curl documenation and/or the libcurl documentation.
It looks like the only proxy-specific error code is (CURLE_COULDNT_RESOLVE_PROXY) 5
you need to check the response result of curl_exec($curl) and try to display the error using
if (!$response) {
echo "cURL error number:" .curl_errno($curl)."\t";
echo "cURL error:" . curl_error($curl)."\n";
}

Categories