file_get_contents from remote client and SSL - php

I have PHP Laravel5 project. I use file_get_contents() in a controller's action like the following:
$production = json_decode(file_get_contents(url('/operation/get-production/'.$job->id)))->production;
The above route is excluded from authentication and authorization and the above line of code works well from the localhost. However, when a remote user from the Internet uses that code, it generates the following error:
I'm not using IIS, the server is Apache on Ubuntu linux. I don't know why SSL is appeared in this case? I'm pretty sure that the URL supplied to the function is http://... not https://...

Your production URL may be forced to use https (and it's a good thing).
file_get_contents() isn't the perfect choice to get data over https but if php_openssl extension is activated and allow_url_fopen to set to "on" you can get content from https.
As you seems to don't have a valid SSL certificate : use PHP Curl must be a better idea ( http://php.net/manual/fr/book.curl.php ), as you can disabled SSL error with this :
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
(if you trust the URL : you could do this, never do this with an external website)

It sounds like the certificate verification is failing because it doesn't have a reference file of root certificates to check it with. See this Stack Overflow answer.

You have 2 possible situations:
Possible Answer #1, if you are using apache, check your conf and see if your domain is not forcing all calls to use SSL
OR -
Possible Answer #2 you need to create a flow for your GET url and send headers to the other server:
<?php //Create a flow
$opt = array(
'http'=>array(
'method'=>'GET',
'user_agent' => 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7) Gecko/20040803 Firefox/0.9.3',
'header'=>'Accept-language: en\r\n' .
'Cookie: foo=bar\r\n’ ) );
$context = stream_context_create($opt);
//Get external url response
$file = file_get_contents('http://www.example.com/some_file', false, $context); ?>

Related

How to get a CSV from a site with a 503 error with CloudFlare protection?

I have created a PHP script which gets a CSV from an external site using fopen and fgetcsv to store the data into an array.
The external site sporadically throws 503 errors. When this occurs fopen will not work and returns an error that the website is unavailable.
The external site in question continues to work fine via browser as it is protected using Cloud Flare.
Is there any way to still get the CSV in this scenario? I imagine by somehow mimicking a browser in my script to get the file...? May not be possible but need confirmation.
Cloudflare support site says:
On the other hand, a 503 Service Temporarily Unavailable error message with "cloudflare-nginx" in it means you are hitting a connection limit in a Cloudflare datacenter. Please contact Cloudflare support with the following information:
link
If the site work with broswers, it might be allowing connection only from broswers to save bandwith, but I think that, when your server contact the site, the connection limit is reached, so it doesn't depend on your server.
You can still try to use curl to emulate a normal broswer and try if it works.
<?php $url="https://example.com";
$agent= 'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.0.3705; .NET CLR 1.1.4322)';
$ch = curl_init(); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_VERBOSE, true);
curl_setopt($ch,CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_USERAGENT, $agent);
curl_setopt($ch, CURLOPT_URL,$url);
$result=curl_exec($ch);
var_dump($result);
?>
But still JavaScript won't load and the site may notice it.
There is no way you can bypass the CloudFlare protection using User-Agent or the like, because, if it was possible, then CloudFlare's wouldn't be any security at all.
What is probably happening is that either the backend has failed, but CloudFlare can allow the browser to use a cached response, or that the failing is intermittent and the browser still works because it's the next call. It might well happen that your CSV-scraper succeeds and the browser fails, and you do not know because when the scraper succeeds... you don't check with the browser at all, as you've no reason to.
As for what can you do, yes, you can emulate a human being with a browser. You do this by caching any successful responses together with a timestamp, and by retrying after a short pause when you get an error.
function scrapeCSV($retries = 3) {
if (0 === $retries) {
// return an invalid response to signify an error
return null;
}
$fp = #fopen(...);
if (!$fp) {
// failed.
sleep(1);
return scrapeCSV($retries - 1);
}
...
return $csv;
}
UPDATE
To access the second-level cache "as a browser would do" you probably need to cross-breed two different solutions: how to "fake" a browser connection and how to read from curl as if it was a stream (i.e. fopen).
If you're cool with recovering the whole CSV in one fell swoop, and parse it later once you've got it as a local file, then you only need the first answer (there's a more upvoted, more detailed and procedural answer below the one I linked - the one I linked is mine ;-) ).

PHP Send Custom HTTP Request

How can I send a custom HTTP Request to a server whose URL is "http://[IP]:[Port]/"?
What I mean is that, instead of the first line being a normal GET or POST like so:
GET /index.html HTTP/1.1
Host: www.example.com
How can this be replaced with something just like:
CUSTOM
Host: [IP]
I don't mind having to use any additional libraries if necessary such as cURL.
UPDATE:
I've tried using the following cURL code:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://[IP]:[Port]/");
curl_setopt($ch, CURLOPT_PORT, [Port]);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
//curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "CUSTOM");
$output = curl_exec($ch);
curl_close($ch);
print($output);
But it just keeps loading for 2 minutes until it said internal error (both with and without using the CURLOPT_CUSTOMREQUEST). However, if I use a standard website such as http://google.com it'll work fine.
Also I forgot to mention, the port my server is using is 7899, is this ok? Open it in my web browser fine, but cURL doesn't seem to be able to.
Looks like there's nothing wrong with your code. If you're using a shared hosting provider, ask them to open up outbound TCP port 7899.
You can create a custom HTTP request method using the http_request_method_register function:
http://www.php.net/manual/en/function.http-request-method-register.php
This code needs to be run on the server that will be handling the request. If you try to use a non-standard HTTP request method on any old server, depending on the server it may ignore it, or may return an error code.

loading FB thru PHP file_get_contents() throws 'You are using an Incompatible Web Browser' error

I have PHP hosting with GoDaddy. Lately for last one hour, I am not able to load facebook content from my php scripts as it always says that
'You are using an Incompatible Web Browser'.
I know that it seems to be a browser issue, but i am sure that it is not because I have tried it with firefox+chrome+IE on two windows machine and I have tried with Firefox+safari browsers on a mac machine. Its getting the same error every time.
Could you please let me know what could be a possible reason for this?
[Try loading http://cabbie.apprumble.in/index.php?r=site/test]
In normal circumstances, This should load the facebook home page properly, instead of showing the error that You have incompatible browser.
[PS: I am loading the facebook page using php call file_get_contents("http://facebook.com") which was working perfectly fine until an hour back. Also, if I load the url from outside the browser, it works perfectly fine, but if its invoked from within the php using file_get_contents call, the said error appears.)]
Could someone please reply soon as I am stuck in my development due to this.
Thanks,
Kshitij
file_get_contents uses the user agent set in your php.ini file from the setting user_agent. You probably cannot change this as you are on godaddy hosting.
You will need to switch from file_get_contents to something that lets you control the user agent. You could use curl or sockets. Here is a curl example:
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://www.facebook.com/");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.13) Gecko/20101206 Ubuntu/10.10 (maverick) Firefox/3.6.13'); // set the user agent here
$data = curl_exec($ch);
echo $data; // this is the homepage
Facebook is attempting to block bots by not allowing certain user agents to request pages. You need to spoof the user agent to look like a normal browser.

Can't connect to HTTPS site using cURL. Returns 0 length content instead. What can I do?

I have a site that connects using cURL (latest version) to a secure gateway for payment.
The problem is cURL always returns 0 length content. I get headers only. And only when I set cURL to return headers. I have the following flags in place.
curl_setopt($ch, CURLOPT_HTTPAUTH, CURLAUTH_ANY);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
curl_setopt($ch, CURLOPT_URL, $gatewayURI);
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_POST, 1);
The header returned is
HTTP/1.1 100 Continue
HTTP/1.1 200 OK
Date: Tue, 25 Nov 2008 01:08:34 GMT
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET
Content-Length: 0
Content-Type: text/html
Set-Cookie: ASPSESSIONIDxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx; path=/
Cache-control: private
I have also tried cURL'ing different sites and they return content fine. I think the problem might have something to do with the https connection.
I have spoken with the company and they are unhelpful.
Has anyone else experienced this error and know a work around? Should I ditch cURL and try and use fsockopen() ?
Thank you. :)
I had the same problem today.
Curl comes with an outdated file to authenticate HTTPS certificates from.
get the new one from:
http://curl.haxx.se/ca/cacert.pem
save it into some dir on your site
and add
curl_setopt ($curl_ch, CURLOPT_CAINFO, dirname(__FILE__)."/cacert.pem");
To every request :-)
IGNORE any dumbass comments about disabling CURLOPT_VERIFYPEER and CURLOPT_VERIFYHOST!! That leaves your code vulnerable to man in the middle attacks!
December 2016 edit:
Solve this properly by using Jasen's method mentioned below.
add curl.cainfo=/etc/ssl/certs/ca-certificates.crt to you php.ini
October 2017 edit:
There is now a composer package that helps you manage the ca certificates, so that you're not vulnerable if your cacert.pem becomes outdated due to revoking certificates.
https://github.com/paragonie/certainty -> composer require paragonie/certainty:dev-master
You should also try checking the error messages in curl_error(). You might need to do this once after each curl_* function.
http://www.php.net/curl_error
Note: This is strictly not production use. If you want to quickly debug, this may be useful. Otherwise, please use #SchizoDuckie's answer above.
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
Just add them. It works.
Just had a very similar problem and solved it by adding
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
Apparently the site I'm fetching redirects to another location and php-curl doesn't follow redirects by default.
I had a situation where this helped: (PHP 5.4.16 on Windows)
curl_setopt($ch, CURLOPT_SSLVERSION, 3);
Whenever I'm testing something with PHP/Curl, I try it from the command line first, figure out what works, and then port my options to PHP.
Sometimes you have to upgrade your Curl certificates to latest version not to have errors with https.
I used file_get_contents using stream_create_context and it works fine:
$postdataStr = http_build_query($postdataArr);
$context_options = array (
'http' => array ( <blink> // this will allways be http!!!</blink>
'method' => 'POST',
'header'=> "Content-type: application/x-www-form-urlencoded\r\n"
. "Content-Length: " . strlen($postdataArr) . "\r\n"
. "Cookie: " . $cookies."\r\n"
'content' => $postdataStr
)
);
$context = stream_context_create($context_options);
$HTTPSReq = file_get_contents('https://www.example.com/', false, $context);
there might be a problem at your web hosting company from where you are testing the secure communication for gateway, that they might not allow you to do that.
also there might be a username, password that must be provided before connecting to remote host.
or your IP might need to be in the list of approved IP for the remote server for communication to initiate.
I discovered this error on a recent application project. I was writing to run from the command line or the browser window, so I was using server detection to get the relative URL of the document I was asking for. The trouble was, the site is https, and each time I attempted to access http://(same server), cURL helpfully changed it to https.
This works fine from the browser, but from the command-line, I'd then get an SSL error even with both verify's set to false. What I had to do was,
1) Check $_SERVER['HTTP_HOST']. If present, use ($_SERVER['HTTPS'] ? "https://" : "http://").$_SERVER['HTTP_HOST']
2) Check $_SERVER['COMPUTERNAME'], and if it matched the production server, provide the https URL. ("https://(servername)")
3) If neither condition passed, it means I'm running command-line on a different server, and use "http://localhost".
Now, this worked, but it's a hack. Also, I never did figure out why on one server (https) cURL changed my URL, while on the other (also https) it left my URL alone.
Weird.
The best way to use https and avoid security issues is to use Firefox (or another tool) and download the certificate to your server. This webpage helped me a lot, and these were the steps that worked for me:
1) Open in Firefox the URL you're gonna use with CURL
2) On the address bar click on the padlock > more information (FF versions can have different menus, just find it). Click the View certificate button > Details tab.
3) Highlight the "right" certificate in Certificate hierarchy. In my case it was the second of three, called "cPanel, Inc. Certification Authority". I just discovered the right one by "trial and error" method.
4) Click the Export button. In my case the one who worked was the file type "PEM with chains" (again by trial and error method).
5) Then in your PHP script add:
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
curl_setopt($ch, CURLOPT_CAINFO, [PATH_TO_CRT_FILE]);
In addition I'd say that we must pay attention on the fact that these steps will probably need to be redone once a year or whenever the URL certificate is replaced or renewed.
You are using POST method, but are you providing an array of data? E.g.
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);

reading SSL page with CURL (php) [duplicate]

This question already has answers here:
PHP - SSL certificate error: unable to get local issuer certificate
(19 answers)
Closed 1 year ago.
I am trying to download the content of a secure (uses https) webpage using php and curl libraries.
However, reading failed and I get error 60: "SSL certificate problem, verify that the CA cert is OK."
also "Details: SSL3_GET_SERVER_CERTIFICATE:certificate verify failed"
So...pretty self explanatory error msg's.
My question is: How do I send an SSL certificate (the right one?) and get this page to verify it and let me in?
Also, here is my options array in case you are wondering:
$options = array(
CURLOPT_RETURNTRANSFER => true, // return web page
CURLOPT_HEADER => false, // don't return headers
CURLOPT_FOLLOWLOCATION => true, // follow redirects
CURLOPT_ENCODING => "", // handle all encodings
CURLOPT_USERAGENT => "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:x.x.x) Gecko/20041107 Firefox/x.x", // who am i
CURLOPT_AUTOREFERER => true, // set referer on redirect
CURLOPT_CONNECTTIMEOUT => 120, // timeout on connect
CURLOPT_TIMEOUT => 120, // timeout on response
CURLOPT_MAXREDIRS => 10, // stop after 10 redirects
CURLOPT_SSL_VERIFYHOST => 1,
);
Any suggestions would be great,
Andrew
It sounds like you might be misinterpreting the error. It looks to me like the site you're connecting to is self-signed or some other common problem. Just like the usual browser warning, you're easiest work around is to disable the checks.
You'll need to set CURLOPT_SSL_VERIFYPEER and CURLOPT_SSL_VERIFYHOST to FALSE. This should disable the two main checks. They may not both be required, but this should at least get you going.
To be clear, this disables a feature designed to protect you. Only do this if you have verified the certificate and server by some other means.
More info on the PHP site: curl_setopt()
If you want to use SSL peer verification (turning it off is not always good idea) you may use next solution on Windows globally for all applications:
Download file with root certificates from here:
http://curl.haxx.se/docs/caextract.html
Add to php.ini:
curl.cainfo=C:/path/to/cacert.pem
that's all magic, CURL can now verify certificates.
(as I know there is no such problem on Linux, at least on Ubuntu)
Even after following advice on SO.. You may still have problems with an error like:
error:14077438:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert internal error
the problem is with the SSL version. Use the following for version 3
curl_setopt($ch, CURLOPT_SSLVERSION,3)
I am assuming that u have enabled verification of peer and host as well and are pointing to an actual certificate file. Eg.
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
curl_setopt($ch, CURLOPT_CAINFO, getcwd() . "/cacert.pem");
This is a "problem" with openssl and VeriSign.
I had a similar problem and my openssl was missing the intermediate ssl certificate used by VeriSign to sign the server certificate.
https://knowledge.verisign.com/support/ssl-certificates-support/index?page=content&id=AR657
I had to import these intermediate certificates from the VeriSign Homepage or Firefox cert-database-export into my local ca-certificates list and after this step I was able to use wget/curl to use the protected connection without any errors.
If it's a developer machine - you can also add this certificate in you system.
Something like this - https://www.globalsign.com/support/intermediate/intermediate_windows.php
It's for WinXP, but it works also on other versions of windows.
You're not SENDing the SSL cert. It appears there's a problem with the SSL cert as it is installed on the host you are contacting. Use option -k or --insecure, to get past the complaint.
Ah. See Ryan Graham's answer
This is apparently on openssl bug. Tomcat can be configured to work around this in /etc/tomcat7/server.xml by restricting the available cipher list:
<Connector protocol="HTTP/1.1" SSLEnabled="true" ... ciphers="SSL_RSA_WITH_RC4_128_SHA"/>

Categories