Related
I’ve been trying to access this particular REST service from a PHP page I’ve created on our server. I narrowed the problem down to these two lines. So my PHP page looks like this:
<?php
$response = file_get_contents("https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json");
echo $response; ?>
The page dies on line 2 with the following errors:
Warning: file_get_contents(): SSL operation failed with code 1.
OpenSSL Error messages: error:14090086:SSL
routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed in
...php on line 2
Warning: file_get_contents(): Failed to enable crypto in ...php on
line 2
Warning:
file_get_contents(https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json):
failed to open stream: operation failed in ...php on line 2
We’re using a Gentoo server. We recently upgraded to PHP version 5.6. It was after the upgrade when this problem appeared.
I found when I replace the REST service with an address like https://www.google.com; my page works just fine.
In an earlier attempt I set “verify_peer”=>false, and passed that in as an argument to file_get_contents, as described here: file_get_contents ignoring verify_peer=>false? But like the writer noted; it made no difference.
I’ve asked one of our server administrators if these lines in our php.ini file exist:
extension=php_openssl.dll
allow_url_fopen = On
He told me that since we’re on Gentoo, openssl is compiled when we build; and it’s not set in the php.ini file.
I also confirmed that allow_url_fopen is working. Due to the specialized nature of this problem; I’m not finding a lot of information for help. Have any of you come across something like this? Thanks.
This was an enormously helpful link to find:
http://php.net/manual/en/migration56.openssl.php
An official document describing the changes made to open ssl in PHP 5.6
From here I learned of one more parameter I should have set to false: "verify_peer_name"=>false
Note: This has very significant security implications. Disabling verification potentially permits a MITM attacker to use an invalid certificate to eavesdrop on the requests. While it may be useful to do this in local development, other approaches should be used in production.
So my working code looks like this:
<?php
$arrContextOptions=array(
"ssl"=>array(
"verify_peer"=>false,
"verify_peer_name"=>false,
),
);
$response = file_get_contents("https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json", false, stream_context_create($arrContextOptions));
echo $response; ?>
You shouldn't just turn off verification. Rather you should download a certificate bundle, perhaps the curl bundle will do?
Then you just need to put it on your web server, giving the user that runs php permission to read the file. Then this code should work for you:
$arrContextOptions= [
'ssl' => [
'cafile' => '/path/to/bundle/cacert.pem',
'verify_peer'=> true,
'verify_peer_name'=> true,
],
];
$response = file_get_contents(
'https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json',
false,
stream_context_create($arrContextOptions)
);
Hopefully, the root certificate of the site you are trying to access is in the curl bundle. If it isn't, this still won't work until you get the root certificate of the site and put it into your certificate file.
I fixed this by making sure that that OpenSSL was installed on my machine and then adding this to my php.ini:
openssl.cafile=/usr/local/etc/openssl/cert.pem
You can get around this problem by writing a custom function that uses curl, as in:
function file_get_contents_curl( $url ) {
$ch = curl_init();
curl_setopt( $ch, CURLOPT_AUTOREFERER, TRUE );
curl_setopt( $ch, CURLOPT_HEADER, 0 );
curl_setopt( $ch, CURLOPT_RETURNTRANSFER, 1 );
curl_setopt( $ch, CURLOPT_URL, $url );
curl_setopt( $ch, CURLOPT_FOLLOWLOCATION, TRUE );
$data = curl_exec( $ch );
curl_close( $ch );
return $data;
}
Then just use file_get_contents_curl instead of file_get_contents whenever you're calling a url that begins with https.
Working for me, I am using PHP 5.6. openssl extension should be enabled and while calling google map api verify_peer make false
Below code is working for me.
<?php
$arrContextOptions=array(
"ssl"=>array(
"verify_peer"=>false,
"verify_peer_name"=>false,
),
);
$url = "https://maps.googleapis.com/maps/api/geocode/json?latlng="
. $latitude
. ","
. $longitude
. "&sensor=false&key="
. Yii::$app->params['GOOGLE_API_KEY'];
$data = file_get_contents($url, false, stream_context_create($arrContextOptions));
echo $data;
?>
At first you need to have enabled curl extension in PHP. Then you can use this function:
function file_get_contents_ssl($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_REFERER, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 3000); // 3 sec.
curl_setopt($ch, CURLOPT_TIMEOUT, 10000); // 10 sec.
$result = curl_exec($ch);
curl_close($ch);
return $result;
}
It works similar to function file_get_contents(..).
Example:
echo file_get_contents_ssl("https://www.example.com/");
Output:
<!doctype html>
<html>
<head>
<title>Example Domain</title>
...
After falling victim to this problem on centOS after updating php to php5.6 I found a solution that worked for me.
Get the correct directory for your certs to be placed by default with this
php -r "print_r(openssl_get_cert_locations()['default_cert_file']);"
Then use this to get the cert and put it in the default location found from the code above
wget http://curl.haxx.se/ca/cacert.pem -O <default location>
You basically have to set the environment variable SSL_CERT_FILE to the path of the PEM file of the ssl-certificate downloaded from the following link : http://curl.haxx.se/ca/cacert.pem.
It took me a lot of time to figure this out.
If your PHP version is 5, try installing cURL by typing the following command in the terminal:
sudo apt-get install php5-curl
following below steps will fix this issue,
Download the CA Certificate from this link: https://curl.haxx.se/ca/cacert.pem
Find and open php.ini
Look for curl.cainfo and paste the absolute path where you have download the Certificate. curl.cainfo ="C:\wamp\htdocs\cert\cacert.pem"
Restart WAMP/XAMPP (apache server).
It works!
hope that helps !!
Just wanted to add to this since I ran into the same problem and nothing I could find anywhere would work (e.g downloading the cacert.pem file, setting cafile in php.ini etc.)
If you are using NGINX and your SSL certificate comes with an "intermediate certificate", you need to combine the intermediate cert file with your main "mydomain.com.crt" file and it should work. Apache has a setting specific for intermediate certs, but NGINX does not so it must be within same file as your regular cert.
Reason for this error is that PHP does not have a list of trusted certificate authorities.
PHP 5.6 and later try to load the CAs trusted by the system automatically. Issues with that can be fixed. See http://php.net/manual/en/migration56.openssl.php for more information.
PHP 5.5 and earlier are really hard to setup correctly since you manually have to specify the CA bundle in each request context, a thing you do not want to sprinkle around your code.
So I decided for my code that for PHP versions < 5.6, SSL verification simply gets disabled:
$req = new HTTP_Request2($url);
if (version_compare(PHP_VERSION, '5.6.0', '<')) {
//correct ssl validation on php 5.5 is a pain, so disable
$req->setConfig('ssl_verify_host', false);
$req->setConfig('ssl_verify_peer', false);
}
Had the same error with PHP 7 on XAMPP and OSX.
The above mentioned answer in https://stackoverflow.com/ is good, but it did not completely solve the problem for me. I had to provide the complete certificate chain to make file_get_contents() work again. That's how I did it:
Get root / intermediate certificate
First of all I had to figure out what's the root and the intermediate certificate.
The most convenient way is maybe an online cert-tool like the ssl-shopper
There I found three certificates, one server-certificate and two chain-certificates (one is the root, the other one apparantly the intermediate).
All I need to do is just search the internet for both of them. In my case, this is the root:
thawte DV SSL SHA256 CA
And it leads to his url thawte.com. So I just put this cert into a textfile and did the same for the intermediate. Done.
Get the host certificate
Next thing I had to to is to download my server cert. On Linux or OS X it can be done with openssl:
openssl s_client -showcerts -connect whatsyoururl.de:443 </dev/null 2>/dev/null|openssl x509 -outform PEM > /tmp/whatsyoururl.de.cert
Now bring them all together
Now just merge all of them into one file. (Maybe it's good to just put them into one folder, I just merged them into one file). You can do it like this:
cat /tmp/thawteRoot.crt > /tmp/chain.crt
cat /tmp/thawteIntermediate.crt >> /tmp/chain.crt
cat /tmp/tmp/whatsyoururl.de.cert >> /tmp/chain.crt
tell PHP where to find the chain
There is this handy function openssl_get_cert_locations() that'll tell you, where PHP is looking for cert files. And there is this parameter, that will tell file_get_contents() where to look for cert files. Maybe both ways will work. I preferred the parameter way. (Compared to the solution mentioned above).
So this is now my PHP-Code
$arrContextOptions=array(
"ssl"=>array(
"cafile" => "/Applications/XAMPP/xamppfiles/share/openssl/certs/chain.pem",
"verify_peer"=> true,
"verify_peer_name"=> true,
),
);
$response = file_get_contents($myHttpsURL, 0, stream_context_create($arrContextOptions));
That's all. file_get_contents() is working again. Without CURL and hopefully without security flaws.
<?php
$stream_context = stream_context_create([
"ssl" => [
"verify_peer" => false,
"verify_peer_name" => false
]
]);
$response = file_get_contents("https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json", false, $stream_context);
echo $response;
?>
Just tested of PHP 7.2, it's working well.
EDIT: Also tested and working on PHP 7.1
Had the same ssl-problem on my developer machine (php 7, xampp on windows) with a self signed certificate trying to fopen a "https://localhost/..."-file. Obviously the root-certificate-assembly (cacert.pem) didn't work.
I just copied manually the code from the apache server.crt-File in the downloaded cacert.pem and did the openssl.cafile=path/to/cacert.pem entry in php.ini
Another thing to try is to re-install ca-certificates as detailed here.
# yum reinstall ca-certificates
...
# update-ca-trust force-enable
# update-ca-trust extract
And another thing to try is to explicitly allow the one site's certificate in question as described here (especially if the one site is your own server and you already have the .pem in reach).
# cp /your/site.pem /etc/pki/ca-trust/source/anchors/
# update-ca-trust extract
I was running into this exact SO error after upgrading to PHP 5.6 on CentOS 6 trying to access the server itself which has a cheapsslsecurity certificate which maybe it needed to be updated, but instead I installed a letsencrypt certificate and with these two steps above it did the trick. I don't know why the second step was necessary.
Useful Commands
View openssl version:
# openssl version
OpenSSL 1.0.1e-fips 11 Feb 2013
View PHP cli ssl current settings:
# php -i | grep ssl
openssl
Openssl default config => /etc/pki/tls/openssl.cnf
openssl.cafile => no value => no value
openssl.capath => no value => no value
Regarding errors similar to
[11-May-2017 19:19:13 America/Chicago] PHP Warning: file_get_contents(): SSL operation failed with code 1. OpenSSL Error messages:
error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed
Have you checked the permissions of the cert and directories referenced by openssl?
You can do this
var_dump(openssl_get_cert_locations());
To get something similar to this
array(8) {
["default_cert_file"]=>
string(21) "/usr/lib/ssl/cert.pem"
["default_cert_file_env"]=>
string(13) "SSL_CERT_FILE"
["default_cert_dir"]=>
string(18) "/usr/lib/ssl/certs"
["default_cert_dir_env"]=>
string(12) "SSL_CERT_DIR"
["default_private_dir"]=>
string(20) "/usr/lib/ssl/private"
["default_default_cert_area"]=>
string(12) "/usr/lib/ssl"
["ini_cafile"]=>
string(0) ""
["ini_capath"]=>
string(0) ""
}
This issue frustrated me for a while, until I realized that my "certs" folder had 700 permissions, when it should have had 755 permissions. Remember, this is not the folder for keys but certificates. I recommend reading this this link on ssl permissions.
Once I did
chmod 755 certs
The problem was fixed, at least for me anyway.
Fix for macos 12.4 / Mamp 6.6 / Homebrew 3.5.2 / Openssl#3
Terminal
Check version
openssl version -a
Mine was pointing to:
...
OPENSSLDIR: "/opt/homebrew/etc/openssl#3"
...
So I looked through homebrew's dir /opt/homebrew/etc/openssl#3 and found the cert.pem and made sure my Mamp's current version of php's php.ini file was pointing to homebrew's correct openssl version's cert.pem
add to php.ini
openssl.cafile=/opt/homebrew/etc/openssl#3/cert.pem
I had the same issue for another secure page when using wget or file_get_contents. A lot of research (including some of the responses on this question) resulted in a simple solution - installing Curl and PHP-Curl - If I've understood correctly, Curl has the Root CA for Comodo which resolved the issue
Install Curl and PHP-Curl addon, then restart Apache
sudo apt-get install curl
sudo apt-get install php-curl
sudo /etc/init.d/apache2 reload
All now working.
For me, I was running XAMPP on a Windows 10 machine (localhost) and recently upgraded to PHP 8. I was trying to open a localhost HTTPS link via file_get_contents().
In my php.ini file, there was a line that read:
openssl.cafile="C:\Users\[USER]\xampp\apache\bin\curl-ca-bundle.crt"
This was the certificate bundle being used to validate "outside" URLs, and was a package from Mozilla as some people have discussed. I don't know if XAMPP came that way or if I set it up in the past.
At some point I had set up HTTPS on my localhost, resulting in another certificate bundle. This bundle needed to be used to validate "localhost" URLs. To remind myself where that bundle was, I opened httpd-ssl.conf and found the line that read:
SSLCertificateFile "conf/ssl.crt/server.crt"
(The complete path was C:\Users[USER]\xampp\apache\conf\ssl.crt\server.crt)
To make both localhost and outside URLs work simultaneously, I copied the contents of my localhost "server.crt" file into Mozilla's bundle "curl-ca-bundle.crt".
.
.
.
m7I1HrrW9zzRHM76JTymGoEVW/MSD2zuZYrJh6j5B+BimoxcSg==
-----END CERTIFICATE-----
Localhost--I manually added this
================================
-----BEGIN CERTIFICATE-----
MIIDGDCCAgCgAwIBAgIQIH+mTLNOSKlD8KMZwr5P3TANBgkqhkiG9w0BAQsFADAU
...
At that point I could use file_get_contents() with both localhost URLs and outside URLs with no additional configuration.
file_get_contents("https://localhost/...");
file_get_contents("https://google.com");
$csm = stream_context_create(['ssl' => ['capture_session_meta' => TRUE]]);
$sourceCountry = file_get_contents("https://api.wipmania.com/{$ip}?website.com", FALSE, $csm);
echo $sourceCountry;
We're using a curl HEAD request in a PHP application to verify the validity of generic links. We check the status code just to make sure that the link the user has entered is valid. Links to all websites have succeeded, except LinkedIn.
While it seems to work locally (Mac), when we attempt the request from any of our Ubuntu servers, LinkedIn returns a 999 status code. Not an API request, just a simple curl like we do for every other link. We've tried on a few different machines and tried altering the user agent, but no dice. How do I modify our curl so that working links return a 200?
A sample HEAD request:
curl -I --url https://www.linkedin.com/company/linkedin
Sample Response on Ubuntu machine:
HTTP/1.1 999 Request denied
Date: Tue, 18 Nov 2014 23:20:48 GMT
Server: ATS
X-Li-Pop: prod-lva1
Content-Length: 956
Content-Type: text/html
To respond to #alexandru-guzinschi a little better. We've tried masking the User Agents. To sum up our trials:
Mac machine + Mac UA => works
Mac machine + Windows UA => works
Ubuntu remote machine + (no UA change) => fails
Ubuntu remote machine + Mac UA => fails
Ubuntu remote machine + Windows UA => fails
Ubuntu local virtual machine (on Mac) + (no UA change) => fails
Ubuntu local virtual machine (on Mac) + Windows UA => works
Ubuntu local virtual machine (on Mac) + Mac UA => works
So now I'm thinking they block any curl requests that dont provide an alternate UA and also block hosting providers?
Is there any other way I can check if a link to linkedin is valid or if it will lead to their 404 page, from an Ubuntu machine using PHP?
It looks like they filter requests based on the user-agent:
$ curl -I --url https://www.linkedin.com/company/linkedin | grep HTTP
HTTP/1.1 999 Request denied
$ curl -A "Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3" -I --url https://www.linkedin.com/company/linkedin | grep HTTP
HTTP/1.1 200 OK
I found the workaround,
important to set accept-encoding header:
curl --url "https://www.linkedin.com/in/izman" \
--header "user-agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.94 Safari/537.36" \
--header "accept:text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8" \
--header "accept-encoding:gzip, deflate, sdch, br" \
| gunzip
Seems like LinkedIn filter both user agent AND ip address. I tried this both at home and from an Digital Ocean node:
curl -A "Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3" -I --url https://www.linkedin.com/company/linkedin
From home I got a 200 OK, from DO I got 999 Denied...
So you need a proxy service like HideMyAss or other (haven't tested it so I couldn't say if it's valid or not). Here is a good comparison of proxy services.
Or you could setup a proxy on your home network, for example use a Raspberry PI to proxy your requests. Here is a guide on that.
Proxy would work, but I think there's another way around it. I see that from AWS and other clouds that it's blocked by IP. I can issue the request from my machine and it works just fine.
I did notice that in the response from the cloud service that it returns some JS that the browser has to execute to take you to a login page. Once there, you can login and access the page. The login page is only for those accessing via a blocked IP.
If you use a headless client that executes JS, or maybe go straight to the subsequent link and provide the credentials of a linkedin user, you may be able to bypass it.
I’ve been trying to access this particular REST service from a PHP page I’ve created on our server. I narrowed the problem down to these two lines. So my PHP page looks like this:
<?php
$response = file_get_contents("https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json");
echo $response; ?>
The page dies on line 2 with the following errors:
Warning: file_get_contents(): SSL operation failed with code 1.
OpenSSL Error messages: error:14090086:SSL
routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed in
...php on line 2
Warning: file_get_contents(): Failed to enable crypto in ...php on
line 2
Warning:
file_get_contents(https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json):
failed to open stream: operation failed in ...php on line 2
We’re using a Gentoo server. We recently upgraded to PHP version 5.6. It was after the upgrade when this problem appeared.
I found when I replace the REST service with an address like https://www.google.com; my page works just fine.
In an earlier attempt I set “verify_peer”=>false, and passed that in as an argument to file_get_contents, as described here: file_get_contents ignoring verify_peer=>false? But like the writer noted; it made no difference.
I’ve asked one of our server administrators if these lines in our php.ini file exist:
extension=php_openssl.dll
allow_url_fopen = On
He told me that since we’re on Gentoo, openssl is compiled when we build; and it’s not set in the php.ini file.
I also confirmed that allow_url_fopen is working. Due to the specialized nature of this problem; I’m not finding a lot of information for help. Have any of you come across something like this? Thanks.
This was an enormously helpful link to find:
http://php.net/manual/en/migration56.openssl.php
An official document describing the changes made to open ssl in PHP 5.6
From here I learned of one more parameter I should have set to false: "verify_peer_name"=>false
Note: This has very significant security implications. Disabling verification potentially permits a MITM attacker to use an invalid certificate to eavesdrop on the requests. While it may be useful to do this in local development, other approaches should be used in production.
So my working code looks like this:
<?php
$arrContextOptions=array(
"ssl"=>array(
"verify_peer"=>false,
"verify_peer_name"=>false,
),
);
$response = file_get_contents("https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json", false, stream_context_create($arrContextOptions));
echo $response; ?>
You shouldn't just turn off verification. Rather you should download a certificate bundle, perhaps the curl bundle will do?
Then you just need to put it on your web server, giving the user that runs php permission to read the file. Then this code should work for you:
$arrContextOptions= [
'ssl' => [
'cafile' => '/path/to/bundle/cacert.pem',
'verify_peer'=> true,
'verify_peer_name'=> true,
],
];
$response = file_get_contents(
'https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json',
false,
stream_context_create($arrContextOptions)
);
Hopefully, the root certificate of the site you are trying to access is in the curl bundle. If it isn't, this still won't work until you get the root certificate of the site and put it into your certificate file.
I fixed this by making sure that that OpenSSL was installed on my machine and then adding this to my php.ini:
openssl.cafile=/usr/local/etc/openssl/cert.pem
You can get around this problem by writing a custom function that uses curl, as in:
function file_get_contents_curl( $url ) {
$ch = curl_init();
curl_setopt( $ch, CURLOPT_AUTOREFERER, TRUE );
curl_setopt( $ch, CURLOPT_HEADER, 0 );
curl_setopt( $ch, CURLOPT_RETURNTRANSFER, 1 );
curl_setopt( $ch, CURLOPT_URL, $url );
curl_setopt( $ch, CURLOPT_FOLLOWLOCATION, TRUE );
$data = curl_exec( $ch );
curl_close( $ch );
return $data;
}
Then just use file_get_contents_curl instead of file_get_contents whenever you're calling a url that begins with https.
Working for me, I am using PHP 5.6. openssl extension should be enabled and while calling google map api verify_peer make false
Below code is working for me.
<?php
$arrContextOptions=array(
"ssl"=>array(
"verify_peer"=>false,
"verify_peer_name"=>false,
),
);
$url = "https://maps.googleapis.com/maps/api/geocode/json?latlng="
. $latitude
. ","
. $longitude
. "&sensor=false&key="
. Yii::$app->params['GOOGLE_API_KEY'];
$data = file_get_contents($url, false, stream_context_create($arrContextOptions));
echo $data;
?>
After falling victim to this problem on centOS after updating php to php5.6 I found a solution that worked for me.
Get the correct directory for your certs to be placed by default with this
php -r "print_r(openssl_get_cert_locations()['default_cert_file']);"
Then use this to get the cert and put it in the default location found from the code above
wget http://curl.haxx.se/ca/cacert.pem -O <default location>
At first you need to have enabled curl extension in PHP. Then you can use this function:
function file_get_contents_ssl($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_REFERER, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 3000); // 3 sec.
curl_setopt($ch, CURLOPT_TIMEOUT, 10000); // 10 sec.
$result = curl_exec($ch);
curl_close($ch);
return $result;
}
It works similar to function file_get_contents(..).
Example:
echo file_get_contents_ssl("https://www.example.com/");
Output:
<!doctype html>
<html>
<head>
<title>Example Domain</title>
...
You basically have to set the environment variable SSL_CERT_FILE to the path of the PEM file of the ssl-certificate downloaded from the following link : http://curl.haxx.se/ca/cacert.pem.
It took me a lot of time to figure this out.
If your PHP version is 5, try installing cURL by typing the following command in the terminal:
sudo apt-get install php5-curl
following below steps will fix this issue,
Download the CA Certificate from this link: https://curl.haxx.se/ca/cacert.pem
Find and open php.ini
Look for curl.cainfo and paste the absolute path where you have download the Certificate. curl.cainfo ="C:\wamp\htdocs\cert\cacert.pem"
Restart WAMP/XAMPP (apache server).
It works!
hope that helps !!
Just wanted to add to this since I ran into the same problem and nothing I could find anywhere would work (e.g downloading the cacert.pem file, setting cafile in php.ini etc.)
If you are using NGINX and your SSL certificate comes with an "intermediate certificate", you need to combine the intermediate cert file with your main "mydomain.com.crt" file and it should work. Apache has a setting specific for intermediate certs, but NGINX does not so it must be within same file as your regular cert.
Reason for this error is that PHP does not have a list of trusted certificate authorities.
PHP 5.6 and later try to load the CAs trusted by the system automatically. Issues with that can be fixed. See http://php.net/manual/en/migration56.openssl.php for more information.
PHP 5.5 and earlier are really hard to setup correctly since you manually have to specify the CA bundle in each request context, a thing you do not want to sprinkle around your code.
So I decided for my code that for PHP versions < 5.6, SSL verification simply gets disabled:
$req = new HTTP_Request2($url);
if (version_compare(PHP_VERSION, '5.6.0', '<')) {
//correct ssl validation on php 5.5 is a pain, so disable
$req->setConfig('ssl_verify_host', false);
$req->setConfig('ssl_verify_peer', false);
}
Had the same error with PHP 7 on XAMPP and OSX.
The above mentioned answer in https://stackoverflow.com/ is good, but it did not completely solve the problem for me. I had to provide the complete certificate chain to make file_get_contents() work again. That's how I did it:
Get root / intermediate certificate
First of all I had to figure out what's the root and the intermediate certificate.
The most convenient way is maybe an online cert-tool like the ssl-shopper
There I found three certificates, one server-certificate and two chain-certificates (one is the root, the other one apparantly the intermediate).
All I need to do is just search the internet for both of them. In my case, this is the root:
thawte DV SSL SHA256 CA
And it leads to his url thawte.com. So I just put this cert into a textfile and did the same for the intermediate. Done.
Get the host certificate
Next thing I had to to is to download my server cert. On Linux or OS X it can be done with openssl:
openssl s_client -showcerts -connect whatsyoururl.de:443 </dev/null 2>/dev/null|openssl x509 -outform PEM > /tmp/whatsyoururl.de.cert
Now bring them all together
Now just merge all of them into one file. (Maybe it's good to just put them into one folder, I just merged them into one file). You can do it like this:
cat /tmp/thawteRoot.crt > /tmp/chain.crt
cat /tmp/thawteIntermediate.crt >> /tmp/chain.crt
cat /tmp/tmp/whatsyoururl.de.cert >> /tmp/chain.crt
tell PHP where to find the chain
There is this handy function openssl_get_cert_locations() that'll tell you, where PHP is looking for cert files. And there is this parameter, that will tell file_get_contents() where to look for cert files. Maybe both ways will work. I preferred the parameter way. (Compared to the solution mentioned above).
So this is now my PHP-Code
$arrContextOptions=array(
"ssl"=>array(
"cafile" => "/Applications/XAMPP/xamppfiles/share/openssl/certs/chain.pem",
"verify_peer"=> true,
"verify_peer_name"=> true,
),
);
$response = file_get_contents($myHttpsURL, 0, stream_context_create($arrContextOptions));
That's all. file_get_contents() is working again. Without CURL and hopefully without security flaws.
<?php
$stream_context = stream_context_create([
"ssl" => [
"verify_peer" => false,
"verify_peer_name" => false
]
]);
$response = file_get_contents("https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json", false, $stream_context);
echo $response;
?>
Just tested of PHP 7.2, it's working well.
EDIT: Also tested and working on PHP 7.1
Had the same ssl-problem on my developer machine (php 7, xampp on windows) with a self signed certificate trying to fopen a "https://localhost/..."-file. Obviously the root-certificate-assembly (cacert.pem) didn't work.
I just copied manually the code from the apache server.crt-File in the downloaded cacert.pem and did the openssl.cafile=path/to/cacert.pem entry in php.ini
Another thing to try is to re-install ca-certificates as detailed here.
# yum reinstall ca-certificates
...
# update-ca-trust force-enable
# update-ca-trust extract
And another thing to try is to explicitly allow the one site's certificate in question as described here (especially if the one site is your own server and you already have the .pem in reach).
# cp /your/site.pem /etc/pki/ca-trust/source/anchors/
# update-ca-trust extract
I was running into this exact SO error after upgrading to PHP 5.6 on CentOS 6 trying to access the server itself which has a cheapsslsecurity certificate which maybe it needed to be updated, but instead I installed a letsencrypt certificate and with these two steps above it did the trick. I don't know why the second step was necessary.
Useful Commands
View openssl version:
# openssl version
OpenSSL 1.0.1e-fips 11 Feb 2013
View PHP cli ssl current settings:
# php -i | grep ssl
openssl
Openssl default config => /etc/pki/tls/openssl.cnf
openssl.cafile => no value => no value
openssl.capath => no value => no value
Regarding errors similar to
[11-May-2017 19:19:13 America/Chicago] PHP Warning: file_get_contents(): SSL operation failed with code 1. OpenSSL Error messages:
error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed
Have you checked the permissions of the cert and directories referenced by openssl?
You can do this
var_dump(openssl_get_cert_locations());
To get something similar to this
array(8) {
["default_cert_file"]=>
string(21) "/usr/lib/ssl/cert.pem"
["default_cert_file_env"]=>
string(13) "SSL_CERT_FILE"
["default_cert_dir"]=>
string(18) "/usr/lib/ssl/certs"
["default_cert_dir_env"]=>
string(12) "SSL_CERT_DIR"
["default_private_dir"]=>
string(20) "/usr/lib/ssl/private"
["default_default_cert_area"]=>
string(12) "/usr/lib/ssl"
["ini_cafile"]=>
string(0) ""
["ini_capath"]=>
string(0) ""
}
This issue frustrated me for a while, until I realized that my "certs" folder had 700 permissions, when it should have had 755 permissions. Remember, this is not the folder for keys but certificates. I recommend reading this this link on ssl permissions.
Once I did
chmod 755 certs
The problem was fixed, at least for me anyway.
Fix for macos 12.4 / Mamp 6.6 / Homebrew 3.5.2 / Openssl#3
Terminal
Check version
openssl version -a
Mine was pointing to:
...
OPENSSLDIR: "/opt/homebrew/etc/openssl#3"
...
So I looked through homebrew's dir /opt/homebrew/etc/openssl#3 and found the cert.pem and made sure my Mamp's current version of php's php.ini file was pointing to homebrew's correct openssl version's cert.pem
add to php.ini
openssl.cafile=/opt/homebrew/etc/openssl#3/cert.pem
I had the same issue for another secure page when using wget or file_get_contents. A lot of research (including some of the responses on this question) resulted in a simple solution - installing Curl and PHP-Curl - If I've understood correctly, Curl has the Root CA for Comodo which resolved the issue
Install Curl and PHP-Curl addon, then restart Apache
sudo apt-get install curl
sudo apt-get install php-curl
sudo /etc/init.d/apache2 reload
All now working.
For me, I was running XAMPP on a Windows 10 machine (localhost) and recently upgraded to PHP 8. I was trying to open a localhost HTTPS link via file_get_contents().
In my php.ini file, there was a line that read:
openssl.cafile="C:\Users\[USER]\xampp\apache\bin\curl-ca-bundle.crt"
This was the certificate bundle being used to validate "outside" URLs, and was a package from Mozilla as some people have discussed. I don't know if XAMPP came that way or if I set it up in the past.
At some point I had set up HTTPS on my localhost, resulting in another certificate bundle. This bundle needed to be used to validate "localhost" URLs. To remind myself where that bundle was, I opened httpd-ssl.conf and found the line that read:
SSLCertificateFile "conf/ssl.crt/server.crt"
(The complete path was C:\Users[USER]\xampp\apache\conf\ssl.crt\server.crt)
To make both localhost and outside URLs work simultaneously, I copied the contents of my localhost "server.crt" file into Mozilla's bundle "curl-ca-bundle.crt".
.
.
.
m7I1HrrW9zzRHM76JTymGoEVW/MSD2zuZYrJh6j5B+BimoxcSg==
-----END CERTIFICATE-----
Localhost--I manually added this
================================
-----BEGIN CERTIFICATE-----
MIIDGDCCAgCgAwIBAgIQIH+mTLNOSKlD8KMZwr5P3TANBgkqhkiG9w0BAQsFADAU
...
At that point I could use file_get_contents() with both localhost URLs and outside URLs with no additional configuration.
file_get_contents("https://localhost/...");
file_get_contents("https://google.com");
$csm = stream_context_create(['ssl' => ['capture_session_meta' => TRUE]]);
$sourceCountry = file_get_contents("https://api.wipmania.com/{$ip}?website.com", FALSE, $csm);
echo $sourceCountry;
I have a Red Hat linux box with apache running several domains, including a.com and b.com.
I have a php script a.com/wget.php, which makes an exec() call to download a file on the local domain b.com. Running the php script from the command line is successful.
But running this script from a web page results in a 404 error. The command is:
/usr/bin/wget -k -S --save-headers --keep-session-cookies
-O <local-file-name> -o <local-log-file-name> -U \"Mozilla/5.0
(Macintosh; Intel Mac OS X 10.8; rv:24.0) Gecko/20100101
Firefox/24.0\" --max-redirect=100 "http://b.com/page.php"
No log messages are written to the Apache access log file for domain b.com for this call.
BUT the server access log file (/var/log/httpd/access_log) is NOT empty, it shows that there was an attempt made to open page "/page.php" on the server (the link in access log has no domain).
xx.xx.xx.xx - - [19/May/2014:12:02:49 +0100] "GET /page.php
HTTP/1.0" 404 285 "-" "Mozilla/5.0 (Macintosh;
Intel Mac OS X 10.8; rv:24.0) Gecko/20100101 Firefox/24.0"
Server error log (/var/log/httpd/error_log) gives this error:
[Mon May 19 12:02:49 2014] [error] [client xx.xx.xx.xx]
File does not exist: /var/www/vhosts/default/htdocs
So it would seem that something is stripping the domain name from "http://b.com/page.php" and the resulting URL that wget is trying to connect to is "/page.php". This will not work, given that the server has many domains on it.
Has anyone come across this? Is there some setting in wget or php or apache that would cause this to not happen? I tried different things based on suggestions regarding similar problems, but nothing has worked so far.
Thanks.
The problem turned out to be not in wget, but in firewall settings. The wget call, executed from behind the firewall, was resolving the domain to an external IP address, and connections to the external IP address were failing. Correcting this in the firewall fixed the wget problem.
For some reason, this doesn't work:
$.ajax({
url: "News.html",
cache: false,
}).done(function(data) {
$("#content").load(data);
});
It gives me:
GET http://127.0.0.1/News.html 404 (Not Found)
But for whatever reason, opening that url manually (copy paste the url) works just fine.
And i thought it had something to do with browser cache at first so i added the cache: false option to the ajax function but even then.. argh..
Also it does not show up as a requested URL in my access.log file..
For information i guess, i'm running:
lighttpd
php as fast-cgi via localhost:port
mapped .html => .php
Running OpenBSD 5.3
and uncommented (in /etc/php.ini):
cgi.fix_pathinfo=1
Also:
# ls *.html
News.html index.html
And here's the request headers for News.html:
Request URL:http://127.0.0.1/News.html
Request Method:GET
Status Code:404 Not Found
Request Headers
Accept:*/*
Accept-Encoding:gzip,deflate,sdch
Accept-Language:en-US,en;q=0.8
Cache-Control:max-age=0
Connection:keep-alive
Host:127.0.0.1
Referer:http://127.0.0.1/index.php
User-Agent:Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.71 Safari/537.36
X-Requested-With:XMLHttpRequest
Response Headers
Content-type:text/html
Date:Tue, 16 Jul 2013 21:55:05 GMT
Server:lighttpd/1.4.32
Transfer-Encoding:chunked
X-Powered-By:PHP/5.3.21
Checkpoint
Conclusion from the comments so far is that this might not be a jQuery issue at all.
Considering that the server responds with all the data (i've checked raw data sent) and it contains everything, but the response header says 404.
Meaning, the data is found but the header says 404... it's odd to say the least..
curl test
curl 'http://127.0.0.1/News.html' -H 'Accept-Encoding: gzip,deflate,sdch' -H 'Host: 127.0.0.1' -H 'Accept-Language: en-US,en;q=0.8' -H 'User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.71 Safari/537.36' -H 'Accept: */*' -H 'Referer: http://127.0.0.1/' -H 'X-Requested-With: XMLHttpRequest' -H 'Connection: keep-alive' -H 'Cache-Control: max-age=0' --compressed
Here you'll soon find a facebook feed, among other things :)
Zerkms test
# echo "wham bam" > zerkms_doesnt_believe.html
#
Config files
lighttpd.conf
php-5.3.ini
Error logs and what not
lighttpd-error.log
cURL test
Manual FastCGI test via a Python client:
# python fcgi_app.py
{'FCGI_MAX_CONNS': '1', 'FCGI_MPXS_CONNS': '0', 'FCGI_MAX_REQS': '1'}
After some tinkering, i figured out how the FastCGI protocol works and i found a client that matched my needs, funny enough it matched the name of my script so here's the output:
# python fcgi_app.py
('404 Not Found', [('x-powered-by', 'PHP/5.3.21'), ('content-type', 'text/html')], '<html>\n\t<head>\n\t\t<title>test php</title>\n\t</head>\n<body>\nChecking</body>\n</html>', '')
And Here's the source
Giving me the conclusion that this is in fact a PHP issue (even tho i've hated on lighttpd for not honoring the 200 code php should respond with.. And for that i'm sorry. Should go bash a little on PHP and see if that helps me come to a conclusion)
Temporary Solution
Placing the following in the top part of your .php page will work around this issue.
Note that it's a clean workaround, it will work but it's not a long term fix for sure.
<?php
header("HTTP/1.0 200 Found");
?>
This smells a bit like a same-origin policy issue.
The path you are specifying may be causing the issue.
Try
$.ajax({
url: "/News.html",
cache: false,
}).done(function(data) {
$("#content").load(data);
});
And let me (us) know if that helps.
This one had me stymied for a bit. Feeling some compulsive urges, I installed lighttpd and php5 on an fresh Ubuntu 12.10 VM (didn't have a BSD one handy). I had to modify to poll from kqueue, but other than that I used your lighttpd.conf. And everything worked fine.
So then I installed your php.ini file, and BAM http status 404 while returning proper content. So that narrowed it down to php-cgi.
Turns out that when the service started, it would log
PHP Warning: PHP Startup: Unable to load dynamic library '/usr/local/lib/php-5.3/modules/pdo.so' - /usr/local/lib/php-5.3/modules/pdo.so: cannot open shared object file: No such file or directory in Unknown on line 0
So id did a quick search and changed one line in the php.ini from
extension_dir = "/usr/local/lib/php-5.3/modules"
to
extension_dir = "/usr/lib/php5/20100525"
restarted php-cgi, and voila status 200 to go along with the content.
After setting up a fresh OpenBSD 5.3 server, and installing with your config files, I was able to narrow down the root cause.
In the lighttpd.conf you have server.chroot = "/var/www/" so all of its path names exclude the /var/www from the front. The php-fastcgi process is not chrooted, so it has a slightly different view of the file system.
Solution #1:
Don't chroot lighttpd and change the server.document-root, accesslog.filename, and server.errorlog to absolute paths.
Solution #2:
Use php-fpm or similar to make PHP chroot aware/capable
Use simple jQuery .load() method:
$(document).ready(function () {
$("#content").load('News.html');
});