I had a problem with PHPMailer suddenly saying my certificate had expired and refusing to connect properly to port 587 with TLS encryption, from Oct 1 2021.
Changing the ssl flags to not verify_peer and not verify_peer_name will temp fix the email issue.
$mail->SMTPOptions = array (
'ssl' => array(
'verify_peer' => false,
'verify_peer_name' => false,
'allow_self_signed' => true
));
But its not an ideal solution.
If I go to the same server via port 80 and web there is nothing wrong with the certificate.
If I connect with OpenSSL command line it says the certificate expired on Sep 30 2021.
This problem also appears under the php command file_get_contents.
NOTE: This issue is PHPMailer and email specific and provides good information about PHPMailer, it should not be closed. It has nothing to do with docker or the other question its associated with other than the cause and fix being similar.
The issue here is a real expired authority cert embedded in the LetsEncrypt chain which really DID expire on Sep 30 2021.
From the openssl blog ...
The currently recommended certificate chain as presented to Let’s Encrypt ACME clients when new certificates are issued contains an intermediate certificate (ISRG Root X1) that is signed by an old DST Root CA X3 certificate that expires on 2021-09-30. In some cases the OpenSSL 1.0.2 version will regard the certificates issued by the Let’s Encrypt CA as having an expired trust chain.
Read more here ...
https://www.openssl.org/blog/blog/2021/09/13/LetsEncryptRootCertExpire/
It mainly affects OpenSSL 1.0.2. On my Mac with OpenSSL 1.1.1 I did not have the issue.
CentOS, and I'm sure others have provided fixes to this issue ...
Backup
cp -i /etc/pki/tls/certs/ca-bundle.crt ~/ca-bundle.crt-backup
Add certificate to blacklist directory
trust dump --filter "pkcs11:id=%c4%a7%b1%a4%7b%2c%71%fa%db%e1%4b%90%75%ff%c4%15%60%85%89%10" | openssl x509 | sudo tee /etc/pki/ca-trust/source/blacklist/DST-Root-CA-X3.pem
Update root store
sudo update-ca-trust extract
Verify removal
diff ~/ca-bundle.crt-backup /etc/pki/tls/certs/ca-bundle.crt
The CentOS specific steps above are from this post ...
https://blog.devgenius.io/rhel-centos-7-fix-for-lets-encrypt-change-8af2de587fe4#:~:text=So%2C%20DST%20Root%20CA%20X3%20needs%20to%20be,The%20manual%20steps%20below%20are%20no%20longer%20necessary.
This is quite a crazy issue that appeared out of nowhere (unless you follow the openSSL blog)
Took me approx 1 day to track down, all the while no emails are being sent and large pieces of the web site not appearing.
Hope this points people in the right direction.
UPDATE: As pointed out by #hakre you may be able to get away with just ...
yum upgrade ca-certificates
Simply edit the fullchain.pem file and remove the last certificate. - if using an OS different to the accepted answer
I know this has been asked on SO before but I think my situation is a little bit different:
When I'm trying to use curl inside PHP I receive the following error when trying to interact with apples push notification service (https://api.push.apple.com/3/device/)
Curl failed: NSS: client certificate not found (nickname not specified)
This is due to the fact that on centos, php is build with curl that uses NSS instead OpenSSL.
What I tried so far:
Recompiling curl (worked! Binary is able to perform the call, but php is not)
Recompiling php (didnt work, as it requires curl-devel to be installed, which might link to NSS again)
So my next approach is to fix this NSS problem, but it turns out NSS is a very bad piece of software as just a simple rename of an imported lets-ecnrypt certificate doesnt work.. ..
Could someone please explain me how I could fix this? I already tried importing a lets encrypt certificate into the NSS database stored in /etc/pki/nssdb, that worked - but unfortunately the certificate is not recognized in PHP, even if I provide its nickname in CURLOPT_SSLCERT => 'nickname'.
Maybe this is because it has special characters inside its nickname which i cannot change as NSS fails to rename (lol).
When I directly try to provide certificates in php using
CURLOPT_SSLCERT => $certFile,
CURLOPT_SSLKEY => $keyFile,
CURLOPT_CAINFO => $caCertFile
I get:
Curl failed: Peer's Certificate issuer is not recognized.
I also turned of peer verification by
CURLOPT_SSL_VERIFYPEER => FALSE
ending in
Curl failed: security library failure
Is there anybody out there who could teach me how to fix it or how to build php on centos with builting curl using openssl?
BR,
Finally I got this working, here is what I did:
Recompiled curl with openssl and put the libcurl.so.4 in a new folder /home/mylibs/
Copied all libs from /usr/lib to /home/mylibs/ while not replacing my libcurl.so.4
Located the system's php-cgi binary, renamed it to php-cgi-real
Created a blank file php-cgi
#! /bin/bash
export LD_PRELOAD=/home/mylibs/libcurl.so.4
exec php-cgi-real "$#"
Restarted the service
Done!
Since last night, several of my scripts (on different servers) using file_get_contents("https://...") and curl functions stopped working.
Example request that fails:
file_get_contents("https://domain.tld/script.php");
Error:
PHP Warning: file_get_contents(): SSL operation failed with code 1. OpenSSL Error messages:
error:1416F086:SSL routines:tls_process_server_certificate:certificate verify failed in /home/domain/public_html/script.php on line 19
I already "fixed" the problem using:
$arrContextOptions=array(
"ssl"=>array(
"verify_peer"=>false,
"verify_peer_name"=>false,
),
);
file_get_contents("https://domain.tld/path/script.php", false, stream_context_create($arrContextOptions));
The "fix" is far from ideal since I'm not verifying the authenticity of the connection, but until I understand the origin of the problem and how to prevent it from happening again, I'll be forced to use it.
Notes:
PHP scripts with Curl also stopped working and the fix is similar:
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);;
The SSL certificate is issued by Let's Encrypt and it was renewed last night ("not valid before 2020/12/24");
All servers have the same timezone;
I'm using CentOS 7/Ubuntu 18 and Virtualmin;
If I open "https://domain.tld/script.php" on Firefox/Chrome, no SSL warnings are shown and the certificate is valid;
I've tried to update the CA certificates (yum install ca-certificates.noarch), but the latest version is already installed;
I understand what's wrong, what I cannot figure out is why it started happening and how to fix it (the real fix).
Question:
How to fix and prevent it from happening again?
The problem was an outdated CA certificate and I found the solution on a Let's Encrypt community thread :
Manual Solution:
Replace the contents of /home/[domain]/ssl.ca with lets-encrypt-r3-cross-signed.pem
restart apache/nginx
Virtualmin Solution:
Go to Virtualmin -> Server Configuration -> SSL Certificate -> CA Certificate
Option 1: Choose upload file and use lets-encrypt-r3-cross-signed.pem
Option 2: Paste the contents of lets-encrypt-r3-cross-signed.pem using the Pasted certificate text option.
Press "Save Certificate"
Note:
This issue was fixed on webmin 1.970, so make sure you've the latest version installed, which wasn't my case due to the webmin repo not being enabled. If that's also your case, just enable or add the webmin repo and run yum update.
I’ve been trying to access this particular REST service from a PHP page I’ve created on our server. I narrowed the problem down to these two lines. So my PHP page looks like this:
<?php
$response = file_get_contents("https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json");
echo $response; ?>
The page dies on line 2 with the following errors:
Warning: file_get_contents(): SSL operation failed with code 1.
OpenSSL Error messages: error:14090086:SSL
routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed in
...php on line 2
Warning: file_get_contents(): Failed to enable crypto in ...php on
line 2
Warning:
file_get_contents(https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json):
failed to open stream: operation failed in ...php on line 2
We’re using a Gentoo server. We recently upgraded to PHP version 5.6. It was after the upgrade when this problem appeared.
I found when I replace the REST service with an address like https://www.google.com; my page works just fine.
In an earlier attempt I set “verify_peer”=>false, and passed that in as an argument to file_get_contents, as described here: file_get_contents ignoring verify_peer=>false? But like the writer noted; it made no difference.
I’ve asked one of our server administrators if these lines in our php.ini file exist:
extension=php_openssl.dll
allow_url_fopen = On
He told me that since we’re on Gentoo, openssl is compiled when we build; and it’s not set in the php.ini file.
I also confirmed that allow_url_fopen is working. Due to the specialized nature of this problem; I’m not finding a lot of information for help. Have any of you come across something like this? Thanks.
This was an enormously helpful link to find:
http://php.net/manual/en/migration56.openssl.php
An official document describing the changes made to open ssl in PHP 5.6
From here I learned of one more parameter I should have set to false: "verify_peer_name"=>false
Note: This has very significant security implications. Disabling verification potentially permits a MITM attacker to use an invalid certificate to eavesdrop on the requests. While it may be useful to do this in local development, other approaches should be used in production.
So my working code looks like this:
<?php
$arrContextOptions=array(
"ssl"=>array(
"verify_peer"=>false,
"verify_peer_name"=>false,
),
);
$response = file_get_contents("https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json", false, stream_context_create($arrContextOptions));
echo $response; ?>
You shouldn't just turn off verification. Rather you should download a certificate bundle, perhaps the curl bundle will do?
Then you just need to put it on your web server, giving the user that runs php permission to read the file. Then this code should work for you:
$arrContextOptions= [
'ssl' => [
'cafile' => '/path/to/bundle/cacert.pem',
'verify_peer'=> true,
'verify_peer_name'=> true,
],
];
$response = file_get_contents(
'https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json',
false,
stream_context_create($arrContextOptions)
);
Hopefully, the root certificate of the site you are trying to access is in the curl bundle. If it isn't, this still won't work until you get the root certificate of the site and put it into your certificate file.
I fixed this by making sure that that OpenSSL was installed on my machine and then adding this to my php.ini:
openssl.cafile=/usr/local/etc/openssl/cert.pem
You can get around this problem by writing a custom function that uses curl, as in:
function file_get_contents_curl( $url ) {
$ch = curl_init();
curl_setopt( $ch, CURLOPT_AUTOREFERER, TRUE );
curl_setopt( $ch, CURLOPT_HEADER, 0 );
curl_setopt( $ch, CURLOPT_RETURNTRANSFER, 1 );
curl_setopt( $ch, CURLOPT_URL, $url );
curl_setopt( $ch, CURLOPT_FOLLOWLOCATION, TRUE );
$data = curl_exec( $ch );
curl_close( $ch );
return $data;
}
Then just use file_get_contents_curl instead of file_get_contents whenever you're calling a url that begins with https.
Working for me, I am using PHP 5.6. openssl extension should be enabled and while calling google map api verify_peer make false
Below code is working for me.
<?php
$arrContextOptions=array(
"ssl"=>array(
"verify_peer"=>false,
"verify_peer_name"=>false,
),
);
$url = "https://maps.googleapis.com/maps/api/geocode/json?latlng="
. $latitude
. ","
. $longitude
. "&sensor=false&key="
. Yii::$app->params['GOOGLE_API_KEY'];
$data = file_get_contents($url, false, stream_context_create($arrContextOptions));
echo $data;
?>
After falling victim to this problem on centOS after updating php to php5.6 I found a solution that worked for me.
Get the correct directory for your certs to be placed by default with this
php -r "print_r(openssl_get_cert_locations()['default_cert_file']);"
Then use this to get the cert and put it in the default location found from the code above
wget http://curl.haxx.se/ca/cacert.pem -O <default location>
At first you need to have enabled curl extension in PHP. Then you can use this function:
function file_get_contents_ssl($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_REFERER, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 3000); // 3 sec.
curl_setopt($ch, CURLOPT_TIMEOUT, 10000); // 10 sec.
$result = curl_exec($ch);
curl_close($ch);
return $result;
}
It works similar to function file_get_contents(..).
Example:
echo file_get_contents_ssl("https://www.example.com/");
Output:
<!doctype html>
<html>
<head>
<title>Example Domain</title>
...
You basically have to set the environment variable SSL_CERT_FILE to the path of the PEM file of the ssl-certificate downloaded from the following link : http://curl.haxx.se/ca/cacert.pem.
It took me a lot of time to figure this out.
If your PHP version is 5, try installing cURL by typing the following command in the terminal:
sudo apt-get install php5-curl
following below steps will fix this issue,
Download the CA Certificate from this link: https://curl.haxx.se/ca/cacert.pem
Find and open php.ini
Look for curl.cainfo and paste the absolute path where you have download the Certificate. curl.cainfo ="C:\wamp\htdocs\cert\cacert.pem"
Restart WAMP/XAMPP (apache server).
It works!
hope that helps !!
Just wanted to add to this since I ran into the same problem and nothing I could find anywhere would work (e.g downloading the cacert.pem file, setting cafile in php.ini etc.)
If you are using NGINX and your SSL certificate comes with an "intermediate certificate", you need to combine the intermediate cert file with your main "mydomain.com.crt" file and it should work. Apache has a setting specific for intermediate certs, but NGINX does not so it must be within same file as your regular cert.
Reason for this error is that PHP does not have a list of trusted certificate authorities.
PHP 5.6 and later try to load the CAs trusted by the system automatically. Issues with that can be fixed. See http://php.net/manual/en/migration56.openssl.php for more information.
PHP 5.5 and earlier are really hard to setup correctly since you manually have to specify the CA bundle in each request context, a thing you do not want to sprinkle around your code.
So I decided for my code that for PHP versions < 5.6, SSL verification simply gets disabled:
$req = new HTTP_Request2($url);
if (version_compare(PHP_VERSION, '5.6.0', '<')) {
//correct ssl validation on php 5.5 is a pain, so disable
$req->setConfig('ssl_verify_host', false);
$req->setConfig('ssl_verify_peer', false);
}
Had the same error with PHP 7 on XAMPP and OSX.
The above mentioned answer in https://stackoverflow.com/ is good, but it did not completely solve the problem for me. I had to provide the complete certificate chain to make file_get_contents() work again. That's how I did it:
Get root / intermediate certificate
First of all I had to figure out what's the root and the intermediate certificate.
The most convenient way is maybe an online cert-tool like the ssl-shopper
There I found three certificates, one server-certificate and two chain-certificates (one is the root, the other one apparantly the intermediate).
All I need to do is just search the internet for both of them. In my case, this is the root:
thawte DV SSL SHA256 CA
And it leads to his url thawte.com. So I just put this cert into a textfile and did the same for the intermediate. Done.
Get the host certificate
Next thing I had to to is to download my server cert. On Linux or OS X it can be done with openssl:
openssl s_client -showcerts -connect whatsyoururl.de:443 </dev/null 2>/dev/null|openssl x509 -outform PEM > /tmp/whatsyoururl.de.cert
Now bring them all together
Now just merge all of them into one file. (Maybe it's good to just put them into one folder, I just merged them into one file). You can do it like this:
cat /tmp/thawteRoot.crt > /tmp/chain.crt
cat /tmp/thawteIntermediate.crt >> /tmp/chain.crt
cat /tmp/tmp/whatsyoururl.de.cert >> /tmp/chain.crt
tell PHP where to find the chain
There is this handy function openssl_get_cert_locations() that'll tell you, where PHP is looking for cert files. And there is this parameter, that will tell file_get_contents() where to look for cert files. Maybe both ways will work. I preferred the parameter way. (Compared to the solution mentioned above).
So this is now my PHP-Code
$arrContextOptions=array(
"ssl"=>array(
"cafile" => "/Applications/XAMPP/xamppfiles/share/openssl/certs/chain.pem",
"verify_peer"=> true,
"verify_peer_name"=> true,
),
);
$response = file_get_contents($myHttpsURL, 0, stream_context_create($arrContextOptions));
That's all. file_get_contents() is working again. Without CURL and hopefully without security flaws.
<?php
$stream_context = stream_context_create([
"ssl" => [
"verify_peer" => false,
"verify_peer_name" => false
]
]);
$response = file_get_contents("https://maps.co.weber.ut.us/arcgis/rest/services/SDE_composite_locator/GeocodeServer/findAddressCandidates?Street=&SingleLine=3042+N+1050+W&outFields=*&outSR=102100&searchExtent=&f=json", false, $stream_context);
echo $response;
?>
Just tested of PHP 7.2, it's working well.
EDIT: Also tested and working on PHP 7.1
Had the same ssl-problem on my developer machine (php 7, xampp on windows) with a self signed certificate trying to fopen a "https://localhost/..."-file. Obviously the root-certificate-assembly (cacert.pem) didn't work.
I just copied manually the code from the apache server.crt-File in the downloaded cacert.pem and did the openssl.cafile=path/to/cacert.pem entry in php.ini
Another thing to try is to re-install ca-certificates as detailed here.
# yum reinstall ca-certificates
...
# update-ca-trust force-enable
# update-ca-trust extract
And another thing to try is to explicitly allow the one site's certificate in question as described here (especially if the one site is your own server and you already have the .pem in reach).
# cp /your/site.pem /etc/pki/ca-trust/source/anchors/
# update-ca-trust extract
I was running into this exact SO error after upgrading to PHP 5.6 on CentOS 6 trying to access the server itself which has a cheapsslsecurity certificate which maybe it needed to be updated, but instead I installed a letsencrypt certificate and with these two steps above it did the trick. I don't know why the second step was necessary.
Useful Commands
View openssl version:
# openssl version
OpenSSL 1.0.1e-fips 11 Feb 2013
View PHP cli ssl current settings:
# php -i | grep ssl
openssl
Openssl default config => /etc/pki/tls/openssl.cnf
openssl.cafile => no value => no value
openssl.capath => no value => no value
Regarding errors similar to
[11-May-2017 19:19:13 America/Chicago] PHP Warning: file_get_contents(): SSL operation failed with code 1. OpenSSL Error messages:
error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed
Have you checked the permissions of the cert and directories referenced by openssl?
You can do this
var_dump(openssl_get_cert_locations());
To get something similar to this
array(8) {
["default_cert_file"]=>
string(21) "/usr/lib/ssl/cert.pem"
["default_cert_file_env"]=>
string(13) "SSL_CERT_FILE"
["default_cert_dir"]=>
string(18) "/usr/lib/ssl/certs"
["default_cert_dir_env"]=>
string(12) "SSL_CERT_DIR"
["default_private_dir"]=>
string(20) "/usr/lib/ssl/private"
["default_default_cert_area"]=>
string(12) "/usr/lib/ssl"
["ini_cafile"]=>
string(0) ""
["ini_capath"]=>
string(0) ""
}
This issue frustrated me for a while, until I realized that my "certs" folder had 700 permissions, when it should have had 755 permissions. Remember, this is not the folder for keys but certificates. I recommend reading this this link on ssl permissions.
Once I did
chmod 755 certs
The problem was fixed, at least for me anyway.
Fix for macos 12.4 / Mamp 6.6 / Homebrew 3.5.2 / Openssl#3
Terminal
Check version
openssl version -a
Mine was pointing to:
...
OPENSSLDIR: "/opt/homebrew/etc/openssl#3"
...
So I looked through homebrew's dir /opt/homebrew/etc/openssl#3 and found the cert.pem and made sure my Mamp's current version of php's php.ini file was pointing to homebrew's correct openssl version's cert.pem
add to php.ini
openssl.cafile=/opt/homebrew/etc/openssl#3/cert.pem
I had the same issue for another secure page when using wget or file_get_contents. A lot of research (including some of the responses on this question) resulted in a simple solution - installing Curl and PHP-Curl - If I've understood correctly, Curl has the Root CA for Comodo which resolved the issue
Install Curl and PHP-Curl addon, then restart Apache
sudo apt-get install curl
sudo apt-get install php-curl
sudo /etc/init.d/apache2 reload
All now working.
For me, I was running XAMPP on a Windows 10 machine (localhost) and recently upgraded to PHP 8. I was trying to open a localhost HTTPS link via file_get_contents().
In my php.ini file, there was a line that read:
openssl.cafile="C:\Users\[USER]\xampp\apache\bin\curl-ca-bundle.crt"
This was the certificate bundle being used to validate "outside" URLs, and was a package from Mozilla as some people have discussed. I don't know if XAMPP came that way or if I set it up in the past.
At some point I had set up HTTPS on my localhost, resulting in another certificate bundle. This bundle needed to be used to validate "localhost" URLs. To remind myself where that bundle was, I opened httpd-ssl.conf and found the line that read:
SSLCertificateFile "conf/ssl.crt/server.crt"
(The complete path was C:\Users[USER]\xampp\apache\conf\ssl.crt\server.crt)
To make both localhost and outside URLs work simultaneously, I copied the contents of my localhost "server.crt" file into Mozilla's bundle "curl-ca-bundle.crt".
.
.
.
m7I1HrrW9zzRHM76JTymGoEVW/MSD2zuZYrJh6j5B+BimoxcSg==
-----END CERTIFICATE-----
Localhost--I manually added this
================================
-----BEGIN CERTIFICATE-----
MIIDGDCCAgCgAwIBAgIQIH+mTLNOSKlD8KMZwr5P3TANBgkqhkiG9w0BAQsFADAU
...
At that point I could use file_get_contents() with both localhost URLs and outside URLs with no additional configuration.
file_get_contents("https://localhost/...");
file_get_contents("https://google.com");
$csm = stream_context_create(['ssl' => ['capture_session_meta' => TRUE]]);
$sourceCountry = file_get_contents("https://api.wipmania.com/{$ip}?website.com", FALSE, $csm);
echo $sourceCountry;
I'm getting:
Warning: ldap_start_tls()
[function.ldap-start-tls]: Unable to
start TLS: Connect error in
/var/www/X.php on line Y
/etc/ldap/ldap.conf:
TLS_CACERT /etc/ssl/certs/ca.crt
ca.crt is the CA which signed the LDAP server certificate. The certificate on the LDAP server is expired and I can't change it.
You can ignore the validity in windows by issuing
putenv('LDAPTLS_REQCERT=never');
in your php code. In *nix you need to edit your /etc/ldap.conf to contain
TLS_REQCERT never
Another thing to be aware of is that it requires version 3 (version 2 is php default):
//$hostnameSSL example would be "ldaps://just.example.com:636" , just make sure it has ldaps://
$con = ldap_connect($hostnameSSL);
ldap_set_option($con, LDAP_OPT_PROTOCOL_VERSION, 3);
To get a better idea of what's going on, you can enable debug logging by:
ldap_set_option(NULL, LDAP_OPT_DEBUG_LEVEL, 7);
This can be done before the ldap_connect takes place.
The specific scenario presented in the question--with an expired certificate that can't be changed--does appear to require disabling certificate validation on the LDAP client.
However, I suspect a lot of people, like me, reach this page for other root causes of receiving opaque LDAP TLS errors, where disabling validation of TLS certificates is not an appropriate answer.
In my case--using the LDAP Authentication extension for Mediawiki on an Ubuntu 18.04 LTS server, and authenticating against Active Directory on a Windows Server 2012 server--authentication stopped working in January/February 2020. The server certificate and the CA certificate were still both valid, and openssl s_client -verify 2 -connect <AD server>:636 from the Mediawiki server passed just fine.
Eventually I noticed that the signature algorithm in the SSL certificate served by AD/LDAP was SHA1, which I remembered recently suffered from the first known chosen-prefix collision exploit. This led me to investigate the changelog for packages that had recently been updated on the system, which turned up "Mark SHA1 as insecure for certificate signing" in the gnutls28 changelog circa January 8th, 2020. (The chain of dependencies from the php-ldap package in Ubuntu 18.04 goes to php7.2-ldap -> libldap-2.4-2 -> libgnutls30, whose source package is gnutls28.)
I followed some instructions to update the Windows CA to use SHA256 and then selectively followed instructions to renew the AD/LDAP cert, installed the new CA cert on my Mediawiki server, and the problem was solved! Briefly, these steps included:
In an Admin PowerShell on the AD server, run certutil -setreg ca\csp\CNGHashAlgorithm SHA256
In the Certification Authority MMC, right click on the CA -> All Tasks -> Renew CA Certificate
In a blank MMC, add snap-in for Certificates; select Local Computer
Under Personal -> Certificates, find the current entry used by LDAPS (Kerberos Authentication template type) -> All Tasks -> Advanced Options -> Renew This Certificate with the Same Key
In the same window, open the new CA certificate -> Details -> Copy to file -> no private key -> base64-encoded X.509
Copy the resulting file to /usr/share/ca-certificates/ on the Mediawiki server, then run sudo dpkg-reconfigure ca-certificates and select the new CA cert for inclusion.
P.S. For SEO purposes, depending on the mode I was using, error messages included:
ldap_start_tls(): Unable to start TLS: Connect error in /var/www/mediawiki/extensions/LdapAuthentication/LdapAuthenticationPlugin.php in the HTTP error log
ldap_start_tls(): Unable to start TLS: Can't contact LDAP server in [...]
Failed to start TLS. in the Mediawiki debug log (when using wgLDAPEncryptionType = ssl, i.e. encrypted LDAP port, 636)
Failed to bind as CN=foobar,CN=Users,DC=myOrgName,DC=local in the Mediwiki debug log (when using wgLDAPEncryptionType = tls, i.e. STARTTLS on the unencrypted LDAP port, 389)
My solution/workaround is to use
/etc/ldap/ldap.conf:
#TLS_CACERT /etc/ssl/certs/ca.crt
TLS_REQCERT never
If you have any better idea, please post another answer.
The path for ldap.conf in Windows is fixed:
c:\openldap\sysconf\ldap.conf
A restart of the web server may be required to apply changes.
In debian based systems:
Install the package: ldap-utils and in the file
/etc/ldap/ldap.conf, edit the line:
TLS_CACERT /etc/ldap/cacerts/cacert.asc
Create the directory /etc/ldap/cacerts and copy the cacert to
/etc/ldap/cacerts/cacert.asc
Restart apache.
In redhat based systems:
Install the package: openldap-clients and in the file
/etc/openldap/ldap.conf edit the line:
TLS_CACERT /etc/openldap/cacerts/cacert.asc
Create the directory /etc/openldap/cacerts and copy the cacert to
/etc/openldap/cacerts/cacert.asc
Restart httpd
I was able to get this working properly with openldap on Amazon Linux (Elastic Beanstalk PHP 7.0) with MacOS Server 5 LDAP, with TLS set to demand.
in /etc/openldap/ldap.conf:
TLS_REQCERT demand
TLS_CACERT /etc/openldap/certs/yourcacert.pem
(note that if you are not using openldap, the path will be /etc/ldap/certs/yourcacert.pem). This setup did not work until I placed the certificate inside the certs folder; it did not work from any other path.
The certificate to be placed in that path is NOT the TLS certificate of the server. It is the CA (Certificate Authority) certificate of the authority whom issued the server/domain specific TLS certificate. Only the CA certificate placed in that path will allow TLS to work before attempting an LDAP bind in php. Get the CA certificate from your server or download it from the authority's site, they are freely available.
To test if LDAP bind is even working without TLS, set TLS_REQCERT never temporarily (may need to comment # out TLS_CACERT). If you get "Can't connect to LDAP" it is not a TLS error; it simply cannot connect to the server and you likely need to open port 389 (not 636 for TLS).
Remember to restart your Apache server every time you make a change to the config file or certificate.
Some additional help for others, the certificate solution here solved my ldapsearch command line issue, but still PHP complained **Can't contact LDAP server**
Turned out to be SELinux on RHEL7 ( CentOS7 ) blocks HTTPD from using LDAP ports 389 and 636 by default, you can unblock with:
setsebool -P httpd_can_network_connect 1
Check your SELinux audit log file for things being blocked.