How to connect to Amazon RDS via SSL? - php

I'm trying to set up an SSL connection to a MySQL database hosted via Amazon RDS. I'm confused as to how to connect.
According to Amazon's documentation, I need to download a CA certificate called "rds-ca-2015-root.pem" and use it in my SSL connection. I set the database user that I am connecting with to require SSL.
In PHP, I include the code below to initiate the connection:
$mysqli = mysqli_init();
mysqli_options($mysqli, MYSQLI_OPT_SSL_VERIFY_SERVER_CERT, true);
$mysqli->ssl_set(NULL, NULL, "/path/to/pem", NULL, NULL);
$mysqli->real_connect("host", "username", "password", "name", 3306, NULL, MYSQLI_CLIENT_SSL);
However, no matter which path I specify as the third parameter in ssl_set() (even if the path is invalid), an SSL connection is successfully established. The third parameter just can't be set to NULL.
I verify this by running this query: SHOW STATUS LIKE 'Ssl_cipher';. The output verifies that the connection is encrypted (Ssl_cipher => AES256-SHA).
Could someone please explain to me how this works? I am confused as to why the connection continues to work successfully when the path is incorrect. How is the RDS server being verified?

The RDS documentation actually explains why this is happening, and suggests that you don't even need the CA cert:
Amazon RDS began updating the SSL certificates on all DB instances on
March 23, 2015, but did not initiate a reboot of the instances. No
operational impact or downtime is incurred when these updates are
performed, and in many situations we will perform the update in your
maintenance window. Amazon RDS will not update the certificate for
your instances if you have already performed the update. Also note
that Amazon RDS is not updating the certificates in AWS GovCloud (US)
and the China (Beijing) regions.
Regardless of whether you manually update the certificate or Amazon
RDS updated the certificate, the DB instance must be rebooted for the
new certificate to take effect. You can decide when you want to
manually reboot the DB instance, but you must update the certificate
and reboot the instance before the old certificate (rds-ca-2010)
expires on April 3, 2015.
You can check the certificate authority (CA) being used by your DB
instance using the Amazon RDS console. The CA is listed under the
Security and Network section of your DB instance details. If your
instance shows rds-ca-2015, then the new certificate has been
successfully applied. You still need to reboot your database instance
and update your client application to use the new SSL certificate.
If the Amazon RDS console shows your instance CA as rds-ca-2010, then
the new certificate has not been applied to your database instance
yet. Use the instructions following to update the SSL certificate on
your database instances.
The 3rd parameter is essentially being ignored by the client. I'm betting by setting the 3rd param to NULL, there is no point in calling mysqli::ssl_set() if all the params are null.
Try removing that function call altogether.

As best I can tell - if you ask for or require an SSL connection to RDS MySQL you will get one - regardless of whether the local copy of the certificate is valid or readable.
I am using the following config in Drupal and regardless of where or what I put in the cert path I get an SSL connection:
$databases['default']['default'] = [
'database' => 'drupal_db',
'username' => 'db_user',
'password' => 'db_password',
'host' => 'hostname',
'port' => '3306',
'driver' => 'mysql',
'prefix' => '',
'pdo' => array(
PDO::MYSQL_ATTR_SSL_CA => '/etc/ssl/certs/rds-combined-ca-bundle.pem',
PDO::MYSQL_ATTR_SSL_VERIFY_SERVER_CERT => false,
),
];
However, if I set the MYSQL_ATTR_SSL_VERIFY_SERVER_CERT value to "false" I get a connection error and Drupal fails. So far I have not been able to find any explanation of why this won't validate the certificate an/or how to make it successfully do so!

Related

Why is my ftp connection not working in php laravel but is working in FileZilla?

I have a PHP Laravel (5.6) system that I need to connect to an FTP server to upload a single file. The FTP server that I am connecting to is restricting access by ip address, uses port 990, and other than that has a seemingly simple configuration. On my local machine (I'm running on Linux Ubuntu if that helps) I am able to connect to the ftp server in FileZilla just fine, FileZilla did seem to automatically choose ftps. I am also able to ping this server.
Now this PHP Laravel (5.6) application is running on NGINX (had this for a while, everything else server-wise seems fine). As of now I am just trying to get this working locally, though there is a production server that it will have to be pushed onto (pretty much identical configuration though).
I started out trying to use the built in PHP function ftp_connect and ftp_ssl_connect - both using the same host and port number (990) as in FileZilla. I have been unable to get past this step - it returns false (so never even gets to my login logic).
$ftp = ftp_connect(env('FTP_HOST'),env('FTP_PORT')); // returns FALSE
$ftp = ftp_ssl_connect(env('FTP_HOST'),env('FTP_PORT')): // returns FALSE
After searching for a while I decided to try Laravel's filesystem to see if that would make it easier, these are my settings in config/filesystems.php:
'ftp' => [
'driver' => 'ftp',
'host' => env('FTP_HOST'),
'username' => env('FTP_USER'),
'password' => env('FTP_PASSWORD'),
'port' => env('FTP_PORT'),
'ssl' => true,
'timeout' => 60,
],
'sftp' => [
'driver' => 'sftp',
'host' => env('FTP_HOST'),
'username' => env('FTP_USER'),
'password' => env('FTP_PASSWORD'),
'port' => env('FTP_PORT'),
'timeout' => 60,
],
I figured I'd try both ftp and sftp here, I then tried the following:
Storage::disk('ftp')->put('test.csv', $file);
and
Storage::disk('sftp')->put('test.csv', $file);
The first just timed out, the second gave me the message: League\Flysystem\Sftp\ConnectionErrorException: Could not login with username: ###, host: ### in...
Any ideas of what this could be or next steps I could take towards troubleshooting would be greatly appreciated, I feel like I just don't know what to try to get a better understanding of what's wrong here. Thanks!
EDIT:
I realized that previously I had always used the quick connect feature in FileZilla for this. I looked into it further and was able to confirm that the encryption has to be implicit FTP over TLS - So I'm wondering if there is a setting for that I'm missing.
The port 990 is for implicit TLS/SSL, as you have eventually figured out.
The implicit TLS/SSL is not supported by the PHP built-in FTP implementation. Neither is implicit TLS/SSL supported by flysystem used by Laravel (which probably internally uses PHP built-in FTP anyway).
The implicit TLS/SSL was a temporary hack back in 90s to allow legacy FTP software to use encrypted connection without modification. It was never something that should have been used in long term, definitely not 30 years later! The implicit FTP never even became part of FTP standard. Only the explicit TLS/SSL was standardized by RFC 4217 in 2005. Since then, noone should be using implicit TLS/SSL ever.
Get your FTP server fixed to use the explicit TLS/SSL.
Take a look at the additional settings (https://flysystem.thephpleague.com/v2/docs/adapter/ftp/) you can add to the filesystem.php for your ftp settings. Match them with what you have on your FileZilla settings and see if it helps.
Laravel underneath uses the flysystem adapters to connect to different storage types and you can reference the settings from the above URL.

PHP on Azure App Service slow performance when connected to Azure Database for MySQL

I am hosting a PHP 7.2 website written on CodeIgniter 3 framework on Linux Azure App Service (plan P1v2). The database is MySQL 5.7 running on Azure Database for MySQL (General Purpose, 2vCores, 5GB storage) with SSL disabled.
When I browse my website, it literally takes about 8-9 seconds to load where browser would stay the white page and display everything at once after finished loading. (most of website contents are pulled from database)
I have been monitoring both App Service and MySQL DB and the average %CPU have never reached 50% of their allocated resource, so the specs does not seem to be the problem.
My outsourced developer team has their own testing environment (not on Azure) and they said it loads almost instantly on their environment, so they are blaming Azure.
I then noticed that Azure Advisor was telling me two things about this database where the impact is High
Improve MySQL connection latency
Our internal telemetry indicates that your application connecting to MySQL server may not be managing connections efficiently. This may result in higher application latency. To improve connection latency, we recommend that you enable connection redirection. This can be done by enabling the connection redirection feature of the PHP driver.
Improve MySQL connection management
Our internal telemetry indicates that your application connecting to MySQL server may not be managing connections efficiently. This may result in unnecessary resource consumption and overall higher application latency. To improve connection management, we recommend that you reduce the number of short-lived connections and eliminate unnecessary idle connections. This can be done by configuring a server side connection-pooler, such as ProxySQL.
I am now working on enabling the connection redirection to reduce latency but still not successful as I am not familiar with CodeIgniter so I need to wait for the developers to fix it.
My question is: what is/are the cause(s) of this performance issue? Are those two recommendations really the causes of this problem?
Both App Service and database are in the same region. Do they communicate locally like some kind of LAN connection? I hope the connection from App Service would not go out through the internet then back to Azure DB, causing slow performance.
Thanks.
I have fixed this issue, turns out the Azure Advisor was right! I have enabled connection redirection and now my website finishes loading in 3 seconds. Here are the what I did.
TL;DR Configure your app to use SSL when connecting to database and also put the mysqlnd_azure extension file in your deployed app, set Azure Web App's App Settings to read the ini file that make it loads the extension. Then set Azure MySQL DB server parameter redirect_enabled to ON and enable Enforce SSL with TLS 1.2. Restart the app.
Beware: Make sure that setting Enforce SSL is the last thing you do, otherwise it will reject all non-SSL connections and your app wouldn't work at all.
Update Jan 2021
I have just found out that it seems like Microsoft has already built-in the required mysqlnd_azure extension to Azure App Service. The majority of steps below are not required anymore.
You should first test if the extension has already been loaded by following step 7. If mysqlnd_azure extension is on the list then you can continue to step 8-10. You still need to follow step 1. to configure SSL certificate for your database though.
Afterwards, confirm if the database connection redirection is in use by following the sample code link in the reference at the end of this answer. You should see the text mysqlnd_azure.enableRedirect: preferred, followed by text in this format: [random text and numbers].[random text and numbers again].[your azure region].worker.database.windows.net.
If the text still shows your MySQL hostname you used, e.g. mydatabase.mysql.database.azure.com, then connection redirection is not in use. Check again.
Original answer below this line.
Configure your app to use SSL when connecting to database. In CodeIgniter, it's the database.php file in /application/config/. Make sure all MySQL drivers that you use have SSL enabled. Download the SSL certificate from https://www.digicert.com/CACerts/BaltimoreCyberTrustRoot.crt.pem and store it in your app's directory e.g. /cert/BaltimoreCyberTrustRoot.crt.pem
/*Snippet from database.php*/
$db['production'] = array(
'dsn' => '',
'hostname' => getenv("DB_HOST"),
'username' => getenv("DB_USER"),
'password' => getenv("DB_PWD"),
'database' => getenv("DB_NAME"),
'dbdriver' => 'mysqli',
'dbprefix' => '',
'pconnect' => FALSE,
'db_debug' => FALSE,
'cache_on' => FALSE,
'cachedir' => '',
'char_set' => 'utf8',
'dbcollat' => 'utf8_general_ci',
'swap_pre' => '',
'encrypt' => [
'ssl_key' => NULL,
'ssl_cert' => NULL,
'ssl_ca' => '/home/site/wwwroot/cert/BaltimoreCyberTrustRoot.crt.pem',
'ssl_capath' => NULL,
'ssl_cipher' => NULL,
'ssl_verify' => FALSE
],
'compress' => FALSE,
'stricton' => FALSE,
'failover' => array(),
'save_queries' => TRUE
);
/*Snippet for PHP (PDO)*/
<?php
define('CONN_HOST', getenv("DB_HOST"));
define('CONN_DATABASE', getenv("DB_NAME"));
define('CONN_USER', getenv("DB_USER"));
define('CONN_PASSWORD', getenv("DB_PWD"));
define('CONN_OPTION', array(
PDO::MYSQL_ATTR_SSL_CA => '/home/site/wwwroot/cert/BaltimoreCyberTrustRoot.crt.pem',
PDO::MYSQL_ATTR_INIT_COMMAND => "SET NAMES utf8"
));
Install mysqlnd_azure extension by following this link https://azureossd.github.io/2019/01/29/azure-app-service-linux-adding-php-extensions/ or if you just want the .so extension right away, download it from http://www.mediafire.com/file/g6mzeld0wnqedw0/mysqlnd_azure.so/file otherwise build it yourself from https://github.com/microsoft/mysqlnd_azure
Put mysqlnd_azure.so in your app's directory. E.g. /bin/mysqlnd_azure.so Then deploy your code to Azure
SSH into your App Service, create a directory in /home/site called ini. Create an .ini file with any name e.g. settings.ini
Use vi or nano to edit that ini file and paste these lines
extension=/home/site/wwwroot/bin/mysqlnd_azure.so
mysqlnd_azure.enableRedirect = on
Tell your app to load the mysqlnd_azure extension by adding an application settings called PHP_INI_SCAN_DIR with value /usr/local/etc/php/conf.d:/home/site/ini
Azure will force restart your app after saving the app settings. After that you can check if the extension has been loaded by SSH again and run php -m to see all loaded extensions. If you see mysqlnd_azure in the list, you're on the right path! If not, this command should tell why it couldn't load that extension
On Azure Portal, navigate to your Azure MySQL DB and set server parameter redirect_enabled to ON
Under Connection Security, enable Enforce SSL connection and choose TLS 1.2
Restart the app again.
If you use sslmode=verify-ca or sslmode=verify-full, you need to switch to the new SSL certificate on October 26, 2020. Refer to https://learn.microsoft.com/en-us/azure/mysql/concepts-certificate-rotation Update: Microsoft has extended the deadline for root certificate deprecation till February 15, 2021.
If you have any deployment slots, make sure to do step 4-7 for every slot.
References:
Sample code to confirm if connection redirection is working https://learn.microsoft.com/en-us/azure/mysql/howto-redirection#confirm-redirection
App Service Linux - Update PHP Settings https://azureossd.github.io/2019/01/29/azure-app-service-linux-update-php-settings/
Configure SSL connectivity in your application to securely connect to Azure Database for MySQL https://learn.microsoft.com/en-us/azure/mysql/howto-configure-ssl

CFFTP equivalents in Laravel

I have a ColdFusion page on which a secure FTP connection is made to a remote ftp server, and the names of all of the files in a certain directory on that server are listed (in a query object). This is relatively simple in CF - you use the cfftp tag to open an ftp connection, and then use the cfftp tag again to list out the files that are in a specific directory:
<cfftp action = "open"
username = "username"
connection = "myConnection"
password = "password"
fingerprint = "(eight pairs of hexadecimal values in the form
hh:hh:hh:hh:hh:hh:hh:hh)"
server = "xxx.xxx.xx.xx"
secure = "yes">
<cfftp action = "listDir"
connection="myConnection"
directory="Output_Directory"
name="qDir" />
<!--- Files in the specified directory are now stored in a query object named "qDir", and their filenames can be pulled using a CF query of query --->
I'm new to PHP/Laravel, and am trying to figure out how to do the same in Laravel. So far, I haven't found any videos or tutorials that explain how to open ftp connections in Laravel (normal or secure), or how to view the files within a directory once that connection has been made.
I did find something at https://laravel.com/docs/5.6/filesystem#downloading-files, on what to put in the filesystems.php page for SFTP driver configuration. Which, I gather, needs to be set up before I can open any SFTP connection within my Laravel site. But, I'm finding this a little confusing. As seen below, that code appears to require values for "privateKey" and "password", and includes no optional setting for "fingerprint", which I'm including in the CFFTP tag above.
//include on filesystems.php:
'sftp' => [
'driver' => 'sftp',
'host' => 'example.com',
'username' => 'your-username',
'password' => 'your-password',
// Settings for SSH key based authentication...
// 'privateKey' => '/path/to/privateKey',
// 'password' => 'encryption-password',
// Optional SFTP Settings...
// 'port' => 22,
// 'root' => '',
// 'timeout' => 30,
],
I can connect to the ftp server securely through an ftp client (Bitvise), which only asks for the username, password, host IP, and port (22). So I'm not sure what "privateKey" or "password" values need to be provided to the code above...nor how there can be two variables named "password" within the 'sftp' array (one for "your-password" and one for "encryption-password").
Laravel seems to have good tutorial videos in general -- can anyone point me to any where ftp/sftp is covered, particularly "open" and "listDir" type actions? (I'm looking for Laravel in particular -- I can figure out how it's done in vanilla PHP on my own). Thanks much.
(Addendum) adding this here later, so that I can wrap the below in code format. I think one thing I need to do is use the ftp filesystem driver, rather than the sftp filesystem driver, and set 'ssl' to true. So on filesystems.php, I should have something like:
'ftp' => [
'driver' => 'ftp',
'host' => env('FTP_SERVER'),
'port' => 22,//env('FTP_PORT'),
'username' => env('FTP_USERNAME'),
'password' => env('FTP_PASSWORD'),
'passive' => true,
'ssl' => true,
'timeout' => 30,
'root' => '/',
],
Still working on establishing a connection; though. I was getting an error message when I tried to do that: "Use of undefined constant FTP_BINARY - assumed 'FTP_BINARY'". Figured out that problem - was because I needed to enable php_ftp.dll in the php config, as this is not enabled by default on Windows. But now I'm getting a different error message -- "Could not connect to host xxx.xxx.xx.xx port:22". Will dig into that, see what the problem is and report back.
(Later) Looks like I'm having an authentication problem of some sort. I've tried connecting to the same FTP server just using vanilla PHP, i.e., the ftp_ssl_connect() function. That function is returning false, which means there's an error connecting. (Before I enabled php_ftp.dll, I was just getting an error message saying that ftp_ssl_connect() was an undefined function). Wondering if omitting the fingerprint, which we were previously including as a param with the Coldfusion cfftp tag, has something to do with it. But this isn't a possible parameter in the Laravel or core PHP functions. Digging...will report back...
(4/11/18)
The crucial question seems to be what exactly is happening with the cfftp tag, and how it's making its connection (sFTP, or FTPS?) Because however it's working is what we want to re-create in PHP/Laravel. I've asked a separate question regarding this.
The documentation for the cfftp tag says nothing about it using sFTP or FTPS. It only says that when you specify secure="yes", then it allows you to open a connection to a Secure Shell (SSH) server by using either symmetric or asymmetric encryption. When using symmetric encryption (which is what we're doing), you pass username, password, and fingerprint (which we do).
You can specify the port in the cfftp tag; if no port is specified, then it defaults to 21. As a test, I've explicitly passed the port as both 21 and 22, and the connection worked fine in both cases. I'm not sure why that is; I'd expect it to work in one case or the other, but not both.
Meanwhile, in case we indeed need to use SFTP, I've tried making a SFTP connection to the ftp server in core PHP, using the ssh2_connect() function. I'm getting an error message saying that this function is undefined, even though I've added and enabled the php_ssh2.dll extension, so that's another thing to look into.
The privateKey field is if you're using public key authentication on
your SSH connection. Public key authentication is used as an alternative to standard username/password authentication. It's typically considered more secure than just username/password. Here's a decent primer.
Laravel's SFTP filesystem driver uses The PHP League's flysystem-sftp:
https://github.com/thephpleague/flysystem-sftp
If you opt to not use public key authentication, you can probably leave that field commented out. The second password entry is probably just a typo in the documentation; see the flysystem-sftp project for the actual configuration syntax.

Connecting to cloud sql from a google container engine instance docker image running craft cms

I am trying to externalise the database from a craft cms docker image running in a google container engine instance and connect it to a gcloud sql second generation instance running in the same project.
I've set the cloud sql permission to enabled in the cluster permissions section of the console and I've tried various authorisation settings in the access control settings of the database instance. I can only connect from the gcloud shell.
It seems I need a unix socket connection and I should be authorised to connect but when I deploy and run I see the dreaded CrashLoopBackOff error. I've tried the socket connection instance with and without the region, ip of the sql instance, with and without a password and authorising the 0.0.0.0/0 network... I can connect via the gcloud shell however. This is the connection of db.php in the craft/config folder
'unixSocket' => '/cloudsql/website-1351:asia-east-1:pzr-craft-database',
'user' => 'root',
'password' => 'xxxxxxxx',
'database' => 'craft',
'tablePrefix' => 'craft',
Any idea how this can be achieved?
You need https://github.com/GoogleCloudPlatform/cloudsql-proxy to connect from GKE to Google Cloud SQL

MySql Error: cannot connect to local MySql server through socket '/var/run/mysqld/mysqld.sock' (2), while other pages do connect correctly

I understand this problem has been a recurring problem on this site, but my issue is a little different than the previous ones.
PROBLEM
Some pages use the correct socket directory, while other pages try and connect through an incorrect socket directory or this is what I believe the problem is based on the error i am receiving.
DETAILS
HOST: example.com
cakePHP version: 1.3.2 (Not my choice).
Page's content comes from database.
URL: http://example.com
My website has 2 sections:
anonymous section
login section for members or admin
The anonymous section works. It accesses the database, adds the content, and funcitons as it should.
ERROR
The error occurs when I click a link "view more.." under "Job Links" on the home page. A login form should pop up, instead i receive the error "cannot connect to local MySql server through socket '/var/run/mysqld/mysqld.sock' (2)".
In addition, after I login via the "Members login" button, also on the home page, with the correct credentials, it also produces the same error.
QUESTION
Why would different sections on my webpage try to access the sockets through different directories?
ADDITIONAL STUFF
I signed up today and this is my first post, so feedback on my post regarding enough information would be helpful for future posts.
Thanks for your time.
UPDATE
Upon further research, MySql has been using the socket directory /var/run/mysqld/mysqld.sock from the start. Not sure what this means yet, but continuing research..
database.php file
class DATABASE_CONFIG {
var $default = array(
'driver' => 'mysqli',
'persistent' => true,
'host' => 'redlabelcom.netfirmsmysql.com',
'login' => 'bcp',
'password' => '********',
'database' => 'bcp',
'prefix' => '',
'encoding' => 'UTF8',
//'socket' => '/var/run/mysqld/mysqld.sock', // I've tried commenting out all variations of socket and port
//'port' => '/var/run/mysqld/mysqld.sock', // nothing works.
);
var $test = array(
'driver' => 'mysqli',
'persistent' => false,
'host' => 'redlabelcom.netfirmsmysql.com',
'login' => 'bcp',
'password' => '********',
'database' => 'bcp',
'prefix' => '',
'encoding' => 'UTF8',
//'port' => '/var/run/mysqld/mysqld.sock',
);
}
?>
This problem is really strange. I guess this has something to do with mysql connection. MySQL is a daemon, usually configured in /etc/mysql/my.cnf. There you define how the client will connect the MySQL server:
[client]
port = 3306
socket = /var/run/mysqld/mysqld.sock
If your socket is wrong (there's no such socket on the server), you are probably connecting different mysql host/port from anonymous/secured parts of application. This is possible (I mean different db connections from different application parts). Check your configuration (host, port, etc.) and try to connect from MySQL console:
$ mysql -h hostname -u username -p dbname # enter password
and check if you can connect. If you can connect from console, you'll surely connect from any server-side application (php, python, whatever).
It's an application that someone else was developing, right, and now you have to maintain it?
PROBLEM SOLVED!
Thanks everyone for your support!
This was an interesting issue that initially looked more complicated than the actual problem.
cakephp version 1.3.2 is set up to make a new connection to the database if you need to access the phpbb data fields. The new connection uses a different configuration than what is set up in the database.php file.
Below is a detailed description with file locations.
the 'app/webroot/discussion/common.php' file makes a NEW connection to the database using a different set of MySql parameters than the database.php file. Again, it does NOT use database.php's MySql configuration.
Instead, cakephp defines new connection variables located at:
'app/webroot/discussion/config.php'
To solve the problem, simply change the
$dbhost => localhost -> $dbhost => yourservername
$dbname => wrongname -> $dbname => rightname
etc..
Keep in mind that it is possible the previous developer changed these values, so if this doesn't help you then good luck.
BTW, the error message above was the result of trying to connect to localhost.
To resolve this issue, I ran the following commands:
mysqld --tc-heuristic-recover=ROLLBACK
systemctl start mariadb.service

Categories