Using LOB data in PHP with PDO instead of OCI_Connect - php

I was using OCI_Connect to connect to my Oracle database.
Because of some internal plicies, i need to change it to PDO.
With OCI_Connect, i can read LOB data from database with "->load()" function in the result, something like this:
$this->Conn = oci_connect($this->User, $this->Pass, $this->Name, 'AL32UTF8');
$sql = "select field from table";
$s = oci_parse($this->Conn, $sql);
$res = oci_fetch_array($s, OCI_ASSOC + OCI_RETURN_NULLS)
echo $res[0]['FIELD']->load();
and it worked very well.
Now i need to do the same stuff with PDO, and because all my queries may change the number and name of the fields, i cannot bind the variables before executing it.
What i'm using now to connect:
$dbTns = "(DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = $server)(PORT = $port)) (CONNECT_DATA = (SERVICE_NAME = $service_name) (SID = $sid)))";
$paramArray = array( PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION, PDO::ATTR_EMULATE_PREPARES => false, PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_ASSOC);
$this->PDO = new PDO("oci:dbname=" . $dbTns . ";charset=utf8", $db_username, $db_password, $paramArray );
With PDO, everything works fine, but i can't use the "->load()" function in the LOB field, as it does not exists here.
Is there an equivalent way to get the data after the query run?
Any suggestions are welcome.
(yes, i did search for a solution before posting that question here)

Related

Stream very large Blob from MySQL to PHP and create a file

We have a MySQL database that has some very large files stored in blob fields, such as some videos that are over 700MB. File sizes range from .5 MB PDFs to JPEGS, etc...
I'm trying to use PHP to retrieve these columns and create a file on the server that will be later offered up as a download.
I'm currently using the following method:
$dsn = "mysql:host=$host;dbname=$db;charset=$charset";
$options = [
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_ASSOC,
PDO::ATTR_EMULATE_PREPARES => false,
PDO::MYSQL_ATTR_MAX_BUFFER_SIZE => 1024*1024*500
];
try {
$pdo = new PDO($dsn, $user, $pass, $options);
} catch (\PDOException $e) {
throw new \PDOException($e->getMessage(), (int)$e->getCode());
}
$stmt = $pdo->prepare('SELECT a.TITLE, a.ATTVERSION, a.ATTACHMENTID, a.CONTENTTYPE, a.FILESIZE, d.DATA FROM ATTACHMENTS a
LEFT JOIN ATTACHMENTDATA d ON d.ATTACHMENTID=a.ATTACHMENTID
WHERE a.ATTACHMENTID= ?');
$stmt->execute([$fileid]);
$file = $stmt->fetch();
file_put_contents($storage_dir . "/" . $filename, $file['DATA']);
This works for smaller files (note I'm setting the buffer size to 500MB), but larger files get truncated and corrupted.
I next tried the LOB and unbuffered query approach:
$dsn = "mysql:host=$host;dbname=$db;charset=$charset";
$options = [
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
PDO::ATTR_EMULATE_PREPARES => false,
PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => false
];
try {
$pdo = new PDO($dsn, $user, $pass, $options);
} catch (\PDOException $e) {
throw new \PDOException($e->getMessage(), (int)$e->getCode());
}
$stmt = $pdo->prepare('SELECT a.TITLE, a.ATTVERSION, a.ATTACHMENTID, a.CONTENTTYPE, a.FILESIZE, d.DATA FROM ATTACHMENTS a
LEFT JOIN ATTACHMENTDATA d ON d.ATTACHMENTID=a.ATTACHMENTID
WHERE a.ATTACHMENTID= ?');
$stmt->execute([$fileid]);
$stmt->bindColumn(1, $title, PDO::PARAM_STR, 256 );
$stmt->bindColumn(2, $attversion, PDO::PARAM_INT);
$stmt->bindColumn(3, $attid, PDO::PARAM_INT);
$stmt->bindColumn(4, $contenttype, PDO::PARAM_STR, 256);
$stmt->bindColumn(5, $filesize, PDO::PARAM_INT);
$stmt->bindColumn(6, $data, PDO::PARAM_LOB);
$stmt->fetch(PDO::FETCH_BOUND);
file_put_contents($storage_dir . "/" . $filename, $data)
With this option, I only get 1MB files, and it seems to be ignoring the MYSQL_ATTR_USE_BUFFERED_QUERY => false and since I'm not setting a buffer size, it's defaulting to 1MB. Can anyone offer any advice or see anything glaring? MySQL is on a different server, but I'd be open to doing this another way as well and calling a bash script or something via PHP.
This was initially done on php 5.4.16, which seemed to not support the unbuffered stream. After upgrading to php 8.0.11, I now have this working, with the exception of increasing PHP memory using ini_set.

Connecting to AWS RDS via PDO

I have been trying to connect my PHP application to a MySQL database on AWS RDS via PDO. I have seen a similar question here: Unable to connect to AWS RDS through PDO but this is over 4 years old with no definitive answers.
I have tried this a couple of ways. Firstly, passing the host name as '<my-db-name.eu-west-2.rds.amazonaws.com:3306' and secondly passing the port explicitly in the dsn string via
$dsn = $dsn = "mysql:host=" . $this->host . ";port=". $this->port . ";dbname=" . $this->name . ";charset=utf8";
(commented out below). Neither works!
The code snippet is:
$dsn = null;
$options = null;
$this->host = SYSTEM_CONFIG["database"]["host"];
$this->type = SYSTEM_CONFIG["database"]["type"];
$this->name = SYSTEM_CONFIG["database"]["name"];
$this->user = SYSTEM_CONFIG["database"]["user"];
$this->pass = SYSTEM_CONFIG["database"]["pass"];
/* New */
$this->port = SYSTEM_CONFIG["database"]["port"];
switch ($this->type) {
case "SQLSRV":
$dsn = "sqlsrv:Server=" . $this->host . ";Database=" . $this->name;
$options = [
PDO::ATTR_PERSISTENT => false,
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
PDO::SQLSRV_ATTR_FETCHES_NUMERIC_TYPE => true,
PDO::ATTR_STRINGIFY_FETCHES => false
];
break;
default:
$dsn = "mysql:host=" . $this->host . ";dbname=" . $this->name;
//$dsn = "mysql:host=" . $this->host . ";port=". $this->port . ";dbname=" . $this->name . ";charset=utf8";
$options = [
PDO::ATTR_PERSISTENT => false,
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
PDO::ATTR_EMULATE_PREPARES => false,
PDO::ATTR_STRINGIFY_FETCHES => false
];
}
try {
$this->pdo = new PDO($dsn, $this->user, $this->pass, $options);
} catch (PDOException $e) {
$this->logError($e);
} catch (Exception $e) {
$this->logError($e);
}
One thing that does work is to pass the IP address and port as the host name in the form
$this->host = "<IP-address>:3306"
However, I only found the IP address by pinging the host name and I am not sure if this is static or a dynamic IP address (the latter would be no good in a config file!).
Any help on this would be much appreciated!
Have got the code working now, although quite frustratingly I never got to the bottom of why it wasn't working in the first place! I suspect it was something to do with not picking up on the port number properly - maybe a typo somewhere that got 'accidently' corrected (rather than deliberately) when I was trying things out. This code now works (just for MySQL):
$dsn = null;
$options = null;
$this->host = SYSTEM_CONFIG["database"]["host"];
$this->type = SYSTEM_CONFIG["database"]["type"];
$this->name = SYSTEM_CONFIG["database"]["name"];
$this->user = SYSTEM_CONFIG["database"]["user"];
$this->pass = SYSTEM_CONFIG["database"]["pass"];
$this->port = SYSTEM_CONFIG["database"]["port"];
switch ($this->type) {
case "SQLSRV":
// Other untested code...
break;
default:
$dsn = "mysql:host={$this->host};port={$this->port};dbname={$this->name};charset=utf8";
$options = [
PDO::ATTR_PERSISTENT => false,
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
PDO::ATTR_EMULATE_PREPARES => false,
PDO::ATTR_STRINGIFY_FETCHES => false
];
}
try {
$this->pdo = new PDO($dsn, $this->user, $this->pass, $options);
} catch (PDOException $e) {
$this->logError($e);
} catch (Exception $e) {
$this->logError($e);
}

Connecting to PGSQL over SSL via Red Bean PHP

$dbh = new PDO('pgsql:localhost=host;port=26257;dbname=bank;sslmode=require;sslcert=[path]/client.maxroach.crt;sslkey=[path]/client.maxroach.key;sslrootcert=[path]/ca.crt;',
'maxroach', null, array(
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
PDO::ATTR_EMULATE_PREPARES => true,
));
This a pdo method, i need configure red bean connection for ssl pgsql connection
R::setup( "pgsql:host=$ip;port=$port;dbname=$dbname",$user, $password, $frozen ); ?
You definitely need to write the full path to certificates and keys,
otherwise nothing will work.
$crt = $_SERVER["DOCUMENT_ROOT"]."/client.crt";
$key = $_SERVER["DOCUMENT_ROOT"]."/client.key";
$ca = $_SERVER["DOCUMENT_ROOT"]."/ca.crt";
R::setup( "pgsql:host=$ip;port=$port;dbname=$dbname;sslmode=verify-ca;sslcert=$crt;sslkey=$key;sslrootcert=$ca;",$user, $password, $frozen );
sslmode=verify-ca; better use sslmode=verify-full

PHP connection string to remote database ORA-12154

I have 2 computers:
PC 1 - Here is where I installed the XAMPP.
PC 2 - Here is where my database which is Oracle 9i is installed.
I am using PHP 7 and already added PDO_OCI extension.
Here is my connection string:
define("DB_HOST", "192.168.10.30:1521");
define("DB_NAME", "BACKEND");
define("DB_USER", "sa");
define("DB_PASS", "sa_backend");
new PDO('oci:dbname='. DB_NAME . ';host='. DB_HOST .';', DB_USER, DB_PASS);
When I used this code I am getting this error:
Warning: Uncaught PDOException: SQLSTATE[42S02]: pdo_oci_handle_factory: ORA-12154: TNS:could not resolve the connect identifier specified (ext\pdo_oci\oci_driver.c:709)
UPDATE 1
$server = "192.168.10.30";
$db_username = "sa";
$db_password = "sa_backend";
$service_name = "backend";
$sid = "backend";
$port = 1521;
$dbtns = "(DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = $server)(PORT = $port)) (CONNECT_DATA = (SERVICE_NAME = $service_name) (SERVER = SHARED) (SID = $sid)))";
$this->dbh = new PDO("oci:dbname=" . $dbtns . ";charset=utf8", $db_username, $db_password, array(
PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
PDO::ATTR_EMULATE_PREPARES => false,
PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_ASSOC));
I used this code I found on other stackoverflow link. Now I am getting this error
SQLSTATE[HY000]: pdo_oci_handle_factory: ORA-12520: TNS:listener could not find available handler for requested type of server (ext\pdo_oci\oci_driver.c:728)
I tried to updated the process to 200 even to 400 but the error is still the same.

Optimize PHP PDO Transaction from CURL Stream

I'm using CURL to request large XML Files from an API.
To prevent memory leaks I use this CURL option to stream the data and send it to the function curlCallback:
curl_setopt($ch, CURLOPT_WRITEFUNCTION, array($splitter, 'curlCallback'));
In the curlCallback I prepare the incoming XML Stream and call the function below to store every main XML Node in the MySQL Database. Everything works well but:
I want to optimize the efficiency to store the data in the MySQL Database. This is the actual code:
public function processLine($str) {
$prdData = simplexml_load_string($str);
// connect to mysql db
$servername = "localhost";
$username = "";
$password = "";
$dbname = 'temp';
$db = new \PDO('mysql:host=' . $servername . ';dbname=' . $dbname . ';charset=utf8mb4',
$username,
$password,
array(
\PDO::ATTR_ERRMODE => \PDO::ERRMODE_EXCEPTION,
\PDO::ATTR_PERSISTENT => false
)
);
try {
$stmt = $db->prepare("INSERT IGNORE INTO Product (PRDNO, DSCRD ,DSCRF, DSCRLONGD, DSCRLONGF, PRTNO, SMCAT, DEL, BNAMD) VALUES (:prdno, :dscrd, :dscrf, :dscrlongd, :dscrlongf, :prtno, :smcat, :del, :bnamd)");
// MySQL Transaction
$db->beginTransaction();
$stmt->bindParam(':prdno', $prdData->PRDNO);
$stmt->bindParam(':dscrd', $prdData->DSCRD);
$stmt->bindParam(':dscrf', $prdData->DSCRF);
$stmt->bindParam(':dscrlongd', $prdData->DSCRLONGD);
$stmt->bindParam(':dscrlongf', $prdData->DSCRLONGF);
$stmt->bindParam(':prtno', $prdData->PRTNO);
$stmt->bindParam(':smcat', $prdData->SMCAT);
$stmt->bindParam(':del', $prdData->DEL);
$stmt->bindParam(':bnamd', $prdData->BNAMD);
$stmt->execute();
$db->commit();
} catch (PDOException $e) {
error_log(date("d.m.Y H:i:s") . ' | ' . $e->getMessage() . PHP_EOL, 3, '/var/www/html/log/import.log');
$db->rollBack();
}
}
How can I optimize this to just send one transaction including for example 100 Rows?

Categories