PDO There is no active transaction - php

One of my class files has been spitting out the "There is no active transaction" warning when creating some records. It is the only one doing it and I can't figure out why because all my queries are executing successfully.
My php_errors log has given me zero insight to the issue, nor have any of the other related questions provided any solutions.
try {
App::$DB->beginTransaction();
$rQuery = App::$DB->prepare("INSERT INTO `Cervidae` (`".implode('`, `',array_keys($aSQL)) ."`) VALUES (".implode(",",$aPlaceholders).")");
$rQuery->execute(array_values($aSQL));
$this->_Cervid_ID = App::$DB->lastInsertId();
if (is_array($aData["Treatments"])) {
foreach ($aData["Treatments"] as $sTreatmentName => $sTreatmentData) {
$oTreatmentToCervid = new TreatmentToCervid(array("Treatment_ID" => $sTreatmentData["Treatment_ID"], "Cervid_ID" => $this->_Cervid_ID, "CreatedBy" => $aSQL["CreatedBy"], "UpdatedBy" => $aSQL["UpdatedBy"], "Dose" => $sTreatmentData["TreatmentDose"]), "NEW");
}
}
$oHerdSize = App::getFlat(self::get(array("COUNT(Cervidae.Cervid_ID) AS HerdSize"),array("Cervidae.Herd_ID = $nHerdID","Cervidae.IsActive = 1")));
$rQuery = App::$DB->prepare("UPDATE `Herds` SET `HerdSize` = ".$oHerdSize->HerdSize." WHERE `Herd_ID` = ".$nHerdID);
$rQuery->execute();
App::$DB->commit();
$this->__construct($this->_Cervid_ID, 'ID');
}
catch (Exception $e) {
App::$DB->rollBack();
throw new Exception($e->getMessage());
}

Related

Google Task API PHP: Setting the TaskList ID & "Invalid task list ID" error

I'm trying to loop through task lists in order to generate a list of tasks using the Google Task PHP library.
I have:
Done all the credential stuff & can call the API
I can get task lists
A list of tasks for the respective task list output correctly using the ids generated from the point above & the tasklist parameter in Task API explorer
Where I'm stuck:
I'm not sure if I'm calling the 1) wrong method or 2) passing the wrong parameter to get a list of tasks for a respective tasklist id.
My code:
function getGcalTasks(){
$client = $this->getGcalTaskClient();
try {
$service = new Google_Service_Tasks($client);
$optParamLists = array(
'maxResults' => 10,
);
$result_lists = $service->tasklists->listTasklists($optParamLists);
if (
is_array($result_lists->getItems())
&& count($result_lists->getItems())
) {
foreach ($result_lists->getItems() as $tasklist) {
$taskListId = trim($tasklist->getId());
$taskListTitle = trim($tasklist->getTitle());
if(
$taskListId
){
$optParamsTasks = array(
// I've tried all of the below and still get: "Invalid task list ID",
'id' => $taskListId,
'kind' => 'tasks#taskList',
'title' => $taskListTitle,
//'tasklist' => $taskListId,
//'taskList' => $taskListId,
//'tasklistId' => $taskListId,
//'listName' => $taskListTitle,
);
$result_tasks = $service->tasks->listTasks($optParamsTasks);
}
}
}
} catch (Exception $e) {
log_message('error',$e->getMessage());
}
}
Welp, I took a look a few minutes later and realized that listTasks() only accepts one parameter, the id. The code below is working for me:
function getGcalTasks(){
$client = $this->getGcalTaskClient();
$tasks = array();
try {
$service = new Google_Service_Tasks($client);
$optParamLists = array(
'maxResults' => 10,
);
$result_lists = $service->tasklists->listTasklists($optParamLists);
if (
is_array($result_lists->getItems())
&& count($result_lists->getItems())
) {
foreach ($result_lists->getItems() as $tasklist) {
$taskListId = trim($tasklist->getId());
$taskListTitle = trim($tasklist->getTitle());
if(
$taskListId
){
$optParamsTasks = array(
'tasklist' => $taskListId,
);
$result_tasks = $service->tasks->listTasks($taskListId);
$tasks[] = $result_tasks->getItems();
}
}
return $tasks;
}
} catch (Exception $e) {
log_message('error',$e->getMessage());
}
}

How to optimize api call PHP

I'm working (for fun), with an API (Riot API), and I made something to retrieve match histories (it's in the game). And I've a problem, everything works really fine, but I don't know how to optimize it, what I mean is : I do the call every time the user refresh the page, and it takes loooong looong time. I tried to call it with AJAX, but didn't find a good way, AJAX dont find my objects.
Here is the code :
$match_history = [];
$api = "";
if (!empty($region)) {
try {
$api = new App\Library\RiotAPI\RiotAPI([
App\Library\RiotAPI\RiotAPI::SET_KEY =>
App\Config::API_KEY,
App\Library\RiotAPI\RiotAPI::SET_CACHE_RATELIMIT => true,
App\Library\RiotAPI\RiotAPI::SET_CACHE_CALLS => true,
App\Library\RiotAPI\RiotAPI::SET_REGION =>
App\Library\RiotAPI\Definitions\Region::getRegion($region),
]);
} catch (\Exception $e) {
// die($e->getMessage());
}
}
if ($api) {
// Insert current rank etc...
try {
$summoner = $api-
>getSummonerByName(App\Repository\UserRepository::getInstance()-
>getUserDetail($_SESSION['user']['id'], 'summoner_name'));
} catch (\Exception $e) {
$summoner = null;
}
// Match history
if (!empty($summoner)) {
try {
$matches = $api->getRecentMatchlistByAccount($summoner->accountId);
// For every matches
foreach ($matches as $match) {
$a_match = $api->getMatch($match->gameId);
if ($a_match->gameType === "MATCHED_GAME") {
$gameCreation = date("d-M-Y H:i:s", substr($a_match-
>gameCreation, 0, 10));
if ($gameCreation >= date("d-M-Y",
strtotime($user['created_at']))) {
// Get the participant ID of the customer
foreach ($a_match->participantIdentities as
$participantIdentity) {
if ($participantIdentity->player->currentAccountId
=== $summoner->accountId) {
$participantId = $participantIdentity-
>participantId;
}
}
// Get stats of the participant
foreach ($a_match->participants as $participant) {
if ($participant->participantId === $participantId)
{
$match_history[$match->gameId]['gameCreation'] =
$gameCreation;
$match_history[$match->gameId]['championId'] =
$participant->championId;
$match_history[$match->gameId]['spells']
['spell1'] = $participant->spell1Id;
$match_history[$match->gameId]['spells']
['spell2'] = $participant->spell2Id;
$match_history[$match->gameId]['win'] =
$participant->stats->win;
$match_history[$match->gameId]['kills'] =
$participant->stats->kills;
$match_history[$match->gameId]['deaths'] =
$participant->stats->deaths;
$match_history[$match->gameId]['assists'] =
$participant->stats->assists;
$match_history[$match->gameId]['goldEarned'] =
$participant->stats->goldEarned;
$match_history[$match->gameId]
['totalMinionsKilled'] = $participant->stats->totalMinionsKilled;
$match_history[$match->gameId]['items']['item0']
= $participant->stats->item0;
$match_history[$match->gameId]['items']['item1']
= $participant->stats->item1;
$match_history[$match->gameId]['items']['item2']
= $participant->stats->item2;
$match_history[$match->gameId]['items']['item3']
= $participant->stats->item3;
$match_history[$match->gameId]['items']['item4']
= $participant->stats->item4;
$match_history[$match->gameId]['items']['item5']
= $participant->stats->item5;
$match_history[$match->gameId]['items']['item6']
= $participant->stats->item6;
}
}
}
}
}
} catch (\Exception $e) {
// die($e->getMessage());
}
}
}
I would like to know if there's a way to : - Run it in background, without AJAX or something like : Keep in mind the match_history for X time, and then after X time, do the call again when the user refresh the page.
Thanks for your help!
Best Regards.

PHP Improve performance to execute multiple queries while reading a file with thousand lines

I'm trying to build a script where I need to read a txt file and execute some process with the lines on the file. For example, I need to check if the ID exists, if the information has updated, if yes, then update the current table, if no, then insert a new row on another temporary table to be manually checked later.
These files may contain more than 20,30 thousand lines.
When I just read the file and print some dummie content from the lines, it takes up to 40-50ms. However, when I need to connect to the database to do all those verifications, it stops before the end due to the timeout.
This is what I'm doing so far:
$handle = fopen($path, "r") or die("Couldn't get handle");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
$segment = explode('|', $buffer);
if ( strlen($segment[0]) > 6 ) {
$param = [':code' => intval($segment[0])];
$codeObj = Sql::exec("SELECT value FROM product WHERE code = :code", $param);
if ( !$codeObj ) {
$param = [
':code' => $segment[0],
':name' => $segment[1],
':value' => $segment[2],
];
Sql::exec("INSERT INTO product_tmp (code, name, value) VALUES (:code, :name, :value)", $param);
} else {
if ( $codeObj->value !== $segment[2] ) {
$param = [
':code' => $segment[0],
':value' => $segment[2],
];
Sql::exec("UPDATE product SET value = :value WHERE code = :code", $param);
}
}
}
}
fclose($handle);
}
And this is my Sql Class to connect with PDO and execute the query:
public static function exec($sql, $param = null) {
try {
$conn = new PDO('mysql:charset=utf8mb4;host= '....'); // I've just deleted the information to connect to the database (password, user, etc.)
$q = $conn->prepare($sql);
if ( isset($param) ) {
foreach ($param as $key => $value) {
$$key = $value;
$q->bindParam($key, $$key);
}
}
$q->execute();
$response = $q->fetchAll();
if ( count($response) ) return $response;
return false;
} catch(PDOException $e) {
return 'ERROR: ' . $e->getMessage();
}
}
As you can see, each query I do through Sql::exec(), is openning a new connection. I don't know if this may be the cause of such a delay on the process, because when I don't do any Sql query, the script run within ms.
Or what other part of the code may be causing this problem?
First of all, make your function like this,
to avoid multiple connects and also o get rid of useless code.
public static function getPDO() {
if (!static::$conn) {
static::$conn = new PDO('mysql:charset=utf8mb4;host= ....');
}
return static::$conn;
}
public static function exec($sql, $param = null) {
$q = static::getPDO()->prepare($sql);
$q->execute($param);
return $q;
}
then create unique index for the code field
then use a single INSERT ... ON DUPLICATE KEY UPDATE query instead of your thrree queries
you may also want to wrap your inserts in a transaction, it may speed up the inserts up to 70 times.

PHP PDO MySQL transactions not executed

I'm struggling with implementing PHP and MySQL Transactions. The script receives a SQL-statement along with some bindparameters through a blocking redisqueue. Everything is passed to a function 'do_transaction' which keeps track of the number of statements received.
I've debugged the PDO statement (after it has been processed) with PdoDebugger and the output is correct:
UPDATE bla SET processed = 1, severity_ou1 = 'low',
severity_ou2 = 'low', severity_ou3 = 'low', severity_ou4 = 'low',
severity_ou5 = 'low', saved = '1', hname = '1', sname = '1', if = '1',
v = '1', translated = 'blablabla.', filtered = 1, repeated = '1',
excessed = '1', eventfilterid = '212', building = '1', floor = '1'
WHERE id = '121614624'
global $batchcount;
$batchcount = 1;
while(true){
$redis = new Redis();
$redis->connect('xxx', xxx);
$sqlbatch = $redis->blpop('xxx:xxx:sqlfiltermatch', 0);
// blpop returns array: 0 has key, 1 has data.
if(is_array($sqlbatch)){
if(isJson($sqlbatch[1])){
$batchstatements = array();
$batchstatements[] = json_decode($sqlbatch[1], true);
// Get statement and bindparams.
$sqlstatement = $batchstatements[0]['statement'];
$bindparams = $batchstatements[0]['bindparams'];
// Replace empty bindparams.
foreach($bindparams as $column => $value){
if(is_null($value)){ $bindparams[$column] = '1'; }
if(empty($value)){ $bindparams[$column] = '1'; }
}
}
$batchcount++;
do_transaction($sqlstatement, $bindparams, $batchcount);
}
}
function do_transaction($sqlstatement, $bindparams){
global $batchcount;
if($batchcount >= 4){
try {
// Setup DB
$db = new PDO('mysql:host=xxx;dbname=xxx;charset=utf8', 'xxx', 'xxx', array(PDO::ATTR_PERSISTENT => true, PDO::ATTR_AUTOCOMMIT => FALSE, PDO::ATTR_ERRMODE => PDO::ERRMODE_WARNING));
echo $db->getAttribute(PDO::ATTR_AUTOCOMMIT)."\n\n";
$db->beginTransaction();
$stmt = $db->prepare($sqlstatement);
// Setup bindparams.
foreach($bindparams as $column => $value){
$stmt->bindParam(":$column", $value);
}
$stmt->execute() or die(print_r($stmt->errorInfo(), true));
echo PdoDebugger::show($sqlstatement, $bindparams)."\n";
$db->commit();
} catch(PDOExecption $e){
//$db->rollback();
print_r("ERROR"); exit;
}
$batchcount = 0;
}
$batchcount++;
}
I've made sure that AUTOCOMMIT = FALSE. Where in "do_transaction" does it go wrong?
There is no point in using transactions this way.
So, just leave them alone.
function do_query($db, $sqlstatement, $bindparams){
$stmt = $db->prepare($sqlstatement);
$stmt->execute($bindparams);
return $stmt;
}
is all the code you actually need.
Use it this way
$db = new PDO('mysql:host=xxx;dbname=xxx;charset=utf8', 'xxx', 'xxx',
array(PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION)
);
while(true){
// whatever redis code goes here
do_query($db, $sqlstatement, $bindparams);
}
Nota bene: If you want to make your inserts faster, you should ask a question titled "How to make inserts faster", no "My transactions do not work".
But your idea of inserting transactions is wrong too.
A single transaction (in terms of the business logic) have to be written to database as soon as possible, without interfering with other transactions. Means you should never couple different business logic transactions within single database transacion. Because error in single business logic transaction will ruin whole batch. So - just write them separately.

Unable to catch Exception using code igniter

This is my controller code which save data in table but when I put more than max length column data , it is not retuning me STATUS false; nothing is happening. please help
function saveImproveUs(){
$status =array("STATUS"=>"false");
try{
$improveUs = array(
'NAME' => trim($this->input->post('name')),
'EMAIL' => trim($this->input->post('email')),
'LOCATION' => trim($this->input->post('location')),
'MESSAGE_TYPE' => trim($this->input->post('messageType')),
'COMMENTS' => trim($this->input->post('comments'))
);
// Save improve us
$this->db->insert('trn_improve_us', $improveUs);
if ($this->db->affected_rows() > 0){
$status = array("STATUS"=>"true");
}
}catch(Exception $ex) {
//show_error($ex);
echo "I am in exception";
exit;
}
echo json_encode (array($status)) ;
}
You have to throw the exception, it won't do this for you.
if ($this->db->affected_rows() > 0){
$status = array("STATUS"=>"true");
}
else {
throw new Exception("Could not insert data");
}
Also inserting more data than a column can hold will automatically get cut-off in MySQL, the insert won't actually fail. You should use strlen to check the length of a string and validate it manually.

Categories