In CodeIgniter, Are we also allowed to use "Active Record Class" like this?
<?php
$data_balance = array(
'balance' => $newbalance
);
$data_transaction = array(
'amount' => $amount,
'user' => $user
);
$this->db->trans_start();
// This is an "Insert" query for a table
$this->db->set('date', 'NOW()', FALSE);
$this->db->insert('financial_transactions', $data_transaction);
// This is an "Update" query for another table
$this->db->where('user_id', $user_id);
$this->db->update('financial_balances', $data_balance);
$this->db->trans_complete();
if ($this->db->trans_status() === FALSE){
// Do stuff if failed
}
?>
Note that i use $this->db-> for both queries, so i don't know if the success result of first one is actually cleared to check the second one?
Is this going to work? can i trust that this will make either both queries to success or neither of them (i don't want one to success and one to fail)
From documentation:
By default CodeIgniter runs all transactions in Strict Mode. When strict mode is enabled, if you are running multiple groups of transactions, if one group fails all groups will be rolled back. If strict mode is disabled, each group is treated independently, meaning a failure of one group will not affect any others.
Related
I tried to make my first query return affected rows: 0 to see if the transaction fails but it continued executing the second query.
Should i break the transaction manually?
DB::transaction(function () {
User::where('id', 1002)->update(['name' => 'x']); // id:1002 doesn't exist
Post::where('user_id', 1)->update(['title' => 'New Title']);
});
There's not a lot of context around your sample code, but a very basic approach would be something like this:
$user = User::findorFail(1002);
$user->update(['name' => 'x']);
if ($user->wasChanged('name')) {
Post::where('user_id', 1)->update(['title' => 'New Title']);
}
So the first line will throw an exception if the model isn't found. Then we do an update. You specifically said you were checking for 0 affected rows, so next we use the wasChanged() method. It "determines if any attributes were changed when the model was last saved within the current request cycle." If that's the case, we proceed with the next update.
There are other changes that could be made involving, for example, route model binding if there was more of your code shown in the question, but hopefully this is a helpful start.
We currently encounter a Duplicate entry QueryException when executing the following code:
Slug::firstOrCreate([
Slug::ENTITY_TYPE => $this->getEntityType(),
Slug::SLUG => $slug
], [
Slug::ENTITY_ID => $this->getKey()
]);
Since the firstOrCreate method by Laravel first checks if the entry with the attributes exist before inserting it, this exception should never occur. However, we have an application with million of visitors and million of actions every day and therefore also use a master DB connection with two slaves for reading. Therefore, it might be possible that some race conditions might occur.
We currently tried to separate the query and force the master connection for reading:
$slugModel = Slug::onWriteConnection()->where([
Slug::SLUG => $slug,
Slug::ENTITY_TYPE => $this->getEntityType()
])->first();
if ($slugModel && $slugModel->entity_id !== $this->getKey()) {
$class = get_class($this);
throw new \RuntimeException("The slug [{$slug}] already exists for a model of type [{$class}].");
}
if (!$slugModel) {
return $this->slugs()->create([
Slug::SLUG => $slug,
Slug::ENTITY_TYPE => $this->getEntityType()
]);
}
However the exception still occurs sometimes.
Our next approach would be to lock the table before the reading check and release the lock after the writing to prevent any inserts with the same slug from other database actions between our reading and our writing. Does anyone know how to solve this? I don`t really understand how Laravel's Pessimistic Locking can help solving the issue. We use MySql for our database.
I would not recommend to lock the table, especially if you have millions of viewers.
Most race-conditions can be fixed by locks, but this is not fixable with locks, because you cannot lock a row that does not exist (there is something like gap locking, but this won't help here.).
Laravel does not handle race-conditions by itself. If you call firstOrCreate it does two queries:
SELECT item where slug=X and entity_type=Y
If it does not exists, create it
Now because we have two queries, race condition is possible, meaning two user in parallel reach step 1, then both try to create the entry in step 2 and your system will crash.
Since you already have a Duplicate Key error, it means you aleady put a unique constrain on the tuple on the two columns that identify your row, which is good.
What you could do now, is to catch the duplicate key error like this:
try{
$slug = Slug::firstOrCreate([
Slug::ENTITY_TYPE => $this->getEntityType(),
Slug::SLUG => $slug
], [
Slug::ENTITY_ID => $this->getKey()
]);
}
catch (Illuminate\Database\QueryException $e){
$errorCode = $e->errorInfo[1];
if($errorCode == 1062){
$slug = Slug::where('slug','=', $slug)->where('entity_type','=', $this->getEntityType())->first();
}
}
one solution for this is to use Laravel queue and make sure that it runs one job at a time, in this way you will never have 2 identical queries at the same time.
for sure this will not work if you want to return back the result in the same request.
We have an api function which check a condition on database with a select-query then if it was true we want just for one time insert some thing to database for example inserting to database that insertion done. Problem is when we call multiple times this api-function concurrently race condition happen, in another words assume we call this function 2 times, first request check the condition it's true then second request check that and it's true again so their do insert to database. But we want to when we check condition no other one can check it again until we do insertion.
We use php/Laravel and know about some ways like using insert into ... select or using some thing like replace into ... and so on.
$order = Order::find($orderId);
$logRefer = $order->user->logrefer;
if (!is_null($logRefer) && is_null($logRefer->user_turnover_id)) {
$userTurnover = new UserTurnover();
$userTurnover->user_id = $logRefer->referrer_id;
$userTurnover->order_id = $order->id;
$userTurnover->save();
$logRefer->order_id = $order->id;
$logRefer->user_turnover_id = $userTurnover->id;
$logRefer->save();
}
If logrefer not found set it and corresponding user-turnover just for one time. We expect to see just one user-turnover related to this order but after running it multiple time concurrently we see multiple user-turnover has inserted.
I usually take advantage of transaction when operations need to be sequential but i think that in your case the situation it's a bit complex due to the fact that also the condition need to be evaluated conditionally if the function it's running. So the idea that i can give you it's to have on the database a variable (another table) used as semaphore which allow or not to perform actions on the table (condition gived by the fact that you set or unset the value of the semaphore). I think as good programmer that semaphore are useful in a lot of cases of concurrential functions.
The database should have a unique key on columns expected to be unique, even if some mechanism in the code prevents duplicates.
Wrap connected queries into a transaction which will fail and roll back in a race event condition
try {
DB::transaction(function() {
$order = Order::find($orderId);
...
$logRefer->save();
});
} catch (\Illuminate\Database\QueryException $ex) {
Log::error(“failed to write to database”);
}
I'm running a dozen update statements on a script and some updates just won't happen, I need to know why and how to fix this situation.
I have this piece of code:
foreach($clis as $c => $cli)
{
//... do some stuff
foreach($emails as $e)
{
$Mailqueue = $this->em->getRepository('Sendsys\Entity\Mailqueue')->findBy(array("email" => $e,"status" => "P"));
foreach($Mailqueue as $queue)
{
$queue->setStatus('S');
$this->em->persist($queue);
}
}
$this->em->flush(); //entity manager is retrieved before that
}
This basically run through a (somewhat large) table to update the emails with status 'P' to 'S', these emails are on a list in the $emails array.
I have tried using doctrine's ExecuteUpdate with a prepared statement instead of retrieving and persisting an entity each time , but it results in the exact same issue.
Some more details:
The issue happens even with the mail sending function commented out
I know that all of the emails are being sent (when uncommented) and the script is generating all of the update statements, I've output them when testing ExecuteUpdate
I know that the updates are getting to Postgres because some rows do update and if I refresh the rows will randomly update until there is none left
I can't just update everything afterwards, I need to know where the script stopped in case of failure.
EDIT
It's not doctrine related, I've connected directly to the database and ran the followings statements inside a loop:
$dbh->setAttribute(\PDO::ATTR_ERRMODE, \PDO::ERRMODE_EXCEPTION);
$stmt = $dbh->prepare("UPDATE mailqueue SET status = 'S' WHERE status = :status AND email LIKE :mail");
$stmt->execute(array( ':status' => 'P', ':mail' => $e ));
echo $stmt->rowCount();
The statements of updates are run only at the end by $this->em->flush();
This is the only action that tells Doctrine to begin the SQL transaction, if you need updates te bo executed before that, just call the flush method.
During the treatments before flushing, the changes remain in the entities linked to the manager.
I am calling multiple models in controllers and all models do database query.
i did something like this
public function InsertSale()
{
$this->db->trans_start(TRUE);
// all logic part and models calling which do insert/update/delete
$this->db->trans_complete();
}
Above code is not working even if something fails after some queries they dont rollback.
Having true in $this->db->trans_start(true); will put the transaction in to test mode which means that, regardless of what happens, the query will be rolled back.
If you wanted to see if the query would work you would use:
$this->db->trans_status();
Which will return either true/false depending on the outcome.
So that you have to follow like this
$this->db->trans_start(); # Starting Transaction
$this->db->trans_strict(FALSE);
$this->db->insert('table_name', $someDataArray); # Inserting data
# Updating data
$this->db->where('id', $id);
$this->db->update('table_name', $someDataArray);
$this->db->trans_complete();
This is work fine. Check this answer too