I have a code:
$tx = Yii::$app->db->beginTransaction();
try {
// CODE (a lot of active-record reads and writes)
$tx->commit();
echo "All good!";
} catch (\Throwable $ex) {
$tx->rollback();
echo "Error";
}
It executes and I get "All good!" as a result.
However, nothing gets actually stored in database.
The code has been working for several months so far and wasn't modieid.
It suddently stopped working yesterday.
After an hour of debugging, I can confirm operations go fine, I can "echo" intermidate results, including IDs of the records I'm inserting. But still nothing saves in DB after the final commit.
If I remove transaction, the code works, and everything gets stored in DB, as it used to with Transaction around.
I want to ensure wholeness of the changes, want to get in back inside the transaction.
Or, at least, I want to understand which code (or DB state or whatever else) prevents data from being stored and why I don't get any exceptions and my "Error" echoed, since transaction failed to commit.
I was sure that if transaction fails to commit (actually write to DB), commit() method would throw an exception, but it does not. Is there a way to get it?
Thank you very much in advance.
The problem turned out to be in the code.
Thanks, #Michal Hynčica, you were right indeed.
There was a part like:
foreach (..) {
$tx = Yii->$app->db->beginTransaction();
if ($oneRareCondition) {
continue;
}
...
$tx->commit();
}
Was solved by adding $tx->commit(); before continue; in that rarely-happening if:
foreach (..) {
$tx = Yii->$app->db->beginTransaction();
if ($oneRareCondition) {
$tx->commit();
continue;
}
...
$tx->commit();
}
Related
I'm building a full system with both Laravel and VueJS and at some point, I'm generating pictures from a list of pictures with Image Intervention.
But this process can break, there are many issues that I faced and resolved that can appear in the future.
What would you recommend me to do to have a broken code not stop the rest ? I was thinking on some service that would be called and be independent, asynchronous.
Can Laravel cover that ? I have read about events in both Laravel and Symfony but that is something I never understood.
Greetgins
Well, I was in a similar problem some days ago. Although, My problem was related to inserting data from CSV to the database. So, there were chances of having some different datatype that might generate the error and halt the whole remaining process. So, I used try catch inside my job. I will show you reference, you can modify as you wish:
$error_arr = array();
$error_row_numbers = array();
try{
//Write your code here that might throw error
$row = Model::updateOrCreate(
['id' => $id,
$Arr
);
} catch (\Exception $e) {
//Optionally you can store error message
//and image number which is failed here
$error_arr[] = $e->getMessage();
$error_row_numbers[] = $row_no; //this row_no is different variable
//and should be increased in loop
//to determine exact image
}
I needed to make sure an External API call (which might take 200-300ms to respond) does not run more than once (Please see my Edit for a better explanation of what I'm trying to do).
This is what I finally thought of:
try{
// Start Transaction
$db->beginTransaction();
// UPDATE (This sets in database that API call was successful)
// Yes, before even calling the API
$stmt = $db->prepare("QUERY1");
$stmt->execute();
// Row isn't updated?
if ($stmt->rowCount() < 1){
// STOP, It might be running in another code execution (Which is what we're trying to prevent here)
throw new PDOException('Not updated!');
}
// Some External API call
$call = $external_api->action();
// External API call was "NOT" Successful?
if ($call !== TRUE){
throw new PDOException('API Failed!');
}
// UPDATE the same row again with "Real" API response data
$stmt = $db->prepare("QUERY2");
$stmt->execute();
// All good
$db->commit();
}catch(PDOException $e){
// If Something went wrong rollback!
$db->rollBack();
}
But I have 3 questions which I don't know the answer for:
Is this a good approach? or it might cause more harm than good?
What happens if PHP gets timed out and we don't roll back? will it roll back automatically if we don't commit?
In the following part of code:
// Row isn't updated?
if ($stmt->rowCount() < 1){
// STOP, It might be running in another code execution (Which is what we're trying to prevent here)
throw new PDOException('Not updated!');
}
If the rowCount is 0, it won't roll back anything right? (I don't want it to roll back the same Query for another Code Execution)
In the end, if there is a better method or approach in my case to prevent double executions, please kindly let me know.
Edit:
What I'm trying to do here, is stopping people from running this page from 2 different browsers at the same time (or refreshing too quickly) and getting the external api call inside this page executed more than once.
https://example.com/?execute_id=15
So if it goes inside if ($stmt->rowCount() < 1){...} it means this row is already updated by another execution, and by throwing execption it stops calling the following API call again.
I dunno how to explain it better :(
I'm having a big head ache with Laravel's chunk and each functions for breaking up result sets.
I have a table, that has a column processed with a value of 0. If I run the following code, it goes through all 13002 records.
Record::where(['processed' => 0])->each(function ($record) {
Listing::saveRecord($record);
}, 500);
This code will run through all 13002 records. However, if I add in some code to mark a record as processed, things go horribly pear shaped.
Record::where(['processed' => 0])->each(function ($record) {
$listing_id = Listing::saveRecord($record);
$record->listing_id = $listing_id;
$record->processed = 1;
$record->save();
}, 500);
When this code runs, only 6002 records are processed.
From my understand of things, that on each iteration of of the chunk (each runs through chunk), that it's executing a new statement.
I've come from using Yii2 and I'm mostly happy with the move, except for this hiccup, which has me pulling my hair out. Yii2 has similar functions (each and batch), but they seem to use result sets and pointers, so even if you update the table while you're processing your results, it doesn't effect your result set.
Is there actually a better way to do this in Laravel?
Try this
Records::where('processed',0)->chunk(100, function($records){
foreach($records as $record)
{
// do your stuff...
}
});
https://laravel.com/docs/5.3/queries#chunking-results
Sorry about indentation, on my phone and that doesnt work apperently..
Hello friends from SO!
Alright, this time I have a little bit more complex problem. We have a web crawler running and functioning normally during most of the day and time.
However, from time to time, it just stops: the link which is supposed to be analyzed next, never change it's state (from pending to scanning), and of course this stops the whole cycle.
We're logging all PHP errors using:
//errores producción
#ini_set('error_reporting', -1);
#ini_set('log_errors','On');
#ini_set('display_errors','Off');
#ini_set('error_log','/var/www/vhosts/xxx.com/xxx.com/xxx');
There's no evidence of anything that could cause the problem described. 0 anomalies.
Therefore, I believe the problem might be related to some kind of MySQL issues?
Every single MySQL query we do, is done using MySQLi by custome made functions, so my question here is:
Is there any simple approach to record every single MySQL error on the same file where we are storing the PHP errors?
Here are some of the functions used to query the MySQL:
Function db_ob($db_link, $ask) {
$feedback = mysqli_fetch_object(mysqli_query($db_link, $ask));
return $feedback;
}
and:
Function db_ob_all($db_link, $ask) {
$feedback = mysqli_query($db_link, $ask);
while ($row = mysqli_fetch_object($feedback)) { $value[] = $row; }
return $value;
}
So what I'm looking for, is a one or two lines solution, that I could add into these functions, in order to store and track any issue or error in the same file where I'm currently storing the PHP errors.
Thanks in advance!
Chris;
Solved:
1) make a function to track the errors into the PHP error_log:
Function informar_error_db($db_link) {
error_log("Dreadful SQL error: ". mysqli_error($db_link)." in ".$_SERVER["SCRIPT_FILENAME"]);
}
2) If there's MySQLi issues, save em:
Function db_ask($db_link, $ask) {
$feedback = mysqli_query($db_link, $ask);
if (mysqli_error($db_link)) { informar_error_db($db_link); }
return $feedback;
}
Here:
if (mysqli_error($db_link)) { informar_error_db($db_link); }
I have a bit of an issue with my code.
I'm making an administrative panel for users to add things to the database. On occasion, they might try to save data without changing it (open a dialog, click save without changing anything). This, of course, will make mysql_affected_rows() return '0' when checking to see if the UPDATE query worked.
Is there another query to make that will always UPDATE regardless of whether the data is the same or not (or can I modify a simple UPDATE query to always update)?
EDIT
This is for users who don't have programming experience. Of course you wouldn't want to update if there's no reason to, but when a user tries to update and it doesn't happen I end up showing a failure message. Rather than there being something wrong, its just it doesn't need to be updated. I need a way to show the user that, instead of a generic 'failure' message. If it failed for another reason, I still need to know.
From the MySQL Documentation:
If you set a column to the value it currently has, MySQL notices this
and does not update it.
Instead of checking mysql_affected_rows, just check to see if the query was successful:
if(!mysql_query("UPDATE ..."))
{
//failure
}
else
{
$verification = mysql_query("SELECT ROW_COUNT() as rows_affected");
$row = mysql_fetch_row($verification);
$rows_affected = $row[0];
if ($rows_affected > 0)
{
//update was performed
}
else
{
//no update was needed
}
}