I want to move my audits table records which got million of rows to another table by using console/command and chunkbyid but it stop in the middle. For example I want to move the audit date of MONTH(created_at) = 02 and YEAR(created_at) = 2021, it does not run through all the records following that condition. As i checked in mysql it suppose to have like 5mils of records but only run up to hundreds thousand only. My codes as below in console
Audit::query()
->whereRaw("MONTH(created_at) = '$month'")
->whereRaw("YEAR(created_at) = '$year'")
->chunkById(1, function ($audits) use ($table_name) {
foreach($audits as $audit){
dump($audit->id);
$newRecord = $audit->replicate()->fill([
'audit_id' => $audit->id,
'created_at' => $audit->created_at,
'updated_at' => $audit->updated_at,
]);
$newRecord->setTable($table_name);
$newRecord->save();
if(str_contains($audit->auditable_type, 'User') || str_contains($audit->auditable_type, 'Trans') || str_contains($audit->auditable_type, 'Step')|| str_contains($audit->auditable_type, 'Team')){
$audit->delete();
}
}
}, $column = 'id');
I already followed many solutions i found in many sites but still not working. Is there anything i missed?
In Laravel documentaion (https://laravel.com/docs/9.x/queries)
there is a block of note say When updating or deleting records inside the chunk callback, any changes to the primary key or foreign keys could affect the chunk query. This could potentially result in records not being included in the chunked results.
and in your code you deleting audit in some cases.
Related
I'm working on a simple search project where I'm returning the results. The search function appears to work however, the total and page return the wrong values. The total field returns the total number of rows inside the data, not the total number of results from the search and the page is always {}.
Here's the model->function I've created:
public function search($string)
{
$results = $this->select('*')->orLike('title', $string)->orLike('excerpt', $string);
if ( empty( $results ) )
{
return [];
} else
{
$data = [
'results' => $results->paginate(2),
'total' => $results->countAllResults(),
'page' => $this->pager,
];
return $data;
}
}
What's puzzling is if I place the total field above the results value the count works, but then the result fields returns everything in the database at paginate(2).
Ok, I managed to solve this query by adding two separate queries to the database. The processing cost appears to be minimal and it should be alright when caching the responses. As it turns out you can chain queries but only in a particular order and if you use grouping (see ->groupStart() )
$results = $this->select('title, image, categories, id, excerpt')->groupStart()->like('title', $search)->orLike('excerpt', $search)->groupEnd()->where('status','live')->paginate(2);
$total = $this->select('title, image, categories, id, excerpt')->groupStart()->like('title', $search)->orLike('excerpt', $search)->groupEnd()->where('status','live')->countAllResults();
Some may argue the inefficiency of the two queries, but this works for my use case :) Hope this helps anyone else stuck on a similar problem.
EDIT:
I want to thanks #jimmix for giving me some idea to get started on my last post, But unfortunately, my post was put on hold. Due to the lack of details.
But here are the real scenario, I'm sorry if I didn't explain well my question.
From my CSV file, I have a raw data, then I will upload using my upload() function in into my phpmyadmin database with the table name "tbldumpbio",
See the table structure below:(tbldumpbio)
From my table tbldumpbio data, I have a function called processTimesheet()
Here's the code:
public function processTimesheet(){
$this->load->model('dbquery');
$query = $this->db->query("SELECT * FROM tbldumpbio");
foreach ($query->result() as $row){
$dateTimeExplArr = explode(' ', $row->datetimex);
$dateStr = $dateTimeExplArr[0];
$timeStr = $dateTimeExplArr[1];
if($row->status='C/Out' and !isset($timeStr) || empty($timeStr) ){
$timeStrOut ='';
} else {
$timeStrOut = $dateTimeExplArr[1];
}
if($row->status='C/In' and !isset($timeStr) || empty($timeStr) ){
$timeStrIn ='';
} else {
$timeStrIn = $dateTimeExplArr[1];
}
$data = array(
'ID' => '',
'companyAccessID' => '',
'name' => $row->name,
'empCompID' => $row->empid,
'date' => $dateStr,
'timeIn' => $timeStrIn,
'timeOut' => $timeStrOut,
'status' => '',
'inputType' => ''
);
$this->dbquery->modInsertval('tblempbioupload',$data);
}
}
This function will add another data into another table called "tblempbioupload". But here are the results that I'm getting with:
Please see the below data:(tblempbioupload)
The problem is:
the date should not be duplicated
Time In data should be added if the status is 'C/In'
Time Out data should be added if the status is 'C/Out'
The expected result should be something like this:
The first problem I see is that you have a time expressed as 15:xx:yy PM, which is an ambiguous format, as one can write 15:xx:yy AM and that would not be a valid time.
That said, if what you want is that every time the date changes a row should be written, you should do just that: store the previous date in a variable, then when you move to the next record in the source table, you compare the date with the previous one and if they differ, then you insert the row, otherwise you simply progress reading the next bit of data.
Remember that this approach works only if you're certain that the input rows are in exact order, which means ordered by EmpCompId first and then by date and then by time; if they aren't this procedure doesn't work properly.
I would probably try another approach: if (but this is not clear from your question) only one row per empcompid and date should be present, i would do a grouping query on the source table, finding the minimum entrance time, another one to find the maximum exit date, and use both of them as a source for the insert query.
I currently have this code below, however when adding around 2000 rows this runs too slow due to being in an foreach loop.
foreach($tables as $key => $table) {
$class_name = explode("\\", get_class($table[0]));
$class_name = end($class_name);
$moved = 'moved_' . $class_name;
${$moved} = [];
foreach($table[0]->where('website_id', $website->id)->get() as $value) {
$value->website_id = $live_website_id;
$value->setAppends([]);
$table[0]::on('live')->updateOrCreate([
'id' => $value->id,
'website_id' => $value->website_id
], $value->toArray());
${$moved}[] = $value->id;
}
// Remove deleted rows
if ($table[1]) {
$table[0]::on('live')->where([ 'website_id' => $live_website_id ])
->whereNotIn('id', ${$moved})
->delete();
}
}
What is happening is basically users will add/update/delete data in a development server, then when they hit a button this data needs to be pushed into the live table, retaining the ID's as auto incremental id's on the live won't work due to look-up tables and multiple users launching data live at the same time.
What is the best way to do this? Should I simply remove all the data in that table (there is a unique identifier for chunks of data) and then just insert?
I think can do:
You can create a temp table and fill just like your development server and just rename table to published table
Use Job and background service(async queue)
You can set your db connection as singleton, because connecting multiple times can waste time
Can use some sql tools for example if use postgresql you can use Foreign Data Wrapper(fdw) to connect db to each other and update and create in a faster way!
So I am making a quiz in which the user gets confronted with a succession of questions which he answers through a form.
The series of problems each contain a given number of questions, and the questions get asked one after the other when the user validates.
I am therefore trying to re-render the view with the form for each problem until they're all done. This is my action:
public function actionAnswer($id_serie)
{
if ($id_serie != 0) //getting the serie's info
{
$serie = Serie::find()
->where(['id' => $id_serie])
->one();
$problems = (new \yii\db\Query()) //getting the problems in the serie
->select('*')
->from('problems')
->where(['id_serie' => $id_serie])
->all();
$prob_counter = $serie->nbr_of_problems; //counts the number of questions answered
$id_serie = 0;
}
$model = new Answer;
if ($model->load(Yii::$app->request->post()) && $model->validate())
{
$model->save(); // works just fine every time
if (--$prob_counter <= 0)
{
return $this->redirect('index.php?r=student/entry');
}
}
return $this->render('answer',
['model' => $model,
'problems' => $problems,
'serie' => $serie,
'prob_counter' => $prob_counter, //these last two are for debug
'id_serie' => $id_serie]);
}
When this action gets executed the first time, $id_serie is never null or =0. Hence I am using this to query the db only once and set a counter to the total number of problems in the serie. (id est the number of time the user has to submit the form)
If his answer is valid, I decrement my counter and if it falls under 0, there are no questions to answer anymore and the user gets redirected.
However, this counter never goes down to 0: it is set correctly, it is decremented only once, and then it never falls lower, no matter where I put the line. (inside or outside any loop)
On the other hand the data from the form is properly inserted in the db each time.
What am I getting wrong?
As per your code, $prob_counter just stores the number of problems for each series. You need to change this to show the number of unanswered problems for the series. How you implement this will depend on your models and database but it should be something like:
$problems = (new \yii\db\Query()) //getting the problems in the serie
->select('*')
->from('problems')
->where(['id_serie' => $id_serie])
->andWhere('not exists (select id from answer where problemid = problems.id')
->all();
Also you should probably look at working with relational data and avoid using Query() in the above section.
I am trying to update multiple records in one field in my database. For some reason I keep getting SQL Error: 1054: Unknown column '520947b9' in 'field list'. 502947B9 is apart of my ID. Im not understanding why that value is being seen as a field list. Here is my code. That said, Im not sure Im updating these records correctly. If Im not please point it out to me. Thanks!!
public function findPolicyIds($coverageId = null) {
$policyid = $this->Policy->find('all', array(
'recursive' => -1,
'conditions' => array('Policy.coverage_id' => $coverageId),
'fields' => array('Policy.id')));
foreach($policyid as $id) {
$all[] = $id['Policy']['id'];
foreach ($all as $key) {
$this->Policy->Declination->updateAll(
array('Declination.policy_id' => $key),
array('Declination.coverage_id <=' => $coverageId)
);
}
}
}
Here are my errors
Query: UPDATE declinations AS Declination LEFT JOIN policies AS Policy ON (Declination.policy_id = Policy.id) SET Declination.policy_id = 520947b9-0210-4067-94ea-70f8ae78509d WHERE Declination.coverage_id <= '520947b9-1fa0-45db-992e-70f8ae78509d'
Query: UPDATE declinations AS Declination LEFT JOIN policies AS Policy ON (Declination.policy_id = Policy.id) SET Declination.policy_id = 520947b9-0694-4724-b353-70f8ae78509d WHERE Declination.coverage_id <= '520947b9-1fa0-45db-992e-70f8ae78509d'
By the looks of your query, updateAll is not recognizing $key as a string. Either cast it as such, or add the ' characters yourself. Example:
$this->Policy->Declination->updateAll(
array('Declination.policy_id' => "'".$key."'"),
array('Declination.coverage_id <=' => $coverageId)
);
That's the SQL error.
Now
"That said, Im not sure Im updating these records correctly."
... Well, what do you want to do? Reading your code, You are getting an array of Policy's ids and updating all Declinations with a coverage_id <= $coverageId, which doesn't make much sense, since that foreach is updating the policy_id for that same condition, so in the end you will perceive the last change: last policy_id of the foreach on every Declination with coverage_id equal or less than $coverage_id.... Doesn't make much sense to me, even not knowing what you need to do.
Based on the SQL and assuming you are using an ORM, it appears to me that policy_id is defined as an numeric field in your Declination model when it really needs to be a string. Coverage_id field is working correctly, so compare the two definitions.