update last record of matching condition in single query - php

I want to update last record of matching condition.i am getting more than one result in code so i did it different way.
$u = user_payments::where('id', $id)->where('staff_id', $staffId)->orderBy('id','desc')->first();
$up = user_payments::where('id', $u->id)->update(['transaction_id' => $transaction_id]);
here i have to query two times so it call DB two times as i want to optimise the query, I tries as above but i want to it in single query... How can i do this ? Thanks.

This should do what you need. Ultimately only one single update script runs on the database.
$result = user_payments::where('id', $id)
->where('staff_id', $staffId)
->orderBy('id','desc')
->take(1)
->update(['transaction_id' => $transaction_id]);

Related

Trying to delete all rows in database that do not match array

I am trying to figure out how to delete all ids in the database that do not exist in an array. I have been trying to use NOT IN in my query but I am not sure why it wont work when running it in a script the same way it works when I manually enter it into mysql. Here is an example.
mysqli_query($con, "DELETE FROM table WHERE id NOT IN ($array)");
$array is a list of ids from a json api. I use CURL to fetch the ids and I am trying to delete all ids in the database that do not match the ids in $array.
First I use another simple CURL script to scrape the apis and insert the ids found into the database and what I am trying to do here is basically make a link/data checker.
If the ids in the database are not found in the array when rechecking them then I want them deleted.
I thought that the query above would work perfect but for some reason it doesn't. When the query is ran from a script the mysql log shows the queries being ran as this.
Example:
DELETE FROM table WHERE id NOT IN ('166')
or this when I am testing multiple values.
DELETE FROM table WHERE id NOT IN ('166', '253', '3324')
And what happens is it deletes every row in the table every time. I don't really understand because if I copy/paste the same query from the log and run it manually myself it works perfect.
I have been trying various ways of capturing the array data such as array_column, array_map, array_search and various functions I have found but the end result is always the same.
For right now, just for testing I am using these 2 bits of code for testing 2 different apis which gives me the same sql query log output as above. The functions used are just a couple random ones that I found.
//$result is the result from CURL using json_decode
function implode_r($g, $p) {
return is_array($p) ?
implode($g, array_map(__FUNCTION__, array_fill(0, count($p), $g), $p)) :
$p;
}
foreach ($result['data'] as $info){
$ids = implode_r(',', $info['id']);
mysqli_query($con, "DELETE FROM table WHERE id NOT IN ($ids)");
}
And
$arrayLength = count($result);
for($i = 0; $i < $arrayLength; $i++) {
mysqli_query($con, "DELETE FROM table WHERE id NOT IN ('".$result[$i]['id']."')");
}
If anyone knows what is going on i'd appretiate the help or any suggestions on how to achieve the same result. I am using php 7 and mysql 5.7 with innodb tables if that helps.
It probably doesn't work because your IN value is something like this
IN('1,2,3,4');
When what you want is this
IN('1','2','3','4')
OR
IN( 1,2,3,4)
To get this with implode include the quotes like this
$in = "'".implode("','", $array)."'";
NOTE whenever directly inputting variables into SQL there is security Implications to consider such as SQLInjection. if the ID's are from a canned source you're probably ok, but I like to sanitize them anyway.
You can't mix array and string.
This works:
mysqli_query($con, "DELETE FROM table WHERE id NOT IN (".implode(',', $ids).")");

Eloquent chunk() missing half the results

I have a problem with Laravel's ORM Eloquent chunk() method.
It misses some results.
Here is a test query :
$destinataires = Destinataire::where('statut', '<', 3)
->where('tokenized_at', '<', $date_active)
->chunk($this->chunk, function ($destinataires) {
foreach($destinataires as $destinataire) {
$this->i++;
}
}
echo $this->i;
It gives 124838 results.
But :
$num_dest = Destinataire::where('statut', '<', 3)
->where('tokenized_at', '<', $date_active)
->count();
echo $num_dest;
gives 249676, so just TWICE as the first code example.
My script is supposed to edit all matching records in the database. If I launch it multiple times, it just hands out half the remaining records, each time.
I tried with DB::table() instead of the Model.
I tried to add a ->take(20000) but it doesn't seem to be taken into account.
I echoed the query with ->toSql() and eveything seems to be fine (the LIMIT clause is added when I add the ->take() parameter).
Any suggestions ?
Imagine you are using chunk method to delete all of the records. The table has 2,000,000 records and you are going to delete all of them by 1000 chunks.
$query->orderBy('id')->chunk(1000, function ($items) {
foreach($items as $item) {
$item->delete();
}
});
It will delete the first 1000 records by getting first 1000 records in a query like this:
SELECT * FROM table ORDER BY id LIMIT 0,1000
And then the other query from chunk method is:
SELECT * FROM table ORDER BY id LIMIT 1000,2000
Our problem is here, that we delete 1000 records and then getting results from 1000 to 2000. Actually we are missing first 1000 records and this means that we are not going to delete 1000 records in first step of chunk! This scenario will be the same for other steps. In each step we are going to miss 1000 records and this is the reason that we are not getting best result in these situations.
I made an example for deletion because this way we could know the exact behavior of chunk method.
UPDATE:
You can use chunkById() for deleting safely.
Read more here:
http://laravel.at.jeffsbox.eu/laravel-5-eloquent-builder-chunk-chunkbyid
https://laravel.com/api/5.4/Illuminate/Database/Eloquent/Builder.html#method_chunkById
Quick answer: Use chunkById() instead of chunk().
When updating or deleting records while iterating over them, any changes to the primary key or foreign keys could affect the chunk query. This could potentially result in records not being included in the results.
The explanation can be found in the Laravel documentation:
If you are updating database records while chunking results, your chunk results could change in unexpected ways. If you plan to update the retrieved records while chunking, it is always best to use the chunkById method instead. This method will automatically paginate the results based on the record's primary key.
Example usage of chunkById():
DB::table('users')->where('active', false)
->chunkById(100, function ($users) {
foreach ($users as $user) {
DB::table('users')
->where('id', $user->id)
->update(['active' => true]);
}
});
(end of the update)
Below is the original answer which used the cursor() method instead of the chunk() method to solve the problem:
I had the same problem - only half of the total results were passed to the callback function of the chunk() method.
Here is the code which had the same problem - half of the transactions were not processed:
Transaction::whereNull('processed')->chunk(100, function ($transactions) {
$transactions->each(function($transaction){
$transaction->process();
});
});
I used Laravel 5.4 and managed to solve the problem replacing the chunk() method with cursor() method and changing the code accordingly:
foreach (Transaction::whereNull('processed')->cursor() as $transaction) {
$transaction->process();
}
Even though the answer doesn't address the problem itself, it provides a valuable solution.
For anyone looking for a bit of code that solves this, here you go:
while (Model::where('x', '>', 'y')->count() > 0)
{
Model::where('x', '>', 'y')->chunk(10, function ($models)
{
foreach ($models as $model)
{
$model->delete();
}
});
}
The problem is in the deletion / removal of the model while chunking away at the total. Including it in a while loop makes sure you get them all! This example works when deleting Models, change the while condition to suit your needs!
When you fetch data using chunk the same SQL query is being executed only the offset is different. Actually increasing as specified on the chunk method param. For example:
SELECT * FROM users WHERE status = 0;
Let's say there are 200 records(let's suppose that is a lot so we want to retrieve these data as chunks). So this looks like:
SELECT * FROM users WHERE status = 0 LIMIT 50 OFFSET 0(offset has a dynamic value
which means next time is 50, after that 100, and the last time 150).
The problem when using laravel chunk while updating is that we are only changing the offset. And this means the number of results is different each time we try to retrieve a chunk of data. So the first time there are 200 records that match the where condition. But if we update the status, for example to 1(status = 1) this means the next time when we try to fetch data we still execute the same query:
SELECT * FROM users WHERE status = 0 LIMIT 50 OFFSET 50(offset has a dynamic value
which means next time 100 and the last time 150).
We only have 150 records that match this query since we updated the table status = 1 for 50rows. Also we said the offset on the second time is going to be 50. And what is going to happen is that we skip 50 rows from 150rows since the offset is 50. And do the same update to these data. This means rows from 50->100 status is being updated to 1(status = 1) from the total of 150 rows.
The third time we run this query:
SELECT * FROM users WHERE status = 0 LIMIT 50 OFFSET 150(offset is going to be 150).
But the result of the query is 100 users in total that have status = 0. So no more data to go through.
This is not what you would expect to happen on the first thought. But this is how it works and why only half of data are being updated and the other part of data is being skipped.

Not getting all rows from a query with odbc_exec in php

I'm trying to display how many id's does my procedure finds, but the variable $processz only got the first row of the sql result. It should display that there are 17 rows or id's, and only got 1. Why does it happends?
$conexion = con_abrir();
$sqlquery = "OEE.dbo.VerPlanillas_fechas '$Linea_ID','$fecha1','$fecha2'";
$processz = odbc_exec($conexion,$sqlquery);
con_cerrar($conexion);
$res = count($processz);
echo $res;
count ($processz) tells you how many results you have - one.
If you want to know how many rows are in the result, you need to call odbc_num_rows ($processz);
Look into using PDO rather than odbc specific functions.

Laravel - multi-insert rows and retrieve ids

I'm using Laravel 4, and I need to insert some rows into a MySQL table, and I need to get their inserted IDs back.
For a single row, I can use ->insertGetId(), however it has no support for multiple rows. If I could at least retrieve the ID of the first row, as plain MySQL does, would be enough to figure out the other ones.
It's mysql behavior of
last-insert-id
Important
If you insert multiple rows using a single INSERT statement, LAST_INSERT_ID() returns the value generated for the first inserted row only. The reason for this is to make it possible to reproduce easily the same INSERT statement against some other server.
u can try use many insert and take it ids or after save, try use $data->id should be the last id inserted.
If you are using INNODB, which supports transaction, then you can easily solve this problem.
There are multiple ways that you can solve this problem.
Let's say that there's a table called Users which have 2 columns id, name and table references to User model.
Solution 1
Your data looks like
$data = [['name' => 'John'], ['name' => 'Sam'], ['name' => 'Robert']]; // this will insert 3 rows
Let's say that the last id on the table was 600. You can insert multiple rows into the table like this
DB::begintransaction();
User::insert($data); // remember: $data is array of associative array. Not just a single assoc array.
$startID = DB::select('select last_insert_id() as id'); // returns an array that has only one item in it
$startID = $startID[0]->id; // This will return 601
$lastID = $startID + count($data) - 1; // this will return 603
DB::commit();
Now, you know the rows are between the range of 601 and 603
Make sure to import the DB facade at the top using this
use Illuminate\Support\Facades\DB;
Solution 2
This solution requires that you've a varchar or some sort of text field
$randomstring = Str::random(8);
$data = [['name' => "John$randomstring"], ['name' => "Sam$randomstring"]];
You get the idea here. You add that random string to a varchar or text field.
Now insert the rows like this
DB::beginTransaction();
User::insert($data);
// this will return the last inserted ids
$lastInsertedIds = User::where('name', 'like', '%' . $randomstring)
->select('id')
->get()
->pluck('id')
->toArray();
// now you can update that row to the original value that you actually wanted
User::whereIn('id', $lastInsertedIds)
->update(['name' => DB::raw("replace(name, '$randomstring', '')")]);
DB::commit();
Now you know what are the rows that were inserted.
As user Xrymz suggested, DB::raw('LAST_INSERT_ID();') returns the first.
According to Schema api insertGetId() accepts array
public int insertGetId(array $values, string $sequence = null)
So you have to be able to do
DB::table('table')->insertGetId($arrayValues);
Thats speaking, if using MySQL, you could retrive the first id by this and calculate the rest. There is also a DB::getPdo()->lastInsertId(); function, that could help.
Or if it returened the last id with some of this methods, you can calculate it back to the first inserted too.
EDIT
According to comments, my suggestions may be wrong.
Regarding the question of 'what if row is inserted by another user inbetween', it depends on the store engine. If engine with table level locking (MyISAM, MEMORY, and MERGE) is used, then the question is irrevelant, since thete cannot be two simultaneous writes to the table.
If row-level locking engine is used (InnoDB), then, another possibility might be to just insert the data, and then retrieve all the rows by some known field with whereIn() method, or figure out the table level locking.
$result = Invoice::create($data);
if ($result) {
$id = $result->id;
it worked for me
Note: Laravel version 9

Incrementing a Table Column, and Getting the New Value using ActiveRecord in CodeIgniter?

I have something similar to the following:
$this->db->set('val', 'val+1', FALSE);
$this->db->where('id', $id);
$query = $this->db->update(TABLE_NAME);
How can I then go, and get the new value of 'val'? Do I have to do another select, or can I get it off of $query?
I've some experience with CodeIgniter and as far as I know you have to requery again.
The update only returns the number of rows affected.
the update query doesnt return anything. if you want to get the value of $val you'll have to run another query
you could run it before running this one, pass the value of val as a parameter to your query next to $id and then return that value + 1.

Categories