batch delete in codeigniter - php

insert_batch() is a codeigniter build-in function which inserts 100 data of rows at a time. That's why it is so much faster for inserting large amount of data.
Now I want to delete large number of data like insert_batch() function does.
Is their any way to do it?
Already I am using where_in function but it is not that much faster like insert_batch() and that's why timeout error occur often.
I want to know specially that can i make a function like insert_batch() or insert_update() in codeigniter system/database/db_query_builder ?
If I can how to do it. or any other suggestion please ?

If we are talking about alot of rows, it may be easier to perform a table 'switch' by inserting and dropping the the original table.
However one drawback to this will mean any Auto Increment IDs will be lost.
<?php
// Create a Copy of the Same Data structure
$this->db->query( "CREATE TABLE IF NOT EXISTS `TB_TMP` .. " );
// Select the data you wish to -Keep- by excluding the rows by some condition.
// E.g. Something that is no longer active.
$recordsToKeep =
$this->db
->select( "*" )
->get_where( "TB_LIVETABLE", [ "ACTIVE" => 1 ])
->result();
// Insert the Excluded Rows.
$this->db->insert_batch( "TB_TMP", $recordsToKeep );
// Perform the Switch
$this->db->query( "RENAME TABLE TB_LIVETABLE TO TB_OLDTABLE" );
$this->db->query( "RENAME TABLE TB_TMP TO TB_LIVETABLE " );
$this->db->query( "DROP TABLE TB_OLDTABLE" );

Related

function render makes website 500% slow! can anyone fix that please?

Function render makes website 500% slow! Can anyone fix that please ?
Someone told me :
because it sends a database request on each iteration of the loop (it's not the only problem with this chunk of code but it's the most taxing one)
Yes I understand what that means. His way is:
you need to get all of the data before you start building the menu,
then you just insert the data instead of requesting more data on each
iteration
But i don't know how i must do it!
<?php
$menu_html='';
function render_menu($parent_id,$actmenuid)
{
$obj = new Database();
$con = $obj->dbconnectt();
global $menu_html;
$result=mysqli_query($con, "select * from tbl_menu where parent_id='$parent_id'");
if(mysqli_num_rows($result)==0) return;
if($parent_id==0){
$menu_html.='<ul class="topnav">';
}else{
$menu_html.='<ul>';
}
while($row=mysqli_fetch_array($result)) {
$childnum = $obj->recordcount("SELECT * FROM tbl_menu WHERE parent_id='".$row['id']."'");
if($childnum == 0){
$linkvalue='/category/'.$row['id'].'.html';
} else{
$linkvalue='#';
}
if($row['id']==$actmenuid && $actmenuid !=NULL){
$actv='class="active"';
}else{
$actv='';
}
$menu_html.='<li '.$actv.'>'.$row['title'].'';
render_menu($row['id'],$actmenuid);
$menu_html.='</li>';
}
$menu_html.='</ul>';return $menu_html;
}
if($isDsh==false){
echo render_menu(0,$actmenuid);
}
?>
Depending on how many records you have, try removing this query from inside the loop since it's running for every record on the first query.
$childnum = $obj->recordcount("SELECT * FROM tbl_menu WHERE parent_id='".$row['id']."'");
Change it a single query like this where it returns counts for each parent idea, and place it outside of the loop:
$parentcount = mysqli_query($con, ("SELECT parent_id, count(*) FROM tbl_menu GROUP BY parent_id");
There may be other issues, so please post the database structure and number of records that you're working with too.
Don't make recursive queries.
Having "more than 1000" rows is not too big. You can simply call everything from the table into php, then perform the recursive html build in php this will have a memory overhead, but far less processing overhead because you only ever make one trip to the db.
Alternatively (when your db table is prohibitively large), you should avoid gathering rows unnecessarily by adding a new column. The new column will store all "descendants" for the respective row when the row is INSERTed or update it when it is UPDATEd. Then you only need to reference this column when needing to call specific rows. In other words, do the recursive processing only once (when writing to the db) AND not when needing to display the data. This will, again, produce a finite result set in one query which can then be recursively traversed to build the desired output.
basically you need to do what #spudly has suggested.
But there is a small catch in his solution which depending on the number of the rows in yous tbl_menu table you may use a big chunk of memory to fetch all the records.
you can optimise it more with using his solution but changing the query to:
select
parent_tbl_menu.id,
count(child_tbl_menu.id) as cnt
from
tbl_menu as parent_tbl_menu
left join
tbl_menu as child_tbl_menu
on parent_tbl_menu.id = child_tbl_menu.parent_id
where
parent_tbl_menu.parent_id = ?
group by
parent_tbl_menu.id
This way you will only fetch the child records of a specific parent.
And please consider using prepared statements as your code has sql injection vulnerability.
Connect (from PHP to MySQL) only once for the entire web page.
Don't put a SELECT inside a loop if you can do all the work in a single SELECT, such as with a JOIN. (Exception: A "hierarchical" table needs the nested SELECT. Exception to the exception: MySQL 8.0 and MariaDB 10.2 can do it with a "recursive CTE".)
Don't fetch all the columns (SELECT *) when all you want it is a recordcount. Instead, SELECT COUNT(*) ... and use the number returned.
1000 of anything is probably excessive for a web page. Re-think the UI.

Laravel Query Builder returning empty rows from 100k+ rows

My users table has over 100K rows. I need to fetch them all for export purpose. But, when I use following code, it doesn't return anything. But, if I apply limit then it returns them. It basically returns 10K rows and when I provide 20K on limit, it doesn't return anything.If I use mysqli_query, it returns all well in the same server and DB.
$myRows = DB::table('users')->get();//returns empty
$myRows = DB::table('users')->take(10000);//returns 10,000 rows
$myRows = DB::table('users')->take(20000);//returns empty
I am new to Laravel.
Thanks in advance.
I think chunking (scroll to "Chunking Results From A Table" point) would be good option here.
DB::table('users')->chunk(10000, function($users) {
//some connect code
});

Laravel - multi-insert rows and retrieve ids

I'm using Laravel 4, and I need to insert some rows into a MySQL table, and I need to get their inserted IDs back.
For a single row, I can use ->insertGetId(), however it has no support for multiple rows. If I could at least retrieve the ID of the first row, as plain MySQL does, would be enough to figure out the other ones.
It's mysql behavior of
last-insert-id
Important
If you insert multiple rows using a single INSERT statement, LAST_INSERT_ID() returns the value generated for the first inserted row only. The reason for this is to make it possible to reproduce easily the same INSERT statement against some other server.
u can try use many insert and take it ids or after save, try use $data->id should be the last id inserted.
If you are using INNODB, which supports transaction, then you can easily solve this problem.
There are multiple ways that you can solve this problem.
Let's say that there's a table called Users which have 2 columns id, name and table references to User model.
Solution 1
Your data looks like
$data = [['name' => 'John'], ['name' => 'Sam'], ['name' => 'Robert']]; // this will insert 3 rows
Let's say that the last id on the table was 600. You can insert multiple rows into the table like this
DB::begintransaction();
User::insert($data); // remember: $data is array of associative array. Not just a single assoc array.
$startID = DB::select('select last_insert_id() as id'); // returns an array that has only one item in it
$startID = $startID[0]->id; // This will return 601
$lastID = $startID + count($data) - 1; // this will return 603
DB::commit();
Now, you know the rows are between the range of 601 and 603
Make sure to import the DB facade at the top using this
use Illuminate\Support\Facades\DB;
Solution 2
This solution requires that you've a varchar or some sort of text field
$randomstring = Str::random(8);
$data = [['name' => "John$randomstring"], ['name' => "Sam$randomstring"]];
You get the idea here. You add that random string to a varchar or text field.
Now insert the rows like this
DB::beginTransaction();
User::insert($data);
// this will return the last inserted ids
$lastInsertedIds = User::where('name', 'like', '%' . $randomstring)
->select('id')
->get()
->pluck('id')
->toArray();
// now you can update that row to the original value that you actually wanted
User::whereIn('id', $lastInsertedIds)
->update(['name' => DB::raw("replace(name, '$randomstring', '')")]);
DB::commit();
Now you know what are the rows that were inserted.
As user Xrymz suggested, DB::raw('LAST_INSERT_ID();') returns the first.
According to Schema api insertGetId() accepts array
public int insertGetId(array $values, string $sequence = null)
So you have to be able to do
DB::table('table')->insertGetId($arrayValues);
Thats speaking, if using MySQL, you could retrive the first id by this and calculate the rest. There is also a DB::getPdo()->lastInsertId(); function, that could help.
Or if it returened the last id with some of this methods, you can calculate it back to the first inserted too.
EDIT
According to comments, my suggestions may be wrong.
Regarding the question of 'what if row is inserted by another user inbetween', it depends on the store engine. If engine with table level locking (MyISAM, MEMORY, and MERGE) is used, then the question is irrevelant, since thete cannot be two simultaneous writes to the table.
If row-level locking engine is used (InnoDB), then, another possibility might be to just insert the data, and then retrieve all the rows by some known field with whereIn() method, or figure out the table level locking.
$result = Invoice::create($data);
if ($result) {
$id = $result->id;
it worked for me
Note: Laravel version 9

Codeigniter, join of two tables with a WHERE clause

I've this code:
public function getAllAccess(){
$this->db->select('accesscode');
$this->db->where(array('chain_code' => '123');
$this->db->order_by('dateandtime', 'desc');
$this->db->limit($this->config->item('access_limit'));
return $this->db->get('accesstable')->result();
}
I need to join it with another table (codenamed table), I've to tell it this. Not really a literal query but what I want to achieve:
SELECT * accesscode, dateandtime FROM access table WHERE chain_code = '123' AND codenames.accselect_lista != 0
So basically accesstable has a column code which is a number, let us say 33, this number is also present in the codenames table; in this last table there is a field accselect_lista.
So I have to select only the accselect_lista != 0 and from there get the corrisponding accesstable rows where codenames are the ones selected in the codenames.
Looking for this?
SELECT *
FROM access_table a INNER JOIN codenames c ON
a.chain_code = c.chain_code
WHERE a.chain_code = '123' AND
c.accselect_lista != 0
It will bring up all columns from both tables for the specified criteria. The table and column names need to be exact, obviously.
Good start! But I think you might be getting a few techniques mixed up here.
Firstly, there are two main ways to run multiple where queries. You can use an associative array (like you've started to do there).
$this->db->where(array('accesstable.chain_code' => '123', 'codenames.accselect_lista !=' => 0));
Note that I've appended the table name to each column. Also notice that you can add alternative operators if you include them in the same block as the column name.
Alternatively you can give each their own line. I prefer this method because I think its a bit easier to read. Both will accomplish the same thing.
$this->db->where('accesstable.chain_code', '123');
$this->db->where('codenames.accselect_lista !=', 0);
Active record will format the query with 'and' etc on its own.
The easiest way to add the join is to use from with join.
$this->db->from('accesstable');
$this->db->join('codenames', 'codenames.accselect_lista = accesstable.code');
When using from, you don't need to include the table name in get, so to run the query you can now just use something like:
$query = $this->db->get();
return $query->result();
Check out Codeigniter's Active Record documentation if you haven't already, it goes into a lot more detail with lots of examples.

Multiple(sharding ) table in a Model CakePHP

There are many tables as the following:
table_2010
table_2009
table_2008
table_2007
.
.
Using MySQL 4 + PHP5 + CakePHP 1.3
My Question is
How to treat these tables in a model?
I wanna treat like this
Table->find('all',"2010",array("conditions"=>""));
I agree with Nik -- unless you're sharding for performance reasons, I would combine all of your tables into one table, with a column for the year (if you make it an INT, it won't affect performance much).
However, if you need to shard your tables, I'd recommend that you just override the Model::find() method to accept additional parameters. In your model, write something like the pseudocode below:
function find( $type, $options = array() ) {
if( isset( $options['table'] ) ) { // this is the index where you'll pass your table name
$this->setSource( $options['table'];
}
return parent::find( $type, $options );
}
Basically the call to setSource will change your table that you are querying, at runtime. See Can a CakePHP model change its table without being re-instantiated? for more information.
For me the smarter way is to use one table - posts for example and in that table to have a special column called year. So the find will be something like:
$this->Post->find('all', array('year'=>2010));

Categories