Yii firing multiple queries instead of one - php

Trying to cut down on the number of queries on my site... Why would a single query run as multiple queries? Is there a way to fix this?
For example, from the following line of code (line 43)...
$model = Menu::model()->findAll();
We can see in my query log that 4 seperate queries were fired...
Or am I just reading this wrong?

Rows 1, 2 & 4 in your screenshot above are doing database queries.
ActiveRecord in Yii does SHOW COLUMNS FROM <table> and SHOW CREATE TABLE <table> before the query so it knows what columns / column types the table has. In production mode you can turn on schema caching to reduce these queries:
http://www.yiiframework.com/doc/blog/1.1/en/final.deployment#enabling-schema-caching

Related

CodeIgniter ActiveRecord query large tables causes MySQL to freeze

I'm trying to fetch all the customers and their remarks + products and treatments ... in one query.
The following query causes the MySQL database to crash on the server even though the following tables...
remarks_treatments
remarks_arrangements
remarks_products
I only have around 100.000 rows, which should be no problem for MySQL.
Here is a screenshot of the query.
I can't even print the...
$this->db->last_query()
... to paste it into PHPMyAdmin for optimisation/debugging because it causes the whole database and website to freeze.
Thanks in advance.

PHP multiple tables data pagination

In a database there are 2 tables - 10 records in each of them;
I want to retrieve all the data, so there will be 20 records in a response;
On each page should be only 5 records listed, so there will be 4 pages at the bottom to switch between;
When I retrieve data from database I must query each table in a separate query.
And there is a problem, because when I set limits and offset afted choosing a page number 4 there won't be records to retrieve from each tables, and each query will give me no data, because limit is 5 and offset is 15 then, but in each tables there is only 10 records;
Is it possible to solve such a problem? I'm not searching for implementation, only for text explanation;
I don't use any framework, just simple mysqli_query during db connection (it's a very old project)
Thanks for advices!
Its possible with UNION but that problem represent very bad database architecture and you should change it (merge into one table) before bigger problems comes
Have you tried using UNION and split each table query into it's own query?
Example:
([query_from_first_table]) UNION ([query_from_second_table]) LIMIT 5;

How to filter cached query in Laravel

Need help/advice with this concept. I have pretty complex fluent query which pulls rows according to users filters.
I was thinking of making unfiltered (only joins, without where/whereIns) query which would be cached, and then somehow filter that cached query according to users need.
There's 2-3 seconds lag when querying db each time form filter changes, so i'm guessing this can perform better.
Now unfiltered query is around 5k rows, and average filtered one brings 500-1000 rows.
Query is around 25 columns with 4 CONCATS, 3 CASE statements and 14 leftJoins.
Is that right way? Any other suggestions?
Thanks in advance!
Y
Maybe you can use sql view.
Or you can store your filtered data to another database table. And you can update it using a trigger automatically.
By the way you can filter your data fastly from database table using sql.
It will be like dbcache, but you will control it.

How to Improve Select Query Performance For Large Data in Mysql

Currently,I am working on one php project. for my project extension,i needed to add more data in mysql database.but,i had to add datas in only one particular table and the datas are added.now,that table size is 610.1 MB and number of rows is 34,91,534.one more thing 22 distinct record is in that table,one distinct record is having 17,00,000 of data and one more is having 8,00,000 of data.
After that i have been trying to run SELECT statement it is taking more time(6.890 sec) to execute.in that table possible number of columns is having index.even though it is taking more time.
I tried two things for fast retrieval process
1.stored procedure with possible table column index.
2.partitions.
Again,both also took more time to execute SELECT query against some distinct record which is having more number of rows.any one can you please suggest me better alternative for my problem or let me know, if i did any mistake earlier which i had tried.
When working with a large amount of rows like you do, you should be careful of heavy complex nested select statements. With each iteration of nested selects it uses more resources to get to the results you want.
If you are using something like:
SELECT DISTINCT column FROM table
WHERE condition
and it is still taking long to execute even if you have indexes and partitions going then it might be physical resources.
Tune your structure and then tune your code.
Hope this helps.

PHP-MySQL merge many query into one query to execution fast

I have PHP script to add new record and check this record in table1,table2 and table3 if record not exist than add it into table3 else update the record to table1 or table2 (where its exist).
I have large data to check. So its possible to perform this task using single MySQL query.
Thanks in advance.
Please keep in mind, that joining two large tables may be a lot slower than using 2 or 3 separate query to get data out of them one by one. The main question is what you consider huge. Joining millions of rows is never a good idea in MySQL AFAIK if you have large rows.
So while having it done in one query is definitely possible it may not be the economical thing to do.
We also need some info about row sizes, indexes, basic query syntax and stuff like that.

Categories