I am having an issue querying a couple of MySQL VIEWS.
The MySQL tables are generated hourly from a process that pulls data from a central Oracle database. Once the update is complete, a query automatically updates the VIEWS. (I do believe this is standard. I could be wrong).
At this point, I am using PHP to query said VIEWS to return data to the page. This is where my problem begins.
When the query begins, it can take well over 30 seconds to return 4 records. The VIEW contains no more than 30K records. But when I try to return a larger data set to the page, it now causes the server to freeze.
The query from my PHP script is very simple. It is looking like the following:
SELECT
DISCHARGE_ETA,
TERMINAL_DISCH,
IMPORT_RAMP,
ORIG_RAMP_FINAL_DEST_ETA,
POOL_LOCATION_FROM_IMP,
POOL_LOCATION_TO_IMP,
ROAD_HAULIER_IMP
// few more columns
FROM
view1
WHERE
" . $_SESSION['where'] . ";
I am using jQuery to send parameters over to the PHP script. PHP then builds the query by plugging all the parameters into the WHERE clause. The data is then returned via JSON.
The query that builds the actual VIEW in MySQL is a little more intense (currently not shown here).
My question is: are VIEWS the best route in retrieving the data I need?
I was thinking it would seem easier to query a TABLE instead of a VIEW. Perhaps the same hourly process that is updating the VIEWS could instead update a single TABLE. That way, I could query the TABLE instead of the VIEW.
Would this be the best bet? If not, are there any alternatives?
Edit
Here is the results from EXPLAIN
Related
Right now I am learning to write an android api in php for some CRUD operation.
I am able fetch data from the database without any issue.
Now I want to fetch 200, 500 and 600 records from three different tables.
At the end I am going to show it the UI by grouping them at their appropriate position.
My question is, should I write multiple php files for fetching the records from each table and send it back to the user separately (obviously I have call the api separately one after the another from the app after getting the response of each of the previous call).
OR
Only one php file where I will fetch all the records from the three tables and send it back to the user in one shot.
I am returning the data in json format.
Please help in figuring out which one of the above method should I use and its advantage or disadvantage if any..
I need to do some calculation each time when a user click "View Statistics" link. The calculation is based on a query. It's too complicated for me to figure out how to re-group the query result I posted a question about this before . Now I'm thinking to create a view of MERGE type first. Then, every time when need to show statistics, I can just query from this view.
I have little knowledge about mysql. Not sure if this is the normal way to do the statistics calculation.
When does the view get updated? Does it keeps updating whenever a new entry related to this View is posted? Or, it updates only when being queried?
Should I delete the view every time after calculation?
You could use stored procedures as an alternative to views. This way you know that your data is freshly grabbed from your tables.
A view is not updated when data is inserted, it is simply a predefined query you can use.
If you have a need for complex calculations to create statistics, maybe create one or more triggers to help you to maintain a statistics table
We've been prototyping a search results system for a mySQL database with about 2 million names and addresses and 3 million associated subscription and conference attendance records.
At the moment the search is executed and all results returned - for each result I then execute a second query to look up subscriptions / conferences for the person's unique ID. I've got indexes on all the important columns and the individual queries execute quite quickly in phpMyAdmin (0.0xxx seconds) but feed this into a webpage to display (PHP, paged using DataTables) and the page takes seconds to render. We've tried porting the data to a Lucene database and it's like LIGHTNING but the bottleneck still seems to be displayng the results rather than retrieving them.
I guess this would be due to the overhead of building, serving and rendering the page in browser. I think I can remove the subquery I mention above by doing GROUP_CONCAT to get the subscription codes in the original query, but how can I speed up the display of the page with the results on?
I'm thinking little and often querying with AJAX / server side paging might be the way to go here (maybe get 50 results, the query is smaller, the page is smaller and can be served quicker) but I welcome any suggestions you guys might have.
Even if you are using pagination with Datatables, all the results are loaded into the page source code at first although you are using the server side feature.
Loading 2 million rows at once will always render slowly. You have to go for server side pagination, it can be by AJAX or by a normal PHP script.
You can also consider using a cache system to speed up the loading of data from the server and avoiding calling the database when it is not needed. If your data can be changing randomly in time, you can always use a function to check whether or not the data has changed since the last time you cached the data and if so, updating the cached data.
I need to have a button to fire an action to copy all records from a defined client from one database to another with php.
The template database has 12 tables (diferent rows on each) but all with the row client_id to make the WHERE clausule work properly.
The question is, how do I do this?
Thanks,
Pluda
Since PHP is a Server-side programming language, you can't copy something from the client. You can however upload Data (like XML), parse it and then insert it into your MySQL Database.
If you want to copy records from one to another database, you might want to read from the Database and save them in a format like SQL. Then, you could send those querys to the second Database.
An advise at this point: If you need to make the same Query (with different values) over and over again, you should use a PreparedStatement. It will be compiled in the Database and then just filled out with new values. This is way faster then using an Insert every time.
I am wondering if it is possible to automate or by button press to move mysql table information from one table to another table deleting it from the first table and putting it in another table? Using php.
My mysql table is big and the page that adds the information to that table has 70 query's on it which slows the page refresh times. I need to move information from the first table to the second at a certain time of day everyday so that those querys don't have to look through all of my giant 27k row table.
Is this possible?
Also if someone could help me with my comment on this page I would be grateful.
link text
PHP doesn't have a constantly running server you can schedule background tasks with.
If you have access to the server you can set up a cron job (or scheduled task under windows) to run the PHP script for you.
Or (and this isnt so nice) you can put the script on the webserver and call it manually at the appropriate time by entering the URL in your browser.
A 27k row table is small by SQL standards, as long as it is properly indexed.
For instance, if you don't care about data from yesterday, you can add an indexed date column and filter with WHERE myDate > NOW() - INTERVAL 1 DAY, and SQL will automatically restrict the query to the rows younger than 24 hours.
I am wondering if it is possible to automate or by button press to move mysql table information from one table to another table deleting it from the first table and putting it in another table? Using php.
You can initiate it from PHP, but what you ask is effectively MySQL's domain.
It can be accomplished in two statements:
Use an INSERT INTO statement to copy the rows from the old table to the new one
Delete the old table
My preference would be that this occurs in a stored procedure for sake of a transaction and ease of execution (in case you want it initiated by CRON/etc) because it would be easier to call one thing vs a couple or more.
27k is not very big table and MySQL should work ok with that. Do you have all the required indexes? Did you used EXPLAIN on your slow queries?
As for the question about moving data from one table to another - create a php script that will be run by CRON and will move rows one by one. What's the problem here?