I am wondering if it is possible to automate or by button press to move mysql table information from one table to another table deleting it from the first table and putting it in another table? Using php.
My mysql table is big and the page that adds the information to that table has 70 query's on it which slows the page refresh times. I need to move information from the first table to the second at a certain time of day everyday so that those querys don't have to look through all of my giant 27k row table.
Is this possible?
Also if someone could help me with my comment on this page I would be grateful.
link text
PHP doesn't have a constantly running server you can schedule background tasks with.
If you have access to the server you can set up a cron job (or scheduled task under windows) to run the PHP script for you.
Or (and this isnt so nice) you can put the script on the webserver and call it manually at the appropriate time by entering the URL in your browser.
A 27k row table is small by SQL standards, as long as it is properly indexed.
For instance, if you don't care about data from yesterday, you can add an indexed date column and filter with WHERE myDate > NOW() - INTERVAL 1 DAY, and SQL will automatically restrict the query to the rows younger than 24 hours.
I am wondering if it is possible to automate or by button press to move mysql table information from one table to another table deleting it from the first table and putting it in another table? Using php.
You can initiate it from PHP, but what you ask is effectively MySQL's domain.
It can be accomplished in two statements:
Use an INSERT INTO statement to copy the rows from the old table to the new one
Delete the old table
My preference would be that this occurs in a stored procedure for sake of a transaction and ease of execution (in case you want it initiated by CRON/etc) because it would be easier to call one thing vs a couple or more.
27k is not very big table and MySQL should work ok with that. Do you have all the required indexes? Did you used EXPLAIN on your slow queries?
As for the question about moving data from one table to another - create a php script that will be run by CRON and will move rows one by one. What's the problem here?
Related
I would like to back up one of my database tables, abc using PHP.
One of the column is timestamp. I would like to back up the table that is 3 months old into filename.sql.gz and delete the data from the table. Keep latest 3 months data on the table.
If possible the output file only has the INSERT query.
You first need to go through the PHP+database integration tutorials. You can achieve above in mysql query itself. You can use PHP as well...you have to write your own code for this simple task. following are the steps that I will follow in completing this tasks, you can modify the steps according to your requirements...
you have not mentioned which database you are using, am assuming its Mysql/mariadb
Connect database (proper access required), some PHP file management knowledge also necessary
Based on timestamp, you should write a query which can pull the old data and write it to a .sql file (plenty of questions have already been asked on this topic in stackoverflow)
On success of STEP 2, you can perform DELETE operation
While doing this activity, make sure that no other process/queries are running on this database/table.
Script must include TRANSACTION START/BEGIN, COMMIT and ROLLBACK
If you want output file in INSERT query then BATCH INSERT is required. Refer this to get an idea how batch query looks like (How to do a batch insert in MySQL)
You can write a custom PHP function to generate BATCH implementation.
No need to touch the data that you want to keep, will remain as it is
I have a php page with lot of queries and takes 2-3 minutes to load.
The database to which queries are hiting is updated once a month (1st
of every month).
The page hits different schemas of database depending on user selection (User selects options from select box depending on which different schema's of db are used).
So, if the database is updated once a month, means after that same result per input selection are displayed till next month.
Is there any solution (caching etc.) so that if any of the client has generated that page, it is used for remaning days instead of hitting queries again ?
Sounds like you might be able to just write the output to a file and just have a seperate command line script that generates the reports once a month.
A example of someone doing this in php is at https://www.sanwebe.com/2013/09/php-cache-dynamic-pages-speed-up-load-times, although I'm sure there are lots of others out there :)
2-3 minutes it's too long, this means your database is not optimized. You should make right indexes for your tables, need to analyze your queries and according to it create indexes for tables.
Also you can create temporary view which will contain data from your most frequent queries. Refresh it once per month or every big update of your data in DB.
Create cron task which will be executed once per month to recreated updated view. Then make queries from that generated view.
Also you can think about replication if there are a lot of select queries, try to separate them between several servers to reduce the load.
Google unfortunately didn't seem to have the answers I wanted. I currently own a small search engine website for specific content using PHP GET.
I want to add a latest searches page, meaning to have each search recorded, saved, and then displayed on another page, with the "most searched" at the top, or even the "latest search" at the top.
In short: Store my latest searches in a MySQL database (or anything that'll work), and display them on a page afterwards.
I'm guessing this would best be accomplished with MySQL, and then I'd like to output it in to PHP.
Any help is greatly appreciated.
Recent searches could be abused easily. All I have to do is to go onto your site and search for "your site sucks" or worse and they've essentially defaced your site. I'd really think about adding that feature.
In terms of building the most popular searches and scaling it nicely I'd recommend:
Log queries somewhere. Could be a MySQL db table but a logfile would be more sensible as it's a log.
Run a script/job periodically to extract/group data from the log
Have that periodic script job populate some table with the most popular searches
I like this approach because:
A backend script does all of the hard work - there's no GROUP BY, etc made by user requests
You can introduce filtering or any other logic to the backend script and it doesn't effect user requests
You don't ever need to put big volumes of data into the database
Create a database, create a table (for example recent_searches) and fields such as query (the query searched) and timestamp (unix timestamp that the query was made) said, then for your script your MySQL query will be something like:
SELECT * FROM `recent_searches` ORDER BY `timestamp` DESC LIMIT 0, 5
This should return the 5 most recent searches, with the most recent one appearing first.
Create table (something named like latest_searches) with fields query, searched_count, results_count.
Then after each search (if results_count>0), check, if this search query exists in that table. And update or insert new line into table.
And on some page you can just use data from this table.
It's pretty simple.
Ok, your question is not yet clear. But I'm guessing that you mean you want to READ the latest results first.
To achieve this, follow these steps:
When storing the results use an extra field to hold DATETIME. So your insert query will look like this:
Insert into Table (SearchItem, When) Values ($strSearchItem, Now() )
When retrieving, make sure you include an order by like this:
Select * from Table Order by When Desc
I hope this is what you meant to do :)
You simply store the link and name of the link/search in MySQL and then add a timestamp to record what time sb searched for them. Then you pull them out of the DB ordered by the timestamp and display them on the website with PHP.
Create a table with three rows: search link timestamp.
Then write a PHP script to insert rows when needed (this is done when the user actually searches)
Your main page where you want stuff to be displayed simply gets the data back out and puts them into a link container $nameOfWebsite
It's probably best to use a for/while loop to do step 3
You could additionally add sth like a counter to know what searches are the most popular / this would be another field in MySQL and you just keep updating it (increasing it by one, but limited to the IP)
Is it possible to queue client requests for accessing database in MySQL. I am trying to do this for concurrency management. MySQL Locks can be used but somehow I am not able to get the desired outcome.
Effectively what I am trying to do is:
INSERT something in a new row
SELECT a column from that row
Store that value in a variable
The issue comes up when two different clients INSERT at the same time, thus variables for both clients store the value of the last INSERT.
I worked the following alternative, but it failed in a few test runs, and the bug is quite evident:
INSERT
LOCK Table
SELECT
Store
UNLOCK
Thanks!
My best guess is that you have an auto-increment column and want to get its value after inserting a row. One option is to use LAST_INSERT_ID() (details here and here).
If this is not applicable, then please post some more details. What exactly are you trying to do and what queries are being fired?
I am writing a PHP/MySQL application (using CodeIgniter) that uses some jQuery functionality for dragging table rows. I have a table in which the user can drag rows to the desired order (kind of a queue for which I need to preserve the rank of each row). I've been trying to figure out how to (and whether I should) update the database each time the user drops a row, in order to simplify the UI and avoid a "Save" button.
I have the jQuery working and can send a serialized list back to the server onDrop, but is it good design practice to run an update query this often? The table will usually have 30-40 rows max, but if the user drags row 1 far down the list, then potentially all the rows would need to be updated to update the rank field.
I've been wondering whether to send a giant query to the server, to loop through the rows in PHP and update each row with its own Update query, to send a small serialized list to a stored procedure to let the server do all the work, or perhaps a better method I haven't considered. I've read that stored procedures in MySQL are not very efficient and use a separate process for each call. Any advice as to the right solution here? Thanks very much for your help!
Any question that includes "The table will usually have 30-40 rows max" ends with "Do whatever you want to it." I can't imagine an operation, however frequently it's performed, that would have any appreciable performance impact on a table that tiny.
The only real question is what the visitor will be doing while your request is going to and returning from the server. Will they be locked out of making other changes? If not, make sure you have a mechanism to ensure that the most recent change is the one that's really taken effect. (It's possible for requests to reach the server out of order, and you wouldn't want an outdated request to get saved as the final state.)