I have a logging database where I archive every week new tables like :
log_20170823
log_20170816
log_20170809
log_20170802
log_20170726
How can I easily merge all these tables into 1 table in another server (archived) for query simplicity. The current procedures all uses the "log" table.
I mean, I know I can use "UNION" but I want this to be dynamic, because the archiving of the table itself is dynamic, meaning I won't know the name of the archived tables necessarily.
So far I was thinking on doing a
SELECT *
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE='BASE TABLE'
AND TABLE_NAME LIKE 'log_%'
And then save result into an array, foreach the array and build a dynamic string with the names of the tables I'd need to join/union/merge.
Is there any other way directly in mySQL to do so?
My way seams sketchy and I'm worrying about the performance of it.
Thanks in advance.
Related
I have a performance and best-practice question concerning mysql tables.
I´m working on an application which connects to a database which gets filled by other programms.
This system is deployed in differnet locations, and from location to location the name of some databases-tables can change (but the fields in this tables etc stay the same).
As I don´t want to change all sql querys in my application for every location, I thought about creating a mysql view which simply mirrors the contents of this table to the normaly used table-name.
Is this a suitable solution, or could it get awfully slow with big tables?
Simple views (created as SELECT * FROM table) behave like the specified table performance wise.
It should be a suitable solution for your case.
mmm, this is tricky. If there are multiple tables then a quick and dirty version for this would be something like
SELECT * FROM (SELECT * FROM table1
Union
SELECT * FROM table2
Union
SELECT * FROM table3) t
Which I think will work. You will of course have problems with pagination, sorting and searching - because you will have to try and do this over 3 or more tables.
Another way would be this
Create a table with the table names and a counter
ImportTable
name
id
Now in this you can enter the names of the tables and the last id that you want to import from.
Create another table to import the records
TableRecords
source
id
field1
field2
etc
Now run something that goes through the tables in ImportTable grabs any new records and shoves them into `TableRecords.
Now this becomes really simply you can query TableRecords and have pagination sorting and searching with no of the previous troubles.
Make something that runs this every 2 minutes say so TableRecords will be 2 mins behind but everything will be really easy and run like a dream.
I have a trivial question. Im using PHP+MySQL managing a huge DB
I want to search in a entire table a keyword I write in a input.
The problem is that the main table have +100 columns, so I had to write the php query manually
[...]
$sql="select *
from db
where ID LIKE '%".$q."%' or USER_ID LIKE '%".$q."%' or Phone_ID LIKE '%".$q."%' or
Fax_ID LIKE '%".$q."%' or email_ID LIKE '%".$q."%' or [...]
And this is a chaos when I modify a column, or add/remove...
Exist any other way to make this search? If not, I tought about create a separate PHP function, that obtains all column header names, and create an auto-fill function inside.
I tried to look for info with no success
https://stackoverflow.com/search?q=search+entire+table
Unfortunately there isnt any simple way to do this.
One option is to select all columns in table, fetch them as array and iterate over them and build your WHERE clause.
select column_name from information_schema.columns
where table_name = 'TableName'
This will make whole script slower, if you want to go this way i would recommend you to use some caching.
You could get the column info for the 'main table' using info from the information schema. Here are some methods for using MySQL. Here is how to do it using PHP.
You can do a SHOW COLUMNS on the table, then loop over the Field to get all the column names in the table, at least that way you don't have a hand-coded mess to deal with.
EDIT:Well I guess I should asked then before this question, would it be better to have a database full of tables(college names) that stores numbers than can be sorted in ascending order, or have a database with one table and select all the rows with the same "college name" and then sort the data from those rows after?
"
Is it possible to add a table in a database like...
CREATE TABLE table_name
(
column_name1 data_type,
column_name2 data_type,
column_name3 data_type,
....
)
...but call from a webpage instead of adding a table through mysql? So make a table in a database from code on my website?"
Yes you can send SQL queries through PHP.
Here is a resource that shows just what you're looking for I think
PHP MySQL Create Database and Tables
edit:
It depends on what you're doing, but I agree with the above comments that creating a table on page view is in most cases the wrong move.
If they all have the same basic structure I would put them all in the same table, and you can index the "college name" column. Reading from the database even with many many rows will still be quick, and if you decide to change something later you won't have to change X amount of tables.
You can also retrieve sorted results
SELECT * FROM Colleges WHERE name = 'University of Wisconsin' ORDER BY student_count ASC
I'm using MySQL v.5.0.77 and I was wondering if it is possible to count the number of tables in a database. I have done a lot of research and am unable to find a way to do this that is not depreciated.
For each user that signs up I had generated a table in a database for them. And now I am trying to get a live count of those tables, if that makes sense. Is this a logical way of storing user information? I am unable to programmatically create entire databases on my server.
You can do a query to information_schema.tables
SELECT COUNT(*) FROM information_schema.TABLES
WHERE TABLE_SCHEMA='$database_name';
You could do it like this:
SELECT COUNT(*) FROM information_schema.tables WHERE table_schema='YOUR_DB_NAME_HERE'
http://dev.mysql.com/doc/refman/5.0/en/information-schema.html
For each user that signs up I had generated a table in a database for them. And now I am trying to get a live count of those tables, if that makes sense. Is this a logical way of storing user information?
If you're creating a separate table for each and every user then probably not. There are better ways to store the data and relationships, which can be more complicated to learn but will provide far more flexibility, abilities, and scalability later on.
I think you're probably trying to reproduce the functionality of the database in your programming e.g. getting a list of users would require you to run a query on every single user table.
I recommend you take a look at database design and database normalization (try to focus on the concept of how best to store data first without getting bogged down in the specifics).
I don't know if this is slower than ajreal's answer but I've been using :
$sql = "SHOW TABLES";
$res = mysql_query($sql);
$num_tables = mysql_num_rows($res);
Open Terminal, and type use myDB, then show tables;
It will gives you total tables name and last line as total tables count 47 rows in set (0.00 sec)
Or type below query
SELECT COUNT(*) FROM information_schema.tables WHERE table_schema='myDB'
I have an array of data :
$ary = new array(
array("domain"=>"domain1", "username"=>"username1"),
array("domain"=>"domain1", "username"=>"username2"),
array("domain"=>"domain2", "username"=>"username3"),
);
I need to use this data to retrieve data from a MySql database with the following table structure (simplified for illustration).
domain_table
domain_id
domain
user_table
user_id
user
stuff_table
stuff_id
... details
link_table
link_id
user_id -- The user we are searching for connections on
connected_user_id -- from the array above
stuff_id
I need to fetch every row in the stuff table for a single user_id that also has a connected_user_id from the array.
I'm using PDO
There may be hundreds (possibly thousands) of entries in $ary.
I could generate a very large query by looping thorugh the array and adding loads of joins.
I could perform a single query for each row in $ary.
I could create a temporary table with $ary and use a join.
Something else?
What is the best way - fastest processor time without being too arcane - to achieve this?
perform a single query for each row - bed way because of small speed
many joins better then 1.
if it is possible - make view & use it.
If your entire dataset is HUGE and doesn't fit in memory, joins shouldn't be your choice.
Do sequential selects. Select rows from your link_table, gather user_id's out of the result in PHP. Then select rows from user_table using "where user_id in (?)". Handle grouping of results in PHP.
Even with large tables selects by key will be fast. And having 2-5 selects instead of 1 select is not a problem.
Joins will be fast while your DB fits into RAM. Then problems arise.