change values in mysql trigger with php? - php

i've created financialTrack table in mysql, to log inserted rows in financial table, and then create this trigger to doing it:
CREATE TRIGGER INS_after_financ
AFTER INSERT ON `financial` FOR EACH ROW
BEGIN
INSERT INTO `financialTrack` (user, changedValue) VALUES (NEW.user, NEW.Value);
END;
these are my tables structure :
TABLE NAME: financial
+--------------+--------------+-------+-------+
| Column | Type | Null | AI |
+--------------+--------------+-------+-------+
| id | int(10) | FALSE | TRUE |
| user | VARCHAR(40) | FALSE | |
| Value | BIGINT(12) | FALSE | |
+--------------+--------------+-------+-------+
TABLE NAME: financialTrack
+--------------+--------------+-------+-----------------+
| Column | Type | Null | Def.Value |
+--------------+--------------+-------+-----------------+
| user | VARCHAR(40) | FALSE | |
| changedValue | BIGINT(12) | FALSE | |
| ts | timestamp | FALSE |CURRENT_TIMESTAMP|
+--------------+--------------+-------+-----------------+
do you have any suggestion to fill user field in financialTrack table with PHP script and remove user column from financial table ?

There are several ways to approach this task, but this lecture will surely help you to learn the basics of handling database queries with PHP: http://php.net/manual/en/book.pdo.php.
PDO extension is currently quite popular and preferred over the other native mysql and mysqli extensions. You will find some other useful information by searching for PDO on stackoverflow.

Related

How to efficiently calculate averages from a big table?

I have a table called ratings with the following fields:
+-----------+------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-----------+------------+------+-----+---------+----------------+
| rating_id | bigint(20) | NO | PRI | NULL | auto_increment |
| user_id | int(11) | NO | MUL | NULL | |
| movie_id | int(11) | NO | | NULL | |
| rating | float | NO | | NULL | |
+-----------+------------+------+-----+---------+----------------+
Indexes on this table:
+---------+------------+----------+--------------+-------------+-----------+-------------+----------+--------+------+------------+---------+---------------+
| Table | Non_unique | Key_name | Seq_in_index | Column_name | Collation | Cardinality | Sub_part | Packed | Null | Index_type | Comment | Index_comment |
+---------+------------+----------+--------------+-------------+-----------+-------------+----------+--------+------+------------+---------+---------------+
| ratings | 0 | PRIMARY | 1 | rating_id | A | 100076 | NULL | NULL | | BTREE | | |
| ratings | 0 | user_id | 1 | user_id | A | 564 | NULL | NULL | | BTREE | | |
| ratings | 0 | user_id | 2 | movie_id | A | 100092 | NULL | NULL | | BTREE | | |
+---------+------------+----------+--------------+-------------+-----------+-------------+----------+--------+------+------------+---------+---------------+
I have another table called movie_average_ratings which has the following fields:
+----------------+---------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+----------------+---------+------+-----+---------+-------+
| movie_id | int(11) | NO | PRI | NULL | |
| average_rating | float | NO | | NULL | |
+----------------+---------+------+-----+---------+-------+
As it is obvious by this point I want to calculate the average rating of movies from ratings table and update the movie_average_ratingstable. I tried the following SQL query.
UPDATE movie_average_ratings
SET average_rating = (SELECT AVG(rating)
FROM ratings
WHERE ratings.movie_id = movie_average_ratings.movie_id);
Currently, there are around 10,000 movie records and 100,000 rating records and I get Lock wait timeout exceeded; try restarting transaction error. The number of records can grow significantly so I don't think increase timeout is a good solution.
So, how can I write 'scalable' query to acheive this? Is iterating the movie_average_ratings table records and calculate averages individually the most efficient solution to this?
Without an explain, it's hard to be clear on what's holding you up. It's also not clear that you will get a performance improvement by storing this aggregated data as a denormalized table - if the query to calculate the ratings executes in 0.04 seconds, it's unlikely querying your denormalized table will be much faster.
In general, I recommend only denormalizing if you know you have a performance problem.
But that's not the question.
I would do the following:
delete from movie_average_ratings;
insert into movie_average_ratings
Select movie_ID, avg(rating)
from ratings
group by movie_id;
I just found something in another post:
What is happening is, some other thread is holding a record lock on
some record (you're updating every record in the table!) for too long,
and your thread is being timed out.
This means that some of your records are locked you can force unlock them in the console:
1) Enter MySQL mysql -u your_user -p
2) Let's see the list of locked tables mysql> show open tables where in_use>0;
3) Let's see the list of the current processes, one of them is locking
your table(s) mysql> show processlist;
4) Kill one of these processes mysql> kill put_process_id_here;
You could redesign the movie_average_ratings table to
movie_id (int)
sum_of_ratings (int)
num_of_ratings (int)
Then, if a new rating is added you can add it to movie_average_ratings and calculate the average if needed

Logging Application Activities on PHP and MySQL

I'm trying to make a small logging table on my database.
Users
+----+------+
| id | name |
+----+------+
| 1 | FOO |
| 2 | BAR |
| 3 | LOS |
+----+------+
Log_Users
+-------------+-------------------+------+-----+-------------------+-----------------------------+
| Field | Type | Null | Key | Default | Extra |
+-------------+-------------------+------+-----+-------------------+-----------------------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| old_id | int(11) | YES | | NULL | |
| old_name | varchar(100) | YES | | NULL | |
| new_id | int(11) | YES | | NULL | |
| new_name | varchar(100) | YES | | NULL | |
| action_type | enum('C','U','D') | YES | | NULL | |
| time | timestamp | NO | | CURRENT_TIMESTAMP | on update CURRENT_TIMESTAMP |
| doers | int(11) | YES | | NULL | |
+-------------+-------------------+------+-----+-------------------+-----------------------------+
I have a small application created using PHP to save user's id into session. How do i send this user's id value (on PHP's session) to a trigger of one of the tables to log their activities, like deleting another users or updating them? I've tried to use a trigger on log table to do all of the things, something like this.
CREATE TRIGGER userTrigger BEFORE INSERT ON Log_Users FOR EACH ROW
BEGIN
IF(new.action_type = 'C') THEN
INSERT INTO Users(id, name) VALUE(new.new_id, new.new_name);
ELSEIF(new.action_type = 'U') THEN
UPDATE Users SET id = new.new_id, name = new.new_name WHERE id = new.old_id;
ELSEIF(new.action_type = 'D') THEN
SET new.old_name = (SELECT name FROM Users WHERE id = new.old_id);
DELETE FROM Users WHERE id = new.old_id;
END IF;
END~
But, I'm struggling on the problem when users updating multiple records on the same column. At the end, what is and how to make an optimal activities logging using PHP and MySQL and how to do it? I have no solution for this problem for now. Thank you.
I've never done this using triggers so I can't help you with that sadly. How I usually do this:
Your users should NEVER have direct access to mysql or phpmyadmin, they should create, edit, delete and anything else using a PHP script you provide. This way you have total control over what your users can and can't do, and you narrow a lot the posible actions performed, so creating logs of them is much easier. For example:
You have a php scrip that users use to do some stuff and insert a new row, right after that you do a insert on the log table recording this last action.

json data in mysql

I have a mysql table called "Data",
+---------+------------------+------+-----+-------------------+----------------+
| Field | Type | Null | Key | Default | Extra |
+---------+------------------+------+-----+-------------------+----------------+
| id | int(10) unsigned | NO | PRI | NULL | auto_increment |
| data | text | YES | | NULL | |
| created | timestamp | NO | MUL | CURRENT_TIMESTAMP | |
+---------+------------------+------+-----+-------------------+----------------+
The field "data" has values like this:
606 | {"first_name":"JOHN","last_name":"SLIFKO","address":"123 main AVE","city":"LAKEWOOD","state":"OH","zip":"20190","home_phone":2165216359,"email":"john#gmail.com",} | 2012-12-04 16:37:23 |
So, it is saving the records in a JSON Format from a PHP Script that I have.
THIS IS THE THING:
How can I structure this table to make faster searchs or consults by every single field like doing searches or queries like:
SELECT * FROM Data WHERE first_name = john;
how can I do this???
Help please......
Yikes. Not a good design. About the best you could do is use the like keyword
Select * from Data Where data like '%"first_name":"JOHN"%'

Speed Up MySQL (MyISAM) COUNTs with WHERE Clauses

We are implementing a system that analyses books. The system is written in PHP, and for each book loops through the words and analyses each of them, setting certain flags (that translate to database fields) from various regular expressions and other tests.
This results in a matches table, similar to the example below:
+------------------------+--------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+------------------------+--------------+------+-----+---------+----------------+
| id | bigint(20) | NO | PRI | NULL | auto_increment |
| regex | varchar(250) | YES | | NULL | |
| description | varchar(250) | NO | | NULL | |
| phonic_description | varchar(255) | NO | | NULL | |
| is_high_frequency | tinyint(1) | NO | | NULL | |
| is_readable | tinyint(1) | NO | | NULL | |
| book_id | bigint(20) | YES | | NULL | |
| matched_regex | varchar(255) | YES | | NULL | |
| [...] | | | | | |
+------------------------+--------------+------+-----+---------+----------------+
Most of the omitted fields are tinyint, either 0 or 1. There are currently 25 fields in the matches table.
There are ~2,000,000 rows in the matches table, the output of analyzing ~500 books.
Currently, there is a "reports" area of the site which queries the matches table like this:
SELECT COUNT(*)
FROM matches
WHERE is_readable = 1
AND other_flag = 0
AND another_flag = 1
However, at present it takes over a minute to fetch the main index report as each query takes about 0.7 seconds. I am caching this at a query level, but it still takes too long for the initial page load.
As I am not very experienced in how to manage datasets such as this, can anyone advise me of a better way to store or query this data? Are there any optimisations I can use with MySQL to improve the performance of these COUNTs, or am I better off using another database or data structure?
We are currently using MySQL with MyISAM tables and a VPS for this, so switching to a new database system altogether isn't out of the question.
You need to use indexes, create them on the columns you do a WHERE on most frequently.
ALTER TABLE `matches` ADD INDEX ( `is_readable` )
etc..
You can also create indexes based on multiple columns, if your doing the same type of query over and over its useful. phpMyAdmin has the index option on the structure page of the table at the bottom.
Add multi index to this table as you are selecting by more than one field. Below index should help a lot. Those type of indexes are very good for boolean / int columns. For indexes with varchar values read more here: http://dev.mysql.com/doc/refman/5.0/en/create-index.html
ALTER TABLE `matches` ADD INDEX ( `is_readable`, `other_flag`, `another_flag` )
One more thing is to check your queries by using EXPLAIN {YOUR WHOLE SQL STATEMENT} to check which index is used by DB. So in this example you should run query:
EXPLAIN ALTER TABLE `matches` ADD INDEX ( `is_readable`, `other_flag`, `another_flag` )
More info on EXPLAIN: http://dev.mysql.com/doc/refman/5.0/en/explain.html

Uploading a csv using php into MySQL and update as well

Ok so i have a database table called requests with this structure
mysql> desc requests;
+------------+--------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+------------+--------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| artist | varchar(255) | YES | | NULL | |
| song | varchar(255) | YES | | NULL | |
| showdate | date | YES | | NULL | |
| amount | float | YES | | NULL | |
+------------+--------------+------+-----+---------+----------------+
Here is some example data
+----+-----------+-------------------------+------------+--------+
| id | artist | song | showdate | amount |
+----+-----------+-------------------------+------------+--------+
| 6 | Metallica | Hello Cruel World | 2010-09-15 | 10.00 |
| 7 | someone | Some some | 2010-09-18 | 15.00 |
| 8 | another | Some other song | 2010-11-10 | 45.09 |
+----+-----------+-------------------------+------------+--------+
I need a way to be able to give user a way to upload a csv with the same structure and it updates or inserts based on whats in the csv. I have found many scripts online but most have a hard coded csv which is not what i need. I need the user to be able to upload the csv...Is that easy with php....
Here is an example csv
id artist song showdate amount
11 NewBee great stuff 2010-09-15 12.99
10 anotherNewbee even better 2010-09-16 34.00
6 NewArtist New song 2010-09-25 78.99
As you can see i have id 6 which is already in the database and needs to be updated..The other two will get inserted
I am not asking for someone to write the whole script but if i can get some direction on the upload and then where to go from there....thanks
Create store procedure as below and test it. It is works
CREATE proc csv
(
#id int,
#artist varchar(50),
#songs varchar(100),
#showdate datetime,
#amount float
)
as
set nocount on
if exists (select id from dummy1 where id=#id) -- Note that dummy1 as my table.
begin
update dummy1 set artist= #artist where id=#id
update dummy1 set songs=#songs where id=#id
update dummy1 set showdate=#showdate where id=#id
update dummy1 set amount=#amount where id=#id
end
else
insert into dummy1 (artist,songs,showdate,amount)values(#artist,#songs,#showdate,#amount)
Go
upload the file to a directory using move_uploaded_file
use fgetcsv to read the uploaded csv and process each row as you like.
delete the csv file

Categories