following Problem:
I have a MySQL database with songs in it. The database has the following structure:
id INT(11)(PRIMARY)
title VARCHAR(255)
album VARCHAR(255)
track INT(11)
duration INT(11)
The user should be able to enter a specific time into a php form and the php function should give him a list of all possible combinations of songs which add up to the given time ±X min.
So if the user wants to listen to 1 hour of music ±5 minutes he would enter 60 minutes and 5 minutes of threshold into the form and would recieve all possible song sets which add up to a total of 55 to 65 minutes. It should not print out duplicates.
I already found several approaches to this problem but they did only give me back the durations which add up to X and not the song names etc. So my problem is how to solve this so that it gives me back the IDs of the songs which add up to the desired time or print out the list with the corresponding song names.
This seems to be one of the best answers I found, but I am not able to adapt it to my database.
What you are describing is a Knapsack Problem. When I was in college, we used Dynamic Programming to attack problems like this.
The brute-force methods (just try every combination until one (or more) works is a complexity O(n!), or factorial-length problem - extremely lengthy to iterate through and calculate!
If your times for tracks are stored as an int (in seconds seems the easiest math to me), then your knapsack size is 3300-3900 (3600 seconds == 1 hour).
Is your goal to always return the first set that matches, or to always return a random set?
Note - by bounding your sack size, you greatly expand the number of possible answers.
Related
I know that title sounds confusing - let me explain the situation. I have a table called hands. Each row in this table is a specific combination of 4 playing cards from a deck of cards, but the hand itself is not unique. What IS unique is the specific combination of hand + sim_id. The table looks like this:
hand / sim_id / percent
AsKsQdJd / 346 / 100
There are 270,000 unique combinations of 4 playing cards, but not every unique combination is currently stored in the database. The percent column displays what percent of the time the player should play that specific hand. When I imported, I only imported hands with percent > 0.
Now, I want to retroactively add ALL combinations to the database for each sim_id. In other words, for all unique 4-card combinations of hand for a given sim_id that are NOT currently in the database, I want to add them with percent = 0.
I can think of a lot of slow and dumb ways to do this, like literally looping through all possible combinations and checking if they exist for all possible sim_ids, but the database is currently 60 million rows and this update will bring it to >200 million, so time is definitely of the essence for this operation. Thanks in advance.
I have a small table called "DataVisitorActivity" with this fields
id int auto_increment primary key,
vID int null,
category varchar(128) null,
timestamp timestamp default CURRENT_TIMESTAMP not null,
value text null,
handle text null
it have 2 index fields
handle_index(handle)
DataVisitorActivity_vID_index(vID)
Until now I had no performance problems all worked in around 0.01 seconds.
currently the table have around 2Milion entrys and it will get bigger every day (We save every website the user visits in this list)
The only thing I had to change last time I edit the table was to set "handle" to "text" because we have really long strings that get saved in that field.
with that change the query I use
SELECT COUNT(*) AS `blog_count`, handle FROM DataVisitorActivity WHERE value = "blog" GROUP BY handle ORDER BY blog_count DESC Limit 5
this time it needs 0.1 - 0.3 seconds still fine for me.
I saw now that the query sometimes(looks random) need around 5-15 seconds to execute.
I just wrote a while loop and let it run 10x10 times total 100 times.
around 60 was under 1 second 20 was under 5 seconds and all other was bigger than 5 seconds.
So My question is: is this query taking so long because the table is getting bigger and bigger? Why does the execution time changes so hard?
Edit: In phpmayadmin this query is exequtet in under 0.001 seconds every time
I would think that your GROUP BY handle is the problem. How big can the field get and do you have an index on it? Check here for indices on text columns: https://dev.mysql.com/doc/refman/5.5/en/column-indexes.html.
A possible solution would be to add a column where you store for example a sha1 hash of the handle column. That will have a fixed width so you can easily add an index - and GROUP BY - on that. Then use EXPLAIN to see where you can improve more.
I am currently working on a simple booking system and I need to select some ranges and save them to a mysql database.
The problem I am facing is deciding if it's better to save a range, or to save each day separately.
There will be around 500 properties, and each will have from 2 to 5 months booked.
So the client will insert his property and will chose some dates that will be unavailable. The same will happen when someone books a property.
I was thinking of having a separate table for unavailable dates only, so if a property is booked from 10 may to 20 may, instead of having one record (2016-06-10 => 2016-06-20) I will have 10 records, one for each booked day.
I think this is easier to work with when searching between dates, but I am not sure.
Will the performance be noticeable worse ?
Should I save the ranges or single days ?
Thank you
I would advise that all "events" go into one table and they all have a start and end datetime. Use of indexes on these fields is of course recommended.
The reasons are that when you are looking for bookings and available events - you are not selecting from two different tables (or joining them). And storing a full range is much better for the code as you can easily perform the checks within a SQL query and all php code to handle events works as standard for both. If you only store one event type differently to another you'll find loads of "if's" in your code and find it harder to write the SQL.
I run many booking systems at present and have made mistakes in this area before so I know this is good advice - and also a good question.
This is too much for a comment,So I will leave this as an answer
So the table's primary key would be the property_id and the Date of a particular month.
I don't recommend it.Because think of a scenario when u going to apply this logic to 5 or 10 years system,the performance will be worse.You will get approximately 30*12*1= 360 raws for 1 year.Implement a logic to calculate the duration of a booking and add it to table against the user.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Right, I'm trying to create a system where by the user can do something, but then must wait until all the other users in the mysql table have made their move, i.e
User1 makes move, user2 and 3 must wait
user2 makes move, user 1 and 3 must wait
user3 makes move, user 1 and 2 must wait
user1 makes move...
One way I thought of was to give each of the users an number (ranging from 1 to the total number of players, say 6) and then when a player makes a move, set their number to the max number (6) and decrease everyone else's number by one, so the one with the minimum number is the only one who can play.
That's my only idea, is there an easier or alternative way?
My suggestion would be just store the last move date as a datetime. When you need to check if a user can move, simply just select out of the table all of the other players where the last move date is less than or equal to the current player's last move date. If the number of rows is not 0, then the player cannot move yet.
The benefits of this approach is the simplicity- every time you allow a player to make a move, just update the column with the current date and time.
Your proposed solution seems a little circuitous:
You're updating+reading every player every move, when the minimum information you need to maintain is whose move it is.
You're losing information about player order as you encode next turn information.
A high-level solution:
Create a games table, one row per game, with a column like INT currentTurn
Create a gameUsers table on a per-game basis, linked to its game in games
Do assign each of the n users in gameUsers an INT playerOrder ranging [1-n]
Only accept a move from playerN if playerN == "SELECT playerID FROM gameUsers WHERE playerOrder = currentTurn"
After a successful move: "UPDATE games SET currentTurn = currentTurn + 1 WHERE game = thisGame"
I believe above table structure is a good object oriented representation of an actual game model. You can stash other per-game things into games like winner, length, date, etc. Pardon the pseudoSQL.
You could have a table with column hasMoved tinyint(1) required default 0, and query for where hasMoved == 0; if the query returns null, then all players have moved.
(Note: this is based on "must wait for all other users", NOT for a strict move order - i.e. 'A' must move before 'B' must move before 'C', etc.)
Additionally, queries using this method is somewhat slow and (to me) seems somewhat unnecessarily resource-intensive - perhaps think about using Ajax instead?
Have a game sequence number that starts at zero. Have a "last moved" number for each player. When a player moves, set their "last moved" number equal to the game sequence number. Once every player has moved, increment the game sequence number.
You may want to use a timeout though, otherwise a player who doesn't move can delay the other players indefinitely.
I would first determine $sequence by calculating speed. Then comparing speeds to determine order. Then use the order to send out notices for their move. Use a timestamp to ensure the user doesn't take over a day or however long, you will need a cron job just for this.
Have a variable or array hold the first n last sequence so u can easily move the last moved player to the back without mixing uP orders.
Have the page check the players order sequence and not allow action unless it's at 1 or 0. Be sure to sanitize inputs so no manipulation exists. Then insert your form and graphics and game equations.
You can save date-time of the last move of the each user. So when you DESC sort this table by this date-time column, you will have to fetch only the first row of the result, that will contain the ID of the allowed to make move player.
In php - how do I display 5 results from possible 50 randomly but ensure all results are displayed equal amount.
For example table has 50 entries.
I wish to show 5 of these randomly with every page load but also need to ensure all results are displayed rotationally an equal number of times.
I've spent hours googling for this but can't work it out - would very much like your help please.
please scroll down for "biased randomness" if you dont want to read.
In mysql you can just use SeleCT * From table order by rand() limit 5.
What you want just does not work. Its logically contradicting.
You have to understand that complete randomness by definition means equal distribution after an infinite period of time.
The longer the interval of selection the more evenly the distribution.
If you MUST have even distribution of selection for example every 24h interval, you cannot use a random algorithm. It is by definition contradicting.
It really depends no what your goal is.
You could for example take some element by random and then lower the possibity for the same element to be re-chosen at the next run. This way you can do a heuristic that gives you a more evenly distribution after a shorter amount of time. But its not random. Well certain parts are.
You could also randomly select from your database, mark the elements as selected, and now select only from those not yet selected. When no element is left, reset all.
Very trivial but might do your job.
You can also do something like that with timestamps to make the distribution a bit more elegant.
This could probably look like ORDER BY RAND()*((timestamps-min(timestamps))/(max(timetamps)-min(timestamps))) DESC or something like that. Basically you could normalize the timestamp of selection of an entry using the time interval window so it gets something between 0 and 1 and then multiply it by rand.. then you have 50% fresh stuff less likely selected and 50% randomness... i am not sure about the formular above, just typed it down. probably wrong but the principle works.
I think what you want is generally referred to as "biased randomness". there are a lot of papers on that and some articles on SO. for example here:
Biased random in SQL?
Copy the 50 results to some temporary place (file, database, whatever you use). Then everytime you need random values, select 5 random values from the 50 and delete them from your temporary data set.
Once your temporary data set is empty, create a new one copying the original again.