I'm coding a quest system for my site that pretty much works like any you find in an MMORPG. I've got the whole thing working, but I really need to speed things up since I coded it inefficiently. Not because I can't code, but because I just wasn't sure how to go about it.
Basically I want to display all the quests that are available to the user. There are quests with no requirements, and some with. To see if a quest has already been completed, and a quest is available you would check the questuser table for the value of questuserCompleted. If it's 1, then it's complete.
Here are the tables I have. I left irrelevant things out.
quest table - Holds all the quest data
questID
questNPC
Where they get the Quest from
questPreReq
refers to a quest they would have needed to comple to get this one
questuser table - Holds all the quests the user has accepted, complete or not
questuserID
questuserUser
User's ID
questuserQuest
refers to the ID of the quest from the quest table
questuserCompleted
0 is in progress, 1 is complete
There's definitely a better way to do than I have now. I'm usually more efficient at things like this, but since I've never coded something like this before, I'm not really sure how to go about it.
Basically it just loops through every single quest, and with an if statement, it checks for questuserCompleted. Once there starts to be a lot of quests, this would get pretty slow for each load of the page.
function displayAvailable($npc = 0){
$completed[] = 0;
$notCompleted = array();
$query=mysql_query("
SELECT a.questID, a.questPreReq, a.questTitle, b.questuserCompleted, a.questText , a.questNPC
FROM quest a
LEFT JOIN questuser b ON a.questID = b.questuserQuest AND b.questuserUser = '".$this->userID."'
ORDER BY a.questID");
$comments = $this->ProcessRowSet($query);
$num = mysql_num_rows($query);
if($num){
foreach ($comments as $c){
if($c['questuserCompleted']){
$completed[] = $c['questID'];
}else{
$notCompleted[] = $c['questID'];
}
if(in_array($c['questPreReq'], $completed) && !in_array($c['questID'], $completed) && $c['questuserCompleted'] != '0'){
if($npc == 0 || $c['questNPC'] == $npc){
$count++;
$return .= "<p>".$c['questTitle']."</p>";
}
}
}
}
if(!$count){
$return = "You have no available quests";
}
return $return;
}
Thanks for any help.
Subqueries to the rescue
SELECT a.questID, a.questPreReq, a.questTitle, a.questText , a.questNPC
FROM quest a
WHERE a.questPreReq IS NULL
OR a.questPreReq IN (SELECT questuserQuest FROM questuser WHERE questuserUser = 'UserID' AND questuseCompleted = 1)
ORDER BY a.questID
So you let the database sort it out for you, this should be superfast.
The query is missing the exclusion of quests the user already did, I leave this as an exercise as I am writing on my smartphone ; )
Related
This is my first question on here, but after some excessive searching I could not find a answer on my question and thought maybe you nice folks can help me out :) I am writing a script that behaves differently depending on what products someone had in their order. For explaining purposes: I have 10 customers. They all have a order_id. Within that order_id in a other table are all products_id's stored.
I want toe "predict" if an order has product_id 1 AND product_id 2, cause I need the script to behave differently. All others combinations are fine (so only product 1, or only product 2 or everything >= product 3.
Here is what I got so far (for readabillity, i left non-important stuff out),
<?php
$qGetMailadressen = 'A select EMAIL query';
while($emailrow = mysqli_fetch_assoc($mailadressenresult)) {
$customers_email = $emailrow['emailadres'];
$qGetLaatsteOrderid = 'A select LAST order from this email QUERY';
$orderidrow = mysqli_fetch_assoc($GetLaatsteOrderidresult);
$orderid = $orderidrow['orders_id'];
$lecountry = $orderidrow['customers_country'];
$country = $orderidrow['order_language'];
$qGetLaatsteProducten = "A select ALL PRODUCTS from this ORDERID QUERY";
while($productenrow = mysqli_fetch_assoc($GetLaatsteProductenresult)){
$productnaam = $productenrow['products_name'];
$productquantity = $productenrow['products_quantity'];
$productid = $productenrow['products_id'];
}
}
So in this last WHILE loop you see I get the product ID I need. And Ideally what I want is:
if ($productid == '1' AND $productid == '2'){
//do this
}else{
// do that
}
However im aware that is not possible cause its only IN the while loop that I know it. I tried putting all the data in an Array, but since I am out of the while loop im not sure how to do this.
I understand it might be very unclear, and im sorry for this. I hope you guys can help me out so I can clearify myself better if something is not clear.
Thank you!
/love Grumpy
I have a bunch of photos on a page and using jQuery UI's Sortable plugin, to allow for them to be reordered.
When my sortable function fires, it writes a new order sequence:
1030:0,1031:1,1032:2,1040:3,1033:4
Each item of the comma delimited string, consists of the photo ID and the order position, separated by a colon. When the user has completely finished their reordering, I'm posting this order sequence to a PHP page via AJAX, to store the changes in the database. Here's where I get into trouble.
I have no problem getting my script to work, but I'm pretty sure it's the incorrect way to achieve what I want, and will suffer hugely in performance and resources - I'm hoping somebody could advise me as to what would be the best approach.
This is my PHP script that deals with the sequence:
if ($sorted_order) {
$exploded_order = explode(',',$sorted_order);
foreach ($exploded_order as $order_part) {
$exploded_part = explode(':',$order_part);
$part_count = 0;
foreach ($exploded_part as $part) {
$part_count++;
if ($part_count == 1) {
$photo_id = $part;
} elseif ($part_count == 2) {
$order = $part;
}
$SQL = "UPDATE article_photos ";
$SQL .= "SET order_pos = :order_pos ";
$SQL .= "WHERE photo_id = :photo_id;";
... rest of PDO stuff ...
}
}
}
My concerns arise from the nested foreach functions and also running so many database updates. If a given sequence contained 150 items, would this script cry for help? If it will, how could I improve it?
** This is for an admin page, so it won't be heavily abused **
you can use one update, with some cleaver code like so:
create the array $data['order'] in the loop then:
$q = "UPDATE article_photos SET order_pos = (CASE photo_id ";
foreach($data['order'] as $sort => $id){
$q .= " WHEN {$id} THEN {$sort}";
}
$q .= " END ) WHERE photo_id IN (".implode(",",$data['order']).")";
a little clearer perhaps
UPDATE article_photos SET order_pos = (CASE photo_id
WHEN id = 1 THEN 999
WHEN id = 2 THEN 1000
WHEN id = 3 THEN 1001
END)
WHERE photo_id IN (1,2,3)
i use this approach for exactly what your doing, updating sort orders
No need for the second foreach: you know it's going to be two parts if your data passes validation (I'm assuming you validated this. If not: you should =) so just do:
if (count($exploded_part) == 2) {
$id = $exploded_part[0];
$seq = $exploded_part[1];
/* rest of code */
} else {
/* error - data does not conform despite validation */
}
As for update hammering: do your DB updates in a transaction. Your db will queue the ops, but not commit them to the main DB until you commit the transaction, at which point it'll happily do the update "for real" at lightning speed.
I suggest making your script even simplier and changing names of the variables, so the code would be way more readable.
$parts = explode(',',$sorted_order);
foreach ($parts as $part) {
list($id, $position) = explode(':',$order_part);
//Now you can work with $id and $position ;
}
More info about list: http://php.net/manual/en/function.list.php
Also, about performance and your data structure:
The way you store your data is not perfect. But that way you will not suffer any performance issues, that way you need to send less data, less overhead overall.
However the drawback of your data structure is that most probably you will be unable to establish relationships between tables and make joins or alter table structure in a correct way.
i've got a script which is supposed to run through a mysql database and preform a certain 'test'on the cases. Simplified the database contains records which represent trips that have been made by persons. Each record is a singel trip. But I want to use only roundway trips. So I need to search the database and match two trips to each other; the trip to and the trip from a certain location.
The script is working fine. The problem is that the database contains more then 600.000 cases. I know this should be avoided if possible. But for the purpose of this script and the use of the database records later on, everything has to stick together.
Executing the script takes hours right now, when executing on my iMac using MAMP. Off course I made sure that it can use a lot of memory etcetare.
My question is how could I speed things up, what's the best approach to do this?
Here's the script I have right now:
$table = $_GET['table'];
$output = '';
//Select all cases that has not been marked as invalid in previous test
$query = "SELECT persid, ritid, vertpc, aankpc, jaar, maand, dag FROM MON.$table WHERE reasonInvalid != '1' OR reasonInvalid IS NULL";
$result = mysql_query($query)or die($output .= mysql_error());
$totalCountValid = '';
$totalCountInvalid = '';
$totalCount = '';
//For each record:
while($row = mysql_fetch_array($result)){
$totalCount += 1;
//Do another query, get all the rows for this persons ID and that share postal codes. Postal codes revert between the two trips
$persid = $row['persid'];
$ritid = $row['ritid'];
$pcD = $row['vertpc'];
$pcA = $row['aankpc'];
$jaar = $row['jaar'];
$maand = $row['maand'];
$dag = $row['dag'];
$thecountquery = "SELECT * FROM MON.$table WHERE persid=$persid AND vertpc=$pcA AND aankpc=$pcD AND jaar = $jaar AND maand = $maand AND dag = $dag";
$thecount = mysql_num_rows(mysql_query($thecountquery));
if($thecount >= 1){
//No worries, this person ID has multiple trips attached
$totalCountValid += 1;
}else{
//Ow my, the case is invalid!
$totalCountInvalid += 1;
//Call the markInvalid from functions.php
$totalCountValid += 1;
markInvalid($table, '2', 'ritid', $ritid);
}
}
//Echo the result
$output .= 'Total cases: '.$totalCount.'<br>Valid: '.$totalCountValid.'<br>Invalid: '.$totalCountInvalid; echo $output;
Your basic problem is that you are doing the following.
1) Getting all cases that haven't been marked as invalid.
2) Looping through the cases obtained in step 1).
What you can easily do is to combine the queries written for 1) and 2) in a single query and loop over the data. This will speed up the things a bit.
Also bear in mind the following tips.
1) Selecting all columns is not at all a good thing to do. It takes ample amount of time for the data to traverse over the network. I would recommend replacing the wild-card with all columns that you really need.
SELECT * <ALL_COlumns>
2) Use indexes - sparingly, efficiently and appropriately. Understand when to use them and when not to.
3) Use views if you can.
4) Enable MySQL slow query log to understand which queries you need to work on and optimize.
log_slow_queries = /var/log/mysql/mysql-slow.log
long_query_time = 1
log-queries-not-using-indexes
5) Use correct MySQL field types and the storage engine (Very very important)
6) Use EXPLAIN to analyze your query - EXPLAIN is a useful command in MySQL which can provide you some great details about how a query is ran, what index is used, how many rows it needs to check through and if it needs to do file sorts, temporary tables and other nasty things you want to avoid.
Good luck.
I have been given access to a third parties database and wish to create a tool using their information. The database designed for their original purpose is very very large and segregated. I need to complete the following task:
From the the below Schema, I need to complete the following tasks:
Look up the item in the invTypes, check both the invTypeMaterials and ramTypeRequirements to see if any materials are need to build the item. If yes, then look up each of those materials in invTypes, and again repeat the process to see if those in turn need components. This loop keeps going until the the check on both the invTypeMaterials and ramTypeRequirements is False, this can be 5 or 6 loops, but 5 or 6 items per loop to check so could be 1561 loops assuming 1 loop for original item, then 5 loops per material of which there is 5, 5 times.
Now I tried to complete the code and came up with the follow:
$materialList = array();
function getList($dbc, $item) {
global $materialList;
// Obtain initial material list
$materials = materialList($dbc, $item);
// For each row in the database
while ($material == mysqli_fetch_array($materials)) {
// Check if there are any sub materials required
if (subList($dbc, $material['ID'])) {
// If so then recurse over the list the given quantity (it has already done it once)
for ($i = 0; $i < $material['Qty'] - 1; $i++) {
if (!subList($dbc, $material['ID'])) {
break;
}
}
} else {
// If there are no further materials then this is the base material so add to the array.
$materialList .= array(
"Name" => $mMaterial['Name'],
"Qty" => $mMaterial['Qty'],
"ID" => $material['ID']
);
}
}
return $materialList;
}
function subList($dbc, $item) {
global $materialList;
// Query the material incase it require further building
$mMaterials = materialList($dbc, $item['ID']);
// If the database returns any rows, then it must have more sub-materials required
if (mysqli_num_rows($mMaterials) > 0) {
// Check the sub-materials to see if they intern require futher materials
if (subList($dbc, $material['ID'])) {
// If the function returns true then iterate over the list the given quantity (its already done it once before)
for ($i = 0; $i < $material['Qty'] - 1; $i++) {
if (!subList($dbc, $material['ID'])) {
break;
}
}
} else {
// if the database returns 0 rows then this object is the base material so add to array.
$materialList .= array(
"Name" => $mMaterial['Name'],
"Qty" => $mMaterial['Qty'],
"ID" => $material['ID']
);
return true;
}
} else {
return false;
}
}
function materialList($dbc, $item) {
// Query
$query = " SELECT i.typeID AS ID, i.typeName AS Name, m.Quantity AS Qty
FROM invTypes AS i
LEFT JOIN invTypeMaterials AS m
ON m.materialTypeID = i.typeID
LEFT JOIN ramTypeRequirements AS r
ON r.typeID = i.typeID
WHERE groupID NOT IN(278,269,278,270,268) AND m.typeID = $item";
$snippets = mysqli_query($dbc, $query) or die('Error: ' . mysqli_error($dbc));
return $snippets;
}
As im sure you have all noticed this code breaks about every programming law there is when it comes to recursive database calls. Not really practical especially in that subList() calls itself continually until it finds it's false. SQL isn't my strong suite, but I cannot for the life of me work out how to get over this problem.
Any pointers would be very helpful, I'm certainly not asking any of you to re-write my entire code for me, but if you have any ideas as to what I should consider I would be grateful.
As a generic solution I would do the following:
For every typeID, gather from both invTypeMaterials and ramTypeRequirements
From the gathered data, you create a new SELECT query and continue the cycle
Initial query
SELECT t.*, m.materialTypeID, m.quantity AS m_quantity, r.requiredTypeID, r.quantity AS r_quantity
FROM invTypes t
LEFT JOIN invTypeMaterials m USING (typeID)
LEFT JOIN ramTypeRequirements r USING (typeID)
WHERE <conditions to select the types>
I've just made a guess at which data from the extra tables are required to load; expand where necessary.
The materialTypeID and requiredTypeID will be non-null for matches rows and null otherwise.
Keep a table of types you have already loaded before, for faster reference. Then for the second query you replace the condition to something like `WHERE t.typeID IN ()
Let me know if this makes sense and whether it's even close to what's useful to you :)
Looks like here recursion is unavoidable. I join Jack's answer, just will extend it with PHP code :)
I must warn you that I never executed it, so it will need debugging, but I hope you will get the idea. :)
$checked_dependencies = array();
$materials = array();
function materialList( $ids ) {
// if we have an array of IDs, condition is ".. in (...)"
if(is_array($ids)) {
$condition = 'IN ('.implode(',',$ids).')';
// add all to checked dependencies
foreach($ids as $id) { $checked_dependencies[] = $id; }
}else{
// otherwise, checking for particular ID
$condition = "= {$ids}";
// add to checked dependencies
$checked_dependencies[] = $ids;
}
$query = "SELECT t.*,
m.materialTypeID, m.quantity AS m_quantity,
r.requiredTypeID,
r.quantity AS r_quantity
FROM invTypes t
LEFT JOIN invTypeMaterials m ON t.typeId = m.typeId
LEFT JOIN ramTypeRequirements r ON t.typeId = r.typeId
WHERE t.typeID {$condition}";
$res = mysqli_query($dbc, $query);
// this will be the list of IDs which we need to get
$ids_to_check = array();
while($material = mysqli_fetch_assoc($res)) {
$materialList[] = $material; // you can get only needed fields
// if we didn't check the dependencies already, adding them to the list
// (if they aren't there yet)
if(!in_array($material['materialTypeId'], $checked_dependencies)
&& !in_array($material['materialTypeId'], $ids_to_check)
&& !is_null($material['materialTypeId'])) {
$ids_to_check[] = $material['materialTypeId'];
}
if(!in_array($material['requiredTypeId'], $checked_dependencies)
&& !in_array($material['requiredTypeId'], $ids_to_check)
&& !is_null($material['requiredTypeId'])) {
$ids_to_check[] = $material['requiredTypeId'];
}
}
// if the result array isn't empty, recursively calling same func
if(!empty($ids_to_check)) { materialList($ids_to_check); }
}
I used a global array here, but it's easy to re-write the func to return data.
Also we can put some depth limit here to avoid too much recursion.
Generally, I'd say it is not a very convenient (for this task) organization of DB data. It's kinda comfortable to store data recursively like that, but, as you see, it results in an unknown amount of iterations and requests to database to get all the dependencies. And that might be expensive (PHP <-> MySQL <-> PHP <->...), on each iteration we lose time, especially if the DB is on remote server as in your case.
Of course, would be great to re-arrange the data structure for possibility to get all requirements at once, but as I understand you have a read-only access to the database. Second solution which comes to my head is a recursive MySQL stored procedure, which is also impossible here.
In some cases (not generally) it is good to get as much data as possible in one query, and operate with it locally, to lessen the iterations number. It is hard to say if it is possible here, because I don't know the size of DB and the structure, etc, but e.g. if all required dependencies are stored in one group, and the groups aren't enormously large, maybe it might be faster to get all the group info in one request to a PHP array and then collect the info from that array locally. But - it is only a guess and it needs testing and checking.
I am trying to implement a check in my PHP code, that checks if there is a duplicate uid in the database, and if so, to assign a new uid, and check again, but I am having trouble nailing the logic, here is what I have thus far,
function check($uid){
$sql = mysql_query("SELECT * FROM table WHERE uid='$uid'");
$pre = mysql_num_rows($sql);
if($pre >= 1){
return false;
}else{
return true;
}
}
And then using that function I thought of using a while loop to continue looping through until it evaluates to true
$pre_check = check($uid);
while($pre_check == false){
//having trouble figuring out what should go here
}
So basically, once I have a usable uid, write everything to the database, else keep generating new ones and checking them till it finds one that is not already in use.
It is probably really simple, but for some reason I am having trouble with it.
Thanx in advance!
$uid = 100; // pick some other value you want to start with or have stored based on the last successful insert.
while($pre_check == false){
$pre_check = check(++$uid);
}
Of course ths is exactly what 'auto incrementing' primary keys are useful for. Are you aware of 'auto incrementing' primary keys in mysql?
EDIT
In light of your comment regarding maintaining someone else's code that uses the random function like that (ewwwww)... I would use the method I suggest above and store the last inserted id somewhere you can read it again for the next user. This will allow you to "fill-in-the-blanks" for the uids that are missing. So, if for example you have uids 1, 2, 5, 9, 40, 100... you can start with $uid = 1; Your while loop will return once you get to 3. Now you store the 3 and create the new record. Next time, you start with $uid = 3; and so on. Eventually you will have all numbers filled in.
It is also important to realize that you will need to do the inserts by either locking the tables for WRITES. You don't want to get into a race condition where two different users are given the same uid because they are both searching for an available uid at the same time.
Indeed the best is to use autoincrement ids, but if you don't have the choice, you can do a reccursive function like that:
function find_uid() {
$new_uid = rand(1000000000, 9999999999);
$sql = mysql_query("SELECT COUNT(*) AS 'nb' WHERE uid=".$new_uid.";");
$row = mysql_fetch_assoc();
$pre = $row['nb'];
return ($pre >= 1 ? find_uid() : $new_uid);
}
COUNT(*) should be more performant because the count is made by MySQL and not php.
By the way, if you need a new uid shouldn't the condition be ($pre > 0) instead of ($pre > 1) ?