I just started to maintain a browser-game and optimize its code. I can see some functions run more than hundred times. I thought about how to optimize the code.
1- Modify whole platform of programming of the game. (it doesn't make sense because it needs at least 9 months to modify it. Also maybe cause of unexpected bugs and I don't want to get in the way).
2- Working on caching for more efficiency.
I am going to use second way. For more clarifying I put a function of the game here.
function modifyResource($vid, $wood, $clay, $iron, $crop, $mode) {
if($wood<0)$wood=0;
if($clay<0)$clay=0;
if($iron<0)$iron=0;
//if($crop<0)$crop=0;
$q = "SELECT * from " . TB_PREFIX . "vdata where wref = $vid";
$result = $this->query($q,__FILE__,__LINE__);
$res=$this->mysql_fetch_all($result);
if($wood+$res[0]['wood']>$res[0]['maxstore'] && $mode)$wood=$res[0]['maxstore']-$res[0]['wood'];
if($clay+$res[0]['clay']>$res[0]['maxstore'] && $mode)$clay=$res[0]['maxstore']-$res[0]['clay'];
if($iron+$res[0]['iron']>$res[0]['maxstore'] && $mode)$iron=$res[0]['maxstore']-$res[0]['iron'];
if($crop+$res[0]['crop']>$res[0]['maxcrop'] && $mode)$crop=$res[0]['maxcrop']-$res[0]['crop'];
if($res[0]['wood']<$wood && !$mode)$wood=$res[0]['wood'];
if($res[0]['clay']<$clay && !$mode)$clay=$res[0]['clay'];
if($res[0]['iron']<$iron && !$mode)$iron=$res[0]['iron'];
if($res[0]['crop']<$crop && !$mode)$crop=$res[0]['crop'];
if(!$mode) {
$wood = -$wood;
$clay = -$clay;
$iron = -$iron;
$crop = -$crop;
}
$q = "UPDATE " . TB_PREFIX . "vdata set wood = wood + $wood, clay = clay + $clay, iron = iron + $iron, crop = crop + $crop where wref = $vid";
$r = $this->query($q,__FILE__,__LINE__);
if($r){
// $this->logging->addResourceLog($vid, $wood, $clay, $iron, $crop, $log);
}
return true;
}
As you can see the function select vdata table to get necessary information then modify it and finally update the table. two queries run in this function. vdata is a real big table for example in the case is more than 500MB for 80,000 rows. The function runs 88 times for every request from user. This means 88 times a big table Selected and Updated. it costs very much hardware resources and also raise loading time.
An Idea is crossed my mind, Store select query in a variable then update it If function run for the first time. And for other times only update the variable until the class of database destroyed then apply all changes. this will cause save 2x87 queries.
Note:
this caching in variable only persistent until user request accomplished. then destroy all stored(cached) data.
This is just an idea, I have no idea how it can be functioning, Or would be useful if I coded it. Or is there any alternative way or any suggested plugins or class that have same functioning?
Therefore my question has three parts:
1- Would be this idea useful?
2- if yes! Is there any class or plugin that I use it or get more idea?
3- if no! What will be your suggestion to optimze the game?
Related
My problem is simple. On my website I'm loading several results from MySQL tables inside a while loop in PHP and for some reason the execution time varies from reasonably short (0.13s) or to confusingly long (11s) and I have no idea why. Here is a short version of the code:
<?php
$sql =
"SELECT * FROM test_users, image_uploads
WHERE test_users.APPROVAL = 'granted'
AND test_users.NAME = image_uploads.OWNER
".$checkmember."
".$checkselected."
ORDER BY " . $sortingstring . " LIMIT 0, 27
";
$result = mysqli_query($mysqli, $sql);
$data = "";
$c = 0;
$start = microtime(true);
while($value = mysqli_fetch_array($result)) {
$files_key = $value["KEY"];
$file_hidden = "no";
$inner_query = "SELECT * FROM my_table WHERE KEY = '".$files_key."' AND HIDDEN = '".$file_hidden."'";
$inner_result = mysqli_query($mysqli, $inner_query);
while ($row = mysqli_fetch_array($inner_result)) {
// getting all variables with row[n]
}
$sql = "SELECT * FROM some_other_table WHERE THM=? AND MEMBER=?";
$fstmt = $mysqli->prepare($sql);
$fstmt->bind_param("ss", $value['THM'], 'username');
$fstmt->execute();
$fstmt->store_result();
if($fstmt->num_rows > 0) {
$part0 = 'some elaborate string';
} else {
$part0 = 'some different string';
}
$fstmt->close();
// generate a document using the gathered data
include "../data.php"; // produces $partsMerged
// save to data string
$data .= $partsMerged;
$c++;
}
$time_elapsed_secs = substr(microtime(true) - $start, 0, 5);
// takes sometimes only 0.13 seconds
// and other times up to 11 seconds and more
?>
I was wondering where the problem could be.
Does it have to do with my db connection or is my code flawed? I haven't had this problem at the beginning when I first implemented it but since a few months it's behaving strangely. Sometimes it loads very fast other times as I said it takes 11 seconds or even more.
How can I fix this?
There's a few ways to debug this.
Firstly, any dynamic variables that form part of your query (e.g. $checkmember) - we have no way of knowing here whether these are the same or different each time you're executing the query. If they're different then each time you are executing a different query! So it goes without saying it may take longer depending on what query is being run.
Regardless of the answer, try running the SQL through the MySQL command line and see how long that query takes.
If it's similar (i.e. not an 11 second range) then the answer is it's nothing to do with the actual query itself.
You need to say whether the environment you're running this in is a web server, e.g. accessing the PHP script via a browser, or executing the script via a command line.
There isn't enough information to answer your question. But you need to at least establish some of these things first.
The rule of thumb is that if your raw SQL executes on a MySQL command line in a similar amount of time on subsequent attempts, the problem area is elsewhere (e.g. connection to a web server via a browser). This can be monitored in the Network tab of your browser.
I am working with a MySQL table and I need to increment a value in one column for each row, of which there are over 6.5m.
The col type is varchar and can contain an integer or a string (i.e. +1). The table type is MyISAM.
I have attempted this with PHP:
$adjust_by = 1;
foreach ($options as $option) {
$original_turnaround = $option['turnaround'];
$adjusted_turnaround = $option['turnaround'];
if (preg_match('/\+/i', $original_turnaround)) {
$tmp = intval($original_turnaround);
$tmp += $adjust_by;
$adjusted_turnaround = '+'.$tmp;
} else {
$adjusted_turnaround += $adjust_by;
}
if (!array_key_exists($option['optionid'], $adjusted)) {
$adjusted[$option['optionid']] = array();
}
$adjusted[$option['optionid']][] = array(
'original_turn' => $original_turnaround,
'adjusted_turn' => $adjusted_turnaround
);
}//end fe options
//update turnarounds:
if (!empty($adjusted)) {
foreach ($adjusted as $opt_id => $turnarounds) {
foreach ($turnarounds as $turn) {
$update = "UPDATE options SET turnaround = '".$turn['adjusted_turn']."' WHERE optionid = '".$opt_id."' and turnaround = '".$turn['original_turn']."'";
run_query($update);
}
}
}
For obvious reasons there are serious performance issues with this approach. Running this in my local dev environment leads to numerous errors and eventually the server crashing.
Another thing I need to consider is when this is run in a production environment. This is for an ecommerce store, and I cannot have a huge update like this lock the database or cause any other issues.
One possible solution I have found is this: Fastest way to update 120 Million records
But creating another table comes with it's own issues. The codebase is not in a good state, similar queries are run on this table in loads of places so I would have to modify a large number of queries and files to make this approach work.
What are my options (if there are any)?
You can do this task with SQL.
With CAST you can convert a string into integer.
With IF and SUBSTR you can check if string contains +.
With CONCAT you will add (merge a two values into one string) + to your calculated result (if it will be necessary).
Just try this SQL:
"UPDATE `options` SET `turnaround` = CONCAT(IF(SUBSTR(`turnaround`, 1, 1) = '+', '+', ''), CAST(`turnaround` AS SIGNED) + " + $adjust_by + ") WHERE 1";
can't you just say
UPDATE whatevertable SET whatever = whatever + 1?
Try it and see, I'm pretty sure it will work!
EDIT: You have strings OR integers? Your DB design is flawed, this probably won't work, but would have been the correct answer had your DB design been more strict.
You probably don't have, but need, this 'composite' index (in either order):
INDEX(optionid, turnaround)
Please provide SHOW CREATE TABLE.
Another, slight, performance boost is to explicitly LOCK TABLE WRITE before that update loop. And UNLOCK afterwards. Caution: This only applies to MyISAM.
You would be much better off with InnoDB.
I have a bunch of photos on a page and using jQuery UI's Sortable plugin, to allow for them to be reordered.
When my sortable function fires, it writes a new order sequence:
1030:0,1031:1,1032:2,1040:3,1033:4
Each item of the comma delimited string, consists of the photo ID and the order position, separated by a colon. When the user has completely finished their reordering, I'm posting this order sequence to a PHP page via AJAX, to store the changes in the database. Here's where I get into trouble.
I have no problem getting my script to work, but I'm pretty sure it's the incorrect way to achieve what I want, and will suffer hugely in performance and resources - I'm hoping somebody could advise me as to what would be the best approach.
This is my PHP script that deals with the sequence:
if ($sorted_order) {
$exploded_order = explode(',',$sorted_order);
foreach ($exploded_order as $order_part) {
$exploded_part = explode(':',$order_part);
$part_count = 0;
foreach ($exploded_part as $part) {
$part_count++;
if ($part_count == 1) {
$photo_id = $part;
} elseif ($part_count == 2) {
$order = $part;
}
$SQL = "UPDATE article_photos ";
$SQL .= "SET order_pos = :order_pos ";
$SQL .= "WHERE photo_id = :photo_id;";
... rest of PDO stuff ...
}
}
}
My concerns arise from the nested foreach functions and also running so many database updates. If a given sequence contained 150 items, would this script cry for help? If it will, how could I improve it?
** This is for an admin page, so it won't be heavily abused **
you can use one update, with some cleaver code like so:
create the array $data['order'] in the loop then:
$q = "UPDATE article_photos SET order_pos = (CASE photo_id ";
foreach($data['order'] as $sort => $id){
$q .= " WHEN {$id} THEN {$sort}";
}
$q .= " END ) WHERE photo_id IN (".implode(",",$data['order']).")";
a little clearer perhaps
UPDATE article_photos SET order_pos = (CASE photo_id
WHEN id = 1 THEN 999
WHEN id = 2 THEN 1000
WHEN id = 3 THEN 1001
END)
WHERE photo_id IN (1,2,3)
i use this approach for exactly what your doing, updating sort orders
No need for the second foreach: you know it's going to be two parts if your data passes validation (I'm assuming you validated this. If not: you should =) so just do:
if (count($exploded_part) == 2) {
$id = $exploded_part[0];
$seq = $exploded_part[1];
/* rest of code */
} else {
/* error - data does not conform despite validation */
}
As for update hammering: do your DB updates in a transaction. Your db will queue the ops, but not commit them to the main DB until you commit the transaction, at which point it'll happily do the update "for real" at lightning speed.
I suggest making your script even simplier and changing names of the variables, so the code would be way more readable.
$parts = explode(',',$sorted_order);
foreach ($parts as $part) {
list($id, $position) = explode(':',$order_part);
//Now you can work with $id and $position ;
}
More info about list: http://php.net/manual/en/function.list.php
Also, about performance and your data structure:
The way you store your data is not perfect. But that way you will not suffer any performance issues, that way you need to send less data, less overhead overall.
However the drawback of your data structure is that most probably you will be unable to establish relationships between tables and make joins or alter table structure in a correct way.
i've got a script which is supposed to run through a mysql database and preform a certain 'test'on the cases. Simplified the database contains records which represent trips that have been made by persons. Each record is a singel trip. But I want to use only roundway trips. So I need to search the database and match two trips to each other; the trip to and the trip from a certain location.
The script is working fine. The problem is that the database contains more then 600.000 cases. I know this should be avoided if possible. But for the purpose of this script and the use of the database records later on, everything has to stick together.
Executing the script takes hours right now, when executing on my iMac using MAMP. Off course I made sure that it can use a lot of memory etcetare.
My question is how could I speed things up, what's the best approach to do this?
Here's the script I have right now:
$table = $_GET['table'];
$output = '';
//Select all cases that has not been marked as invalid in previous test
$query = "SELECT persid, ritid, vertpc, aankpc, jaar, maand, dag FROM MON.$table WHERE reasonInvalid != '1' OR reasonInvalid IS NULL";
$result = mysql_query($query)or die($output .= mysql_error());
$totalCountValid = '';
$totalCountInvalid = '';
$totalCount = '';
//For each record:
while($row = mysql_fetch_array($result)){
$totalCount += 1;
//Do another query, get all the rows for this persons ID and that share postal codes. Postal codes revert between the two trips
$persid = $row['persid'];
$ritid = $row['ritid'];
$pcD = $row['vertpc'];
$pcA = $row['aankpc'];
$jaar = $row['jaar'];
$maand = $row['maand'];
$dag = $row['dag'];
$thecountquery = "SELECT * FROM MON.$table WHERE persid=$persid AND vertpc=$pcA AND aankpc=$pcD AND jaar = $jaar AND maand = $maand AND dag = $dag";
$thecount = mysql_num_rows(mysql_query($thecountquery));
if($thecount >= 1){
//No worries, this person ID has multiple trips attached
$totalCountValid += 1;
}else{
//Ow my, the case is invalid!
$totalCountInvalid += 1;
//Call the markInvalid from functions.php
$totalCountValid += 1;
markInvalid($table, '2', 'ritid', $ritid);
}
}
//Echo the result
$output .= 'Total cases: '.$totalCount.'<br>Valid: '.$totalCountValid.'<br>Invalid: '.$totalCountInvalid; echo $output;
Your basic problem is that you are doing the following.
1) Getting all cases that haven't been marked as invalid.
2) Looping through the cases obtained in step 1).
What you can easily do is to combine the queries written for 1) and 2) in a single query and loop over the data. This will speed up the things a bit.
Also bear in mind the following tips.
1) Selecting all columns is not at all a good thing to do. It takes ample amount of time for the data to traverse over the network. I would recommend replacing the wild-card with all columns that you really need.
SELECT * <ALL_COlumns>
2) Use indexes - sparingly, efficiently and appropriately. Understand when to use them and when not to.
3) Use views if you can.
4) Enable MySQL slow query log to understand which queries you need to work on and optimize.
log_slow_queries = /var/log/mysql/mysql-slow.log
long_query_time = 1
log-queries-not-using-indexes
5) Use correct MySQL field types and the storage engine (Very very important)
6) Use EXPLAIN to analyze your query - EXPLAIN is a useful command in MySQL which can provide you some great details about how a query is ran, what index is used, how many rows it needs to check through and if it needs to do file sorts, temporary tables and other nasty things you want to avoid.
Good luck.
When I run my script I receive the following error before processing all rows of data.
maximum execution time of 30 seconds
exceeded
After researching the problem, I should be able to extend the max_execution_time time which should resolve the problem.
But being in my PHP programming infancy I would like to know if there is a more optimal way of doing my script below, so I do not have to rely on "get out of jail cards".
The script is:
1 Taking a CSV file
2 Cherry picking some columns
3 Trying to insert 10k rows of CSV data into a my SQL table
In my head I think I should be able to insert in chunks, but that is so far beyond my skillset I do not even know how to write one line :\
Many thanks in advance
<?php
function processCSV()
{
global $uploadFile;
include 'dbConnection.inc.php';
dbConnection("xx","xx","xx");
$rowCounter = 0;
$loadLocationCsvUrl = fopen($uploadFile,"r");
if ($loadLocationCsvUrl <> false)
{
while ($locationFile = fgetcsv($loadLocationCsvUrl, ','))
{
$officeId = $locationFile[2];
$country = $locationFile[9];
$country = trim($country);
$country = htmlspecialchars($country);
$open = $locationFile[4];
$open = trim($open);
$open = htmlspecialchars($open);
$insString = "insert into countrytable set officeId='$officeId', countryname='$country', status='$open'";
switch($country)
{
case $country <> 'Country':
if (!mysql_query($insString))
{
echo "<p>error " . mysql_error() . "</p>";
}
break;
}
$rowCounter++;
}
echo "$rowCounter inserted.";
}
fclose($loadLocationCsvUrl);
}
processCSV();
?>
First, in 2011 you do not use mysql_query. You use mysqli or PDO and prepared statements. Then you do not need to figure out how to escape strings for SQL. You used htmlspecialchars which is totally wrong for this purpose. Next, you could use a transaction to speed up many inserts. MySQL also supports multiple interests.
But the best bet would be to use the CSV storage engine. http://dev.mysql.com/doc/refman/5.0/en/csv-storage-engine.html read here. You can instantly load everything into SQL and then manipulate there as you wish. The article also shows the load data infile command.
Well, you could create a single query like this.
$query = "INSERT INTO countrytable (officeId, countryname, status) VALUES ";
$entries = array();
while ($locationFile = fgetcsv($loadLocationCsvUrl, ',')) {
// your code
$entries[] = "('$officeId', '$country', '$open')";
}
$query .= implode(', ', $enties);
mysql_query($query);
But this depends on how long your query will be and what the server limit is set to.
But as you can read in other posts, there are better way for your requirements. But I thougt I should share a way you did thought about.
You can try calling the following function before inserting. This will set the time limit to unlimited instead of the 30 sec default time.
set_time_limit( 0 );