I have an application where investment profit should be calculated on daily basis, which I intend to do using a cron job.
Presently, based on the rate a user has and the amount he possess, I have written a statement to calculate the profit as
$rate = $row_cron['rate'];
$amount = $row_cron['amount'];
$usernamex = $row_cron['Username'];
$check = $rate * $amount / 100;
$sql = "UPDATE users
SET invest = invest + $check
WHERE status = 'member'";
to fetch my records I have also created a recordset of data using
mysql_select_db($database_emirate, $emirate);
$query_cron = "SELECT * FROM users";
$cron = mysql_query($query_cron, $emirate) or die(mysql_error());
$row_cron = mysql_fetch_assoc($cron);
$totalRows_cron = mysql_num_rows($cron);
I have used this query to get all users in my table
Now the above code snippets only returns the first record in my table to be updated.
What I am trying to achieve is to return all records in my table so as to calculate and update their records at the same time, based on their various column data below
$rate = $row_cron['rate'];
$amount = $row_cron['amount'];
please help
Simply use pure SQL as you are updating the same table with calculated column. No loops are needed. Just run one query in PHP.
UPDATE `users`
SET invest = invest + (rate * amount / 100)
WHERE status = 'member'
Do note: SQL allows arithmetic calculations and some RDBMS's like MySQL even maintains mathematical and statistical functions.
Related
My database is for keeping track of user's balances, and there are certain actions that will add or remove balance from the users. Currently I am doing this to get the balance and update it
$conn->prepare("SELECT * FROM users WHERE userid=:uid")
$conn->bindValue(':uid', $data['id']
$conn-execute()
$currentBal = $conn->fetch()
$newBal = $currentBal['balance'] + 100
$conn->prepare("UPDATE users SET balance=:bal WHERE userid=:uid")
$conn->bindValue(':bal', $newBal)
$conn->bindValue(':uid', $data['id']
Is there a way to directly do math within the query to add or remove? It is an integer.
Just do arithmetic:
UPDATE users
SET balance = balance + :inc
WHERE userid = :uid;
You can hardcode the 100, but I think it is better to pass it in as a parameter.
I am creating a running inventory system. I am looking for suggestions on how to speed up the processing time to populate my table that contains the daily inventories.
I have 4 database tables that I am using to do this:
1). Daily Usage table
2). Incoming Product table (product ordered to come in)
3). Physical Inventory Counts table (Monthly Physical Count totals held here)
4). Perpetual Running Inventory table (In a perfect world this is what inventories should be)
As of now I have each piece of data being updated in my perpetual table and it takes quite awhile for my code to run through, do the calculations, connect to the database and update the information.
My question is: How can I speed this up? Is there a way that I can upload all info from an array at once so I am only updating the database once instead of a couple hundred times?
Sorry for dumping 75 lines of code below but I wanted to make sure to show everything so it can easily be seen what is going on.
My logic behind the code:
I start off by pulling info from all 4 tables into arrays and then cycling through the arrays. I use a date function to start from 2 weeks ago instead of the very beginning of the table (Beginning of 2016). Once the date quota is met, I pull the previous days perpetual running inventory, subtract the current dates daily usage, and add the current days incomming to get the current days perpetual. Once I have the current days perpetual I update the perpetual table with that piece of data.
If I simply just stored the information into to the array, could I basically write over the whole table at the end? Im assuming this would be much faster if it is possible?
Heres the code:
<?php
include("connection.php");
/* $rowper below (Row Perpetual) */
$query= "SELECT * FROM perpetual";
$result = mysqli_query($link, $query);
/* $rowdaily below */
$query2= "SELECT * FROM dailyusage";
$result2 = mysqli_query($link, $query2);
/* $rowpo below (Row Purchase Order) */
$query3= "SELECT * FROM incomming";
$result3 = mysqli_query($link, $query3);
/* $rowinv below (Row Inventory) */
$query4= "SELECT * FROM inventory";
$result4 = mysqli_query($link, $query4);
$checkdate = mktime(0, 0, 0, date('n'), date('d')-14, date('y'));
$checkdate= date('Y-m-d', $checkdate);
$b=1;
while (($rowper=mysqli_fetch_array($result)) and ($rowdaily=mysqli_fetch_array($result2)) and ($rowpo=mysqli_fetch_array($result3)) and ($rowinv=mysqli_fetch_array($result4))){
$a=2;
if($b == 1){
while($a< mysqli_num_fields($result)) {
/* $it holds the item #s from the column */
$it[$a]=$rowper[$a];
$a++;
}
$a=2;
}
if($rowper[1]>= $checkdate){
if($b>2){
while($a< mysqli_num_fields($result)) {
$previnv[$a];
if($rowinv[$a]!=0){
$rowper[$a]=$rowinv[$a];
$previnv[$a]=$rowper[$a];
/* the 'p' is because column name was made by item # + p at end to make valid colum name */
$query="UPDATE perpetual SET ".$it[$a]."p ='".$rowper[$a]."' WHERE date='".$rowper[1]."' LIMIT 1";
mysqli_query($link, $query);
}else{
$rowper[$a]=$previnv[$a] - $rowdaily[$a] + $rowpo[$a];
$previnv[$a]=$rowper[$a];
$query="UPDATE perpetual SET ".$it[$a]."p ='".$rowper[$a]."' WHERE date='".$rowper[1]."' LIMIT 1";
mysqli_query($link, $query);
}
$a++;
}
}
}
$b++;
}
?>
Stored Procedures and User Defined Functions are what you are looking for. You can schedule them to run at certain times to perform operations on your tables.
Side note, remember the fundamentals behind ETL.
Extract - Transform - Load in that order.
I need to synchronize specific information between two databases (one mysql, the other a remote hosted SQL Server database) for thousands of rows. When I execute this php file it gets stuck/timeouts after several minutes I guess, so I wonder how I can fix this issue and maybe also optimize the way of "synchronizing" it.
What the code needs to do:
Basically I want to get for every row (= one account) in my database which gets updated - two specific pieces of information (= 2 SELECT queries) from another SQL Server database. Therefore I use a foreach loop which creates 2 SQL queries for each row and afterwards I update those information into 2 columns of this row. We talk about ~10k Rows which needs to run thru this foreach loop.
My idea which may help?
I have heard about things like PDO Transactions which should collect all those queries and sending them afterwards in a package of all SELECT queries, but I have no idea whether I use them correctly or whether they even help in such cases.
This is my current code, which is timing out after few minutes:
// DBH => MSSQL DB | DB => MySQL DB
$dbh->beginTransaction();
// Get all referral IDs which needs to be updated:
$listAccounts = "SELECT * FROM Gifting WHERE refsCompleted <= 100 ORDER BY idGifting ASC";
$ps_listAccounts = $db->prepare($listAccounts);
$ps_listAccounts->execute();
foreach($ps_listAccounts as $row) {
$refid=$row['refId'];
// Refsinserted
$refsInserted = "SELECT count(username) as done FROM accounts WHERE referral='$refid'";
$ps_refsInserted = $dbh->prepare($refsInserted);
$ps_refsInserted->execute();
$row = $ps_refsInserted->fetch();
$refsInserted = $row['done'];
// Refscompleted
$refsCompleted = "SELECT count(username) as done FROM accounts WHERE referral='$refid' AND finished=1";
$ps_refsCompleted = $dbh->prepare($refsCompleted);
$ps_refsCompleted->execute();
$row2 = $ps_refsCompleted->fetch();
$refsCompleted = $row2['done'];
// Update fields for local order db
$updateGifting = "UPDATE Gifting SET refsInserted = :refsInserted, refsCompleted = :refsCompleted WHERE refId = :refId";
$ps_updateGifting = $db->prepare($updateGifting);
$ps_updateGifting->bindParam(':refsInserted', $refsInserted);
$ps_updateGifting->bindParam(':refsCompleted', $refsCompleted);
$ps_updateGifting->bindParam(':refId', $refid);
$ps_updateGifting->execute();
echo "$refid: $refsInserted Refs inserted / $refsCompleted Refs completed<br>";
}
$dbh->commit();
You can do all of that in one query with a correlated sub-query:
UPDATE Gifting
SET
refsInserted=(SELECT COUNT(USERNAME)
FROM accounts
WHERE referral=Gifting.refId),
refsCompleted=(SELECT COUNT(USERNAME)
FROM accounts
WHERE referral=Gifting.refId
AND finished=1)
A correlated sub-query is essentially using a sub-query (query within a query) that references the parent query. So notice that in each of the sub-queries I am referencing the Gifting.refId column in the where clause of each sub-query. While this isn't the best for performance because each of those sub-queries still has to run independent of the other queries, it would perform much better (and likely as good as you are going to get) than what you have there.
Edit:
And just for reference. I don't know if a transaction will help here at all. Typically they are used when you have several queries that depend on each other and to give you a way to rollback if one fails. For example, banking transactions. You don't want the balance to deduct some amount until a purchase has been inserted. And if the purchase fails inserting for some reason, you want to rollback the change to the balance. So when inserting a purchase, you start a transaction, run the update balance query and the insert purchase query and only if both go in correctly and have been validated do you commit to save.
Edit2:
If I were doing this, without doing an export/import this is what I would do. This makes a few assumptions though. First is that you are using a mssql 2008 or newer and second is that the referral id is always a number. I'm also using a temp table that I insert numbers into because you can insert multiple rows easily with a single query and then run a single update query to update the gifting table. This temp table follows the structure CREATE TABLE tempTable (refId int, done int, total int).
//get list of referral accounts
//if you are using one column, only query for one column
$listAccounts = "SELECT DISTINCT refId FROM Gifting WHERE refsCompleted <= 100 ORDER BY idGifting ASC";
$ps_listAccounts = $db->prepare($listAccounts);
$ps_listAccounts->execute();
//loop over and get list of refIds from above.
$refIds = array();
foreach($ps_listAccounts as $row){
$refIds[] = $row['refId'];
}
if(count($refIds) > 0){
//implode into string for use in query below
$refIds = implode(',',$refIds);
//select out total count
$totalCount = "SELECT referral, COUNT(username) AS cnt FROM accounts WHERE referral IN ($refIds) GROUP BY referral";
$ps_totalCounts = $dbh->prepare($totalCount);
$ps_totalCounts->execute();
//add to array of counts
$counts = array();
//loop over total counts
foreach($ps_totalCounts as $row){
//if referral id not found, add it
if(!isset($counts[$row['referral']])){
$counts[$row['referral']] = array('total'=>0,'done'=>0);
}
//add to count
$counts[$row['referral']]['total'] += $row['cnt'];
}
$doneCount = "SELECT referral, COUNT(username) AS cnt FROM accounts WHERE finished=1 AND referral IN ($refIds) GROUP BY referral";
$ps_doneCounts = $dbh->prepare($doneCount);
$ps_doneCounts->execute();
//loop over total counts
foreach($ps_totalCounts as $row){
//if referral id not found, add it
if(!isset($counts[$row['referral']])){
$counts[$row['referral']] = array('total'=>0,'done'=>0);
}
//add to count
$counts[$row['referral']]['done'] += $row['cnt'];
}
//now loop over counts and generate insert queries to a temp table.
//I suggest using a temp table because you can insert multiple rows
//in one query and then the update is one query.
$sqlInsertList = array();
foreach($count as $refId=>$count){
$sqlInsertList[] = "({$refId}, {$count['done']}, {$count['total']})";
}
//clear out the temp table first so we are only inserting new rows
$truncSql = "TRUNCATE TABLE tempTable";
$ps_trunc = $db->prepare($truncSql);
$ps_trunc->execute();
//make insert sql with multiple insert rows
$insertSql = "INSERT INTO tempTable (refId, done, total) VALUES ".implode(',',$sqlInsertList);
//prepare sql for insert into mssql
$ps_insert = $db->prepare($insertSql);
$ps_insert->execute();
//sql to update existing rows
$updateSql = "UPDATE Gifting
SET refsInserted=(SELECT total FROM tempTable WHERE refId=Gifting.refId),
refsCompleted=(SELECT done FROM tempTable WHERE refId=Gifting.refId)
WHERE refId IN (SELECT refId FROM tempTable)
AND refsCompleted <= 100";
$ps_update = $db->prepare($updateSql);
$ps_update->execute();
} else {
echo "There were no reference ids found from \$dbh";
}
The following code runs incredibly slowly when performing a WHILE LOOP using data from table product and updating another table stock_figures within the same database.
The code loops through each row in product taking the value from product_id and wholesale_price and then performs some calculations on the product table before updating the stock_figures table with the values.
I'd be grateful of any suggestions which would improve the performance of my queries.
PHP WHILE LOOP
<?
// Retrieve data from database
$loop = " SELECT product_id, wholesale_price FROM product";
$query= mysql_query($loop);
while($rows=mysql_fetch_assoc($query))
{
$row = mysql_fetch_row($query);
$id = $row[0];
$price = $row[1];
?>
QUERIES WITHIN WHILE LOOP
<?
$bawtry_stock = "
SELECT product_id,
( kids_uk_j_105 + kids_c_17 + kids_c_18 + kids_c_19 + ... etc )
AS SUM FROM product WHERE product_id = '$id'";
$result_bawtry = mysql_query($bawtry_stock) or die (mysql_error());
$line = mysql_fetch_row($result_bawtry);
$bawtry = $line[1];
$chain_stock = "
SELECT product_id,
(quantity_c_size_26_chain + quantity_c_size_28_chain + quantity_c_size_30_chain +
... etc )
AS SUM FROM product WHERE product_id = '$id'";
$result_chain = mysql_query($chain_stock) or die (mysql_error());
$line = mysql_fetch_row($result_chain);
$chain = $line[1];
/*
* Declare the total value of all pairs from Bawtry, Chain
*/
$totalpairs = $chain + $bawtry;
/*
* Insert values for stock to write to databse
* Total stock for Bawtry, Chain
* Total value of stock for Bawtry, Chain
*
*/
$bawtry_value = (float)($bawtry * $price);
$chain_value = (float)($chain * $price);
$total_value = (float)($price * ($bawtry + $chain));
$sql2="
UPDATE stock_figures SET
bawtry_stock='$bawtry',
chain_stock='$chain',
totalstock='$totalpairs',
bawtry_value='$bawtry_value',
chain_value='$chain_value',
totalvalue='$total_value'
WHERE id='$id'";
$result2=mysql_query($sql2) or die (mysql_error());
?>
// close while loop
<? } ?>
UPDATED CODE
$sql = "SELECT product_id, wholesale_price,
(kids_uk_j_105 + kids_c_17 + kids_c_18 + kids_c_19 + kids_c_20 + kids_c_21 +
... )
AS bawtry,
(quantity_c_size_26_chain + quantity_c_size_28_chain + quantity_c_size_30_chain +
... )
AS chain from product";
$result = mysql_query($sql) or die (mysql_error());
while ($line=mysql_fetch_assoc($result))
{
$id = $line['product_id'];
$price = $line['wholesale_price'];
$bawtry = $line['bawtry'];
$chain = $line['chain'];
/*
* Declare the total value of all pairs from Bawtry, Chain
*/
$totalpairs = $chain + $bawtry;
/*
* Insert values for stock to write to database
* Total stock for Bawtry, Chain
* Total value of stock for Bawtry, Chain
*
*/
$bawtry_value = (float)($bawtry * $price);
$chain_value = (float)($chain * $price);
$total_value = (float)($price * ($bawtry + $chain));
$sql2="
UPDATE stock_figures SET
bawtry_stock='$bawtry',
chain_stock='$chain',
totalstock='$totalpairs',
bawtry_value='$bawtry_value',
chain_value='$chain_value',
totalvalue='$total_value'
WHERE id='$id'";
$result2=mysql_query($sql2) or die (mysql_error());
However, it's still taking an absolute age to complete. It seems to run really fast when I comment out the UPDATE statement at the end. Obviously this needs to remain in the code, so I'll probably run the whole thing as a cronjob.
Unless any further improvements can be suggested?
It seems you doing a lot of wasted selects.
You first select some data from table products, then for each row you select again from the same table. Twice. Then finally inserting this into another table, stock_figures.
And the only operation you are doing is adding lots of figures together.
All of this can be done in a single query.
select product_id,
whole_sale_price,
sum(kids_uk_j_105,
kids_c_17,
...) as bawtry,
sum(quantity_c_size_26_chain,
quantity_c_size_28_chain,
...) as chain
from products;
If this still is taking lots of time you need to check some server settings and also number of rows
Every write you make is a transaction and depending on your ACID-level it might be slow to do commits. Change innodb-flush-log-at-trx-commit to 2 will speed up writes.
You are doing a full table scan on products-table. I guess this is intended but if that table is big reading it will take a while, and writing all those rows back to stock_figures is going to take even longer.
Consider another approach. For each write (insert, update or delete) to products have a trigger update the corresponding row in stock_figures. Not only will it eliminate the batch job, it will also make stock_figures be correct at any given time.
The first thing is:
$row = mysql_fetch_row($query);
$id = $row[0];
$price = $row[1];
I don't know if it does work for you, but you already take $rows in your while condition so probably you should change it into:
$id = $rows['product_id'];
$price = $row['wholesale_price'];
Then the next 2 queries you can combine info:
SELECT product_id,
( kids_uk_j_105 + kids_c_17 + kids_c_18 + kids_c_19 + ... etc )
AS `SUM` FROM product WHERE product_id = '$id'
UNION ALL
SELECT product_id,
(quantity_c_size_26_chain + quantity_c_size_28_chain + quantity_c_size_30_chain +
... etc )
AS `SUM` FROM product WHERE product_id = '$id'
or even:
SELECT product_id,
( kids_uk_j_105 + kids_c_17 + kids_c_18 + kids_c_19 + ... etc )
AS `SUM1`,
(quantity_c_size_26_chain + quantity_c_size_28_chain + quantity_c_size_30_chain +
... etc )
AS `SUM2`
FROM product WHERE product_id = '$id'
because those 2 queries are run on the same table.
But in fact you can use just one query to get everything about your products as Andreas Wederbrand pointed in his answer.
But there are more problems:
You use old mysql_ function instead of mysqli_ or PDO and your code is vulnerable to SQL Injection
For each product you run 2 extra queries (select with union all if you go my way and update).
I don't know how many products you have, but if you have for example 1000 products or 10000 products you cannot expect it will be very fast. In that case you should somehow run your script in cron or refresh the page and do the job for small amount of products (for example for 10 or 100 at one time)
You should also consider if your database structure is the best one. Usually using many columns as you here kids_uk_j_105, kids_c_17, kids_c_18 is not the best choice.
I hope you have set key primary_id at product_id column as least.
When executing many SQL commands, parsing them takes some time. You can reduce this overhead by using http://php.net/manual/en/mysqli.quickstart.prepared-statements.php
How much you gain, depends on case.
Prepared statements are also good for security reasons.
This answer does not void other answers here. Try to gain efficiency by reducing number of queries, analyzing their work, merging them if possible etc.
I am currently working on a simple auction site. I am storing bids in their own MySQL table called 'bids'. I am wondering what is the best way of ensuring that two of the same bids are not submitted at the exact same time.
My current strategy for verifying that the bid submitted is in fact the highest bid is to do the following (as an example):
$sql = "SELECT * FROM bids WHERE amount >= '".$bidamount."'";
$result = mysql_query($sql);
if(mysql_num_rows($result) == 0) {
$sql = "INSERT INTO bids SET amount = '".$bidamount."'";
mysql_query($sql);
$bidid = mysql_insert_id();
}
The problem with the above set of queries is that between the time the SELECT query is run and the INSERT query is run, another user could insert the same bid.
Is there some way to lock the table during the SELECT that would prevent this double-bidding from occurring? My main concern with locking tables for such a purpose would be performance problems when you have a lot of people bidding at once.
You may want to make conditional insert, like:
$amount = intval($amount);
$query = "
INSERT INTO
bids(amount)
SELECT
{$amount}
FROM
(SELECT 1) tmp_tbl
WHERE NOT EXISTS(
SELECT * FROM bids WHERE amount >= {$amount}
)
";
and check for affected (inserted) rows.