updatepricing in MySQL where multiple prices are in same row - php

I have been tasked to update the price list on our website. Currently, we have to do this one item at a time inside the Admin Panel.
However, we have over 3000 items, and tiered pricing for each item (up to 5 tiers)
Problem is, in the sell_prices table, the prices are structured like so:
10.50:9.50:8.50;7.50;6.50 in one cell.
I am attempting to write a script that will update each sell_price by 10%
UPDATE inv_items
SET sell_prices = sell_prices * 1.10
WHERE id = xxxxxx
I have also tried:
UPDATE inv_items
SET sell_prices = sell_prices * 1.10; sell_prices = sell_prices * 1.10
WHERE id = xxxxxx
But naturally received an error.
This works great but only updates one record, and leaves the rest blank.
Currently, I am working through PhpMyAdmin but I will write a new Price Increase script once I can figure this out.
I have backed up the database.

If I understand you correctly then you have 5? prices in one field, colon separated?
That is a really bizarre way of doing it. There may be a nifty way of doing it with mySQL parsing, but from PHP you're going to need to pull the values out, explode them into an array, apply the price increase to each element, implode it back with the colons and write it back to the database. It's as clunky as all get-out but faster than doing it by hand. Just.
Going forward if you can you really need to look at refactoring that; that's just going to keep biting you.

You'll need to do something like:
select sell_prices from inv_items
(get the values into php)
(split the values by delimiter ':')
(update each value)
(rebuild the line of values with ':' in between)
update inv_items set sell_prices = (value string you just created)

EDIT with mysqli function as suggested:
$qry = 'SELECT id, sell_prices FROM `tablename`';
if($result = $mysqli->query($qry)) {
while ($row = $result->fetch_assoc()) {
$temp = explode(';', $row['sell_prices']);
foreach ($temp as &$price) {
$price = (float)$price*1.1;
}
$prices[$row['id']] = implode(';', $temp);
}
foreach ($prices as $id => $pricesStr) {
$stmt = $mysqli->prepare("UPDATE `tablename` SET sell_prices = ? WHERE id = ?");
$stmt->bind_param('si', $pricesStr, $id);
$stmt->execute();
$stmt->close();
}
}
Please note that I wrote this on the fly without testing, i may overlooked something :)

Related

PHP or SQL for Large Data and Sub/Related Data Sets

I can't imagine I'm the first or the last to ask about this, but I was unable to find the answer by searching.
I have a large dataset of orders in SQL Server. I need to return every order with each of their line items and each of their payments. I'm looking for the most efficient way to query and iterate through this data. I am using PHP/MSSQL to pull the orders, looping through those with foreach and querying for a list of items and payments would be incredibly inefficient.
I've thought about creating a union to create one massive dataset that has a bunch of columns, one of which is a column indicating what type of record (payment,line_item,order_data). But that also seems pretty inefficient.
This is an outline of the process I have been working with:
<?php
// query for orders
$records = sql_function("select * from v_migration_orders");
$json_output = array();
foreach($record as $r){
$data['order']['org_code'] = $r['org_code'];
$data['order']['po_number'] = $r['po_number'];
...
// query for line items
$data['order']['line_items'] = array();
$items = sql_function("select * from v_migration_line_items where invoice = ".sanitize_function($r['invoice']));
foreach($items as $i){
$line['line_item_type'] = $i['line_item_type'];
$line['sku'] = $i['sku'];
$line['legacy_id'] = $i['legacy_id'];
...
array_push($data['order']['line_items'],$line);
}
// query for payments
$data['order']['payments'] = array();
$payments = sql_function("select * from v_migration_payments where invoice = ".sanitize_function($r['invoice']));
foreach($payments as $p){
$pyt['Amount'] = $p['Amount'];
$pyt['payment_type'] = $p['payment_type'];
$pyt['Check_Number'] = $p['Check_Number'];
...
array_push($data['order']['payments'],$pyt);
}
array_push($json_output,$data);
}
echo json_encode($json_output);
With a focus on efficiency, I'm hoping someone in the community can help point me in the right direction here. The number of calls from the web server to the database server are triple what I'd like them to be. The alternative I can think of is to find a way to place them all into a single trip to the database and have PHP parse the full response once it receives it.
Below is the structure of the needed data I've put into views. Note that I'm using the integer, Invoice as the key to join on.
v_migration_orders:
line_type (value is: "Order")
Org_Code
PO_Number
Invoice
Tax_Total
Order_Total
EMAIL
status
order_t_stamp
order_legacy_id
v_migration_orders_line_items:
line_type (value is "line_item")
line_item_type
Invoice
sku
legacy_id
TITLE
quantity
unit_price
v_migration_orders_payments:
line_type (value is "payment")
Invoice
Amount
payment_type
Check_Number
Card_Processor
cc_message
card_type
cardholder_name
card_expiry
payment_token
payment_t_stamp
Thanks in advance!

PHP MYSQL Check & Append Function

I hope someone can help. Basically I'm fairly OK with PHP and MySQL,
however, I need some advice on how to complete this task.
As my system is to complex to explain, I've condensed it down so it's clearer.
Basically, I have an simple PHP Form that asks the user for their:
Name,Item Ordered, Item Quantity. The OrderID is autogenerated and is a random
4 number. So at the moment I do it with this:
$sql="INSERT INTO system_orders
(orderid,name,itemordered,itemquantity) VALUES
('$randomgeneratednumber', '$_POST[name]','$_POST[itemordered]','$_POST[itemquantity]')"; and
run $sql
Now what I want is if they put the quantity as "2", I want it to create an additional row and append
the randomgeneratednumber. For example, if the randomgeneratednumber was 9876 and the quantity was 2, it would create an additional new row, with the $randomgeneratednumber-2, in this example 9876-2
Would anyone know how to achieve this?
I have temporarily used an if statement (which I know is really bad programming practice)
to append the -2 manually, but there must be a function out there to detect if $quantity = 2
then create additional row with the appended -2 and so on for 3,4,5,6,7,8...
Use a loop:
if ($quantity > 1) {
for ($q = 2; $q <= $quantity; $q++) {
$sql = "INSERT INTO system_orders
(orderid,name,itemordered,itemquantity) VALUES
('$randomgeneratednumber-$q', '$_POST[name]', '$_POST[itemordered]', '$_POST[itemquantity]')";
// run $sql
}
}
You also should switch to a database API that supports parametrized queries, or escape the user-supplied inputs.
$sql="INSERT INTO system_orders
(orderid,name,itemordered,itemquantity) VALUES
('$randomgeneratednumber', '$_POST[name]','$_POST[itemordered]','$_POST[itemquantity]')"; and
run $sql
if ($_POST['itemquantity']>1) {
$multipleorderid = $randomgeneratednumber."-".$POST['itemquantity'];
$sql="INSERT INTO system_orders
(orderid,name,itemordered,itemquantity) VALUES
('$multipleorderid', '$_POST[name]','$_POST[itemordered]','$_POST[itemquantity]')"; and
run $sql
}

Splitting a string of values like 1030:0,1031:1,1032:2 and storing data in database

I have a bunch of photos on a page and using jQuery UI's Sortable plugin, to allow for them to be reordered.
When my sortable function fires, it writes a new order sequence:
1030:0,1031:1,1032:2,1040:3,1033:4
Each item of the comma delimited string, consists of the photo ID and the order position, separated by a colon. When the user has completely finished their reordering, I'm posting this order sequence to a PHP page via AJAX, to store the changes in the database. Here's where I get into trouble.
I have no problem getting my script to work, but I'm pretty sure it's the incorrect way to achieve what I want, and will suffer hugely in performance and resources - I'm hoping somebody could advise me as to what would be the best approach.
This is my PHP script that deals with the sequence:
if ($sorted_order) {
$exploded_order = explode(',',$sorted_order);
foreach ($exploded_order as $order_part) {
$exploded_part = explode(':',$order_part);
$part_count = 0;
foreach ($exploded_part as $part) {
$part_count++;
if ($part_count == 1) {
$photo_id = $part;
} elseif ($part_count == 2) {
$order = $part;
}
$SQL = "UPDATE article_photos ";
$SQL .= "SET order_pos = :order_pos ";
$SQL .= "WHERE photo_id = :photo_id;";
... rest of PDO stuff ...
}
}
}
My concerns arise from the nested foreach functions and also running so many database updates. If a given sequence contained 150 items, would this script cry for help? If it will, how could I improve it?
** This is for an admin page, so it won't be heavily abused **
you can use one update, with some cleaver code like so:
create the array $data['order'] in the loop then:
$q = "UPDATE article_photos SET order_pos = (CASE photo_id ";
foreach($data['order'] as $sort => $id){
$q .= " WHEN {$id} THEN {$sort}";
}
$q .= " END ) WHERE photo_id IN (".implode(",",$data['order']).")";
a little clearer perhaps
UPDATE article_photos SET order_pos = (CASE photo_id
WHEN id = 1 THEN 999
WHEN id = 2 THEN 1000
WHEN id = 3 THEN 1001
END)
WHERE photo_id IN (1,2,3)
i use this approach for exactly what your doing, updating sort orders
No need for the second foreach: you know it's going to be two parts if your data passes validation (I'm assuming you validated this. If not: you should =) so just do:
if (count($exploded_part) == 2) {
$id = $exploded_part[0];
$seq = $exploded_part[1];
/* rest of code */
} else {
/* error - data does not conform despite validation */
}
As for update hammering: do your DB updates in a transaction. Your db will queue the ops, but not commit them to the main DB until you commit the transaction, at which point it'll happily do the update "for real" at lightning speed.
I suggest making your script even simplier and changing names of the variables, so the code would be way more readable.
$parts = explode(',',$sorted_order);
foreach ($parts as $part) {
list($id, $position) = explode(':',$order_part);
//Now you can work with $id and $position ;
}
More info about list: http://php.net/manual/en/function.list.php
Also, about performance and your data structure:
The way you store your data is not perfect. But that way you will not suffer any performance issues, that way you need to send less data, less overhead overall.
However the drawback of your data structure is that most probably you will be unable to establish relationships between tables and make joins or alter table structure in a correct way.

Transform MySQL table and rows

I have one problem here, and I don't even have clue what to Google and how to solve this.
I am making PHP application to export and import data from one MySQL table into another. And I have problem with these tables.
In source table it looks like this:
And my destination table has ID, and pr0, pr1, pr2 as rows. So it looks like this:
Now the problem is the following: If I just copy ( insert every value of 1st table as new row in second) It will have like 20.000 rows, instead of 1000 for example.
Even if I copy every record as new row in second database, is there any way I can fuse rows ? Basically I need to check if value exists in last row with that ID_, if it exist in that row and column (pr2 for example) then insert new row with it, but if last row with same ID_ does not have value in pr2 column, just update that row with value in pr2 column.
I need idea how to do it in PHP or MySQL.
So you got a few Problems:
1) copy the table from SQL to PHP, pay attention to memory usage, run your script with the PHP command Memory_usage(). it will show you that importing SQL Data can be expensive. Look this up. another thing is that PHP DOESNT realese memory on setting new values to array. it will be usefull later on.
2)i didnt understand if the values are unique at the source or should be unique at the destination table.. So i will assume that all the source need to be on the destination as is.
I will also assume that pr = pr0 and quant=pr1.
3) you have missmatch names.. that can also be an issue. would take care of that..also.
4) will use My_sql, as the SQL connector..and $db is connected..
SCRIPT:
<?PHP
$select_sql = "SELECT * FROM Table_source";
$data_source = array();
while($array_data= mysql_fetch_array($select_sql)) {
$data_source[] = $array_data;
$insert_data=array();
}
$bulk =2000;
foreach($data_source as $data){
if(isset($start_query) == false)
{
$start_query = 'REPLACE INTO DEST_TABLE ('ID_','pr0','pr1','pr2')';
}
$insert_data[]=implode(',',$data).',0)';// will set 0 to the
if(count($insert_data) >=$bulk){
$values = implode('),(',$insert_data);
$values = substr(1,2,$values);
$values = ' VALUES '.$values;
$insert_query = $start_query.' '.$values;
$mysqli->query($insert_query);
$insert_data = array();
} //CHECK THE SYNTAX IM NOT SURE OF ALL OF IT MOSTLY THE SQL PART>> SEE THAT THE QUERY IS OK
}
if(count($insert_data) >=$bulk) // IF THERE ARE ANY EXTRA PIECES..
{
$values = implode('),(',$insert_data);
$values = substr(1,2,$values);
$values = ' VALUES '.$values;
$insert_query = $start_query.' '.$values;
$mysqli->query($insert_query);
$insert_data = null;
}
?>
ITs off the top off my head but check this idea and tell me if this work, the bugs night be in small things i forgot with the QUERY structure, print this and PASTE to PHPmyADMIN or you DB query and see its all good, but this concept will sqve a lot of problems..

Referencing the next iteration before it happens in PHP

I have a table in MySQL with "text", "date_posted", and "user". I currently query all text from user=Andy, and call those questions. All of the other text fields from other users are answers to the most recent question.
What I want is to associate those answers with the most recent question, with a loop similar to "for each text where user=Andy, find the text where user!=Andy until date>the next user=Andy (question)"
This seems awfully contrived, and I'm wondering if it can be done roughly as I've outlined, or if I can save myself some trouble in how I'm storing the data or something.
Thanks for any advice.
EDIT: I've added in the insert queries I've been using.
$url = "http://search.twitter.com/search.json?q=&ands=&phrase=&ors=&nots=RT%2C+%40&tag=andyasks&lang=all&from=amcafee&to=&ref=&near=&within=1000&units=mi&since=&until=&tude%5B%5D=%3F&rpp=50)";
$contents = file_get_contents($url);
$decode = json_decode($contents, true);
foreach($decode['results'] as $current) {
$query = "INSERT IGNORE INTO andyasks (questions, date, user) VALUES ('$current[text]','$current[created_at]','Andy')";
mysql_query($query);
}
$url2 = "http://search.twitter.com/search.json?q=&ands=&phrase=&ors=&nots=RT&tag=andyasks&lang=all&from=&to=amcafee&ref=&near=&within=15&units=mi&since=&until=&rpp=50";
$contents2 = file_get_contents($url2);
$decode2 = json_decode($contents2, true);
foreach($decode2['results'] as $current2) {
$query2 = "INSERT IGNORE INTO andyasks (questions, date, user) VALUES ('$current2[text]','$current2[created_at]','$current2[from_user]')";
mysql_query($query2);
}
And then on the SELECT side, this is where I am currently:
$results = mysql_query("SELECT * FROM andyasks");
$answers = mysql_query("SELECT * FROM andyasks WHERE 'user' != 'Andy'");
while($row = mysql_fetch_array($results))
{
if ($row['user'] == 'Andy') {
print(preg_replace($pattern, $replace, "<p>".$row["questions"]."</p>"));
}
}
while($row = mysql_fetch_array($answers))
{
print(preg_replace('/#amcafee/', '', "<p>".$row["questions"]."</p>"));
}
What you have in mind could, I believe, be done with subtle use of JOIN or nested SELECT, ORDER BY, LIMIT, etc, but, as you surmise, it would be "awfully contrived" and likely pretty slow.
As you suspect, you would save yourself a lot of trouble at SELECT time if you added a column to the table, which, for answers, has the primary key of the question they're answering (that could be easily obtained at INSERT time, since it's the latest entry with user equal Alex). Then the retrieval would be easier!
If you can alter your schema this way, but need help with the SQL, pls comment or edit your answer to indicate that and I'll be happy to follow up (similarly, I'd be happy to follow up if you're stuck with this schema and need the "awfully contrived" SQL -- I just don't know which of the two possibilities applies!-).
Edit: since the schema's changed, the INSERT could be (using form :name to indicate parameters you should bind):
INSERT IGNORE INTO andyasks
(questions, date, user, answering)
SELECT :text, :created_at, :from_user,
IF(:from_user='Andy', NULL, aa.id)
FROM andyasks AS aa
WHERE user='Andy'
ORDER BY date DESC
LIMIT 1
i.e.: use INSERT INTO ... SELECT' to do a query-within-insertion, which picks the latest post by Andy. I'm assuming you do also have a primary keyid` that's auto-increment, which is the normal arrangement of things.
Later to get all answers to a given question, you only need to select rows whose answering attribute equals that question's id.
If I understand you correctly you want something like:
$myArr = array("bob","joe","jennifer","mary");
while ($something = next($myArr)) {
if ($nextone = next($myArr)) {
//do Something
prev($myArr)
}
}
see http://jp2.php.net/next as well as the sections on prev, reset and current

Categories