when i submit an edited order in which the reward points awarded is negative (because the new processed order is less than the original), in the reward history one of the entries is pushed forward 12 hours ahead of the most recent entry
this isn't causing any bad effects to the total reward points as the value is being calculated correctly
there's really nothing my edit order module could be doing that does this, to edit the order the module just does the exact same thing as what Reorder does but it also creates a new session which stores the previous order's item's, their qty, the date the order was made and it's increment id, how the points are calculated is that when they are normally calculated, there's a check for this session, if it's exists it gets the Base Grand Total, puts that though $this->getReward()->getRateToPoints()->calculateToPoints() and subtracts that from the points delta
when we process the order in the checkout, it just creates a new order but the transaction id is the same as the previous order, since internet sales looks at this transaction id they'll know which order to edit, the old order will be canceled during the next run of the synchronization scripts
given all of that, at no time do i ever do anything with the reward history and it's not always the order your editing who's reward history entry i pushed up, i went and edited order 100000040 but 100000042 was pushed up
NOTE: i should also point out that the orders are not storing the same transaction id at this time as i am waiting on sample xml to compare to the request/response xml from the payment gateway, so there is no way that an order once process is tied to another order in any way
The following code is taken from Enterprise_Reward_Model_Action_OrderExtra::getPoints, it is places between $pointsDelta = $this->getReward()->getRateToPoints()->calculateToPoints((float)$monetaryAmount); and return $pointsDelta;
$editOrder = Mage::getModel('core/session')->getData('editorder');
if(!is_null($editOrder))
{
$readConnection = Mage::getSingleton('core/resource')->getConnection('core_read');
$result = $readConnection->fetchAll("
SELECT sfo.entity_id FROM sales_flat_order AS sfo
WHERE sfo.increment_id = ?
LIMIT 1
",$editOrder['original_order']);
$order = Mage::getModel('sales/order')->load($result[0]['entity_id']);
$orderData = $order->getData();
$previousOrderMonetaryAmount = (($orderData['base_grand_total'] - $orderData['base_shipping_amount']) - $orderData['base_tax_amount']);
$pointsDelta = $pointsDelta - $this->getReward()->getRateToPoints()->calculateToPoints((float)$previousOrderMonetaryAmount);
}
above is a snapshot of what's happening, the top shows what the reward history was before en order was edited and processed, the bottom is what happens after (for #100000049 nothing happened so i had to do it again yo get the glitch showing up)
the items in #100000044 are completely different to order #100000048, order #100000046 is missing cause when the points delta is 0 (meaning no points will be added/subtracted) it doesn't show (which is good)
all the order with a negative values are processed edited orders of those which had positive values but had some items removed during the edit reducing the cart total (in the gateway, this would then refund the customer so it makes sense that they should not be allowed to have the points that they was refunded for)
The problem is to do with magento itself, by forcing a clean build of magento to always have a negative amount in the order using $pointsDelta = $pointsDelta - 100000 results in the exact same problem, no idea if Magento will consider it a bug though since the only way to have discovered it is via customization of Magento but i'll report it nun the less
Related
I am designing as a project a web store (using PHP Laravel and MySQL FYI) and I am at the part where I have to create the logic behind the production system, which goes like this:
-On my Database,
I have 1 ORDER table where I have all the information regarding the shipping, customer, etc.
I have another table called ITEM where are listed all the Items in an order (so if an order has 3 items, there will be 3 lines in the ITEM table with a Foreign Key pointing to the ORDER).
Now I'm creating the PRODUCTION DASHBOARD. Right now I'm able to scan the item ID and get the shipment information on the Dashboard.
After that, for orders with multiple items what I want to do is for the system to tell the user to deposit the item in a numbered box to wait for the rest of the items from the order. That way the user can keep scanning items from other orders and once another item from the ordered stored in X box is produced, he can scan it and the system will then tell him that the other items from the same Order and placed on X box and he can do that until the order is complete.
My question is what would be the best way and logic Database wise (and also Laravel wise if you want to further expand your answer hehe) to implement this BOX system.
I hope my question is clear enough and thank you very much :)
I had a similar system for a project that I was working on. What I did was, was create a database table called temp_orders with a column called items that each item was separated by a line break. Until the order was finalized (100% processed), the order would remain in temp_orders.
Once finalized, it would get deleted from temp_orders and moved over to the orders table. If I needed to check items, I would explode() the data from the items column in temp_orders table using a line break, thus putting them into an array and then using the data however I needed to.
You need to determine when you want to finalize the order. It could be upon credit card payment, or upon user order confirmation, for example.
Working on a lightweight invoicing system and I'm trying to update my 'items' table correctly without fully succeeding.
Once an invoice is saved, I need to mark the items as paid that were purchased so far that day for that buyer.
Right now this is my query:
$markitemspaidquery = "UPDATE solditems SET paidstatus='paid', paidtime='$date' WHERE buyerid='$buyer_id'";
This updates the records for the buyer correctly, but the problem is it will mark EVERY item in the table for that buyer 'paid'; even if it is from 3 days ago.
How can I use a query kind of like this one or achieve this affect?
$markitemspaidquery = "UPDATE solditems SET paidstatus='paid', paidtime='$date' WHERE buyerid='$buyer_id' AND DATE(timesold)=CURDATE() AND paidstatus='unpaid'";
In all reality, everything should be paid by the end of the day anyway because of the way the company works, but I would like to know for future reference since it's just using up unnecessary resources to update every item for the buyer.
Here is an example with order by and limit
update questions
set prepStatus=1,status_bef_change=status,cv_bef_change=closeVotes,max_cv_reached=greatest(max_cv_reached,closeVotes)
where status='O' and prepStatus = 0
order by qId desc
limit 900;
There is a flaw in your datamodel. You have items sold and invoices but no connection between them.
So you muddle your way through be saying: when there is an invoice on a day for a customer it must be an invoice exactly covering all items on that very day bought by that customer. This is a rule you are kind of making up - the database doesn't tell you this.
So have a reference in your sold items table to the invoice table. Once an invoice is entered in the system it must be linked to all items sold it includes. Thus the update is easy, or better not necessary, as it is not the sigle item sold which is paid, but the invoice. So the sold items shouldn't have columns paidstatus and paidtime, but the invoice should.
UPDATE: Here is an example for a working data model. Each item has an invoice number which is null at first. Once the customer checks out an invoice is written and all the customer's items without an invoice number get this new invoice number. The invoice's total amount is the sum of its items. I.e. you don't store the sum redundantly.
auction_item
auction_item_no - primary key
customer_no - not null
description - not null
price - not null
invoice_no - nullalble (null as long as not checked out)
invoice
invoice_no - composite primary key part 1
customer_no - composite primary key part 2
date_time - not null
I am currently trying to implement a trading engine. My code can successfully create and fill limit and market orders. In addition, I would like my engine to have the ability to successfully fill "Fill or Kill" orders. Unfortunately, I have no idea how to write an algorithm for this "Fill or Kill". Does anyone know if there is an efficient algorithm for doing this?
The following is some background for those of you with less trading knowledge. Usually assets to be bought are ordered in a manner such as the following:
Price Amount
$200 3
$300 4
$350 2.5
$400 1.11
If one wants to buy assets, a simple algorithm goes from top to bottom in order to give the customer the best value. A whole row does not have to be completely bought. For example, if I want 4 apples, the code will sell me 3 for $200 and 1 for $300 (going from top to bottom).
Now, certain sellers may give the option of "Fill or Kill". This means that a given row cannot be bought partially. So now lets say that the second row is designated "Fill or Kill":
Price Amount
$200 3
$300 4 (Fill or Kill)
$350 2.5
$400 1.11
In this case, if I want 4 apples, I cannot buy less than 4 from the second row. So now I have two obvious options. I can buy 4 apples from the second row for 4*$300=$1200, or I can buy three apples from the first row, and 1 from the third for a price of (3*$200)+(1*350)= $850. Now in this case, the second option is the better deal. Unfortunately, that will not always be the case depending on the different prices and amount of orders.
These tables will be implemented in SQL and ordered by price. I am trying to implement this algorithm in php. Any help will be greatly appreciated.
The purpose of a "Fill or kill" order is, to ensure that a position is placed to the market at a desired price. Therefore the transaction should be immediately and completely or not enter the market at all. A FOK order prohibits a broker from partially filling. You need a 1:1 order to market match, before the order is placed in the queue/book.
The "no partial fulfillment" is what differentiates "Fill Or Kill" (and "All or None") orders from "Immediate Or Cancel" orders. IoC orders allow partial filling and wait in the book to get more stock incrementally until order expiration time.
The difference between FOK orders and AON orders is, that AON, which cannot be executed immediately remain active until they are executed or cancelled. AON keeps waiting in the book for full match.
FOK gets dropped, if no immediate full match.
Efficient algorithm for Fill Or Kill?
There are two forms "Multi Fill Or Kill" and "Single Fill or Kill".
Single Fill Or Kill
This is probably the easier one. It's a 1:1 match of order to market. If no direct order match, kill. If you do this with a SQL command you have WHERE equals comparisons.
Single means: the order will be rejected, if it cannot be immediately filled against a single order of equal or greater size.
Multi Fill Or Kill
This is probably the algorithm, which you are after.
Multi means: fill with multiple internal orders.
It's said that a FOK order prohibits a broker from partially filling.
That's only true, for the order book communication to the outside.
Internally, an "immediate interal partial filling" happens, until the order is "filled".
For the incoming FOK order, you create an OrderEventGroup.
The matcher might have multiple orders sitting on the book that allow to satisfy the incoming FOK order, but only IF ADDED TOGETHER. On the MatchEvent the OrderEventGroup, is filled with contra-orders matching the order constraint. You match orders in order book, where the quantity/price is lower or equal to the requested amount and price. You fill the OrderEventGroup until the FOK.order.amount equals OrderEventGroup.amountTogether. If OrderEventGroup.amountTogether doesn't sum up at all and/or takes longer than, what you defined as "immediate" execution time, the FOK order is killed.
You get a single transaction, which might have multiple matching steps, with possibly different prices. But that's not communicated. The order report contains: "filled" and "price", "qty" - where price is the average price of the OrderEventGroup. The order submitter would not even know that his order was filled against more than one other order.
The good thing: elimination of execution risk for the order submitter.
The bad thing: you get the average price, without a clue of min max prices in the OrderEventGroup or the number of collected contra-orders.
Fill And Kill
Then, there is "Fill and Kill".
The order is filled to the extent of the quantity that can be immediately filled at the requested price. The remainder of the order is killed/canceled. These orders might end with a status "partially filled". The algo is the same as under 2., except that FOK.order.amount doesn't have to equal OrderEventGroup.amountTogether. The OrderEventGroup.amountTogether is allowed to be lower (partial fill).
The matching algorithm used is mainly a pure time priority (FIFO) algorithm, because this kind of algorithm maximizes the number of effective orders. (Some markets use Pro-Rata matching. And somewhere in-between are LIFFE and the CME algorithms, both combining elements from fifo and pro-rata.)
There should be a pseudo-polynomial dynamic programming solution, similar to that for the knapsack. Maintain a table giving, for all quantities <= the desired quantity, the lowest possible cost at which that quantity can be bought. Start with all entries in the table set to the best you can get using only ordinary offers. Take each fill or kill offer in turn and use this to ammend the table: the new cheapest price to quantity k is min(previous price for k, previous price for k-x plus cost to provide x shares according to the offer you are currently considering). Keep enough backtrack information to work back to a combination of offers from the cheapest price for the target quantity after considering all fill or kill offers. This could be, for instance, a note, for each offer, of the quantities for which it forms part of the cheapest price.
You can skip the offer with the attribute "Fill or kill" and pick the next product. Then repeat with the skipped offer and find the cheapest combination between both solutions.
First, FOK don't sit on the order book. They are like IOC (Immediate Or Cancel) and have a Time In Force of zero. If they don't execute, they are cancelled immediately.
All Or None orders, which are even less supported and less standard, might be have a TIF of end of day or good till cancel. In that case they are not displayed and will execute when they are able to.
Now with that our if the way, are you trying to obey securities law, or is this just a toy system? FOK aren't as standardized a Limit, Market, or IOC orders. Some venues may not even implement them and some have have different definitions if they do.
But, if you are trying to obey RegNMS, you cannot skip the inside of the book to fill deeper (the trade through requirement). I only ever see FOK orders filled on the the NBBO (best bid/offer, the inside), but I think I've used them at once place only that I can remember.
So the algorithm would be simple to see if the inside of the book has enough size and if so, execute against it. If you wanted to be a little more lenient, you could work to the next level, however under RegNMS, a venue couldn't do that if another venue was also showing size at that level (trade through strikes again). That is kind of the point of the order type though, to prevent leakage of your intent to buy/sell.
Horrible title I know, I couldn't find a better way to articulate my problem.
I searched through SO without success, but if the answer's already out there, feel free to link it and I will read it thoroughly.
I'm building a book ordering system. People will order books throughout the month, and one large order arrives at the end of every month.
Orders get recorded into an Order table with the following fields:
order_id, book_id, quantity, language, person_ordering, timestamp, month, year
When orders arrive, they are inputted into an Received table with the following fields:
book_id, quantity, language
Now suppose one person orders (2) copies of book 1. And another person orders (3). And another (5). For a grand total of (10).
Then the order arrives and it only has 7 copies.
I'm trying to write a script/function that will find out:
Which persons will receive their copies of the book (it's first come first serve, so the people that ordered it first will have their order fulfilled first.
If a person can only have their order partially fulfilled, to update the Order table (or possible a new Pending Orders table is needed?) to reflect that they have X amount still waiting to be fulfilled. Then the following month, their orders would be fulfilled first, again, based on date ordered.
I thought about pulling the information from the Orders table based on time-stamp of when the order was made and sorting through it, then pulling the information out of the Received table and somehow comparing the two arrays and updating a third array?
Or perhaps there's a simpler solution that I'm missing.
If more information is needed, I will gladly provide.
I've been pulling my hair out over this problem for 2 days straight.
Any help would be appreciated.
Thanks!
To me, it sounds like you could do away with your timestamp, month, and year variables to be replace with a single datetime stamp.
Also, add a column in your Received table for the datetime stamp of collection.
Finally, make another table to keep track of fulfilled orders.
Do a select * on the month needed ordered by the datetime stamp descending for your "orders" data.
In a separate query get your "received" data for programmatic comparison using the correct month.
Put the "received" quantity of each "received" book_id into separate variables (probably a key/value array).
For each "received" book_id, compare and compute the "ordered" quantity needed from each order in a sub-loop, updating the "received" quantity value. At the end of each iteration of your sub-loop, if your "ordered" quantity does not equal 0, then you need to add another entry for the next month with the remaining quantity, also enter all fulfilled orders into your new table.
* You **will not** want to modify any existing records, as you will almost certainly need this untouched in the future.
Clear as mud? :P
* Database tip - Every entry of every record should **always** have a datetime stamp and when possible, a "who did it".
You can try with adding an extra field in Received table.
when the order is coming you can calculate the number of copies that we can provide at that time.
book_id, quantity,qty_available, language
1. B1 2 2 EN
2. B1 3 3 EN
3. B1 5 2 EN
Assuming that the Received table is for current stock, and that orders are only shipped when the entire order can be shipped, here's a suggestion for what you can do:
Add a field to Order for whether the Order has been fulfilled.
Select from Order who have ordered the book, ordered by date. You can limit by quantity of Received.
Start fulfilling the Orders and marking them as such until quantity from Received is reached.
If what is left of Received quantity is not enough to fulfill an Order, leave it and leave the remaining quantity in Received for next time.
Hope it makes sense.
I am building a platform on php/mysql that allows users to but for a percentage of a product. Basically they choose what percentage they want to bid for and the markup price they'd like to bid at e.g. 1.2.
The issue I have is that if there are several bids placed simultaneously for the same product, I need to queue the bids so when processed only the valid ones go through. For example, if there is 20% available at a value of 1.2 and two users simultaneously bid for 20% at 1.2, the process for each bid would be:
1----- select current bids
2----- php processing to work out if bid is still available
3----- place bid
The issue here is if both of the 'check if availables' happen before either of the 'place bids' then both bids will go through and effectively cause a stock issue.
Is there a way of queuing this process so the whole of steps 1-3 happen for a bid before the next one can run.
My initial thought was to use a cache table to store all the bids and then run a cron every few seconds which will process in order of ID from the cache table and notify accordingly but this seems like it may not be viable on a high traffic system as the server would get hammered over it.
Any help much appreciated
If two bids can never be identical I would solve it at the data layer. Create a unique key on your bid table based on item_id, bid_value and whatever else needs to be unique. The second insert will fail and you can let user know they were beaten to the post
I solved this at PHP level by injecting the bid as an 'accepted bid' first so that any subsequent bids would fail. Then I ran the check to see if the bid was ok, if not the bid gets removed, if it is the table gets updated as required in place for new bids.