Inventory system database design - php

I am trying to make an inventory system for a company, when user will enter item to an inventory there is no problem as the table is shown below, however how do i know how many item left when some of the item is sold ? for example I purchased 100 bags and Mr Y purchased 20 bags, how will system show 80 bags left ? Any help would be appreceated. Thanks
CREATE TABLE `inventory` (
`inv_id` int(11) NOT NULL AUTO_INCREMENT,
`inv_reference_no` varchar(40) NOT NULL,
`inv_part_no` varchar(100) NOT NULL,
`inv_category_id` int(11) NOT NULL,
`inv_product_name` varchar(200) NOT NULL,
`inv_quantity` int(11) NOT NULL,
`inv_description` varchar(500) NOT NULL,
`inv_cost_price` float(12,2) NOT NULL,
`inv_cost_sub_total` float(14,2) NOT NULL,
`inv_product_type` enum('cons','serv','stock') NOT NULL,
`inv_date` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (`inv_id`)
) ENGINE=InnoDB AUTO_INCREMENT=7 DEFAULT CHARSET=utf8

You can add field say "in_stock" in you table which will store the quantity of products left.
Or you can decrement your quantity field everytime any item is purchased.
Hope it helps

Well, it should seem straightforward as you already have an 'inv_quantity' field that you could update. I'm not sure if you're having trouble with figuring out how to update it.
Here's some partly pseud-code that could help you. It could start out as UPDATE inventory SET inv_quantity = inv_quantity - quantity_purchased WHERE inv_id = purchased_product_id.
Then soon after, you would SELECT inv_quantity FROM inventory WHERE inv_id = purchased_product_id. The variables quantity_purchased and purchased_product_id will have to be supplied from you with however you're sending form data with your application.

Related

Speed up MySQL inner join with LIKE clause

I have the following 2 tables, api_analytics_data, and telecordia.
CREATE TABLE `api_analytics_data` (
`id` bigint(20) NOT NULL AUTO_INCREMENT,
`upload_file_id` bigint(20) NOT NULL,
`partNumber` varchar(100) DEFAULT NULL,
`clei` varchar(45) DEFAULT NULL,
`description` varchar(150) DEFAULT NULL,
`processed` tinyint(1) DEFAULT '0',
PRIMARY KEY (`id`),
KEY `idx_aad_clei` (`clei`),
KEY `idx_aad_pn` (`partNumber`),
KEY `id_aad_processed` (`processed`),
KEY `idx_combo1` (`partNumber`,`clei`,`upload_file_id`)
) ENGINE=InnoDB CHARSET=latin1;
CREATE TABLE `telecordia` (
`tid` int(11) NOT NULL AUTO_INCREMENT,
`ProdID` varchar(50) DEFAULT NULL,
`Mfg` varchar(20) DEFAULT NULL,
`Pn` varchar(50) DEFAULT NULL,
`Clei` varchar(50) DEFAULT NULL,
`Series` varchar(50) DEFAULT NULL,
`Dsc` varchar(50) DEFAULT NULL,
`Eci` varchar(50) DEFAULT NULL,
`AddDate` date DEFAULT NULL,
`ChangeDate` date DEFAULT NULL,
`Cost` float DEFAULT NULL,
PRIMARY KEY (`tid`),
KEY `telecordia.ProdID` (`ProdID`) USING BTREE,
KEY `telecordia.clei` (`Clei`),
KEY `telecordia.pn` (`Pn`),
KEY `telcordia.eci` (`Eci`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
Users upload data via a web interface using Excel/CSV files into api_analytics_data. The data contains EITHER the partNumbers or CLEIs. I then update the api_analytics_data table by joining the telecordia table. The telecordia table is the master list of partNumber and Cleis.
So if a user uploads a file of CLEIs, the update/join I use is:
update api_analytics_data aad
inner join telecordia t on aad.clei = t.Clei
set aad.partNumber = t.Pn
where aad.partNumber is null
and aad.upload_file_id = 5;
It works quickly, but not very thoroughly. The problem I have is that the CLEI uploaded may only be a substring of the CLEI in the telecordia table.
For example, the uploaded CLEI may be "5SC1DX0". In the telcordia table, the correct matching row is:
tid: 184324
ProdID: 472467
Mfg: PLSE
Pn: AUA58-2-REV-E
Clei: 5SC1DX04AA
Series: null
Dsc: DL SGL-PTY POTS CU RT
Eci: 205756
AddDate: 1994-03-18
ChangeDate: 1998-04-13
Cost: null
So obviously my update doesn't work in this case, even though 5SC1DX0 and 5SC1DX04AA are the same part.
What I need is a wildcard search. However, when I try this, it is crazy slow. With about 4500 rows uploaded into the api_analytics_data table, it runs for about 10 minutes, and then loses the connection with the server.
update api_analytics_data aad
inner join telecordia t on aad.clei like concat(t.Clei,'%')
set aad.partNumber = t.Pn
where aad.partNumber is null
and aad.upload_file_id = 5;
Is there a way to optimize this so that it runs quickly?
The correct answer is "no". The better course of action is to create a new column in telecordia with the correct Clei value in it, one that can be used for joining the tables. In the most recent versions of MySQL, this can even be a computed column and be indexed.
That said, you might be able to do something if the matching portion is always the same length. If so, try this:
update api_analytics_data aad inner join
telecordia t
on t.Clei = left(aad.clei, 7)
set aad.partNumber = t.Pn
where aad.partNumber is null and aad.upload_file_id = 5;
For this query, you want an index on api_analytics_data(upload_fiel_id, partNumber, clei) and telecordia(clei, pn).

MYSQL will not return more than 1 category

I've tried to find an answer to this problem, but I can't.
I have a 3 table format mqsql database.
I use 1 table to add all of the product information, CarpetInfo,
1 Table to list my categories, CarpetCategories,
and 1 Table to add categories to the products, CarpetCategorySort.
My CarpetCategorySort table has 3 columns, Manufacturer, Style, CategoryID.
Example would be Manufacturer = Aladdin, Style = Alma Mater, CategoryID = 14/ 15/ 18/ 19/ 20/ 21/ 67/
My CarpetCategories Table has 2 Columns CategoryID and Category.
2 Examples would be CategoryID = 14, Category = Commercial &
CategoryID = 15, Category = Commercial Loop
I can only get the code to work when I type in Commercial into the $category variable below. The code will not work if I type Commercial Loop into the $category variable. It's like it will only pull in the first number 14 and all of the others are ignored. The pricing order and everything else works right, just not the CategoryID part.
Here is my code.
<?php $mill = "Aladdin"; $category = "Commercial Loop";
$order = mysqli_query($con, "
SELECT * FROM CarpetInfo JOIN CarpetCategorySort USING (Manufacturer, Style)
JOIN CarpetCategories USING (CategoryID)
WHERE Manufacturer='$mill' AND Category LIKE '%$category%'
order by Price = 0, Price asc,
Style asc");
include($_SERVER['DOCUMENT_ROOT'].'/includes/pricing/carpet-order-test.htm');?>
Any help is greatly appreciated!
That's a backward way of handling things, a field with several datapoints smooshed together like that really never has any justification for existence. I'd list the carpets in one table, list the categories in another, and finally list cross-references of both in another table, where you get a many-to-one relationship at both ends.
You need an actual category table only if you think it's going to take so much room to give duplicate info for each category and/or you can't control user input for categories (like you aren't just using a pulldown to give a name choice).
Something like:
CREATE TABLE IF NOT EXISTS `carpets` (
`id` int(11) NOT NULL,
`name` varchar(50) NOT NULL,
`abbrev` varchar(10),
`description` varchar(250),
[...]
`updated_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=latin1;
CREATE TABLE IF NOT EXISTS `carpet_categories` (
`id` int(11) NOT NULL,
`carpet_id` int(11) NOT NULL,
`category_id` int(11) NOT NULL,
[...]
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=latin1;
CREATE TABLE IF NOT EXISTS `carpet_category_info` (
`id` int(11) NOT NULL,
`price-per-sqf` int(11) NOT NULL,
`name` varchar(50),
[...]
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=latin1;
then all your joins become simple, and fast.. And accurate.

What's the best way for generating mysql reports in huge databases?

I am wondering about the best way for generating mysql reports in large databases, as I have a database for POS application with more than 40 stores working on it.
The database has more that 1.5M rows for each of four tables.
two for headers and other for details.
I am generating reports by joining headers with details and some other tables to get full info for the view.
I tried to archive the data in one table, which has all data required for reporting, but I found it's a huge load, and MySQL events for fetching data to that table not always working, and may lead to data loss.
I've also tried indexing tables, but didn't help too much as the queries are too big and take too much time, which lead to heavy load on the server and may stop the application with not responding at all.
I searched across google, and found some ideas about partitioning tables, and others about archiving, changing the whole engine or even upgrading server requirements.
The relation between two tables (invoice_header and invoice_detail) is (one to many) that the invoice_header is the header of an invoice, with only totals. Which is linked to invoice_detail using location ID (loc_id) and Invoice number (invo_no), as each location has its own serial number. The invoice detail contains the details of each invoice.
Sample Query:
The query takes too long (15 - 20) seconds to fetch
-Total rows: 1495873
-Total Fetched rows: 9 - 12
SELECT SUM(invoice_detail.qty) AS qty, Month(invoice_header.date) AS month
FROM invoice_detail
JOIN invoice_header ON invoice_detail.invo_no = invoice_header.invo_no
AND invoice_detail.loc_id = invoice_header.loc_id
WHERE invoice_detail.item_id = {$itemId}
GROUP BY Month(invoice_header.date)
ORDER BY Month(invoice_header.date)
EXPLAIN:
invoice_header table structure:
CREATE TABLE `invoice_header` (
`invo_type` varchar(1) NOT NULL,
`invo_no` int(20) NOT NULL AUTO_INCREMENT,
`invo_code` varchar(50) NOT NULL,
`date` date NOT NULL,
`time` time NOT NULL,
`cust_id` int(11) NOT NULL,
`loc_id` int(3) NOT NULL,
`cash_man_id` int(11) NOT NULL,
`sales_man_id` int(11) NOT NULL,
`ref_invo_no` int(20) NOT NULL,
`total_amount` decimal(19,2) NOT NULL,
`tax` decimal(19,2) NOT NULL,
`discount_amount` decimal(19,2) NOT NULL,
`net_value` decimal(19,2) NOT NULL,
`split` decimal(19,2) NOT NULL,
`qty` int(11) NOT NULL,
`payment_type_id` varchar(20) NOT NULL,
`comments` varchar(255) NOT NULL,
PRIMARY KEY (`invo_no`,`loc_id`)
) ENGINE=InnoDB AUTO_INCREMENT=20286 DEFAULT CHARSET=utf8
invoice_detail table structure:
CREATE TABLE `invoice_detail` (
`invo_no` int(11) NOT NULL,
`loc_id` int(3) NOT NULL,
`serial` int(11) NOT NULL,
`item_id` varchar(11) NOT NULL,
`size_id` int(5) NOT NULL,
`qty` int(11) NOT NULL,
`rtp` decimal(19,2) NOT NULL,
`type` tinyint(1) NOT NULL,
PRIMARY KEY (`invo_no`,`loc_id`,`serial`),
KEY `item_id` (`item_id`),
KEY `size_id` (`size_id`),
KEY `invo_no` (`invo_no`),
KEY `serial` (`serial`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8
After adding "Extract":
EXPLAIN SELECT SUM( invoice_detail.qty ) AS qty, Month( invoice_header.date ) AS
MONTH
FROM invoice_detail
JOIN invoice_header ON invoice_detail.invo_no = invoice_header.invo_no
AND invoice_detail.loc_id = invoice_header.loc_id
WHERE invoice_detail.item_id =11321
GROUP BY EXTRACT(
YEAR_MONTH FROM invoice_header.date )
I am using a quite good dedicated server with:
Intel Xeon Quad Core 3.3GHz (8 threads)
1 Gbps Uplink
16 GB RAM
1,000 GB RAID-1 Drives
25 TB Bandwidth
Any suggestions?

CakePHP Voucher/Discount code system

I looked over Google for some samples/solutions about this, for example Creating Discount Code System (MySQL/php) but I haven't found a good solution.
My situation is such that I have a platform, where the user is supposed to have a balance in a virtual currency, and can buy virtual items for it. Now there's a wish to implement vouchers and discounts. There would be different kinds of codes, like one that gives 50% discount on purchasing items, x amount of extra items (with or without minimum item amount), just a code to get some currency, or a reference code that gives the referrer something.
I have implemented it as Campaign and CampaignType, where first holds the campaign info and second holds the action info.
Here's the structure:
-- Table structure for table `cake_campaigns`
CREATE TABLE IF NOT EXISTS `cake_campaigns` (
`id` bigint(20) NOT NULL AUTO_INCREMENT,
`name` varchar(50) CHARACTER SET utf8 NOT NULL,
`code` varchar(100) COLLATE utf8_bin NOT NULL,
`type_id` varchar(50) CHARACTER SET utf16 COLLATE utf16_bin NOT NULL DEFAULT '1',
`value` int(10) unsigned NOT NULL DEFAULT '5' COMMENT 'Percentage or amount',
`min_amount` bigint(20) unsigned NOT NULL DEFAULT '0',
`owner_id` bigint(20) unsigned NOT NULL,
`created` datetime NOT NULL,
`active` tinyint(1) unsigned NOT NULL DEFAULT '1',
`single_use` tinyint(1) unsigned NOT NULL DEFAULT '0',
PRIMARY KEY (`id`),
UNIQUE KEY `code` (`code`),
KEY `owner_id` (`owner_id`),
FULLTEXT KEY `name` (`name`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin AUTO_INCREMENT=4 ;
-- Table structure for table `cake_campaign_types`
CREATE TABLE IF NOT EXISTS `cake_campaign_types` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`name` varchar(50) CHARACTER SET utf8 NOT NULL,
`unit` varchar(10) CHARACTER SET utf16 NOT NULL DEFAULT '%',
`multiplier` double(10,8) NOT NULL DEFAULT '0.01000000',
`type` varchar(50) COLLATE utf8_bin DEFAULT NULL,
PRIMARY KEY (`id`),
UNIQUE KEY `name` (`name`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin AUTO_INCREMENT=7 ;
Currently my logic is that when a campaign is used, then the action is according to CampaignType's name, for example in the purchase logic:
if (isset($this->request->data['Purchase']['code'])) {
$code = $this->request->data['Purchase']['code'];
$campaign = $this->Campaign->findByCode($code);
$this->Campaign->id = $campaign['Campaign']['id'];
// campaign no longer active
if ($this->Campaign->field('active') == 0) $code = false;
if ($this->CampaignLog->find('first', array('conditions' => array(
'user_id' => $this->User->field('id'),
'campaign_id' => $this->Campaign->field('id'),
'activated' => 1,
)))) $code = false; // code has already been used
unset($this->request->data['Purchase']['code']);
} else $code = false;
// Some purchasing logic here
if ($code) {
$this->CampaignLog->create();
$this->CampaignLog->save(array(
'campaign_id' => $this->Campaign->field('id'),
'user_id' => $this->User->field('id'),
'activated' => 1,
'source' => $this->Session->read('referrer'),
'earnings' => $earned,
'created' => strftime('%Y-%m-%d %H:%M:%S'),
));
if ($this->Campaign->field('single_use') == 1) {
$this->Campaign->saveField("active", 0);
}
// Apply code here
}
Now, my question is:
What would be the best course of action on going about applying those codes, because I'm a bit queasy on going with if-then-else or switch-case through all the possible code types. But right now, since there's so many things that can be different (ex. Discount - in percentage or set amount), then that seems to be the only option.
Maybe the structure/logic of the codes should be different?
It's already straightforward in my point of view, integrating it with the purchase would be the best bet in knowing further problems. Assuming that we have $this->request->data['price'] for the price, then we have an example type_id of 1 that represents as a discount.
All we have to do is to get value and do percentage equation so that would be like
$discount = floatval('0.' . $this->Campaign->value);
$finalPrice = $this->request->data['price'] * $discount;
Better if we implement it on a switch case to isolate their logic. It may depend on how you implement it but that's the gist of the concept.
Check the cart plugin it is using events for everything and nothing is hardcoded, it is pretty flexible by using this approach.
There is one method that is fired whenever the cart needs to be recalculated. Inside it calls other methods to calculate taxes and discounts.
Implementing this through events has the advantage that it is very easy to extend with additional discounts or tax calculations later.
Feel free to review the whole code of the plugin, it is not a simple implementation and covers cookie, session and database storage for the cart data and has events for a lot of things.

deleting inactive products from oscommerce

i am using latest oscommerce.
I got a huge amount of inactive products.I want to remove them.Going though admin one at a time is really slow.
I thought if i create a new temp category and move all inactive products to this temp category then using back end of oscommerce i can easily delete them.Doing this will also remove the associated image.
Products are associated via product id and categories association is done by product to category table. inactive products are set via products_status = 0;
CREATE TABLE IF NOT EXISTS `products` (
`products_id` int(11) NOT NULL AUTO_INCREMENT,
`products_quantity` int(4) NOT NULL,
`products_model` varchar(64) COLLATE utf8_unicode_ci DEFAULT NULL,
`products_ean` varchar(64) COLLATE utf8_unicode_ci NOT NULL,
`google_product_category` varchar(300) COLLATE utf8_unicode_ci DEFAULT NULL,
`products_image` varchar(64) COLLATE utf8_unicode_ci DEFAULT NULL,
`products_price` decimal(15,4) NOT NULL,
`products_date_added` datetime NOT NULL,
`products_last_modified` datetime DEFAULT NULL,
`products_date_available` datetime DEFAULT NULL,
`products_weight` decimal(5,2) NOT NULL,
`products_status` tinyint(1) NOT NULL,
`products_tax_class_id` int(11) NOT NULL,
`manufacturers_id` int(11) DEFAULT NULL,
`products_ordered` int(11) NOT NULL DEFAULT '0',
`products_last_import` datetime DEFAULT NULL,
`products_submit_google` smallint(6) NOT NULL DEFAULT '1',
`icecat_prodid` int(10) unsigned NOT NULL,
`vendors_id` int(11) DEFAULT '1',
`products_availability` smallint(6) NOT NULL DEFAULT '0',
PRIMARY KEY (`products_id`),
KEY `idx_products_model` (`products_model`),
KEY `idx_products_date_added` (`products_date_added`),
KEY `idx_icecat_prodid` (`icecat_prodid`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=292067 ;
CREATE TABLE IF NOT EXISTS `products_to_categories` (
`products_id` int(11) NOT NULL,
`categories_id` int(11) NOT NULL,
PRIMARY KEY (`products_id`,`categories_id`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
i have tried using the following query but i get an error #1062 - Duplicate entry '276917-29240' for key 'PRIMARY'
Update products p ,products_to_categories pc
set pc.categories_id = 29598
where p.products_id = pc.products_id
and p.products_status = 0
You most likely have a product that is linked in one or more categories.
Example: Product ABC with products_id = 123 can exist twice in the products_to_categories table if it is in two categories (say 'categories_id` 222 and 333. So you have two entries in your table, 123-222 and 123-333.
When you run your update, the first time it encounters product/category 123-222, it will change it's category to 123-29598. When it encounters the product/category 123-333, it will also try to update the row to 123-29598, due to your primary key constraint, and would cause the problem you see.
Perhaps in your script you can check if the product (123) already exists in the category, and if so, then remove the second entry (123-333) rather than change it's category to (123-29598). See here for information on deleting entries with the same id from your table.

Categories