I have 4 arrays :
qb_array = { sku, stock } size:20803
valid_array = { sku, name, price, status } size:199803
by intersect qb_array && valid_array, base one sku then I have :
intersect_sku_array { sku } size:18795
Now, there are some data that I want to grab out of my first 2 arrays.
From qb_array, I want stock
From valid array - I want name, price, and status
Then I decide to create my 4th-array and called it :
4 - inserted_array ( because I will use this array to insert into my database )
I tried to construct it, and now I am stuck.
Here is my approach :
First, I did
foreach ($intersect_sku_array as $key ) {
$inserted_array[] = $key;
}
So far, over here is good - everything is working when I dd($inserted_array); I see all the stuffs in it.
Then, moving on to add the other 3 from my valid_array
foreach ($valid_array as $key => $value ) {
if ( in_array( $key , array_map('strtolower', $intersect_sku_array ) )){
$inserted_array[] = $value['name'];
$inserted_array[] = $value['price'];
$inserted_array[] = $value['status'];
}
}
Then I did dd($inserted_array); at the end, it is hanging on me.
After about 5 mn I got this error :
Maximum execution time of 300 seconds exceeded
Is it because I have too much data, or my code is stuck in the infinite loop some how ?
Can someone please explain all of these in details ?
Maybe this would help:
foreach($intersect_sku_array as $sku)
{
$qbRow = qb_array[array_search($sku,$qb_array)];
$validRow = valid_array[array_search($sku,$valid_array)];
inserted_array[] = array($sku, $qbRow[1], $validRow[1], $validRow[2], $validRow[3]);
}
Although I think it would be easier for you to use named arrays like the following:
qb_array = ['sku' => ['stock'=> 'actual stock']];
valid_array = ['sku' => ['name'=> 'actual name', 'price' => 'actual price', 'status' => 'actual status']];
Like so.
For one thing you are running the lowercasing inside the loop, which is certainly not the best way to save time.
Then you are constructing a "flat" array that will contain (sku, name, price, status) quadruplets (and no stock) in sequence.
I would rather have the database do a join on both tables, since SQL is a much better tool than PHP for that kind of job.
Now if for some reason you can't do that, better use sku as a key for your two arrays.
foreach ($qb_array as $val) $k_qb[strtolower($val['sku']) = $val['stock'];
foreach ($valid_array as $val) $k_va[strtolower($val['sku']) = array ($val['name'], $val['price'], $val['status']);
(if you must lowercase something, better do it late than never, but frankly this should also be done in the database, unless you're forced to work with garbage data)
Then you can do the join manually without any intermediate intersection array, like so:
forach ($k_qb as $sku => $stock) // use smallest array
{
if (!isset($k_va[$sku])) continue; // skip non-matching records
$record = $k_va[$sku];
$inserted_array[] = array (
'sku' => $sku,
'stock' => $stock,
'name' => $record[0],
'price' => $record[1],
'status' => $record[2]);
}
The order of the algorithm will be NlogM instead of MN if M and N are the sizes of both your arrays.
Related
I am saving a complex dataset in Laravel 4.2 and I am looking for ways to improve this.
$bits has several $bobs. A single $bob can be one of several different classes. I am trying to duplicate a singular $bit and all its associated $bobs and save all of this to the DB with as few calls as possible.
$newBit = $this->replicate();
$newBit->save();
$bobsPivotData = [];
foreach ($this->bobs as $index => $bob) {
$newBob = $bob->replicate();
$newBobs[] = $newBob->toArray();
$bobsPivotData[] = [
'bit_id' => $newBit->id,
'bob_type' => get_class($newBob),
'bob_id' => $newBob->id,
'order' => $index
];
}
// I now want to save all the $bobs kept in $newBobs[]
DB::table('bobs')->insert($newBobs);
// Saving all the pivot data in one go
DB::table('bobs_pivot')->insert($bobsPivotData);
My problem is here, that I cant access $newBob->id before I have inserted the $newBob after the loop.
I am looking for how best to reduce saves to the DB. My best guess is that if I can predict the ids that are going to be used, I can do all of this in one loop. Is there a way I can predict these ids?
Or is there a better approach?
You could insert the bobs first and then use the generated ids to insert the bits. This isn't a great solution in a multi-user environment as there could be new bobs inserted in the interim which could mess things up, but it could suffice for your application.
$newBit = $this->replicate();
$newBit->save();
$bobsPivotData = [];
foreach ($this->bobs as $bob) {
$newBob = $bob->replicate();
$newBobs[] = $newBob->toArray();
}
$insertId = DB::table('bobs')->insertGetId($newBobs);
$insertedBobs = DB::table('bobs')->where('id', '>=', $insertId);
foreach($insertedBobs as $index => $newBob){
$bobsPivotData[] = [
'bit_id' => $newBit->id,
'bob_type' => get_class($newBob),
'bob_id' => $newBob->id,
'order' => $index
];
}
// Saving all the pivot data in one go
DB::table('bobs_pivot')->insert($bobsPivotData);
I have not tested this, so some pseudo-code to be expected.
Alright so I have an insert query that I would like to run but the issue I am having is with getting object properties/values that I need to insert.
Say I have a query that looks like the one below.
$this->db->insert('tblitems_in', array(
'platform' => $item['Platform'],
'ram' => $item['RAM'],
'qty' => $item['qty'],
'rate' => number_format($item['rate'], 2, '.', ''),
'rel_id' => $insert_id,
'rel_type' => 'estimate',
'item_order' => $item['order'],
'unit' => $item['unit']
));
This works fine when the person chooses RAM on the webpage which sets the $item Objects property 'RAM' to the value that was picked. Now if they choose HardDrive, that properties name is now sent as 'HardDrive' with the value they chose. Is there a way that i Could replace the 'ram' and 'RAM' from the below example with a variable so I could change what the property name is that I would like to insert and insert into the corresponding db column?
EDIT:
I should have added that the options on the webpage are also dynamically created from a database so I do not know at the time of coding what the property names are. They could be RAM, HardDrive, Processor, maybe even Elephant. I was hoping I could use variables so that I could look at the DB used to create the webpage so that I know the property names and then dynamically add those names into the query.
EDIT:
Right now I am using the following code in order to get all the possible options that can be received from the webpage from a DB the webpages uses to create itself.
$plat_options = $this->db->get('tblplatform_options')->row()->name;
In the database right now it is only populated with names RAM and HardDrive to make things known for testing purposes. So this returns $plat_options = {RAM, HardDrive}. I now have to figure out how to test is $item has these(RAM and HardDrive) as properties and if $item does have them then add them into the query previously shown.
You can set an array of key => variable names, then loop over those values to see if they exist in the $item variable and, if so, add that value to the data to be inserted into the db:
//default array of data to insert
$data = [
'platform' => $item['Platform'],
'qty' => $item['qty'],
'rate' => number_format($item['rate'], 2, '.', ''),
'rel_id' => $insert_id,
'rel_type' => 'estimate',
'item_order' => $item['order'],
'unit' => $item['unit']
];
//Get column names from db
$plat_options = $this->db->get('tblplatform_options')->row()->name;
// $plat_options = [RAM, HardDrive]
//Check if $item[$name] exists. If it does, add that to the
// array of data to be inserted
foreach($plat_options as $key) {
if(array_key_exists($key, $item)) {
$data[$key] = $item[$key];
}
}
$this->db->insert('tblitems_in', $data);
edit
I'm not sure this will work (I don't understand the use case).
It is possible, using array_diff_key to get a list of array keys that exist in $item but not in $data. With this array of keys, you can add the missing keys.
I have altered my previous code to demonstrate this.
You could create the array one element at a time based on whatever field data you received. I used a switch statement, but it could be a simple if/then/else as well.
$data_array = array();
$data_array['platform'] = $item['Platform']
switch($item['Object'] {
case 'HardDrive':
$data_array['harddrive'] = $item['HardDrive'];
break;
case 'RAM':
$data_array['ram'] = $item['RAM'];
break;
}
$data_array['qty'] = $item['qty'];
$data_array['rate' = number_format($item['rate'], 2, '.', '');
$data_array['rel_id'] = $insert_id;
$data_array['rel_type' = 'estimate';
$data_array['item_order'] = $item['order'];
$data_array['unit'] = $item['unit'];
$this->db->insert('tblitems_in', $data_array);
Not sure how to title this properly but here's the issue I am running into currently. I built a cart and checkout system and it loads all the data into a database when it finalized the order. To save some space, I stored just the item IDs originally but then I ran into the issue of if I deleted the item from the database (because it was discontinued or whatever) then it wouldn't return the info I needed. And if they ordered more then 1 item the database record would be wrong. So I stored the data like so:
Itemid:Quantity:Price:name, itemid2:quantity2:price2:name2
OR
1:3:20.00:Flower Hat, 2:1:17.75:diamonds
The issue I have right now that I need help with is this. I need to seperate the four values into variables like $price, $item, $id, $ammount so I can display them on the order history page and I need to loop through all items on the array so I can print a row for each item with all four fields respective to that item.
I use strpos already to get the shipping info from the same database field which is formatted as METHOD:Price but since I have 3 :'s on my string I'm not sure how to go through each one. Thanks.
Here's a function
function parseItems($dbjunk){
$cart = array();
$items = explode(",",$dbjunk);
foreach($items as $i){
$chunks = explode(":", $i);
$cart[] = array(
"ItemID" => $chunks[0] ,
"Quantity" => $chunks[1] ,
"Price" => $chunks[2] ,
"name" => $chunks[3]
);
}
return $cart;
}
Example usage:
$dbjunk = "Itemid:Quantity:Price:name, itemid2:quantity2:price2:name2";
$parsed = parseItems($dbjunk);
print_r($parsed);
See: https://3v4l.org/rBkXF
If you need variables instead of an array you can use list(), like this..
$dbjunk = "Itemid:Quantity:Price:name, itemid2:quantity2:price2:name2";
$parsed = parseItems($dbjunk);
foreach($parsed as $p){
list($itemID, $Quantity, $Price, $name) = array_values($p);
var_dump($itemID, $Quantity, $Price, $name);
}
see: https://3v4l.org/l4vsn
You should not physically delete items from your database. Instead, put a new column named 'is_active' or something like that to indicate whether the product is active/non-deleted.
Answering your question, here is my suggestion:
$orderString = '1:3:20.00:Flower Hat, 2:1:17.75:diamonds';
$items = array();
foreach(explode(', ', $orderString) as $itemString) {
$itemData = explode(':', $itemString);
$items[] = array(
'id' => $itemData[0],
'amount' => $itemData[1],
'value' => $itemData[2],
'description' => $itemData[3]
);
}
with this code, you will obtain an array with the data of all the items in the string, no matter how much items are in the string
try something like
$data = 1:3:20.00:Flower Hat, 2:1:17.75:diamonds
list($price, $item, $uid, $id, $ammount) = explode(":", $data);
echo $user;
echo $item;
Read about First Normal Form. Basically, you want to store one value in one field. So, instead of this:
shipping = "method:price"
You want something like this:
shipping_method = "method"
shipping_price = "price"
Don't concern yourself with space -- it's essentially free nowadays.
Regarding your deleted items dilemma, your initial implementation was the way to go:
I stored just the item IDs originally
In addition to reverting to this technique, I would recommend two things:
Add a boolean field to your item table to represent if the item is currently available or not. This gives you the additional feature of being able to toggle items on/off without having to delete/insert records and change ids.
Before deleting an item, check to see if it's ever been ordered. If not, it's ok to delete. If so, instead just deactivate it.
I have loop like following and its run for more than 6000 records,
foreach ($csv as $value) {
$research = ResearchData::create(array('company_id' => Input::get('company'), 'date' => Input::get('date')));
}
in here i used 2 values company_id and date.
i want to know what is the most good way to use this from follow codes
................1....................
$company_id=Input::get('company_id');
$date=Input::get('date');
foreach($csv as value){
$research=ResearchData::create(array('company_id'=>$company_id,'date'=>$date));
}
................2...................
foreach ($csv as $value) {
$research = ResearchData::create(array('company_id' => Input::get('company'), 'date' => Input::get('date')));
}
From a performance point of view, number 1 will be faster, but only because Input::get will take a tiny little bit longer as it does some checks, an array concatenation and eventually grabs something from an array. This will take a completely negligible amount of time, but option 1 does this once whereas option 2 will do this for every iteration of the loop
From any other point of view (code clarity, documentation etc) it's completely opinion based.
You can do a bulk insert. I didn't do a performance check, but I expect a better performance. Check below:
$company_id=Input::get('company_id');
$date=Input::get('date');
$data = array_fill(0, count($csv) - 1, ['company_id' => $company_id, 'date' => $date]); // skip the large foreach
ResearchData::insert(array_values($data)); // skip the numeric keys
Documentation:
http://php.net/array_filter
http://laravel.com/docs/4.2/queries#inserts
I want to import an ExcelSheet to my DB using Symfony/Doctrine (imported ddeboer data-import bundle)
What is best practice to import the data and first check if the data is already imported?
I was thinking of two possibilities:
1)
$numarray = $repo->findAllAccounts();
$import = true;
foreach ($reader as $readerobjectkey => $readervalue) {
foreach ($numarray as $numkey){
if (($numkey->getNum() == $readervalue['number'])){
$import = false;
}
}
if($import){
$doctrineWriter ->disableTruncate()
->prepare()
->writeItem(
array(
'num' => $readervalue['number'],
'name' => $readervalue['name'],
'company' => $companyid
)
)
->finish();
2)
foreach ($reader as $row =>$value ) {
// check if already imported
$check = $this->checkIfExists($repo,'num', $value['number']);
if ($check){
echo $value['number']." Exists <br>";
}else{echo $value['number']." new Imported <br>";
$doctrineWriter ->disableTruncate()
->prepare()
->writeItem(
array(
'num' => $value['number'],
'name' => $value['name'],
'company' => $companyid
)
)
->finish();
public function checkIfExists($repo, $field, $value){
$check = $repo->findOneBy(array($field => $value));
return $check;
Problem is with big exceldatasheets (3000 rows +) with both solutions i get a timeout....
Error: Maximum execution time of 30 seconds exceeded
in general: for performance issues: is it prefered to generate 1000 queries to check if value exists (findOneBy) or to use two foreach loops to compare values?
Any help would be awesome!
Thx in advance...
You can try to check the filemtime of the file : http://php.net/manual/en/function.filemtime.php
I'm not sure if it would work properly but it worth a shot to try it and see if the modified date works as expected.
Otherwise you should think of another way that checking the data like this, it would take lot of resources to do so. maybe adding some metadata to the excel file :
http://docs.typo3.org/typo3cms/extensions/phpexcel_library/1.7.4/manual.html#_Toc237519906
Any other way than looping or querying database for large data is better.