How to get specific columns in Laravel [Maatwebsite/Laravel-Excel] - php

the file i am importing has thousands of records which is causing my network to get slow. i want to read only those columns that i need, before inserting it into database. When file is processed , it should first search those columns not the whole file, and fetch the rows of those columns
Excel::load($path, function($reader) {
//Getting headers using this
$headers = $reader->first()->keys()->toArray();
//This is the array of required columns
$headings = array('registrant_name','registrant_address','registrant_phone','registrant_zip','registrant_email','registrant_country','registrant_state','registrant_city');
});
Data insertion after file read.
if($data->count() > 0)
{
foreach($data->toArray() as $value)
{
$insert[] = array(
'registrant_name' => $value['registrant_name'],
'registrant_address' => $value['registrant_address'],
'registrant_phone' => $value['registrant_phone'],
'registrant_zip' => $value['registrant_zip'],
'registrant_email' => $value['registrant_email'],
'registrant_country' => $value['registrant_country'],
'registrant_state' => $value['registrant_state'],
'registrant_city' => $value['registrant_city']
);
}
}
if(!empty($insert))
{
DB::table('customers')->insert($insert);
}

In collections, you can check the method only documentation
i.e.
$headers = $reader->first()->only(['registrant_name','registrant_address','registrant_phone','registrant_zip','registrant_email','registrant_country','registrant_state','registrant_city']);
You might also use chunk, to insert the data in smaller collections than in one big.
Could you update the post with the App\Import class in order to help you more.

Related

Predicting future IDs used before saving to the DB

I am saving a complex dataset in Laravel 4.2 and I am looking for ways to improve this.
$bits has several $bobs. A single $bob can be one of several different classes. I am trying to duplicate a singular $bit and all its associated $bobs and save all of this to the DB with as few calls as possible.
$newBit = $this->replicate();
$newBit->save();
$bobsPivotData = [];
foreach ($this->bobs as $index => $bob) {
$newBob = $bob->replicate();
$newBobs[] = $newBob->toArray();
$bobsPivotData[] = [
'bit_id' => $newBit->id,
'bob_type' => get_class($newBob),
'bob_id' => $newBob->id,
'order' => $index
];
}
// I now want to save all the $bobs kept in $newBobs[]
DB::table('bobs')->insert($newBobs);
// Saving all the pivot data in one go
DB::table('bobs_pivot')->insert($bobsPivotData);
My problem is here, that I cant access $newBob->id before I have inserted the $newBob after the loop.
I am looking for how best to reduce saves to the DB. My best guess is that if I can predict the ids that are going to be used, I can do all of this in one loop. Is there a way I can predict these ids?
Or is there a better approach?
You could insert the bobs first and then use the generated ids to insert the bits. This isn't a great solution in a multi-user environment as there could be new bobs inserted in the interim which could mess things up, but it could suffice for your application.
$newBit = $this->replicate();
$newBit->save();
$bobsPivotData = [];
foreach ($this->bobs as $bob) {
$newBob = $bob->replicate();
$newBobs[] = $newBob->toArray();
}
$insertId = DB::table('bobs')->insertGetId($newBobs);
$insertedBobs = DB::table('bobs')->where('id', '>=', $insertId);
foreach($insertedBobs as $index => $newBob){
$bobsPivotData[] = [
'bit_id' => $newBit->id,
'bob_type' => get_class($newBob),
'bob_id' => $newBob->id,
'order' => $index
];
}
// Saving all the pivot data in one go
DB::table('bobs_pivot')->insert($bobsPivotData);
I have not tested this, so some pseudo-code to be expected.

Compare array to db and add difference to db

I'm using codeigniter.
I'm trying to compare some posted values from a form with entries from a database.
Put simply, i want to check to see if the entry is already in the database, if so ignore the posted data, but if there is no database entry then add to the database.
My thinking was that it shouldn't actually be that hard, but having some issues, and now im completely confused. Any pointers in the right direction would be appreciated.
I have an array coming from POST $assigned_ids
And i want to compare that with the data from $assign_data which is being output from the database.
I've been trying foreach loops, looping over the posted data, and then inside that looping through the database and comparing and adding if neccessary.
It works upto a point, but if there is data coming from the database, its adding multiple entires.
Heres my code, surely im over complicating things?
// posted values foreach - loop through all posted values
foreach($assigned_ids as $id) {
if(is_array($assign_data) && count($assign_data) > 0) {
// query all data in assignments table
foreach($assign_data as $key => $value) {
// is the user id from assignments table in posted id's
if(in_array($value->user_id, $id)){
// if it is, then do the course id's match as well? if so, do nothing, already an entry
if($value->course_id == $course_id) {
echo "match id and course, do nothing";
} else {
// else if there isnt an entry for this user for this course, add the entry
$add_data = array(
'user_id' => $value->user_id,
'course_id' => $course_id,
'org_id' => $org_id
);
$this->assignment_model->save_org_assignments($add_data);
}
} else {
// the user id was not in the array from the db, but was in the posted vars, so log in db
$add_data = array(
'user_id' => $id,
'course_id' => $course_id,
'org_id' => $org_id
);
$this->assignment_model->save_org_assignments($add_data);
}
}
} else {
$add_data = array(
'user_id' => $id,
'course_id' => $course_id,
'org_id' => $org_id
);
$this->assignment_model->save_org_assignments($add_data);
}
}
I think your main issue is your array is not properly structured that's why your having a hard time.
My opinion is to predefined your db result after fetching it.
function getAssignedData(){
// its better to get the only field you'll need than to fetch everything
$result = $this->db->select($field)->get();
if($result->num_rows()){
$existing_ids = [];
foreach($result->result() as $row){
$existing_ids[] = $row->$field;
}
return array_flip($existing_ids);
}
return FALSE;
}
And you can already compare the values like this
foreach($assigned_ids as $id)
{
if(!isset($existing_ids[$id])) {
// do something
}
}
Hope that helps.

Multiple insert to related tables

I would like to ask if anyone of you tried to insert multiple records at once in a related tables?
Here's the scenario. I have a table of Drug and DrugMovement. Now I want to insert records to both tables using from a xls file as source. So this xls file contains thousand of records. By the time the user upload that file. All content of that file will be inserted to the tables. Now I'm thinking of batch upload. But I have no idea what will be the best approach to this.
Below is the schema
======== Drug Table ==========
class Drug extends Model
{
public function drugMovements() {
return $this->hasMany('App\DrugMovement');
}
}
======== Drug Movement ===========
class DrugMovement extends Model
{
public function drug() {
return $this->belongsTo('App\Drug');
}
}
Now I want to save records to both this table with a thousand of records inserting at once. How can I achieve this?
If I do something like this then it will be a waste of resource as I need to loop to all the records and do thousands of insert.
foreach($datas as $data) {
$drug = Drug::create([
"pharmacy_id" => 1,
"name" => $data->drug_name,
"strength" => $data->strength,
]);
$drug_movements = new DrugMovement([
"drug_id" => $drug->id,
"quantity" => $data->quantity,
"pharmacist_id" => 1,
]);
$drug->drugMovements()->save($drug_movements);
$drug->save();
}
As you can see if data has thousands of records it will insert thousand times. How can I optimize this?
The best thing would be to make two arrays and after that insert them to the database.
$latestDrugID = Drug::max('id');
// Of course if your database is empty start from `1`
foreach($datas as $data) {
$drugs[] = [
"pharmacy_id" => 1,
"name" => $data->drug_name,
"strength" => $data->strength,
];
$drug_movements[] = [
"drug_id" => $latestDrugID++,
"quantity" => $data->quantity,
"pharmacist_id" => 1,
];
}
Drug::insert($drugs);
DrugMovement::insert($drug_movements);
PS: You may want to chunk these arrays into little pieces depending on how large your data is.

How to import Excelsheets in Symfony doctrine entity

I want to import an ExcelSheet to my DB using Symfony/Doctrine (imported ddeboer data-import bundle)
What is best practice to import the data and first check if the data is already imported?
I was thinking of two possibilities:
1)
$numarray = $repo->findAllAccounts();
$import = true;
foreach ($reader as $readerobjectkey => $readervalue) {
foreach ($numarray as $numkey){
if (($numkey->getNum() == $readervalue['number'])){
$import = false;
}
}
if($import){
$doctrineWriter ->disableTruncate()
->prepare()
->writeItem(
array(
'num' => $readervalue['number'],
'name' => $readervalue['name'],
'company' => $companyid
)
)
->finish();
2)
foreach ($reader as $row =>$value ) {
// check if already imported
$check = $this->checkIfExists($repo,'num', $value['number']);
if ($check){
echo $value['number']." Exists <br>";
}else{echo $value['number']." new Imported <br>";
$doctrineWriter ->disableTruncate()
->prepare()
->writeItem(
array(
'num' => $value['number'],
'name' => $value['name'],
'company' => $companyid
)
)
->finish();
public function checkIfExists($repo, $field, $value){
$check = $repo->findOneBy(array($field => $value));
return $check;
Problem is with big exceldatasheets (3000 rows +) with both solutions i get a timeout....
Error: Maximum execution time of 30 seconds exceeded
in general: for performance issues: is it prefered to generate 1000 queries to check if value exists (findOneBy) or to use two foreach loops to compare values?
Any help would be awesome!
Thx in advance...
You can try to check the filemtime of the file : http://php.net/manual/en/function.filemtime.php
I'm not sure if it would work properly but it worth a shot to try it and see if the modified date works as expected.
Otherwise you should think of another way that checking the data like this, it would take lot of resources to do so. maybe adding some metadata to the excel file :
http://docs.typo3.org/typo3cms/extensions/phpexcel_library/1.7.4/manual.html#_Toc237519906
Any other way than looping or querying database for large data is better.

Multiple Keys to an array in if/else search

My code is pretty basic. I'm using an array to generate a datasheet for a product based on it's SKU and a filepath.
My array looks like this:
$a=array(
"/images/ManualSheets/factSheetCLASSIC.pdf"=>"KE800/6",
"/images/ManualSheets/factSheetMICRO.pdf"=>"KE800/12",
"/images/ManualSheets/factSheetSMALL.pdf"=>"KE4000/12",
"/images/ManualSheets/factSheetMEDIUM.pdf"=>"KE8000/12",
);
Where the first Key is the filepath, and the second Key is the SKU (as generated by the system) I then use an if/else to generate a button - so if a product is not in the array it returns a blank value and doesn't have a button which leads to nowhere
$factsheetweblink_url = array_search($product_sku,$a);
if ($factsheetweblink_url==false) {
echo " ";
}
else {
echo "<div class='productpagestockistBTN'>
<img src='/images/FactSheet_btn.png' >
</div>";
}
?>
This code works fine. The catch comes when I have products with different SKUs but the same datasheet file, (same brand and make but a different model). Currently I can only get it to work by uploading multiple copies of the datasheets with different names, but it's becoming a big waste of space.
I have tried using an array as a key to hold multiple values to the one key..
"/images/ManualSheets/factSheetMEDIUM.pdf"=> array("KE8000/12","KE7000/12"),
but it doesn't seem to be working... I'm not quite sure if I need to refine my if statement to search within the sub arrays as well or..?
Any help would be appreciated, thanks in advance.
You should use arrays like this:
$products = array(
0 => array(
"pdf" => "/images/ManualSheets/factSheetCLASSIC.pdf",
"skus" => array("KE800/6","KE900/6")
),
1 => array(
"pdf" => "/images/ManualSheets/factSheetCLASSIC3.pdf",
"skus" => array("KE100/6","KE200/6"))
);
This is because array_search returns just first row whit that key.
Then just do your own search function like:
function findBySku($items, $sku) {
$pdf = ""; // return empty if not found.
foreach($items as $row) {
if (in_array($sku, $row['skus'])) {
$pdf = $row['pdf'];
break;
}
}
return $pdf;
}
and call that function:
$pdf = findBySku($products, "some sku");

Categories