I want to import an ExcelSheet to my DB using Symfony/Doctrine (imported ddeboer data-import bundle)
What is best practice to import the data and first check if the data is already imported?
I was thinking of two possibilities:
1)
$numarray = $repo->findAllAccounts();
$import = true;
foreach ($reader as $readerobjectkey => $readervalue) {
foreach ($numarray as $numkey){
if (($numkey->getNum() == $readervalue['number'])){
$import = false;
}
}
if($import){
$doctrineWriter ->disableTruncate()
->prepare()
->writeItem(
array(
'num' => $readervalue['number'],
'name' => $readervalue['name'],
'company' => $companyid
)
)
->finish();
2)
foreach ($reader as $row =>$value ) {
// check if already imported
$check = $this->checkIfExists($repo,'num', $value['number']);
if ($check){
echo $value['number']." Exists <br>";
}else{echo $value['number']." new Imported <br>";
$doctrineWriter ->disableTruncate()
->prepare()
->writeItem(
array(
'num' => $value['number'],
'name' => $value['name'],
'company' => $companyid
)
)
->finish();
public function checkIfExists($repo, $field, $value){
$check = $repo->findOneBy(array($field => $value));
return $check;
Problem is with big exceldatasheets (3000 rows +) with both solutions i get a timeout....
Error: Maximum execution time of 30 seconds exceeded
in general: for performance issues: is it prefered to generate 1000 queries to check if value exists (findOneBy) or to use two foreach loops to compare values?
Any help would be awesome!
Thx in advance...
You can try to check the filemtime of the file : http://php.net/manual/en/function.filemtime.php
I'm not sure if it would work properly but it worth a shot to try it and see if the modified date works as expected.
Otherwise you should think of another way that checking the data like this, it would take lot of resources to do so. maybe adding some metadata to the excel file :
http://docs.typo3.org/typo3cms/extensions/phpexcel_library/1.7.4/manual.html#_Toc237519906
Any other way than looping or querying database for large data is better.
Related
the file i am importing has thousands of records which is causing my network to get slow. i want to read only those columns that i need, before inserting it into database. When file is processed , it should first search those columns not the whole file, and fetch the rows of those columns
Excel::load($path, function($reader) {
//Getting headers using this
$headers = $reader->first()->keys()->toArray();
//This is the array of required columns
$headings = array('registrant_name','registrant_address','registrant_phone','registrant_zip','registrant_email','registrant_country','registrant_state','registrant_city');
});
Data insertion after file read.
if($data->count() > 0)
{
foreach($data->toArray() as $value)
{
$insert[] = array(
'registrant_name' => $value['registrant_name'],
'registrant_address' => $value['registrant_address'],
'registrant_phone' => $value['registrant_phone'],
'registrant_zip' => $value['registrant_zip'],
'registrant_email' => $value['registrant_email'],
'registrant_country' => $value['registrant_country'],
'registrant_state' => $value['registrant_state'],
'registrant_city' => $value['registrant_city']
);
}
}
if(!empty($insert))
{
DB::table('customers')->insert($insert);
}
In collections, you can check the method only documentation
i.e.
$headers = $reader->first()->only(['registrant_name','registrant_address','registrant_phone','registrant_zip','registrant_email','registrant_country','registrant_state','registrant_city']);
You might also use chunk, to insert the data in smaller collections than in one big.
Could you update the post with the App\Import class in order to help you more.
I am saving a complex dataset in Laravel 4.2 and I am looking for ways to improve this.
$bits has several $bobs. A single $bob can be one of several different classes. I am trying to duplicate a singular $bit and all its associated $bobs and save all of this to the DB with as few calls as possible.
$newBit = $this->replicate();
$newBit->save();
$bobsPivotData = [];
foreach ($this->bobs as $index => $bob) {
$newBob = $bob->replicate();
$newBobs[] = $newBob->toArray();
$bobsPivotData[] = [
'bit_id' => $newBit->id,
'bob_type' => get_class($newBob),
'bob_id' => $newBob->id,
'order' => $index
];
}
// I now want to save all the $bobs kept in $newBobs[]
DB::table('bobs')->insert($newBobs);
// Saving all the pivot data in one go
DB::table('bobs_pivot')->insert($bobsPivotData);
My problem is here, that I cant access $newBob->id before I have inserted the $newBob after the loop.
I am looking for how best to reduce saves to the DB. My best guess is that if I can predict the ids that are going to be used, I can do all of this in one loop. Is there a way I can predict these ids?
Or is there a better approach?
You could insert the bobs first and then use the generated ids to insert the bits. This isn't a great solution in a multi-user environment as there could be new bobs inserted in the interim which could mess things up, but it could suffice for your application.
$newBit = $this->replicate();
$newBit->save();
$bobsPivotData = [];
foreach ($this->bobs as $bob) {
$newBob = $bob->replicate();
$newBobs[] = $newBob->toArray();
}
$insertId = DB::table('bobs')->insertGetId($newBobs);
$insertedBobs = DB::table('bobs')->where('id', '>=', $insertId);
foreach($insertedBobs as $index => $newBob){
$bobsPivotData[] = [
'bit_id' => $newBit->id,
'bob_type' => get_class($newBob),
'bob_id' => $newBob->id,
'order' => $index
];
}
// Saving all the pivot data in one go
DB::table('bobs_pivot')->insert($bobsPivotData);
I have not tested this, so some pseudo-code to be expected.
I'm using codeigniter.
I'm trying to compare some posted values from a form with entries from a database.
Put simply, i want to check to see if the entry is already in the database, if so ignore the posted data, but if there is no database entry then add to the database.
My thinking was that it shouldn't actually be that hard, but having some issues, and now im completely confused. Any pointers in the right direction would be appreciated.
I have an array coming from POST $assigned_ids
And i want to compare that with the data from $assign_data which is being output from the database.
I've been trying foreach loops, looping over the posted data, and then inside that looping through the database and comparing and adding if neccessary.
It works upto a point, but if there is data coming from the database, its adding multiple entires.
Heres my code, surely im over complicating things?
// posted values foreach - loop through all posted values
foreach($assigned_ids as $id) {
if(is_array($assign_data) && count($assign_data) > 0) {
// query all data in assignments table
foreach($assign_data as $key => $value) {
// is the user id from assignments table in posted id's
if(in_array($value->user_id, $id)){
// if it is, then do the course id's match as well? if so, do nothing, already an entry
if($value->course_id == $course_id) {
echo "match id and course, do nothing";
} else {
// else if there isnt an entry for this user for this course, add the entry
$add_data = array(
'user_id' => $value->user_id,
'course_id' => $course_id,
'org_id' => $org_id
);
$this->assignment_model->save_org_assignments($add_data);
}
} else {
// the user id was not in the array from the db, but was in the posted vars, so log in db
$add_data = array(
'user_id' => $id,
'course_id' => $course_id,
'org_id' => $org_id
);
$this->assignment_model->save_org_assignments($add_data);
}
}
} else {
$add_data = array(
'user_id' => $id,
'course_id' => $course_id,
'org_id' => $org_id
);
$this->assignment_model->save_org_assignments($add_data);
}
}
I think your main issue is your array is not properly structured that's why your having a hard time.
My opinion is to predefined your db result after fetching it.
function getAssignedData(){
// its better to get the only field you'll need than to fetch everything
$result = $this->db->select($field)->get();
if($result->num_rows()){
$existing_ids = [];
foreach($result->result() as $row){
$existing_ids[] = $row->$field;
}
return array_flip($existing_ids);
}
return FALSE;
}
And you can already compare the values like this
foreach($assigned_ids as $id)
{
if(!isset($existing_ids[$id])) {
// do something
}
}
Hope that helps.
I have 4 arrays :
qb_array = { sku, stock } size:20803
valid_array = { sku, name, price, status } size:199803
by intersect qb_array && valid_array, base one sku then I have :
intersect_sku_array { sku } size:18795
Now, there are some data that I want to grab out of my first 2 arrays.
From qb_array, I want stock
From valid array - I want name, price, and status
Then I decide to create my 4th-array and called it :
4 - inserted_array ( because I will use this array to insert into my database )
I tried to construct it, and now I am stuck.
Here is my approach :
First, I did
foreach ($intersect_sku_array as $key ) {
$inserted_array[] = $key;
}
So far, over here is good - everything is working when I dd($inserted_array); I see all the stuffs in it.
Then, moving on to add the other 3 from my valid_array
foreach ($valid_array as $key => $value ) {
if ( in_array( $key , array_map('strtolower', $intersect_sku_array ) )){
$inserted_array[] = $value['name'];
$inserted_array[] = $value['price'];
$inserted_array[] = $value['status'];
}
}
Then I did dd($inserted_array); at the end, it is hanging on me.
After about 5 mn I got this error :
Maximum execution time of 300 seconds exceeded
Is it because I have too much data, or my code is stuck in the infinite loop some how ?
Can someone please explain all of these in details ?
Maybe this would help:
foreach($intersect_sku_array as $sku)
{
$qbRow = qb_array[array_search($sku,$qb_array)];
$validRow = valid_array[array_search($sku,$valid_array)];
inserted_array[] = array($sku, $qbRow[1], $validRow[1], $validRow[2], $validRow[3]);
}
Although I think it would be easier for you to use named arrays like the following:
qb_array = ['sku' => ['stock'=> 'actual stock']];
valid_array = ['sku' => ['name'=> 'actual name', 'price' => 'actual price', 'status' => 'actual status']];
Like so.
For one thing you are running the lowercasing inside the loop, which is certainly not the best way to save time.
Then you are constructing a "flat" array that will contain (sku, name, price, status) quadruplets (and no stock) in sequence.
I would rather have the database do a join on both tables, since SQL is a much better tool than PHP for that kind of job.
Now if for some reason you can't do that, better use sku as a key for your two arrays.
foreach ($qb_array as $val) $k_qb[strtolower($val['sku']) = $val['stock'];
foreach ($valid_array as $val) $k_va[strtolower($val['sku']) = array ($val['name'], $val['price'], $val['status']);
(if you must lowercase something, better do it late than never, but frankly this should also be done in the database, unless you're forced to work with garbage data)
Then you can do the join manually without any intermediate intersection array, like so:
forach ($k_qb as $sku => $stock) // use smallest array
{
if (!isset($k_va[$sku])) continue; // skip non-matching records
$record = $k_va[$sku];
$inserted_array[] = array (
'sku' => $sku,
'stock' => $stock,
'name' => $record[0],
'price' => $record[1],
'status' => $record[2]);
}
The order of the algorithm will be NlogM instead of MN if M and N are the sizes of both your arrays.
Again I find myself at the mercy of the stackoverflow community!
I've gone over to use CodeIgniter for my PHP projects and it's been a breeze so far, hoever I've gotten stuck trying to update a database field with some post data.
My array is the usual: array(name => value, name => value, name => value);
which again is populated from the submitted $_POST data.
Similarly to the array, I have a database table with 2 fields: setting and value, where the names under setting corresponds to the array keys and value to the array keys' value.
(Did I explain that right?)
Nonetheless, I've been trying for a little while now to get this to work as it should, but, I'm really just waving my hands around in the dark.
I hope some of you bright minds out there can help me with this annoying issue!
Edit:
Thanks to everyone who replied! I managed to produce the result that I wanted with the following code:
foreach ($form_data as $key => $val)
{
$this->db->where ('setting', $key);
$this->db->set ('value', $val);
$this->db->update ('recruitment');
}
Now, I tried following up with this by adding:
if ($this->db->affected_rows() >= '1') { return true; }
return false;
To my model, and
if ($this->RecruitmentSettings->Update($form_data) == TRUE)
{
redirect('recruitment/success');
}
to my controller, but it's not working as expected at all. Any ideas what I'm doing wrong?
There are a lot of questions here. Do you already have values in the database and you want to update them? Or do you want to put in new data every time? The answer depends on that.
What you want is the insert_batch() or update_batch() methods of the active record class (if that's what you're using for the db).
foreach($post_array as $key => $value)
{
$settings[] = array(
'setting' => $key,
'value' => $value
);
}
$this->db->insert_batch('your_db_table', $settings);
OR, for updates:
$this->db->update_batch('your_db_table', $settings, 'setting');
You could do a query to check for settings and do insert_batch or update_batch depending on if there are results. If you wanted to insert every time, you could delete the rows in the table before you do the insert. I wouldn't do that without a transaction, however.
So you want to store the array data in the database? You could do this
Model
foreach ($data as $key => $item)
{
$this->db->set ('setting', $key);
$this->db->set ('value', $item);
$this->db->insert ('table_name');
}
return ($this->db->affected_rows() > 0);
Controller
if ($this->RecruitmentSettings->Update($form_data))
{
redirect('recruitment/success');
}
else
{
echo "error";
}