TL;DR:
I want to use data from a 1-dimensional array of arbitrary size, created by userinput, and fill its values into the appropriate fields of a 2-dimensional array of arbitrary size, created via query from the Database.
I have a webapplication where the user can access the DBs data both in read-mode and write-mode.
The DB records accessible to him are determined by the departments he belongs to.
The records are organized in a DB structure where a coretable contains data visible to ALL departments, while extensiontables referencing the coretable via FK contain data which are only visible to users who belong to the department associated with this extensiontable.
I'm using Lumen/Laravel and its eloquent model to interact with the DB from my Backend, some examplecode for accessing the DB looks like this:
$join = coretable::with($permittedTables)->find(1);
Here, $permittedTables is an array of tablenames, referencing the extensiontables accessible to the user.
The above code then fetches the following result:
{
"id": 1,
"Internal_key": "TESTKEY_1",
"extensiontable_itc": {
"description": "EXTENSION_iTC_1"
},
"extensiontable_sysops": {
"description": "EXTENSION_SYSOPS_1"
}
}
Now, the user will have a list-like view in the front-end where all this data has been merged into a big table. In this list, the user can click a field and change its value, then send an http-request to persist these changes to the DB.
This http-request will be an array in JSON format, which i will json_decode() in my backend, and use the transmitted id to fetch the model as seen above.
Now, at this point, two sets of data, organized in the structure of associative arrays, will face each other. The input from the http-request will likely be a 1-dimensional array, while the model from the DB will almost certainly be the multidimensional array youve seen above.
Furthermore, there is a huge amount of possible combinations of datasets. It can be the combination seen above, but it also can be a combination of data from other tables not listed here, and it can be both a bigger or smaller set of tables aggregated into the model and userinput.
Therefore, the process setting the incoming input to the DB must be able to determine dynamically which field of the input must be put into which field of the DB.
I'm doing this for the first time and I don't have a lot of ideas how to do this. The only thing that came to my mind was mirroring the DB column names to the indexes of the input's array and then loop through the model and compare its indexes to the index of the currently selected element of the input. If they match, the value from the respective field of the input will be set to the respective field of the DB.
I am aware that in order for this to work, each column of the tables affected by this process MUST have a unique name across the DB. This is doable though.
Still, this is currently the only idea I could come up with.
However, I wanted to ask you two things about this:
1) Are there other, less "hacky" approaches to solve the problem outlined above?
2) Can someone give me a custom function/a set of custom functions capable of iterating over two arrays, of which at least one will be multidimensional, while comparing their indexes, then setting the value from Array A to Array B when the indexes match?
Below I have a little codeexample, creating two arrays which reproduce the described situation, with which you can fiddle around:
First, the inputside:
$inputArray = array(
"id" => 1,
"internal_key" => "TESTKEY_1",
"CPU" => "intelTest1",
"GPU" => "nvidiaTest1",
"Soundcard" => "AsusTest1"
"MAC" => "macTest1",
"IP" => "ipTest1",
"VLAN" => "vlanTest1"
);
Then, the DB-side:
$modelArray = array(
"id" => 1,
"internal_key" => "TESTKEY_2",
"extensiontable_itc" => array (
"CPU" => "intelTest1",
"GPU" => "nvidiaTest2",
"Soundcard" => "AsusTest1"
),
"extensiontable_sysops" => array (
"MAC" => "macTest2",
"IP" => "ipTest1",
"VLAN" => "vlanTest1"
)
);
Related
I need to make an import method that takes the CSV file and imports everything in the database.
I've done the parsing with one of Laravel's CSV addons and it works perfectly giving me a big array of values set as:
[
'col1_name' => 'col1 value',
'col2_name' => 'col2 value',
'col3_name' => 'col3 value,
'...' => '...'
]
This is also perfect since all the column names fit my model which makes the database inserts a breeze.
However - a lot of column values are strings that i'd like to set as separate tables/relations. For example, one column contains the name of the item manufacturer, and i have the manufacturer table set in my database.
My question is - what's the easy way to go through the imported CSV and swap the strings with the corresponding ID from the relationship table, making it compatible with my database design?
Something that would make the imported line:
[
'manufacturer' => 'Dell',
]
into:
[
'manufacturer' => '32',
]
I know i could just do a foreach loop comparing the needed values with values from the relationship models but I'm sure there's an easier and more clean way of doing it.
I don't think theres any "nice" way to do this - you'll need to look up each value for "manufacturer" - the question is, how many queries will you run to do so?
A consideration you need to make here is how many rows you will be importing from your CSV file.
You have a couple of options.
1) Querying 1 by 1
I'm assuming you're going to be looping through every line of the CSV file anyway, and then making a new model? In which case, you can add an extra database call in here;
$model->manufacturer_id = Manufacturer::whereName($colXValue)->first()->id;
(You'd obviously need to put in your own checks etc. here to make sure manufacturers exist)
This method is fine relatively small datsets, however, if you're importing lots and lots of rows, it might end up sluggish with alot of arguably unnecessary database calls.
2) Mapping ALL your Manufacturers
Another option would be to create a local map of all your Manufacturers before you loop through your CSV lines;
$mappedManufacturers = Manufacturer::all()->pluck('id', 'name');
This will make $mappedManufacturers an array of manufacturers that has name as a key, id as a value. This way, when you're building your model, you can do;
$model->manufacturer_id = $mappedManufacturers[$colXValue];
This method is also fine, unless you have tens of thousands of Manufacturers!
3) Where in - then re-looping
Another option would be to build up a list of manufacturer names when looping through your CSV lines, going to the database with 1 whereIn query and then re-looping through your models to populate the manufacturer ID.
So in your initial loop through your CSV, you can temporarily set a property to store the name of the manufacturer, whilst adding it to another array;
$models = collect();
$model->..... = ....;
$model->manufacturer = $colXValue;
$models->push($colXValue);
Then you'll end up with a collection of models. You then query the database for ONLY manufacturers which have appeared:
$manufacturers = Manufacturer::whereIn('name', $models->lists('manufacturer'))->get()->keyBy('name')->toArray();
This will give you array of manufacturers, keyed by their name.
You then loop through your $models collection again, assigning the correct manufacturer id using the map;
$model->manufacturer_id = $manufacturers[$model->manufacturer];
Hopefully this will give you some ideas of how you can achieve this. I'd say the solution mostly depends on your use case - if this was going to be a heavy duty ask - I'd definitely Queue it and be tempted to use Option 1! :P
I am trying to fetch Data in a recursive-way (over associations) using CakePHP 3.1
In Cake 3 I can use the "contain" key to fetch the next level of asociated data. But I need to fetch one more level. Does anyone know how to do this? I read the docs but didn't found anything there, with google it's the same.
The 3 Levels are connected like this:
OperationalCostInvoice (belongsTo Object)
-> Object (hasMany OperationalCostTypes)
-> OperationalCostType
With OperationalCostInvoice->get($object_id, ['contain' => 'Object']) I can get the Object that is associated with the OperationalCostInvoice but I also want to fetch the OperationalCostTypes from the Object in (if possible) just one call.
I dont need tipps about association linking the reason that the entities are linked like this is I can easily implement a history function.
Thanks in advance!
I just meant one function call (on the Table object) to fetch everything. I know that more than one query is required.
Just create your own table method then and return all your results in one array or implement whatever you want and return it.
public function foo() {
return [
'one' => $this->find()...->all();
'two' => $this->Asssoc->find()...->all();
];
}
But in CakePHP 2 there was the option recursive which controlled on how many levels associated data is fetched.
The recursive was a pretty stupid thing in Cake2. First thing we've always done was to set it to -1 in the AppModel to avoid unnecessary data fetching. Using contain is always the better choice. I would stay away from using recursive at all, especially for deeper levels.
Also contain is still, as it was in Cake2 as well, able to fetch more than one level deep associations.
$this->find()->contain([
'FirstLevel' => [
'SecondLevel' => [
'ThirdLevel'
]
]
])->all();
Hello everyone and thank you for viewing this question.
Since someone asked what i am doing this for, here is the answer:
An artist asked me to make him a web app to store all his new concerts etc.. Now, when it comes to add the Instruments, artists etc, i could have 10 instruments, or maybe 100.. Everything is set into a form.. Some data is fixed like location, time etc, but this other fields are added dynamically using DOM..
I am building a system in which the user set up a form to be stored on a database like:
Name,Surname,field_1
//Lets say that this is the "fixed" part of the form
//But the user should be able to add 'n' other fields with no limit
//Therefore my problem is that i would end up with a row made of, lets say,
//4 colums
//And another one of, maybe, 100 columns
//
//Then i will need to access these rows, and row one should have 4 cols, row two 100..
//This can't be done in a "traditional" way since each row should have the
//same amount of cols
//
//I thought to create a new table for each submission
//but this doesn't really make that much sense to me..
//
//Storing all the possible fields in a single one and then
//access them through an array ? That would require too much, even since my fields
//should have the possibility to be edited..
//Each field is a mixture of variables then, like
//field1:a=12,field2:b=18.. too complex
Any help would be very appreciated
I would go the one field approach. You could have three columns, Name, Surname, and field_values. In the field_values column, store a PHP serialized string of an array representing what would otherwise be your columns. For example, running:
array(
['col1'] => 'val',
['col2'] => 'val1',
['col3'] => 'val2',
['col4'] => 'val3'
)
through serialize() would give you:
a:4:{s:4:"col1";s:3:"val";s:4:"col2";s:4:"val1";s:4:"col3";s:4:"val2";s:4:"col4";s:4:"val3";}
and you can take this value and run it back through unserialize() to restore your array and use it however you need to. Loading/saving data within this array is no more difficult than changing values in the array before serializing it and then saving it to the field_values column.
With this method you can have as many or few 'columns' as you need with no need for a ton of columns or tables.
In this case I would personally create a new table for each user, with new row inserted for ever new custom field. You must have a master table containing table names of each user table to access the data within later.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Why it is the best practice to store your message or results in the two dimensional array?
I grilled it a lot in my mind but failed to produce an exact answer
The answers which came up to me with lot of grilling are as following :-
To store 2 messages at one time
To have the facility to store large messages
To store the large number of messages
though I am not sure about any one of them I admit it that my problem is not that programming oriented!
It might be best to look at what the PHP docs have to say about arrays first:
An array in PHP is actually an ordered map. A map is a type that
associates values to keys. This type is optimized for several
different uses; it can be treated as an array, list (vector), hash
table (an implementation of a map), dictionary, collection, stack,
queue, and probably more. As array values can be other arrays, trees
and multidimensional arrays are also possible.
As you can see from that definition, php arrays are very flexible and cover a lot of use cases. The particular area you are asking about is the multidimensional(2D) PHP array style. Now take a look at how a creating a 2D array looks:
$blank2DArray = array(array());
It's fairly clear that what you have is simply an array of arrays, ie a 2d Array.
So where 2D arrays are useful are cases where you have data that goes beyond simple key => value usage. A simple example: You have some results from multiple race car drivers and their scores from a race course. Each driver has multiple pieces of information so you need more than just a single key => value stored for each driver. You could make an object with attributes to store with this kind of thing, but you could handle it very quickly and simply with a PHP 2D array like this:
$drivers = array();
$drivers[0] = array('driver_id' => 2, 'course_id' => 5 'score' => 61.6);
$drivers[1] = array('driver_id' => 3, 'course_id' => 4 'score' => 70.8);
$drivers[2] = array('driver_id' => 8, 'course_id' => 2 'score' => 76.8, 'winner' => 1);
Each driver and their data are represented by a new array and each is added with an index(this does not need to be numeric). Notice driver[2] has an attribute winner that the others do not have; this is allowed because PHP allows for jagged arrays, ie not all entries have to be the same size. You can easily access child elements of each array like this:
$drivers[0]['driver_id'] //prints 2
$drivers[1]['course_id'] //prints 4
$drivers[2]['score'] //prints 76.8
PHP arrays are excellent for solving a variety of problems and 2D arrays specifically allow for representing complex data far beyond simple key => value storage. For an in-depth look under the hood at PHP arrays check out this blog post: Link
So to answer your question it may not always be best practice to use a 2D array, it will depend on the problem you are trying to solve. PHP arrays are a swiss army knife and the 2D variety are excellent for solving problems where you need to store variable, complex data elements.
Your question is very broad so I'll cover the two possibilities that come to mind:
1 > You're looking for a way to have PHP access and use multidimensional arrays:
$sData[0] = array("Name" => "T-Rex", "Type" => "dinosaur");
$sData[1] = array("Name" => "Frog", "Type" => "amphibian");
$sData[2] = array("Name" => "Salamander", "Type" => "amphibian);
This will allow you to have multiple rows with multiple sub-rows worth of data inside of them. There's no limit (besides machine memory) as to how many rows deep you can go.
2 > You're trying to figure out how to store that information in the database. In which case you need two tables like such:
Table: types
Structure: id INT(8) autoincrement, typename VARCHAR(65)
Example Data: 0, dinosaur -- 1, amphibian
Table: animals
Structure: id INT(8) autoincrement, type_id INT(8), name VARCHAR(65)
Example Data: 0, 0, T-Rex -- 1, 1, Frog -- 2, 1, Salamander
problem
I have two data tables SEQUENCES and ORGANISMS whose many-to-many-relationship is mappend in the table SOURCES. There is also a 1-m relationshipt between SOURCES and ENTRIES. I will append a detailed structure.
What i want to achieve, is the display of all sequences with all associated organisms and entries, where a condition within the sequences table is met. I have some ideas on how to achieve this, but i need the solution with the best performance, as each of these contains 50k+ entries.
idea one
Select all organisms that belong to the same sequence as a concatenated string in sql, and split it in PHP. I have no idea though, how to do the concatenation in SQL.
idea two
select same sequences with different organisms as distinct records, order by organism, and join them later in php. though this somehow feels just wrong.
idea three
use views. ANY idea on this one appreciated
structure
SEQUENCES
SEQUENCE_ID
DESCRIPTION
ORGANISMS
ORGANISM_ID
NAME
SOURCES
SOURCE_ID
SEQUENCE_ID FK to SEQUENCES.SEQUENCE_ID
ORGANISM_ID FK to ORGANISMS.ORGANISM_ID
ENTRIES
SOURCE_ID FK to SOURCES.SOURCE_ID
ENTRY_VALUE
desired outcome
array(
array(
"SEQUENCE_ID" => 4,
"DESCRIPTION" => "Some sequence",
"SOURCES" => array(
array(
"ORGANISM_ID" => 562,
"ORGANISM_NAME" => "Escherichia coli",
"ENTRIES" => array(
"some entry",
"some other entry"
),
array(
"ORGANISM_ID" => 402764,
"ORGANISM_NAME" => "Aranicola sp. EP18",
"ENTRIES" => array()
)
)
),
array(
"SEQUENCE_ID" => 5,
.....
)
)
PHP5 and FIREBIRD2.5.1
You can't fetch a nested array like that directly from a flat table structure. But if I get you right, what you want to do is not that hard to achieve.
I don't understand why you would concatenate things and then split them again, that's hard to maintain and probably slow.
I see two approaches here:
Fetch everything at once as flat table using JOIN and loop through it in PHP. This approach creates a lot of duplication but it's fast because you can fetch all data in one query and then process it with PHP.
Fetch every entity separately, loop and fetch the next hierarchy level as you go. This approach will be slower. It takes complexity away from the SQL query and doesn't fetch redunant data. It also gives you more freedom as to how you loop through your data and what you do with it.
Alternatively you might want to actually store hierarchical data in a no-sql way, where you could already store the array structure you mentioned.