How do you save an arraycollection field in an object to a mysql database. I usually would have used a for each item in array collection field then pass a unique key to it so it track it back later, but thats getting cumbersome now.
Is there a way to save an arraycollection that also allows easy update of the arraycollection item.
Thanks in advance
Johnny
If you woun't update some elements of collection separately, you may store it in one row (values of array separated with ',' in one string). Use blob fields if the string will be long. If you need to update some part in that collection you may tre to derive out that part in separate row. But it's not good practice, so if you want to update something putting each element of collection in a separate field is probably the best solution.
Related
I have a string column in the table that have this kind of data // 1,2,3,4,5,6 comma separated data, supposedly this id has equivalent value to the another table.
I have the solution that I can get the data of the mentioned ids but its a bit mess. This is the step that I have solution
Get all data first
Loop all the data
Convert the column that has comma separated data to array
Loop the converted column
Query to another table
And save the query data
Is there an easy way to achieve this via eloquent query?
Thanks
I don't know what are the situation but you can convert the table column into a n * m relationship. I mean that if you have an "array" that it's identifying another table with multiple values it possible that the best BD structure was a new pivot table to relate the two already existing.
For new pivot table in a similar problem see: https://stackoverflow.com/a/58706027/7702532
For subquerying a new table in a similar problem see: https://stackoverflow.com/a/58706063/7702532
If you can't create a new table and the subquery option is not available for your version you can use accessor & mutators for the model attribute to mutate it and do not do a query on a loop of all results from the first query. https://laravel.com/docs/8.x/eloquent-mutators#accessors-and-mutators
I'm thinking on two options for you, I have not all the data of the problem but I'm trying to help ;D :
In the mutator you can call the other table and return the correct value but its very expensive for your server because all item of the first query you are doing another new query.
Another option are return the well formed array in the mutator. After getting the first query values, merge all the arrays of your attribute and make a unique query to your second table. This option need a loop to the results of the first table to assign the correct values filtering the second query results.
The first option is more easy to code but it's more expensive on memory and computation. The second one can be a little bit hard to code but is a good solution if you can't change the database structure or do correctly the subqueries.
In other hand I think that you can do a custom relationship or find a community contribution.
I'm learning Laravel and would know howto read data from a db and write it automatically to a second db.
First I read from db1 and it works:
$paciente_q = Pacientes::on('db1')->find($id);
Then I wish to move the data to an identical table on db2 (assigned in the configuration)
Pacientes::create($paciente_q);
The error is that I pass an object and "::create" wants an array. I converted it to an array but didn't work. The only option that I can find is to create an array with the data and then make the ::create. But I think that there should be an easier way. I'm talking about 10 columns.
What could I do if we talk about hundreds of columns?
Your approach didn't work probably because by default mass assignment is prevented for security reasons; you need to manually set the model's fields that are mass assignable in the fillable property of the model (that should be an array) - if you do not care about that security or are sure that you'll never directly mass-assign user input to your models you can make all the fields mass assignable by setting the guarded property of the model to an empty array.
Once that's done, your code is mostly correct, just convert the model to an array and don't forget to select the second database when creating the model, like so :
// the model to insert, converted to an array - get() would also work but first() ensures we get only one record even if the primary key is messed up and there are multiple values with the same ID
$paciente_q = Pacientes::on("db1")->find($id)->first()->toArray();
// create the same model on the second database
Pacientes::on("db2")->create($paciente_q);
Now, if you want to do it occasionally for a few rows then the above approach is suitable, otherwise you may look at bulk insertion, here's an example for copying the entire table from your first database to the second one :
// an array with all the rows
$patients = Pacientes::on("db1")->all()->toArray();
// get the model's table name
$table = with(new Pacientes)->getTable();
// bulk insert all these rows into the second database
DB::connection("db2")->table($table)->insert($patients);
Note that here we're not using Eloquent for inserting them, so we must first get the table's name from an instance of the model; if the table's name on the second database is different from the first then adjust the $table variable accordingly.
The solution was to change the get() to first() because we were searching for one item. I read wrong the first solution from #André... sorry! Should learn to read instead of Laravel!
$paciente_q = Pacientes::on('db1')->where('numerohistoria',$numerohistoria)->first()->toArray();
Pacientes::create($paciente_q);
Now it works!! Thanks to all and specially to #André !
I currently have a database structure for dynamic forms as such:
grants_app_id user_id field_name field_value
5--------------42434----full_name---John Doe
5--------------42434----title-------Programmer
5--------------42434----email-------example#example.com
I found this to be very difficult to manage, and it filled up the number rows in the database very quickly. I have different field_names that can vary up to 78 rows, so it proved to be very costly when making updates to the field_values or simply searching them. I would like to combine the rows and use either json or php serialize to greatly reduce the impact on the database. Does anyone have any advice on how I should approach this? Thank you!
This would be the expected output:
grants_app_id user_id data
5--------------42434----{"full_name":"John Doe", "title":"Programmer", "email":"example#example.com"}
It seems you don't have a simple primary key in those rows.
Speeding up the current solution:
create an index for (grants_app_id, user_id)
add an auto-incrementing primary key
switch from field_name to field_id
The index will make retrieving full-forms a lot more fun (while taking a bit extra time on insert).
The primary key allow you to update a row by specifying a single value backed by a unique index, which should generally be really fast.
You probably already have some definition of fields. Add integer-IDs and use them to speed up the process as less data is stored, compared, indexed, ...
Switching to a JSON-Encoded variant
Converting arrays to JSON and back can be done by using json_encode and json_decode since PHP 5.2.
How can you switch to JSON?
Possibly the current best way would be to use a PHP-Script (or similar) to retrieve all data from the old table, group it correctly and insert it into a fresh table. Afterwards you may switch names, ... This is an offline approach.
An alternative would be to add a new column and indicate by field_name=NULL that the new column contains the data. Afterwards you are free to convert data at any time or store only new data as JSON.
Use JSON?
While certainly it is tempting to have all data in one row there are somethings to remember:
with all fields preserved in a single text-field searching for a value inside a field may become a two-phase approach, as a % inside any LIKE can skip into other field's values. Also LIKE '%field:value%' is not easily optimized by indexing the column.
changing a single field means updating all stored fields. As long as you are sure only one process changes the data at any given time this is ok, otherwise there tend to be more problems.
JSON-column needs to be big enough to hold field-names + values + separators. This can be a lot. Also if you miss-calculate a long value in any field means a truncation with the risk of loosing all information on all fields after the long value
So in your case even with 78 different fields it may still be better two have a row per formular user and field. (It may even turn out that JSON is more practicable for formulars with few fields).
As explained in this question you have to remember that JSON is only some other text to MySQL.
In my database there is a table, which has a column of the type text. This column holds a serialized array. This array is read and stored by another application, and I cannot change its format.
The serialized array holds a selection of database names, table names and column names in two different languages.
I would like to write a controller, entity, form, etc. in Symfony2 that is able to modify this serialized array.
There is a script that I can use that can provide an array of all possible db names, table names and column names that each serialized array may contain.
The goal is to present a list of check boxes where users can select db's, tables and columns. Next, they can do a translation of the names.
Since all data is so volatile, I am not sure whether this is even possible in Symfony2.
An alternative is to make the following entities: { database, table, column } and do it fully OO. And then I could export a selection in a serialized array, to the external application that expects it that way...
Can you guys follow my reasoning? Am I overlooking a strategy here...?
Added:
The array is a nested array up to the fifth degree. Databases contain tables, which contain columns. And every item has an original name and a translated name.
I think you answered your own guestion:
An alternative is to make the following entities: { database, table, column } and do it fully OO.
And then I could export a selection in a serialized array, to the external application that
expects it that way...
You would start with a master entity mapped to your table.
class SomeEntity
{
protected $serializedInfo;
public getDatabases()
{
// Process serializedInfo into an array of database objects and return
You then pass SomeEntity to SomeEntityFormType which in turn uses a collection of DatabaseFormTypes. The DatabaseFormType then has a collection of TableFormTypes and so on.
Eventually your form would be posted and SomeEntity would be updated. You then serialize before posting. Should be straight forward. Might be a bit more challenging if you want users to add information but even then it is doable.
I know it's really late but I was really busy with university so I couldn't answer sooner
This is what I think is the best to do
Imagine that the table that contains the column which contains your array is called foo
So you make an entity which is called Foo and contains a field(type text) that has the name you like
Now the tricky part is to make an object called Database that contains all the relations you need(To a Table object and Table objects to column Objects)
So even though I told you to make the field type as text you will pas the object Database to this field
So how it's going to work
The Database object will have a __string method that will return the serialized array of the object the way you want
This way when doctrine2 tries to save the Database object in the text field it will be saved as the string that __string method returns
And you will have getDatabase that will converts the serialized array to the database object
This is the idea that I have and not sure if it suits you or not
I am curious to know what would be the preferred method for a table which would store a list of IDs associated with a UserID.
Should I have one field storing a serialized array of all the id's (one row per UserID)
Or one row for every id for every player? Is the answer clear cut or are there factors to take into account when making such a decision?
Hope you can help explain the best choice.
if you're going to get the whole list of IDs everytime or most of the time, or if the lists of IDs are short, then serializing them is fine (though I wouldn't recommend it anyway)
However you'll be missing out on indexing if you ever need to search for a user using an ID in the list of IDs as a constraint.
Best way is to have a separate table storing the many-to-many relations (which is what I'm assuming you want). If you have a one-to-many relationship, then you should reverse the way that you are referencing the relation and add a user_id column to the table containing the IDs .
I would keep one row for every id for every player. That to me is more manageable and clean approach. How many records are you looking at here though ?