Mysql dynamic column with php - php

I have a problem how to store some data in mysql.
I have website which when link is pressed pass some data to php file which read this data with get and write in database(mysql). I'm passing campaign_id and unknown number of parameters.
http://domain.com/somefile.php?campaignid=1&parameter1=sometext1&parameter2=sometext2&parameter3=sometext3,....etc..
I don't know actual number of parameters because user make them in some sort of cms. The problem I'm facing is how to store them in database. I was thinking to make it like this below but i'm not sure if it's the right and the most effective way:
Combinations Table
-combination_id (Primary key and auto increment)
-campaign_id
-parameter1
-parameter2
-parameter3
-parameter4
-parameter5
-parameter6
-parameter7
-parameter8
-parameter9
-parameter10
In this example I assume that user will not add/use more than 10 parameters(which I think is lame, but I can't get better solution)
Also if I use this design I assume I need to check in this file where is get them from passing and write to database, if each parameter exist(if it was passed).

You have to normalize your schema.
Assume the following tables:
Entity: id, campaign_id, other fields.
Parameter: id, entityId, parameterValue.
This is a Many-to-One relation.

What About storing all the parameters as json in one table row?

You could try something like this:
combination_id (primary key auto increment)
campaign_id ( indexed / foreign key / can't be unique!)
param_name
param_value
You'd have to create an entry for every parameter you're getting, but you could theoretically add a thousand parameters or more.
Might not be the fastest method though and can be a bit hard to work with.

I think this is the kind of data nosql databases are made for... At least, trying to force it into a sql database always ends up as some kind of kludge. (been there done it...)
as far as I can see, you have three different ways of storing it:
As you proposed. Probably the easiest way to handle it and also probably the most efficient. But, at the moment you get 11 parameters you are in for major problems...
Make a parameter table - parameter_id, - campaign_id parameter (possible parameter name if it matters) - this gives you total flexibility - but everything else, ecept for searching for single values gets more difficult,
Combine the parameters and store them all in a text or varchar field. This is probably even more efficient than 1, except for searching for single parameter values.
And if I may add
Use a database system with an array type, eg postgresql

If you don't know the actual number of parameters that will come through url, there is a best option to store the infinite number of values for a campaign_id.
For that you can create multiple rows in the table. Like,
insert into table_name values(<campaign_id>,<parameter1>,<sometext>)
insert into table_name values(<campaign_id>,<parameter2>,<sometext>)
insert into table_name values(<campaign_id>,<parameter3>,<sometext>)
insert into table_name values(<campaign_id>,<parameter4>,<sometext>)
Assuming the campaign_id is unique in url.

Related

Combine Multiple Rows in MySQL into JSON or Serialize

I currently have a database structure for dynamic forms as such:
grants_app_id user_id field_name field_value
5--------------42434----full_name---John Doe
5--------------42434----title-------Programmer
5--------------42434----email-------example#example.com
I found this to be very difficult to manage, and it filled up the number rows in the database very quickly. I have different field_names that can vary up to 78 rows, so it proved to be very costly when making updates to the field_values or simply searching them. I would like to combine the rows and use either json or php serialize to greatly reduce the impact on the database. Does anyone have any advice on how I should approach this? Thank you!
This would be the expected output:
grants_app_id user_id data
5--------------42434----{"full_name":"John Doe", "title":"Programmer", "email":"example#example.com"}
It seems you don't have a simple primary key in those rows.
Speeding up the current solution:
create an index for (grants_app_id, user_id)
add an auto-incrementing primary key
switch from field_name to field_id
The index will make retrieving full-forms a lot more fun (while taking a bit extra time on insert).
The primary key allow you to update a row by specifying a single value backed by a unique index, which should generally be really fast.
You probably already have some definition of fields. Add integer-IDs and use them to speed up the process as less data is stored, compared, indexed, ...
Switching to a JSON-Encoded variant
Converting arrays to JSON and back can be done by using json_encode and json_decode since PHP 5.2.
How can you switch to JSON?
Possibly the current best way would be to use a PHP-Script (or similar) to retrieve all data from the old table, group it correctly and insert it into a fresh table. Afterwards you may switch names, ... This is an offline approach.
An alternative would be to add a new column and indicate by field_name=NULL that the new column contains the data. Afterwards you are free to convert data at any time or store only new data as JSON.
Use JSON?
While certainly it is tempting to have all data in one row there are somethings to remember:
with all fields preserved in a single text-field searching for a value inside a field may become a two-phase approach, as a % inside any LIKE can skip into other field's values. Also LIKE '%field:value%' is not easily optimized by indexing the column.
changing a single field means updating all stored fields. As long as you are sure only one process changes the data at any given time this is ok, otherwise there tend to be more problems.
JSON-column needs to be big enough to hold field-names + values + separators. This can be a lot. Also if you miss-calculate a long value in any field means a truncation with the risk of loosing all information on all fields after the long value
So in your case even with 78 different fields it may still be better two have a row per formular user and field. (It may even turn out that JSON is more practicable for formulars with few fields).
As explained in this question you have to remember that JSON is only some other text to MySQL.

save array in mysql field and search in that field

I have a mysql table looking like this:
id
some_field1
some_field2
variable_fields
datetime
...
Now I want to store more than 1 value in variable_fields like this:
user_id:5;message_id:10
The reason why I do not create a separate field for every value I want to store is that these values differ throughout the project. So I am storing different values along the project.
At some time variable_fields contains this value:
user_id:5;message_id:10
And at some other time it contains this value:
car_id:56;payment_id:45
This wouldn't be a big problem but I want to be able to search in this field. So something like: variable_fields LIKE '%payment_id:45%'.
This obviously takes time for mysql.. Is there another way of handling this instead of creating a field for every value? So some kind of dynamic field in mysql?
I happy for every kind of help. Thank you in advance!
Best regards,
Freddy
If you'll add a myisam full-text index or employ any other full-text tools on that column (e.g. sphinx, lucene) those searches you described will work much better, however that isn't advisable.
I would suggest either to divide the dynamic meta data into different tables per case, and keep a type_id in the main table, or keep columns for all options that are set to NULL by default. Really depends if there is a simple division or is this really dynamic and changing over time. In case you're diving the data into several tables, a JOIN according to type_id will give the ability to query by those specific fields values. Be sure to create an index in both tables on the mutual id.

Storing values in two MySQL tables

I have a scenario in which I am not sure about what to do.
I have a website where a user can update their status. I am allowing the use of hash tags so a possible user post might look like:
Went for a great hike today!! #hiking
Now, I intend to store the post in a table appropriately named "POSTS" which is structured like this:
post_id | user_id | text | date
Now, when a user submits the form which holds the post text I run a script to create an array to get all of the hash tag terms the user used and then store them in an array.
So then I can loop through that array and insert the tags into the aptly named "TAGS" table. Now the structure of this table is this:
tag_id | post_id | user_id | tag
The only problem with this is that I do not know the post_id of the post until after I insert the data into the "POSTS" table (post_id is the primary key and is auto increment).
Now, I was thinking I could just SELECT the last row of data from the "POSTS" table for that user (after I insert the post), and then in turn use the returned post_id for my query that inserts the tag data into the "TAGS" table. This seems like not the best way? My question is:
Is this the best solution or is there a better way to go about this scenario?
I am brand new to Stack Overflow, so don't please down vote me. Comment and tell me what I am doing wrong and I will learn and ask better questions.
Thanks
You can get last insterted ID very simply:
mysql_insert_id() if you don't use PDO or using function lastInsertId() if you do.
Have a new column in both tables - unique_id - which holds a string you generate in code before querying the database. That way you have an id to tie posts and tags together before submission. I use this method all the time for similar applications.
Only issue is uniqueness, but there a variety of ways to generate unique ids (I normally use a mixture of timestamps and hashing).
This sort of depends on which version of mysql you're using and how you want to organize your code.
Option 1. Do exactly what you've said. PHP would contain the code to manage the database and how data is stored into the database. The only drawback that I see in what you've outlined is if there's an issue with dealing with the hashtags, then possibly you would have a post that is inserted to the database, but the hash part did not successfully complete. For certain applications (like a bank account), this may not be acceptable and this is what database transactions are for.
Option 2. Another way to handle this would be to write a mysql stored procedure that does both the insert and handling the hash tags. The stored procedure could also wrap the whole thing in a transaction so that your database is consistent. Note that this requires a version of mysql that supports stored procedures. The bad side of doing this is that you would have to write in mysql, which is different from PHP.
Both mysql and PHP can handle this application logic/datastore logic. It is a matter of how you want to organize the code. I would prefer keeping the layers distinct. Even if you are to do this in PHP, at least have a separate class that deals with the database and not do anything else. When your code gets bigger, having a separate class or module or namespace that manages these types of code really makes them easier to change and to test.

MySQL – Should I use last_insert_id()? or something else?

I have a table users that has an auto-incrementing id column. For every new user, I essentially need to insert three rows into three different tables, which are 1. user, 2. user_content, and 3. user_preferences. The rows inserted into user_content and user_preferences are referenced by their id's which correspond to each user's id (held in user)
How do I accomplish this?
Should I do the INSERT INTO user query first, obtaining that auto-incremented id with last_insert_id(), and then the other two INSERT INTO queries using the obtained user id? Or, is there a more concise way to do this?
(note: I am using MySQL and PHP, and if it makes a difference, I am using a bigint to store the id values in all three tables.)
Thank you!
The approach that you've described (insert into user first, take the result of last_insert_id(), and use that to insert to the other two tables) is perfectly reasonable; I see nothing wrong with it.
It might be technically possible to combine the three queries and use the LAST_INSERT_ID() MySQL function to insert values to the other two tables, but this would be significantly more complex without any corresponding benefits. Not really worth doing, in other words.
I see 3 options:
PHP side using some *_last_insert_id (as you describe)
Create a trigger
Use a stored procedure.

Maintain unique id across multiple tables

I want to build a database-wide unique id. That unique id should be one field of every row in every table of that database.
There are a few approaches I have considered:
Create one master-table with an auto-increment-field and a trigger in every other table, like:
"before insert here, insert in master-table -> get the auto-increment value -> and use this value as primary-key here"
I have seen this before, but instead of making one INSERT, it does 2 INSERTS, which I expect would not be that performant.
Add a field uniqueId to every table, and fill this field with a PHP-generated integer... something like unix-timestamp plus a random number.
But I had to use BIGINT as the datatype, which means big index_length and big data_length.
Similar to the "uniqueId" idea, but instad of BIGINT I use VARCHAR and use uniqid() to populate this value.
Since you are looking for opinions... Of the three ideas you give, I would "vote" for the uniqid() solution. It seems pretty low cost in terms of execution (but possibly not implementation).
A simpler solution (I think) would be to just add a field to each table to store a guid and set the default value of the field to be MySQL's function that generates a guid (I think it is UUID). This lets the database do the work for you.
And in the spirit of coming up with random ideas... It would be possible to have some kind of offline process fill in the IDs asynchronously. Make sure every table has the appropriate field and make the default value be 0/empty. Then the offline process could simply run a query on each table to find the rows that do not yet have a unique id and it could fill them in. That would let you control the ID and even use some kind of incrementing integer. This, of course, requires that you do not need the unique ID instantly each time a record is inserted.

Categories