I am experimenting trying to find the best and most efficient way to alter the data in a given table through a form using PHP.
The scenario is a list of items in a table, if you right click->edit an item, a request is made to MySQL for all the data and the fields are populated.
The user can change or leave the data untouched in any of the fields, and then presses save which sends everything back to PHP.
The easy way would be to just update all the columns regardless of whether or not they have changed, i.e.:
$this->model->set('name', 'some name string from the form', $itemId);
$this->model->set('price', 'number from the form', $itemId);
...etc...
So potentially I could change just the name and needlessly update the rest of the columns with the same data as what was received. (As a side question, does MySQL know this and ignore the update behind the scenes?)
Would a good way to perform an intelligent update be to compare two arrays? One that contains the original data and another with the data from the user. If values of a given index don't match, then it must have changed and so do the update?
i.e. a very simplified example:
if($submittedValues['name'] != $originalValues['name'])
{
...Update...
}
I guess you answer your question, and you could compare two array, either in your PHP code or using javascript and instead of sending every thing to the server, only send the changed values.
But in general I wouldn't care if I reset all the data, the process of affecting all fields again could be faster than comparing between old and new data in arrays, I would take much care if I was making many queries to the database but its only one update query
What could be interesting in test is, when the user lefts the fields empty, then the request will send an empty string, at the end it the update request will insert an empty string where a NULL value would have a better signification
Related
I have an in-house PHP class that I use for Object Relational Mapping. I would like to add an improvement to it though, and I'm not sure how to go about it:
When a new record is created, it has a flag marking it as such. When the record is saved, it checks for that flag; if the flag is set, then any fields that were flagged as "auto_increment", are assigned the mysqli::insert_id value.
What I'd like to do is update this so that ~any~ field that the database updates on the save (e.g. TIMESTAMP DEFAULT NOW()), gets assigned back to the object.
So, the only way that comes to mind is as follows:
Save the record
Grab the auto-increment field if applicable
Reload the record with another SELECT
I guess that would work, but it seems a bit roundabout to me. It also depends on none of the key fields being automatically assigned.
Is there any way to get auto-assigned values other than insert_id from the mysqli object after saving the record?
I currently have a database structure for dynamic forms as such:
grants_app_id user_id field_name field_value
5--------------42434----full_name---John Doe
5--------------42434----title-------Programmer
5--------------42434----email-------example#example.com
I found this to be very difficult to manage, and it filled up the number rows in the database very quickly. I have different field_names that can vary up to 78 rows, so it proved to be very costly when making updates to the field_values or simply searching them. I would like to combine the rows and use either json or php serialize to greatly reduce the impact on the database. Does anyone have any advice on how I should approach this? Thank you!
This would be the expected output:
grants_app_id user_id data
5--------------42434----{"full_name":"John Doe", "title":"Programmer", "email":"example#example.com"}
It seems you don't have a simple primary key in those rows.
Speeding up the current solution:
create an index for (grants_app_id, user_id)
add an auto-incrementing primary key
switch from field_name to field_id
The index will make retrieving full-forms a lot more fun (while taking a bit extra time on insert).
The primary key allow you to update a row by specifying a single value backed by a unique index, which should generally be really fast.
You probably already have some definition of fields. Add integer-IDs and use them to speed up the process as less data is stored, compared, indexed, ...
Switching to a JSON-Encoded variant
Converting arrays to JSON and back can be done by using json_encode and json_decode since PHP 5.2.
How can you switch to JSON?
Possibly the current best way would be to use a PHP-Script (or similar) to retrieve all data from the old table, group it correctly and insert it into a fresh table. Afterwards you may switch names, ... This is an offline approach.
An alternative would be to add a new column and indicate by field_name=NULL that the new column contains the data. Afterwards you are free to convert data at any time or store only new data as JSON.
Use JSON?
While certainly it is tempting to have all data in one row there are somethings to remember:
with all fields preserved in a single text-field searching for a value inside a field may become a two-phase approach, as a % inside any LIKE can skip into other field's values. Also LIKE '%field:value%' is not easily optimized by indexing the column.
changing a single field means updating all stored fields. As long as you are sure only one process changes the data at any given time this is ok, otherwise there tend to be more problems.
JSON-column needs to be big enough to hold field-names + values + separators. This can be a lot. Also if you miss-calculate a long value in any field means a truncation with the risk of loosing all information on all fields after the long value
So in your case even with 78 different fields it may still be better two have a row per formular user and field. (It may even turn out that JSON is more practicable for formulars with few fields).
As explained in this question you have to remember that JSON is only some other text to MySQL.
I've got a PHP script pulling a file from a server and plugging the values in it into a Database every 4 hours.
This file can and most likely change within the 4 hours (or whatever timeframe I finally choose). It's a list of properties and their owners.
Would it be better to check the file and compare it to each DB entry and update any if they need it, or create a temp table and then compare the two using an SQL query?
None.
What I'd personally do is run the INSERT command using ON DUPLICATE KEY UPDATE (assuming your table is properly designed and that you are using at least one piece of information from your file as UNIQUE key which you should based on your comment).
Reasons
Creating temp table is a hassle.
Comparing is a hassle too. You need to select a record, compare a record, if not equal update the record and so on - it's just a giant waste of time to compare a piece of info and there's a better way to do it.
It would be so much easier if you just insert everything you find and if a clash occurs - that means the record exists and most likely needs updating.
That way you took care of everything with 1 query and your data integrity is preserved also so you can just keep filling your table or updating with new records.
I think it would be best to download the file and update the existing table, maybe using REPLACE or REPLACE INTO. "REPLACE works exactly like INSERT, except that if an old row in the table has the same value as a new row for a PRIMARY KEY or a UNIQUE index, the old row is deleted before the new row is inserted." http://dev.mysql.com/doc/refman/5.0/en/replace.html
Presumably you have a list of columns that will have to match in order for you to decide that the two things match.
If you create a UNIQUE index over those columns then you can use either INSERT ... ON DUPLICATE KEY UPDATE(manual) or REPLACE INTO ...(manual)
I have a Many-to-Many table, in which I input some form info. I recently made this form dynamic, so that when an input elements value is changed, it is sent to database via AJAX.
So my question is:
Is it faster to try and find the values that exist, edit them, create the ones that don't and delete the ones that are not used anymore OR Should I delete all of the values for the id, and insert all of the new ones?
In response to comment, an elaboration.
A form , that has about 10 fields. Some of them mandatory, some not. Every time you access it, it generates a random identifier.
When a user starts filling the form, after the focus an element is lost, the whole form is submitted through AJAX, and all of the values that are not empty, are input in the many to many table.
The table has 3 fields : form identificator , element name , element value;
The question rephrased:
Do I delete all of the entries with the required form identificator, or try to find the fields and edit them?
It will require less code to delete all the existing relations and add new ones
Make sure you do this in a transaction
Handle errors correctly
Less code == fewer bugs, less developer time. So that is definitely faster.
I always delete all & insert in cases like this. I'd suspect that it'd be more processing time to search, edit, create, delete.
You can also try looking at:
INSERT INTO table (field1, field2) VALUES ('Value1', 'Value2')
ON DUPLICATE KEY UPDATE field1 = 'Value1'
Which will insert a new record or update an existing -- I'd still suspect the delete/insert to be faster -- depending on the number of fields you'd be updating at any given time.
I want to use temporary tables in my PHP code. It is a form that will be mailed. I do use session variables and arrays but some data filled in must be stored in a table format and the user must be able to delete entries in case of typos etc. doing this with arrays could work (not sure) but I'm kinda new at the php and using tables seems so much simpler. My problem is that using mysql_connect creates the table and adds the line of data but when i add my 2nd line it drops table and create it again... Using mysql_pconnect works by not dropping the table but creates more than on instance of the table at times and deleting entry's? what a mess! How can I best use temporary tables and not have them droped when my page refreshes? not using temporary tables may cause other issues if the user closes the page and leaving the table in the database.
Sounds like a mess! I am not sure why you are using a temp table at all, but you could create a random table name and assign it to a session variable. But this is hugely wrong as you would have a table for each user!
If you must use a database, add field to the table called sessionID. When you do your inserting/deleting reference the php sessionid.
Just storing the data in the session would probably be much easier though...
Better to create a permanent table and temporary rows. So, say you've serialized the object holding all your semi-complete form data as $form_data. At the end of the script, the last thing that should happen is that $form_date should be stored to the database and the resulting row id be stored in your $_SESSION. Something like:
$form_data=serialize($form_object); //you can also serialize arrays, etc.
//$form_data may also need to be base64_encoded
$q="INSERT INTO incomplete_form_table(thedata) '$form_data'";
$r=mysqli->query($q);
$id=$mysql->last_insert_id;
$_SESSION['currentform']=$id;
Then, when you get to the next page, you reconstitute your form like this:
$q="SELECT thedata FROM incomplete_form_table WHERE id={$_SESSION['currentform']}";
$r=$mysql->query($q);
$form_data=$r->fetch_assoc();
$form=$form_data['thedata'];//if it was base64_encoded, don't forget to unencode it first
You can (auto-) clean up the incomplete_form_data table periodically if the table has a timestamp field. Just delete everything that you consider expired.
The above code has not been exhaustively checked for syntax errors.