MySQL – Should I use last_insert_id()? or something else? - php

I have a table users that has an auto-incrementing id column. For every new user, I essentially need to insert three rows into three different tables, which are 1. user, 2. user_content, and 3. user_preferences. The rows inserted into user_content and user_preferences are referenced by their id's which correspond to each user's id (held in user)
How do I accomplish this?
Should I do the INSERT INTO user query first, obtaining that auto-incremented id with last_insert_id(), and then the other two INSERT INTO queries using the obtained user id? Or, is there a more concise way to do this?
(note: I am using MySQL and PHP, and if it makes a difference, I am using a bigint to store the id values in all three tables.)
Thank you!

The approach that you've described (insert into user first, take the result of last_insert_id(), and use that to insert to the other two tables) is perfectly reasonable; I see nothing wrong with it.
It might be technically possible to combine the three queries and use the LAST_INSERT_ID() MySQL function to insert values to the other two tables, but this would be significantly more complex without any corresponding benefits. Not really worth doing, in other words.

I see 3 options:
PHP side using some *_last_insert_id (as you describe)
Create a trigger
Use a stored procedure.

Related

how to increase performance of mysql database insertion

working on the PHP project related to web scraping and my aim is to store the data into the mysql database,i'm using unique key index on 3 indexes in 9 columns table and records are more than 5k.
should i check for unique data at program level like putting values in arrays and then comparing before inserting into database ?
is there any way so that i can speed up my database insertion ?
Never ever create a duplicate table this is a anti SQL pattern and it makes it more difficult to work with your data.
Maybe PDO and prepared statement will give you a little boost but dont expect wonders from it.
multible INSERT IGNORE may also give you a little boost but dont expect wonders from it.
You should generate a multiinsert query like so
INSERT INTO database.table (columns) VALUES (values),(values),(values)
Keep in mind to keep under the max packet size that mysql will have.
this way the index file have to be updated once.
You could create a duplicate of the table that you currently have except no indices on any field. Store the data in this table.
Then use events to move the data from the temp table into the main table. Once the data is moved to the main table then delete if from the temp table.
you can follow your updates with triger. You should do update table and you have to right trigger for this table.
use PDO, mysqli_* function, to increase insertion into database
You could use "INSERT IGNORE" in your query. That way the record will not be inserted if any unique constraints are violated.
Example:
INSERT IGNORE INTO table_name SET name = 'foo', value = 'bar', id = 12345;

Mysql dynamic column with php

I have a problem how to store some data in mysql.
I have website which when link is pressed pass some data to php file which read this data with get and write in database(mysql). I'm passing campaign_id and unknown number of parameters.
http://domain.com/somefile.php?campaignid=1&parameter1=sometext1&parameter2=sometext2&parameter3=sometext3,....etc..
I don't know actual number of parameters because user make them in some sort of cms. The problem I'm facing is how to store them in database. I was thinking to make it like this below but i'm not sure if it's the right and the most effective way:
Combinations Table
-combination_id (Primary key and auto increment)
-campaign_id
-parameter1
-parameter2
-parameter3
-parameter4
-parameter5
-parameter6
-parameter7
-parameter8
-parameter9
-parameter10
In this example I assume that user will not add/use more than 10 parameters(which I think is lame, but I can't get better solution)
Also if I use this design I assume I need to check in this file where is get them from passing and write to database, if each parameter exist(if it was passed).
You have to normalize your schema.
Assume the following tables:
Entity: id, campaign_id, other fields.
Parameter: id, entityId, parameterValue.
This is a Many-to-One relation.
What About storing all the parameters as json in one table row?
You could try something like this:
combination_id (primary key auto increment)
campaign_id ( indexed / foreign key / can't be unique!)
param_name
param_value
You'd have to create an entry for every parameter you're getting, but you could theoretically add a thousand parameters or more.
Might not be the fastest method though and can be a bit hard to work with.
I think this is the kind of data nosql databases are made for... At least, trying to force it into a sql database always ends up as some kind of kludge. (been there done it...)
as far as I can see, you have three different ways of storing it:
As you proposed. Probably the easiest way to handle it and also probably the most efficient. But, at the moment you get 11 parameters you are in for major problems...
Make a parameter table - parameter_id, - campaign_id parameter (possible parameter name if it matters) - this gives you total flexibility - but everything else, ecept for searching for single values gets more difficult,
Combine the parameters and store them all in a text or varchar field. This is probably even more efficient than 1, except for searching for single parameter values.
And if I may add
Use a database system with an array type, eg postgresql
If you don't know the actual number of parameters that will come through url, there is a best option to store the infinite number of values for a campaign_id.
For that you can create multiple rows in the table. Like,
insert into table_name values(<campaign_id>,<parameter1>,<sometext>)
insert into table_name values(<campaign_id>,<parameter2>,<sometext>)
insert into table_name values(<campaign_id>,<parameter3>,<sometext>)
insert into table_name values(<campaign_id>,<parameter4>,<sometext>)
Assuming the campaign_id is unique in url.

Inserting in two tables with a single query

I am developing a web app using zend framework and the problem is about combining 2 sql queries for improving efficiency. My table structure is like this
>table message
id(int auto incr)
body(varchar)
time(datetime)
>table message_map
id(int auto incr)
message_id(forgain key from message table's id column)
sender(int ) comment 'user id of sender'
receiver(int) comment 'user id of receiver'
To get the code working, I am first inserting the message body and time to the message table and then using the last inserted id, I am inserting message sender and receiver to message_map table. Now what I want to do is to do this task in a single query as using one query will be more efficient. Is there any way to do so.
No there isn't. You can insert in only one table at once.
But I can't imagine you need to insert so much messages that performance really becomes an issue. Even with these separate statements, any database can easily insert thousands of records a minute.
bulk inserts
Of course, when inserting multiple records in the same table, that's a different matter. This is indeed possible in MySQL and it will make your query a lot faster. It will give you trouble, though, if you need to insert_ids from all those records.
mysql_insert_id() returns the first id that is inserted in the last insert statement, if it is a bulk insert. So you could query all id's that are >= that id. It should give you all records you just inserted, although the result may contain id's that other people inserted between your insert and the following query for those ids.
if its for only these two tables. Why dont you create a single table having all these columns in one as
>table message
id(int auto incr)
body(varchar)
sender(int ) comment 'user id of sender'
receiver(int) comment 'user id of receiver'
time(datetime)
then it will be like the way you want.
I agree with GolezTrol or otherwise if you want an optimized performance for your query perhaps you may choose to use Stored Procedures
Indeed combining those two inserts wouldn't be possible. While you van use JOIN in get queries, you can't combine insert queries. If your really worrying about performance, isn't there anyway to join those two tables together? As far is I can see there's no point in keeping them separated; there both about the message.
As stated before, executing a second insert query isn't that much of a server load by the way.
As others pointed out, you cannot really update multiple tables at once. And, you should not really be worried about performance, unless you are inserting thousands of messages in a short period of time.
Now, there is one thing you could worry about. Imagine, you first insert the message body, and then try to insert the receiver/sender IDs. Suppose first succeeds, while second (for whatever reason) fails. That would corrupt your data a bit. To avoid that, you can use transactions, e.g.
mysql_query("START TRANSACTION", $connection);
//your code
mysql_query("COMMIT", $connection);
That would ensure that either both inserts get into the database, or neither do. If you are using PDO, look into http://www.php.net/manual/en/pdo.begintransaction.php for examples.

bulk database insert with id

I have hundred of thousands of elements to insert into a database. I realized calling an insert statement per element is way too costly and I need to reduce the overhead.
I recon each insert can have multiple data elements specified such as
INSERT INTO example (Parent, DataNameID) VALUES (1,1), (1,2)
My issue is that since the "DataName" keeps repeating itself for each element I thought it would optimize space if I stored these string names in another table and reference it.
However that causes problems for my idea of the bulk insert which now requires a way to actually evaluate the ID from the name before calling the bulk insert.
Any recommendations?
Should I simply de-normalize and insert the data every time as plain string to the table?
Also what is the limit of the size of the string as the string query amounts to almost 1.2 MB?
I am using PHP with MySQL backend
You haven't given us a lot of info on the database structure or size, but this may be a case where absolute normalization isn't worth the hassle.
However if you want to keep it normalized and the strings are already in your other table (let's call it datanames), you can do something like
INSERT INTO example (Parent, DataNameID) VALUES
(1, (select id from datanames where name='Foo')),
(1, (select id from datanames where name='Bar'))
First you should insert the name in the table.
Than call LAST_INSERT_ID() to get the id.
Than you can do your normal inserts.
If your table is MYisam based you can use INSERT DELAYED to improve performance: http://dev.mysql.com/doc/refman/5.5/en/insert-delayed.html
You might want to read up on load data (local) infile. It works great, I use it all the time.
EDIT: the answer only addresses the sluggishness of individual inserts. As #bemace points out, it says nothing about string IDs.

Best way to INSERT autoincrement field? (PHP/MySQL)

I have to insert data into two tables, Items and Class_Items. (A third table, Classes is related here, but is not being inserted into).
The primary key of Items is Item_ID, and it's an auto-incrementing integer. Aside from this primary key, there are no unique fields in Items. I need to know what the Item_ID is to match it to Classes in Class_Items.
This is all being done through a PHP interface. I'm wondering what the best way is to insert Items, and then match their Item_ID's into Class_Items. Here are the two main options I see:
INSERT each Item, then use mysql_insert_id() to get its Item_ID for the Class_Items INSERT query. This means one query for every Item (thousands of queries in total).
Get the next Autoincrement ID, then LOCK the Class_Items table so that I can just keep adding to an $item_id variable. This would mean just two queries (one for the Items, one for the Class_Items)
Which way is best and why? Also, if you have an unlisted alternative I'm open to whatever is most efficient.
The most efficient is probably going to be to use parameterized queries. That would require using the mysqli functions, but if you're to the point of needing to optimize this kind of query you should think about being there anyway.
No matter how you cut it, you've got two inserts to make. Doing the first, grabbing the new ID value as you've described (which imposes insignificant overhead, because the value is on hand to mysql already,) and using it in the second insert is pretty minimal.
I would investigate using stored procedures and/or transactions to make sure nothing bad happens.
I'm working on a project with mysql and what I did is the following (without using autoincrement fields):
1- I created a table called SEQUENCE with one field of type BIGINT called VALUE with an initial value of 1. This table will store the id value that will be incremented each time you insert a new record.
2- Create a store procedure and handle the id increment inside it within a transaction.
Here is an example.
CREATE PROCEDURE `SP_registerUser`(
IN _username VARCHAR(40),
IN _password VARCHAR(40),
)
BEGIN
DECLARE seq_user BIGINT;
START TRANSACTION;
#Validate that user does not exist etc..........
#Register the user
SELECT value FROM SEQUENCE INTO seq_user;
UPDATE SECUENCE SET value = value + 1;
INSERT INTO users VALUES(seq_user, _username, SHA1(_password));
INSERT INTO user_info VALUES(seq_user, UTC_TIMESTAMP());
COMMIT;
END //
In my case I want to store the user id in two different tables (users and user_info)

Categories