Most optimized/efficient way to update Database - php

I have a data set with more than 10000 (this will be more in future) records as below:
[[name=>'name1',url=>'url1', visit=>120],
[name=>'name2',url=>'url2'], visit=>250,
..........
]
It is possible to have duplicate values for the key combination name,url. In such situations I need to get the sum of each records have the duplicate name,url.
Finally I want insert this values into a database. When I do this I have two method to do this:
Create another array with unique combination (name,url) and sum of visit
Update/insert db for each record in a loop.
What is the optimal solution to do this or is there better way to do this?
I know there will be memory issues for a large data set in the first method. In second method there are many db hits and I need to know the disadvantage(s) if I follow 2nd way.
Any help or insight would be appreciated.

I do some big database update like this myself and spent ages trying different solutions.
Instead of:
Check if record exists, eg select count(id) from data where
name='name' and url='url'
Not found, insert record
Found, sum result
I would try this
Set the unique primary keys on your data table on the url and name field.
Try to do a normal insert and see if you get a successful result.
On unsuccessful result (there already is value for name and url because these 2 fields must be unique), sum the result.

Related

Is there a way to add multiple values with same ID in two different tables?

Using PHP and mySQL, I need to add multiple values in 2 mysql tables : the first table would have the more important informations and the second one would have the less important informations about each items.
To be more clear, ONE element would have his informations split in two tables.
(I need this for some reasons but two of them are : having the first table the less weight possible, and the second table would store datas that will be erase after a short time (meanwhile the first table keeps all the datas it stored).)
In the best scenario, I'd like to add a row in each table about one item/element with the same id in each table. Something like this :
Table 1 id|data_1_a|data_1_b|...
Table 2 id|data_2_a|data_2_b|...
So if I add an element which get the ID "12345" in the table 1, it adds the datas in the table 2 with the same ID "12345".
To achieve this, I think of two solutions :
Create the ID myself for each element (instead of having an auto_increment on table 1). The con is that it would probably be better to check if the ID doesn't already exist in the tables everytime I generate an ID...
Add the element on table 1, get its ID with $db->lastInsertId(); and use it to add the element's datas on table 2. The con is that I have to add one element by one element to get all the IDs, while most of the time I want to add a lot of elements (like one, two or three hundreds !) at once
Maybe there's a better way to achieve this ?
lastInsertId() reports the first value generated by the last INSERT statement executed. It's reliable to assume that when you insert many rows, they are given consecutive id values following that first value. For example, the MySQL JDBC driver relies on this assumption, so it can report the set of id values generated.
This assumption breaks only if you deliberately set innodb_autoinc_lock_mode=2 (interleaved). See https://dev.mysql.com/doc/refman/8.0/en/innodb-auto-increment-handling.html for details about that.
But if it were my task, I would still choose to use a single table. When you find you don't need some of the columns anymore, use UPDATE to set them to NULL. This will eliminate the problems you're facing with assuring the same id is used across two tables.

Create Unique MYSQL Primary Key Where Only Part Of is Incremntal

In one of my mysql table, I need to generate the primary key field which is 15 digit in length.
Structure is 2+2+2+2+2+5 as ex: 010101010100001.
First 10 Digit values comes from five 2 digit form/input fields and last 5 character is unique and incremental. So, whenever a form/data is submitted that value will be increased.
now how can I achieve that?
I think of following method but thinking of 2 issue:
First use select query to get the last used id/number. Then add +1 to it.**
Issue-1:
For this I have to make 2 query 1 for select and 1 for update, but I think there is much better way than this?
Issue-2:
What if multiple like hundreds/thousands of users submit the form at the very same time? how can I make sure it will be unique and wont cause a db error?
any suggestion/idea would be highly appreciated.
Having a UNIQUE/PRIMARY KEY ensures that you cannot have any duplicate entries in the database. You will get an error message from MySQL if you try to create a duplicate entry. So you don't need to check for duplicates, MySQL will do so for you. However you have to check for error messages and react accordingly by changing the value like adding +1 on it, depending on your requirements.

PHP MySQL inserting/updating multiple values

Let's say I have dynamic numbers with unique id's to them.
I'd like to insert them into database. But if I already have that certain ID (UNIQUE) I need to add to the value that already exists.
I've already tried using "ON KEY UPDATE" ,but it's not really working out. And selecting the old data so we could add to it and then updating it ,is not efficient.
Is there any query that could do that?
Incrementing your value in your application does not guarantee you'll always have accurate results in your database because of concurrency issues. For instance, if two web requests need to increment the number with the same ID, depending on when the computer switches the processes on the CPU, you could have the requests overwriting each other.
Instead do an update similar to:
UPDATE `table` SET `number` = `number` + 1 WHERE `ID` = YOUR_ID
Check the return value from the statement. An update should return the number of rows affected, so if the value is 1, you can move on happy to know that you were as efficient as possible. On the other hand, if your return value is 0, then you'll have to run a subsequent insert statement to add your new ID/Value.
This is also the safest way to ensure concurrency.
Hope this helps and good luck!
Did something different. Instead of updating the old values ,I'm inserting new data and leaving old one ,but using certain uniques so I wouldn't have duplicates. And now to display that data I use a simple select query with sum property and then grouping it by an id. Works great ,just don't know if it's the most efficient way of doing it.

INSERT IGNORE mysql insert id gaps when writing to unique index workaround

I use while $1<=N loop to get database row from table TTT and run certain operation with it, rinse and repeat. I use $i to retrieve the next consecutive id row from database on each loop go.
I used to run SELECT query to find the duplicates in table TTT, but now with current size of TTT it is too slow, so I decided to go with Unique index.
Everything works fine and no more duplicates, however there are now gaps in my id values because I use INSERT IGNORE, which breaks my script depending on id's to be consecutive.
So how do I adjust my code to get the same functionality with unique index?
I was thinking to create temp table with ordered ids, but that was bad idea.I was trying to get next id in previous loop go with (id > $i), but it would only work for first gap...
Ok so for my situation where only one script inserts into TTT, my solution is to enable
innodb_autoinc_lock_mode=0
to get the desired functionality back.
More info on different ways to handle the issue - http://dev.mysql.com/doc/refman/5.1/en/innodb-auto-increment-handling.html

mysql query from subquery keeps hanging

Hello I have a mysql database and all I want is basically to get a value on the second table from a first table query
I have figured something like this but is not working.
select src, dst_number, state, duration
from cdrs, area_code_infos
where SUBSTRING(cdrs.src,2,3) = area_code_infos.`npa`;
Please help me figure out this. I have tried in PHP to have multiple queries running one after the other but when I loaded the page after 45 minutes of wait time I gave up.
Thanks,
I assume the tables are farily big, and you are also doing an unindexed query.. basically substring has to be calculated for every row.
Whenever you do a join, you want to make sure both of the joined fields are indexed.
An option would be to create another column containing the substring calculation and then creating an index on that.
However, a better option would be to have an areaCodeInfosID column and set it as a foreign key to the area_code_infos table

Categories