PHP MySQL inserting/updating multiple values - php

Let's say I have dynamic numbers with unique id's to them.
I'd like to insert them into database. But if I already have that certain ID (UNIQUE) I need to add to the value that already exists.
I've already tried using "ON KEY UPDATE" ,but it's not really working out. And selecting the old data so we could add to it and then updating it ,is not efficient.
Is there any query that could do that?

Incrementing your value in your application does not guarantee you'll always have accurate results in your database because of concurrency issues. For instance, if two web requests need to increment the number with the same ID, depending on when the computer switches the processes on the CPU, you could have the requests overwriting each other.
Instead do an update similar to:
UPDATE `table` SET `number` = `number` + 1 WHERE `ID` = YOUR_ID
Check the return value from the statement. An update should return the number of rows affected, so if the value is 1, you can move on happy to know that you were as efficient as possible. On the other hand, if your return value is 0, then you'll have to run a subsequent insert statement to add your new ID/Value.
This is also the safest way to ensure concurrency.
Hope this helps and good luck!

Did something different. Instead of updating the old values ,I'm inserting new data and leaving old one ,but using certain uniques so I wouldn't have duplicates. And now to display that data I use a simple select query with sum property and then grouping it by an id. Works great ,just don't know if it's the most efficient way of doing it.

Related

Reseting mysql autoincrement field value after delete from php?

I have a form from which i am inserting data into mysql works fine.But when i delete some data from mysql, and inserted values into database again the autoincrement value is starting from the previous row value.
ForExample:
If i have 1,2,3,4,5 as id's in mydatabse and if i delete 4 and 5 id's from database
and started inserting next data from PHP. then the id's are coming from 6.... But i need to get id as 4 .can any one give suggestions.Thanks in advance.
I'm afraid MySQL does not allow you to "reset" AUTO_INCREMENT fields like that. If you need that behavior, you have to stop using AUTO_INCREMENT and generate your IDs manually.
Auto increment does not (and cannot) guarantee an unbroken sequence.
You can implement this yourself as "SELECT MAX(ID) + 1 FROM MYTABLE;"
But be warned: You will take a slight but noticeable performance hit.
If you are running updates concurrently you risk deadlocks
(again if you are running updates concurrently) you will risk having two inserts with the same key.
You can also implement this by running your own counter in a separate table. You must have program logic to decrement this correctly on a deletion, and, again you will get a performance hot and risk of deadlock as the "counter" will become an object of contention.
You should not play with AUTO_INCREMENT value in a production environment let MySQL take care of its value for you.
If you need to know how many row you have you can use
SELECT COUNT(id) FROM tbl;
Anyway if you really want to change its value the syntax is :
ALTER TABLE tbl AUTO_INCREMENT=101;

Auto Increment skipping numbers?

Note: I'm new to databases and PHP
I have an order column that is set to auto increment and unique.
In my PHP script, I am using AJAX to get new data but the problem with that is, is that the order skips numbers and is substantially higher thus forcing me to manually update the numbers when the data is inserted. In this case I would end up changing 782 to 38.
$SQL = "INSERT IGNORE INTO `read`(`title`,`url`) VALUES\n ".implode( "\n,",array_reverse( $sql_values ) );
How can I get it to increment +1?
The default auto_increment behavior in MySQL 5.1 and later will "lose" auto-increment values if the INSERT fails. That is, it increments by 1 each time, but doesn't undo an increment if the INSERT fails. It's uncommon to lose ~750 values but not impossible (I consulted for a site that was skipping 1500 for every INSERT that succeeded).
You can change innodb_autoinc_lock_mode=0 to use MySQL 5.0 behavior and avoid losing values in some cases. See http://dev.mysql.com/doc/refman/5.1/en/innodb-auto-increment-handling.html for more details.
Another thing to check is the value of the auto_increment_increment config variable. It's 1 by default, but you may have changed this. Again, very uncommon to set it to something higher than 1 or 2, but possible.
I agree with other commenters, autoinc columns are intended to be unique, but not necessarily consecutive. You probably shouldn't worry about it so much unless you're advancing the autoinc value so rapidly that you could run out of the range of an INT (this has happened to me).
How exactly did you fix it skipping 1500 for ever insert?
The cause of the INSERT failing was that there was another column with a UNIQUE constraint on it, and the INSERT was trying to insert duplicate values in that column. Read the manual page I linked to for details on why this matters.
The fix was to do a SELECT first to check for existence of the value before attempting to INSERT it. This goes against common wisdom, which is to just try the INSERT and handle any duplicate key exception. But in this case, the side-effect of the failed INSERT caused an auto-inc value to be lost. Doing a SELECT first eliminated almost all such exceptions.
But you also have to handle a possible exception, even if you SELECT first. You still have a race condition.
You're right! innodb_autoinc_lock_mode=0 worked like a charm.
In your case, I would want to know why so many inserts are failing. I suspect that like many SQL developers, you aren't checking for success status after you do your INSERTs in your AJAX handler, so you never know that so many of them are failing.
They're probably still failing, you just aren't losing auto-inc id's as a side effect. You should really diagnose why so many fails occur. You could be either generating incomplete data, or running many more transactions than necessary.
After you change 782 in 38 you can reset the autoincrement with ALTER TABLE mytable AUTO_INCREMENT = 39. This way you continue at 39.
However, you should check why your gap is so high and change your design accordingly. Changing the autoincement should not be "default" behaviour.
I know the question has been answered already.. But if you have deleted rows in the table before, mysql will remember the used ID/Number because typically your Auto increment is Unique.. So therefore will not create duplicate increments.. To reindex and increment from the current max ID/integer you could perform:
ALTER TABLE TableName AUTO_INCREMENT=(SELECT max(order) + 1 FROM tablename)
auto increment doesn't care, if you delete some rows - everytime you insert a row, the value is incremented.
If you want a numbering without gaps, don't use auto increment and do it by yourself. You could use something like this to achive this for inserting
INSERT INTO tablename SET
`order` = (SELECT max(`order`) + 1 FROM (SELECT * from tablename) t),
...
and if you delete a row, you have to rearange the order column manually

Optimized ways to update every record in a table after running some calculations on each row

There is a large table that holds millions of records. phpMyAdmin reports 1.2G size for the table.
There is a calculation that needs to be done for every row. The calculation is not simple (cannot be put in set col= calc format), it uses a stored function to get the values, so currently we have for each row a single update.
This is extremely slow and we want to optimize it.
Stored function:
https://gist.github.com/a9c2f9275644409dd19d
And this is called by this method for every row:
https://gist.github.com/82adfd97b9e5797feea6
This is performed on a off live server, and usually it is updated once per week.
What options we have here.
Why not setup a separate table to hold the computed values to take the load off your current table. It can have two columns: primary key for each row in your main table and a column for the computed value.
Then your process can be:
a) Truncate computedValues table - This is faster than trying to identify new rows
b) Compute the values and insert into the computed values table
c) So when ever you need your computed values you join to the computedValues table using a primary key join which is fast, and in case you need more computations well you just add new columns.
d) You can also update the main table using the computed values if you have to
Well, the problem doesn't seem to be the UPDATE query because no calculations are performed in the query itself. As it seems the calculations are performed first and then the UPDATE query is run. So the UPDATE should be quick enough.
When you say "this is extremely slow", I assume you are not referring to the UPDATE query but the complete process. Here are some quick thoughts:
As you said there are millions of records, updating those many entries is always time consuming. And if there are many columns and indexes defined on the table, it will add to the overhead.
I see that there are many REPLACE INTO queries in the function getNumberOfPeople(). These might as well be a reason for the slow process. Have you checked how efficient are these REPLACE INTO queries? Can you try removing them and then see if it has any impact on the UPDATE process.
There are a couple of SELECT queries too in getNumberOfPeople(). Check if they might be impacting the process and if so, try optimizing them.
In procedure updateGPCD(), you may try replacing SELECT COUNT(*) INTO _has_breakdown with SELECT COUNT(1) INTO _has_breakdown. In the same query, the WHERE condition is reading _ACCOUNT but this will fail when _ACCOUNT = 0, no?
On another suggestion, if it is the UPDATE that you think is slow because of reason 1, it might make sense to move the updating column gpcd outside usage_bill to another table. The only other column in the table should be the unique ID from usage_bill.
Hope the above make sense.

Is it possible to UPDATE then INSERT in mysql?

Is it possible to UPDATE and then INSERT where row exists in mysql? I have this query,
$q = $dbc -> prepare("UPDATE accounts SET lifeforce = maxLifeforce, inHospital = 0 WHERE hospitalTime <= NOW() AND inHospital = 1");
$q -> execute();
How can I either get the primary key into an associative array to then do an insert for each item in the array, or do an UPDATE AND INSERT?
Or does it involve doing a SELECT to get all that match criteria, then UPDATE then INSERT using array from the select? This seems rather a long way to do it?
Basically I need to INSERT onto another table using the same primary keys that get updated.
Or does it involve doing a SELECT to get all that match criteria, then UPDATE then INSERT using array from the select?
Yes, sorry, that's the main way.
Another approach is to add a column called (say) last_updated, that you set whenever you update the column. You can then use that column in a query that drives your insert. That would have other advantages — I find last_updated columns to be useful for many things — but it's overkill if this is the only thing you'd ever use it for.
Edited to add: Another option, which just occurred to me, is to add a trigger to your accounts table, that will perform the insert you need. That's qualitatively different — it causes the insertion to be a property of accounts, rather than a matter of application logic — but maybe that's what you want? Even the most extreme partisans of the "put-all-constraints-in-the-database-so-application-logic-never-introduces-inconsistency" camp are usually cautious about triggers — they're really not a good way to implement application logic, because it hides that logic somewhere that no-one will think to look for it. But if the table you're inserting into is some sort of account_history table that will keep track of all changes to account, then it might be the way to go.
You can use a multiple table update as written in the manual: http://dev.mysql.com/doc/refman/5.0/en/update.html
If the second table needs an insert, you probably would have to do it manually.
You can use the mysqli_last_id function:
http://php.net/manual/en/mysqli.insert-id.php
Also, when running consecutive queries like that, I'd recommend using transactions:
http://www.techrepublic.com/article/implement-mysql-based-transactions-with-a-new-set-of-php-extensions/6085922

Check for existing entries in Database or recreate table?

I've got a PHP script pulling a file from a server and plugging the values in it into a Database every 4 hours.
This file can and most likely change within the 4 hours (or whatever timeframe I finally choose). It's a list of properties and their owners.
Would it be better to check the file and compare it to each DB entry and update any if they need it, or create a temp table and then compare the two using an SQL query?
None.
What I'd personally do is run the INSERT command using ON DUPLICATE KEY UPDATE (assuming your table is properly designed and that you are using at least one piece of information from your file as UNIQUE key which you should based on your comment).
Reasons
Creating temp table is a hassle.
Comparing is a hassle too. You need to select a record, compare a record, if not equal update the record and so on - it's just a giant waste of time to compare a piece of info and there's a better way to do it.
It would be so much easier if you just insert everything you find and if a clash occurs - that means the record exists and most likely needs updating.
That way you took care of everything with 1 query and your data integrity is preserved also so you can just keep filling your table or updating with new records.
I think it would be best to download the file and update the existing table, maybe using REPLACE or REPLACE INTO. "REPLACE works exactly like INSERT, except that if an old row in the table has the same value as a new row for a PRIMARY KEY or a UNIQUE index, the old row is deleted before the new row is inserted." http://dev.mysql.com/doc/refman/5.0/en/replace.html
Presumably you have a list of columns that will have to match in order for you to decide that the two things match.
If you create a UNIQUE index over those columns then you can use either INSERT ... ON DUPLICATE KEY UPDATE(manual) or REPLACE INTO ...(manual)

Categories