sql insert- 1 big query, or many single queries - php

I am using PHP with an odbc connection to MSSQL database.
Currently, I having around 1900 insert statements, in the one string, separated by a semicolon, and running that in 1 odbc_execute statement.
Firstly, is this a bad method? Should I be processing every insert statement separately in a for loop?
Also, the way I am currently doing it, with 1 big statement, for some reason, only a maximum of 483 rows are being inserted with no errors. If I copy the statement that is run and run this through SQL studio, all rows insert, yet every single time, only a maximum of 483 rows insert.
Any ideas why this could be?

One network round trip per INSERT will mean a lot of latency. It'll be very slow.
There's probably a limit on the buffer size of all those concatenated SQL statements.
I think you want to use a prepared statement and bound variables:
http://msdn.microsoft.com/en-us/library/windows/desktop/ms712553(v=vs.85).aspx
You can still process it in a loop - two, actually. You'll want an inner loop to add INSERTs to a batch that's executed as a single transaction, and an outer loop to process all the necessary batches.

As long as you aren't running a separate transaction for each insert, there's nothing wrong with inserting one at a time.
For this sort of 'batch insert' I would typically run a transaction for every 100 rows or so.
I would avoid trying to cram them all in at once. nothing really to be gained by doing that.

You may put the data in a file (xml?) on the sql-server and request a stored procedure from php that process it.
regards,
/t

Related

Does Mysqli_multi_query improve performance

I am updating mysql data through php script. I am looking to use mysqli_multi_query() instead of mysql_query. I have N number of update queries to execute. Please suggest if multi query will help in better execution time.
A. Updating data using mysql_query(), Firing single single queries N times.
B. Concatinating All Update queries with ";" and firing once using multi query.
Please Suggest if Technique "B" will help in performance.
Thanks.
C. Use prepared statements and if possible (InnoDB) within a transaction.
A multiquery will save you some client-server roundups, but the queries are still executed one by one.
If you use prepared statement together with transactions, the query is checked once by the parser after that just values are pasted to server. Transaction prevents indexes being rebuild after each update.
Multiple insert statements can be rewritten as a single bulk insert statement:
INSERT INTO t1 (c1,c2,c3) VALUES (1,2,3), (3,4,5) --etc

pg_prepare: cannot insert multiple commands into a prepared statement

I have 2 tables, TableA and TableB. TableB has a fk field pointing to TableA.
I need to use a DELETE statement for TableA, and when a record in it is deleted I need to delete all records in TableB related to that record in TableA. Pretty basic.
begin;
DELETE FROM TableB
WHERE nu_fornecedor = $1;
DELETE FROM TableA
WHERE nu_fornecedor = $1;
commit;
This string is passed to pg_prepare(), but then I get error
ERROR: cannot insert multiple commands into a prepared statement
Ok, but I need to run both commands in the same transaction, I cant execute 2 separated statements. I tried to use with without begin-commit and got same error.
Any idea how to do it?
To understand what is going on and your options, let me explain what a prepared statement is and what it is not. You can use pg_prepare, but only for the statements individually, not for the transaction as a whole.
A prepared statement is a statement handed to PostgreSQL which is then parsed for and stored as a parse tree for future use. On first execution, the parse tree is planned with the inputs provided, and executed, and the plan cached for future use. Usually it makes little sense to use prepared statements unless you want to reuse the query plan (i.e. executing a bunch of otherwise identical update statements hitting roughly the same number of rows), all in the same transaction.
If you want something that gives you the benefits of separating parameters from parse trees but does not cache plans, see pg_query_param() in the PHP documentation. That is probably what you want.

Stepping through PHP changes MySQL result

Under plain PHP 5.3, I have some code which uses MySQL to first deletes some old records, record a log, perform a few tiny operations and then adds new replacement records.
The delete command looks like this:
DELETE FROM `rtable` WHERE `UserName`='%s';
And the add commands looks like this:
INSERT INTO `table` (`UserName`,`Attribute`,`op`,`Value`) VALUES ('%s','%s','%s','%s');
Oddly though, the insert commands appear to not execute when running normally, however if I enable my debugger and step through one line at a time, it appears to work. Likewise, if I insert a sleep command of two seconds after the delete commands. It appears to work. I am therefor assuming that the insert commands are running -before- the delete commands and thus the delete commands are also erasing the new records.
How can I get PHP to wait for the delete operation to finish before continuing to the insert commands?
That sounds really odd.
Do you happen to have a replicated database cluster?
Also, do you check the return value of the mysql_query or whatever command and print the error message (which of course is not recommended for scripts in production)?
I am not totally certain how PHP deals with processes and how consecutive queries are run, but if you want to make certain to encapsulate the delete in a transaction, you can do so with PDO like this:
$dbh->beginTransaction();
$sth = $dbh->exec("DELETE FROM `rtable` WHERE `UserName`='%s'");
$dbh->commit();
// You could also pop a transaction around the inserts
// in case another page tries to do the same
$sth = $dbh->exec("INSERT INTO `table`
(`UserName`,`Attribute`,`op`,`Value`)
VALUES ('%s','%s','%s','%s')");
BTW: I took the liberty of correcting the single quotes to backticks in your queries.

Is there any way to send more than one postgresql sql prepared statement in one call in php?

Is there any way to execute more sql prepared statements at once? Or at least use something to achieve this result, can it be emulated with transactions?
pg_send_query can execute more statements (from php docs "The SQL statement or statements to be executed.")
but
pg_send_execute and pg_send_prepare can work only with one statement.
The query parameter has the following description
"The parameterized SQL statement. Must contain only a single statement. (multiple statements separated by semi-colons are not allowed.) If any parameters are used, they are referred to as $1, $2, etc."
from http://www.php.net/manual/en/function.pg-send-prepare.php
Is there any way to send more statements at once to make less roundtrips between php and postgresql like the pg_send_query does?
I don't want to use pg_send_query because without parameter binding I can have sql injection vulnerabilities in my code.
The round trips to the DB server shouldn't be your bottleneck as long as you are (a) using persistent connections (either directly or via a pool) and (b) aren't suffering from the "n+1 selects" problem.
New connections have an order of magnitude overhead which slows things down if done on every query. The n+1 problem results in generating far more trips than is really needed if the application retrieved (or acted upon) sets of related rows rather than doing all operations one at a time.
See: What is the n+1 selects problem?
Separate your queries by semicolon:
UPDATE customers SET last_name = 'foo' WHERE id = 1;UPDATE customers SET last_name = 'bar' WHERE id = 2;
Edit:
Okay you cannot do this on the call side:
The parameterized SQL statement. Must contain only a single statement. (multiple statements separated by semi-colons are not allowed.)
Another way would be to call a stored procedure with this method and this SP issues multiple statements.

mysql continue on errors

I have a PHP foreach loop and a mysql insert statement inside of it. The loop inserts data into my database. I've never ran into this issue before but what I think is happening is that the insert dies (I do not have an "or die" statement after the insert) when it reaches a duplicate record. Even though there may be duplicate records in the table, I need this to just continue. Is there something that I need to specify to do this?
I'm transferring some records from one table to another. Right now, I have 20 records in table #1 and only 17 in table #2. I'm missing 3 records but only one of those are duplicated which violates the constraint on the table. The other two records should have been added. Can someone give me some advice here?
What's happening is that PHP is throwing a warning when the mysql insert fails and stopping on that warning. The best way to accomplish your goal is:
Create a custom exception handler
Set PHP to use the exception handler for warnings.
Wrap the insert attempt into a try / catch
When you catch the exception / warning, either log or output the mysql error but continue script execution.
This will allow your script to continue without stopping while at the same time explaining to you the problem.
One way around this would be to simply query the database for the record that you're about to insert. This way, your series of queries will not die when attempting to insert a duplicate record.
A slightly more efficient solution would be to query for [i]all[/i] of the records you're about to insert in one query, remove all the duplicates, then insert the new ones.
Do you insert multiple rows with one INSERT statement?
INSERT INTO xyz (x,y,z) VALUES
(1,2,3),
(2,3,5),
(3,4,5),
(4,5,6)
Then you might want to consider prepared statements
...or adding the IGNORE keyword to your INSERT statement
INSERT IGNORE INTO xyz (x,y,z) VALUES
(1,2,3),
(2,3,5),
(3,4,5),
(4,5,6)
http://dev.mysql.com/doc/refman/5.0/en/insert.html says:
If you use the IGNORE keyword, errors that occur while executing the INSERT statement are treated as warnings instead
You can still fetch the warnings but the insertion will not be aborted.
Not a good way cause you should figure out whats wrong, but to just prevent it from dieing try adding # in front of the function
#mysql_query = ...
INSERT INTO FOO
(ID, BAR)
VALUES(1,2),(3,4)
ON DUPLICATE KEY UPDATE BAR=VALUES(BAR)

Categories