How to make my php code faster? - php

I'am using php on server side to manage data with MySQL.
I have to request a API that gives me an list of users. I need to check for each user if he is in the database.
If yes, I update his information.
If not, I insert him in the data base.
The issue is that there is more than 2000+ users each times and my code in PHP is really slow (sometimes I get 504 Gateway Time-out).
We will have even more users very soon.
How can I make my code faster ? Is Php ok ?
EDIT my codeV3 after improvement:
$userList = getFromAPI();
foreach ($userList as $userId){
$db = dbConnect();
$tagList = implode(",", $user["tagid_list"]);
$query = $db->prepare(
"INSERT INTO USERS(id, name, group) VALUES(:id, :name, :group)
ON DUPLICATE KEY UPDATE name=values(name), group=values(group)"
);
$query->execute([
"id"=>$id,
"name"=>$name,
"group"=>$group
]);
}

Maybe try with putting $db = dbConnect(); outside of your foreach?
I don't know if it is needed to open the connection in each cycle. It may be time consuming aswell.

You can use a single query for that:
INSERT INTO users (id, name)
VALUES (1, 'Alice'), (2, 'Bob'), (3, 'Cecil')
ON DUPLICATE KEY UPDATE name = VALUES(name);
In a nutshell: you insert new rows, but if one already exists (the key is duplicated), it is updated instead. You can build your insert values in a loop so you end up with a single query instead of 4000+.
Read more here.

First of all get fetching all users ids from database out of foreach lopp and buffer it in some variable. Should be better.

Related

INSERT IGNORE INTO and UPDATE in one statement

I'm running a script which serves downloads to my users. I would like to monitor their traffic on a per byte level, and I hold how many bytes they've downloaded in $bytes. I want to log it to my database and I'm using the following function:
register_shutdown_function(function() {
global $bytes;
/* Save the traffic to the database */
$db = new PDO('mysql:host=localhost;dbname=test', 'root', '');
$st = $db->prepare('INSERT IGNORE INTO `stats`
SET `date` = CURDATE(), `bytes` = :bytes');
$st->bindParam(':bytes', $bytes);
$st->execute();
$st = null;
$db = null;
});
This query seems to work once, when the download is complete there's a new record in the table with the following data:
date | bytes
------------------------
2013-02-03 | 2799469
however, on every other download the bytes field doesn't change. No new records and no change to the record that's already in the table. It's pretty obvious what the problem is, the query tries to insert a record but if it already exists then it aborts. I need an update statement like this:
UPDATE `stats`
SET `bytes` = `bytes` + :bytes
WHERE `date` = CURDATE()
but I would like to do the entire operation in one query. A query that will create the record if it doesn't exist and if it does exist, update the existing record.
Can this be done or am I going to have to run two queries on every download?
You might want to look into ON DUPLICATE KEY UPDATE. You can read about it here.
Your query would look something like this.
$st = $db->prepare("INSERT INTO `STATS`
VALUES('CURDATE()', :bytes)
ON DUPLICATE KEY UPDATE `BYTES` = `BYTES` + :bytes");
You also should avoid using INSERT IGNORE INTO because in case of duplicate rows, no error will be generated, but only a warning.

PDO two similar queries - only the first inserts

I have a form where the user can insert up to five line items for an invoice. The easiest way for me to do this is to just do five inserts and do a isset check before each query. However, the problem is if I try to run the two queries one after another only the first one inserts the data. I know I can combine them into one PDO query (and that does in fact work), but it does not suit my needs. The second query does not insert.
// Connect to the database
$conn = new PDO("mysql:host=$DB_HOST;dbname=$DB_DATABASE",$DB_USER,$DB_PASSWORD);
//Set all the data here
$receiptid = $_POST['receiptid'];
// .. the rest of the POST data gets set here.
//Insert first line item
$sql = "INSERT INTO lineitems (receiptid, service, description, quantity, unitprice, linetotal)
VALUES (:receiptid, :service, :description, :quantity, :unitprice, :linetotal)";
$q = $conn->prepare($sql);
$q->execute(array(':receiptid'=>$receiptid,
':service'=>$service,
':description'=>$description,
':quantity'=>$quantity,
':unitprice'=>$unitprice,
':linetotal'=>$linetotal));
//Insert second line item
$sql = "INSERT INTO lineitems (receiptid, service2, description2, quantity2, unitprice2, linetotal2)
VALUES (:receiptid, :service2, :description2, :quantity2, :unitprice2, :linetotal2)";
$q = $conn->prepare($sql);
$q->execute(array(':receiptid'=>$receiptid,
':service2'=>$service2,
':description2'=>$description2,
':quantity2'=>$quantity2,
':unitprice2'=>$unitprice2,
':linetotal2'=>$linetotal2));
Does your table really have different columns for each entered lineitem number (i.e. service2, descriptions2, etc.)?
Perhaps you need to change the field names in your second insert to match those in the first.
If you were handling cases where you did not get expected query result properly (i.e. checking your execution results and looking at the errors if something fails, You would be able to get to the source of the problem in a hurry.)

Ensuring Unique Rows Using PHP/MySQL

I have the following code that should, when run, update a table of "victims" of Her Royal Majesty Penelope the Queen of Sheep (it's work for someone, honest), however every time the code is executed it adds all new rows all over again. I was pretty sure I had safeguarded against that, but I guess not. What am I doing wrong here?
require_once 'victims.php';
foreach( $victims as $vic )
{
$vic = mysql_real_escape_string($vic);
if(!(mysql_query("
SELECT * FROM victims
WHERE ".$vic
)))
{
mysql_query("
INSERT INTO victims
(victim, amount)
VALUES( '".$vic."', 0)
");
}
}
You need to change the where clause of your first query to the following:
WHERE victim = $vic
Also, please consider using bind variables as this will protect your code from SQL injection attacks.
You could use an "INSERT ... ON DUPLICATE KEY" query instead, which will guarantee that existing rows won't be duplicated, but only updated. Assuming vic is the table's primary key, you'd do:
INSERT INTO victims (victim, amount)
VALUES ($vic, $amount)
ON DUPLICATE KEY UPDATE amount=VALUES(amount)

Batch insertion of data to MySQL database using php

I have a thousands of data parsed from huge XML to be inserted into database table using PHP and MySQL. My Problem is it takes too long to insert all the data into table. Is there a way that my data are split into smaller group so that the process of insertion is by group? How can set up a script that will process the data by 100 for example? Here's my code:
foreach($itemList as $key => $item){
$download_records = new DownloadRecords();
//check first if the content exists
if(!$download_records->selectRecordsFromCondition("WHERE Guid=".$guid."")){
/* do an insert here */
} else {
/*do an update */
}
}
*note: $itemList is around 62,000 and still growing.
Using a for loop?
But the quickest option to load data into MySQL is to use the LOAD DATA INFILE command, you can create the file to load via PHP and then feed it to MySQL via a different process (or as a final step in the original process).
If you cannot use a file, use the following syntax:
insert into table(col1, col2) VALUES (val1,val2), (val3,val4), (val5, val6)
so you reduce to total amount of sentences to run.
EDIT: Given your snippet, it seems you can benefit from the INSERT ... ON DUPLICATE KEY UPDATE syntax of MySQL, letting the database do the work and reducing the amount of queries. This assumes your table has a primary key or unique index.
To hit the DB every 100 rows you can do something like (PLEASE REVIEW IT AND FIX IT TO YOUR ENVIRONMENT)
$insertOrUpdateStatement1 = "INSERT INTO table (col1, col2) VALUES ";
$insertOrUpdateStatement2 = "ON DUPLICATE KEY UPDATE ";
$counter = 0;
$queries = array();
foreach($itemList as $key => $item){
$val1 = escape($item->col1); //escape is a function that will make
//the input safe from SQL injection.
//Depends on how are you accessing the DB
$val2 = escape($item->col2);
$queries[] = $insertOrUpdateStatement1.
"('$val1','$val2')".$insertOrUpdateStatement2.
"col1 = '$val1', col2 = '$val2'";
$counter++;
if ($counter % 100 == 0) {
executeQueries($queries);
$queries = array();
$counter = 0;
}
}
And executeQueries would grab the array and send a single multiple query:
function executeQueries($queries) {
$data = "";
foreach ($queries as $query) {
$data.=$query.";\n";
}
executeQuery($data);
}
Yes, just do what you'd expect to do.
You should not try to do bulk insertion from a web application if you think you might hit a timeout etc. Instead drop the file somewhere and have a daemon or cron etc, pick it up and run a batch job (If running from cron, be sure that only one instance runs at once).
You should put it as said before in a temp directory with a cron job to process files, in order to avoid timeouts (or user loosing network).
Use only the web for uploads.
If you really want to import to DB on a web request you can either do a bulk insert or use at least a transaction which should be faster.
Then for limiting inserts by batches of 100 (commiting your trasnsaction if a counter is count%100==0) and repeat until all your rows were inserted.

Is it bad to put a MySQL query in a PHP loop?

I often have large arrays, or large amounts of dynamic data in PHP that I need to run MySQL queries to handle.
Is there a better way to run many processes like INSERT or UPDATE without looping through the information to be INSERT-ed or UPDATE-ed?
Example (I didn't use prepared statement for brevity sake):
$myArray = array('apple','orange','grape');
foreach($myArray as $arrayFruit) {
$query = "INSERT INTO `Fruits` (`FruitName`) VALUES ('" . $arrayFruit . "')";
mysql_query($query, $connection);
}
OPTION 1
You can actually run multiple queries at once.
$queries = '';
foreach(){
$queries .= "INSERT....;"; //notice the semi colon
}
mysql_query($queries, $connection);
This would save on your processing.
OPTION 2
If your insert is that simple for the same table, you can do multiple inserts in ONE query
$fruits = "('".implode("'), ('", $fruitsArray)."')";
mysql_query("INSERT INTO Fruits (Fruit) VALUES $fruits", $connection);
The query ends up looking something like this:
$query = "INSERT INTO Fruits (Fruit)
VALUES
('Apple'),
('Pear'),
('Banana')";
This is probably the way you want to go.
If you have the mysqli class, you can iterate over the values to insert using a prepared statement.
$sth = $dbh->prepare("INSERT INTO Fruits (Fruit) VALUES (?)");
foreach($fruits as $fruit)
{
$sth->reset(); // make sure we are fresh from the previous iteration
$sth->bind_param('s', $fruit); // bind one or more variables to the query
$sth->execute(); // execute the query
}
one thing to note about your original solution over the implosion method of jerebear (which I have used before, and love) is that it is easier to read. The implosion takes more programmer brain cycles to understand, which can be more expensive than processor cycles. premature optimisation, blah, blah, blah... :)
One thing to note about jerebear's answer with multiple VALUE-blocks in one INSERT:
It can be rather dangerous for really large amounts of data, because most DBMS have an upper limit on the size of the commands they can handle. If you exceed that with too many VALUE-blocks, your insert will fail. On MySQL for example the limit is usually 1MB AFAIK.
So you should figure out what the maximum size is (ideally at runtime, might be available from the database metadata), and make sure you don't exceed it by spreading your lists of values over several INSERTs.
I was inspired by jerebear's answer to build something like his second option for one of my current projects. Because of the shear volume of records I couldn't save and do all the data at once. So I built this to do imports. You add your data, and then call a method when each record is done. After a certain, configurable, number of records the data in memory will be saved with a mass insert like jerebear's second option.
// CREATE TABLE example ( Id INT, Field1 INT, Field2 INT, Field3 INT);
$import=new DataImport($dbh, 'example', 'Id, Field1, Field2, Field3');
foreach ($whatever as $row) {
// add data in the order of your column definition
$import->addValue($Id);
$import->addValue($Field1);
$import->addValue($Field2);
$import->addValue($Field3);
$import->nextRow();
}
$import->lastRow();

Categories