Condition-driven INSERT and UPDATE in a single query - php

I have a MySQL table with the following fields:
ID
PHONE
NAME
CITY
COUNTRY
Using PHP, I am reading a comma separated dump of values off a text document, parsing the values and inserting records to the table. For reference, here's the code:
<?php
// Includes
require_once 'PROJdbconn.php';
// Read comma-separated text file
$arrindx = 0;
$i = 0;
$filehandle = fopen(PROJCDUMPPATH.PROJCDUMPNAME,"rb");
while (!feof($filehandle)){
$parts = explode(',', fgets($filehandle));
$contnames[$arrindx] = $parts['0'];
$contnumbers[$arrindx] = preg_replace('/[^0-9]/','',$parts['1']);
$arrindx += 1;
}
fclose($filehandle);
$arrindx -= 1;
$filehandle = NULL;
$parts = NULL;
// Build SQL query
$sql = "INSERT INTO Contact_table (PHONE, NAME) VALUES ";
for ($i = 0; $i < $arrindx; ++$i){
$sql .= "('".$contnumbers[$i]."', '".$contnames[$i]."'),";
}
$i = NULL;
$arrindx = NULL;
$contnames = NULL;
$contnumbers = NULL;
$sql = substr($sql,0,strlen($sql)-1).";";
// Connect to MySQL database
$connect = dbconn(PROJHOST,PROJDB,PROJDBUSER,PROJDBPWD);
// Execute SQL query
$query = $connect->query($sql);
$sql = NULL;
$query = NULL;
// Close connection to MySQL database
$connect = NULL;
?>
Now, this code, as you can see, blindly dumps all records into the table. However, I need to modify the code logic as such:
Read text file and parse records into arrays (already doing)
For each record in text file
Check if PHONE exists in the table
If yes,
For each field in the text file record
If text file field != NULL
Update corresponding field in table
Else
Skip
If no,
INSERT record (already doing)
I apologize if the logic isn't terribly clear, feel free to ask me if any aspect confuses you. So, I understand this logic would involve an insane number of SELECT, UPDATE, and INSERT queries, depending on the number of fields (I intend to add more fields in future) and records. Is there any way to either somehow morph them into a single query or leastwise optimize the code by minimizing the number of queries?

What you're trying to do is called an "upsert" (update/insert).
MySQL INSERT else if exists UPDATE

Related

Multiple row inserts as fast as possible

I've seen multiple threads discussing this but there always has been totally different conclusion in the answers. Especially I wonder whether it is really necessary to create a own prepared statement (with the right amount of placeholders) in order to insert it as single query. I expected that when I use beginTransaction and endTransaction before and after my for loop, that pdo/php waits with the transaction until all data is collected and it will send these data's as a single query once the server hits the line endTransaction.
How would I need to rewrite such a for loop insert with multiple inserts in order to reach the best performance (it has between 1 and 300 rows usually but it also could reach 2000 rows).
for($i=0; $i<$baseCount; $i++)
{
$thLevel = $bases[$i]["ThLevel"];
$gold = $bases[$i]["Gold"];
$elixir = $bases[$i]["Elixir"];
$darkElixir = $bases[$i]["DarkElixir"];
$dateFound = $elixir = $bases[$i]["TimeFound"];
$query = $db->prepare("INSERT INTO bot_attacks_searchresults (attack_id, available_gold, available_elixir, available_dark_elixir, date_found, opponent_townhall_level)
VALUES (:attack_id, :available_gold, :available_elixir, :available_dark_elixir, :date_found, :opponent_townhall_level)");
$query->bindValue(':attack_id', $attackId);
$query->bindValue(':available_gold', $gold);
$query->bindValue(':available_elixir', $elixir);
$query->bindValue(':available_dark_elixir', $darkElixir);
$query->bindValue(':date_found', $dateFound);
$query->bindValue(':opponent_townhall_level', $thLevel);
$query->execute();
}
Prepare the statement once. MySQL lexes it once, so any subsequent call to the query will be quick since it's already lexed and juts needs parameters.
Start the transaction before the loop. This is done so your hard drive can write down all the rows in one input output operation. The default mode is that 1 insert query = 1 I/O of the hdd.
Create the loop, bind your parameters there and call the $query->execute();
Exit the loop and commit() the transaction.
Full code:
$db->beginTransaction();
$query = $db->prepare("INSERT INTO bot_attacks_searchresults (attack_id, available_gold, available_elixir, available_dark_elixir, date_found, opponent_townhall_level)
VALUES (:attack_id, :available_gold, :available_elixir, :available_dark_elixir, :date_found, :opponent_townhall_level)");
for($i = 0; $i < $baseCount; $i++)
{
$thLevel = $bases[$i]["ThLevel"];
$gold = $bases[$i]["Gold"];
$elixir = $bases[$i]["Elixir"];
$darkElixir = $bases[$i]["DarkElixir"];
$dateFound = $elixir = $bases[$i]["TimeFound"];
$query->bindValue(':attack_id', $attackId);
$query->bindValue(':available_gold', $gold);
$query->bindValue(':available_elixir', $elixir);
$query->bindValue(':available_dark_elixir', $darkElixir);
$query->bindValue(':date_found', $dateFound);
$query->bindValue(':opponent_townhall_level', $thLevel);
$query->execute();
}
$db->commit();
Here's a very crude proof of concept:
<?php
$values = array();
for($i=0;$i<10;$i++)
{
$values[] = "($i)";
}
$values = implode($values,',');
$query = "INSERT INTO my_table VALUES $values";
echo $query;
?>
outputs INSERT INTO my_table VALUES (0),(1),(2),(3),(4),(5),(6),(7),(8),(9)
You would need to restructure this slightly to work with prepare (PHP is not my forte), but the principle is the same; i.e. you build the query inside the loop, but execute it only once.

Transform MySQL table and rows

I have one problem here, and I don't even have clue what to Google and how to solve this.
I am making PHP application to export and import data from one MySQL table into another. And I have problem with these tables.
In source table it looks like this:
And my destination table has ID, and pr0, pr1, pr2 as rows. So it looks like this:
Now the problem is the following: If I just copy ( insert every value of 1st table as new row in second) It will have like 20.000 rows, instead of 1000 for example.
Even if I copy every record as new row in second database, is there any way I can fuse rows ? Basically I need to check if value exists in last row with that ID_, if it exist in that row and column (pr2 for example) then insert new row with it, but if last row with same ID_ does not have value in pr2 column, just update that row with value in pr2 column.
I need idea how to do it in PHP or MySQL.
So you got a few Problems:
1) copy the table from SQL to PHP, pay attention to memory usage, run your script with the PHP command Memory_usage(). it will show you that importing SQL Data can be expensive. Look this up. another thing is that PHP DOESNT realese memory on setting new values to array. it will be usefull later on.
2)i didnt understand if the values are unique at the source or should be unique at the destination table.. So i will assume that all the source need to be on the destination as is.
I will also assume that pr = pr0 and quant=pr1.
3) you have missmatch names.. that can also be an issue. would take care of that..also.
4) will use My_sql, as the SQL connector..and $db is connected..
SCRIPT:
<?PHP
$select_sql = "SELECT * FROM Table_source";
$data_source = array();
while($array_data= mysql_fetch_array($select_sql)) {
$data_source[] = $array_data;
$insert_data=array();
}
$bulk =2000;
foreach($data_source as $data){
if(isset($start_query) == false)
{
$start_query = 'REPLACE INTO DEST_TABLE ('ID_','pr0','pr1','pr2')';
}
$insert_data[]=implode(',',$data).',0)';// will set 0 to the
if(count($insert_data) >=$bulk){
$values = implode('),(',$insert_data);
$values = substr(1,2,$values);
$values = ' VALUES '.$values;
$insert_query = $start_query.' '.$values;
$mysqli->query($insert_query);
$insert_data = array();
} //CHECK THE SYNTAX IM NOT SURE OF ALL OF IT MOSTLY THE SQL PART>> SEE THAT THE QUERY IS OK
}
if(count($insert_data) >=$bulk) // IF THERE ARE ANY EXTRA PIECES..
{
$values = implode('),(',$insert_data);
$values = substr(1,2,$values);
$values = ' VALUES '.$values;
$insert_query = $start_query.' '.$values;
$mysqli->query($insert_query);
$insert_data = null;
}
?>
ITs off the top off my head but check this idea and tell me if this work, the bugs night be in small things i forgot with the QUERY structure, print this and PASTE to PHPmyADMIN or you DB query and see its all good, but this concept will sqve a lot of problems..

Unable to pass large array to mySQL database with PHP

I currently have a relatively large HTML form (100+ fields). I want to take the data from that form and upload it to a mySQL database when the use hits submit. I have created the PHP code below and have been slowly adding fields and testing to see if the connection is successful. Everything was working through $skilled_nursing, but when I added the next set of values I am no longer successfully creating database entries. All of my echo commands are displayed and I am not getting failures in my error log, but the data is not being received in the database.
Can anyone see what is going wrong? I have checked multiple times for spelling errors, but I haven't seen any. I am wondering if I am somehow timing out with the connection or if I am trying to stick too many values into the execute command.
<?php
echo 'started ok';
// configuration
$dbtype = "mysql";
$dbhost = "localhost";
$dbname = "dbname";
$dbuser = "dbuser";
$dbpass = "userpass";
echo 'variables assigned ok';
// database connection
$conn = new PDO("mysql:host=$dbhost;dbname=$dbname",$dbuser,$dbpass);
echo 'connection established';
// new data
$facility_name = $_POST['facility_name'];
$facility_street = $_POST['facility_street'];
$facility_county = $_POST['facility_county'];
$facility_city = $_POST['facility_city'];
$facility_state = $_POST['facility_state'];
$facility_zipcode = $_POST['facility_zipcode'];
$facility_phone = $_POST['facility_phone'];
$facility_fax = $_POST['facility_fax'];
$facility_licensetype = $_POST['facility_licensetype'];
$facility_licensenumber = $_POST['facility_licensenumber'];
$facility_email = $_POST['facility_email'];
$facility_administrator = $_POST['facility_administrator'];
$skilled_nursing = $_POST['skilled_nursing'];
$independent_living = $_POST['independent_living'];
$assisted_living = $_POST['assisted_living'];
$memory_care = $_POST['memory_care'];
$facility_type_other = $_POST['facility_type_other'];
$care_ratio = $_POST['care_ratio'];
$nurse_ratio = $_POST['nurse_ratio'];
// query
$sql = "INSERT INTO Facilities (facility_name, facility_street, facility_county, facility_city, facility_state, facility_zipcode, facility_phone, facility_fax, facility_licensetype, facility_licensenumber, facility_email, facility_administrator, skilled_nursing, independent_living, assisted_living, memory_care, facility_type_other, care_ratio, nurse_ratio) VALUES (:facility_name, :facility_street, :facility_county, :facility_city, :facility_state, :facility_zipcode, :facility_phone, :facility_fax, :facility_licensetype, :facility_licensenumber, :facility_email, :facility_administrator, :skilled_nursing, :independent_living, :assisted_living, :memory_care, :facility_type_other, :care_ratio, :nurse_ratio)";
$q = $conn->prepare($sql);
$q->execute(array(':facility_state'=>$facility_name,
':facility_street'=>$facility_street,
':facility_county'=>$facility_county,
':facility_city'=>$facility_city,
':facility_state'=>$facility_state,
':facility_name'=>$facility_name,
':facility_zipcode'=>$facility_zipcode,
':facility_phone'=>$facility_phone,
':facility_fax'=>$facility_fax,
':facility_licensetype'=>$facility_licensetype,
':facility_licensenumber'=>$facility_licensenumber,
':facility_email'=>$facility_email,
':facility_administrator'=>$facility_administrator,
':skilled_nursing'=>$skilled_nursing,
':independent_living'=>$independent_living,
':assisted_living'=>$assisted_living,
':memory_care'=>$memory_care,
':facility_type_other'=>$facility_type_other,
':care_ratio'=>$care_ratio,
':nurse_ratio'=>$nurse_ratio));
echo 'query parsed';
?>
This doesn't exactly answer what's going wrong with your code, but it might help solve it.
I would do this a bit differently. You say that you have a lot of fields. Your code is likely to get very long and repetitive. Since it looks like your form field names already correspond with your table columns, I would do something more like this (not tested):
// get a list of column names that exist in the table
$sql = "SELECT column_name FROM information_schema.columns WHERE table_name = 'Facilities'";
$q = $conn->prepare($sql);
$q->execute();
$columns = $q->fetchAll(PDO::FETCH_COLUMN, 0);
$cols = array();
foreach ($_POST as $key=>$value)
{
// if a field is passed in that doesn't exist in the table, remove it
if (!in_array($key, $columns)) {
unset($_POST[$key]);
}
}
$cols = array_keys($_POST);
$sql = "INSERT INTO Facilities(". implode(", ", $cols) .") VALUES (:". implode(", :", $cols) .")";
$q = $conn->prepare($sql);
array_walk($_POST, "addColons");
$q->execute($_POST);
function addColons($value, &$key)
{
$key = ":{$key}";
}
This way, you could have 10, 100, or 1000 fields and this code won't have to change at all. You also reduce your chance for typo errors because there's only one place where the column name is specified. You don't have to worry about SQL injection on the column names because you check to make sure that the column exists before allowing it to be used in your query.
This does, of course, assume that all fields passed in via $_POST correspond with column names in your table. If this isn't the case, it may be easiest to just store those particular field values that aren't columns in separate variables and unset() them from the $_POST array.

Copy a large table to a different table efficiently

There is a large table and I need to move the contents of it to a table which has a different structure. The tables are on different databases. For this I am using a PHP script. But the script does not work with the way I wanted. It over-copies and never stops. Maybe it is a noobish and simple question but right now my head is spinning, from trying but I can not put my finger on the problem. And this job needs to be done immediately. I will be glad if you help. Here the code snippet:
function copy_table()
{
$this->load->database();
$num_rows = $this->db->get('orj_table')->num_rows();
$offset = 0;
$limit = 500;
while ($offset <= $num_rows)
{
$this->load->database();
//Query for original table
//......
$this->db->limit($limit, $offset);
$records = $this->db->get('orj_table')->result_array();
$this->db->close();
//Open a connection to new database.
$this->db_new = $this->load->database('new', TRUE);
foreach($records as $record)
{
$data1 = $record['data1'];
$data2 = $record['data2'];
$datas[] = array('data1 => $record['data1'],
'data2 => $record['data2']
);
}
//Insert 500 records at one time with "insert_batch"
$sorgu = $this->db_yeni->insert_batch('new_table', datas);
$this->db->close();
$offset += 500;
}
}
Try using this simple mysql query instead :
INSERT INTO different_table (SELECT col1, col3, col4 FROM initial_table)
Why use PHP, or most importantly CodeIgniter?
You should just do something along the lines of:
CREATE TABLE `newtable`
SELECT * FROM `othertable`;
.. of course with this you can select what you want, join, define new column names etc, and whatever you select out will be placed into a new table.
If your destination table already exists, change the CREATE to INSERT.

Problems copying Oracle result set to MySQL

What I'm trying to do is copy the result of a query in oracle into table in a MySQL database. The reasoning behind this is not important to the question, and I can't really take a different approach.
What I am doing is running the query though php, then copying the result into a newly created table in oracle. I know this is not very efficient, but my table sizes are pretty small, even though the queries run very long.
Everything works other than I'm having trouble bringing the date over from oracle; when I run it as is, my date fields are set to 0. What I'm doing is checking the result set from oracle to see if the type of a column is "DATE(7)" and if it is then I create a column in MySQL with type "DATE". But for some reason this doesn't work.
I'm using the following code, sorry it's quite long, but it seemed best to provide it all.
function hmis_query_transfer ($query_name, $parameters = NULL) {
//Create Connections to both servers
$dbConn_Oracle = hmis_oci_connect();
hmis_mysql_connect("hmis_temp");
//Retrieve the text for the query and run it on the Oracle server
$query_Oracle = hmis_query_by_name($query_name, $parameters);
$stmt_Oracle = hmis_oci_query($dbConn_Oracle , $query_Oracle);
$ncols = oci_num_fields($stmt_Oracle);
//Test if table is already created in MySQL
$mysql_check = "DESC ".$query_name;
#hmis_mysql_query($mysql_check);
if (mysql_errno()==1146){
$mysql_create = "CREATE TABLE ".$query_name." ( ";
//Transform and append column names to query string
for ($j = 1; $j <= $ncols; $j++) {
$name = oci_field_name($stmt_Oracle, $j);
$type = oci_field_type($stmt_Oracle, $j);
if ($type == "NUMBER") {
$type = "INT";
} else if ($type == "VARCHAR2"){
$type = "VARCHAR";
}
$type .= "(".oci_field_size($stmt_Oracle, $j).")";
if ($type == "DATE(7)") {
$type = "DATE";
}
$mysql_create .= $name." ".$type.",";
}
$mysql_create = substr_replace( $mysql_create, "", -1 );
$mysql_create .= ')';
//Create Table
$result = hmis_mysql_query($mysql_create);
if ( !$result ){
die('<strong>Failed Create:</strong>'.mysql_error());
}
}
elseif (!mysql_errno()) {
//If the table already exists, empty it
$mysql_truncate = "TRUNCATE TABLE ".$query_name;
$result = hmis_mysql_query($mysql_truncate);
if ( !$result ){
die('<strong>Failed Truncate:</strong>'.mysql_error());
}
}
//Copy over row by row the data from the result set to MySQL
while ($results_row = oci_fetch_array($stmt_Oracle, OCI_ASSOC )) {
$mysql_insert = "INSERT INTO ".$query_name." VALUES (";
for ($i = 1; $i <= $ncols; $i++) {
$mysql_insert .= "'".$results_row[oci_field_name($stmt_Oracle, $i)]."',";
}
$mysql_insert = substr_replace( $mysql_insert, "", -1 );
$mysql_insert .= ")";
$result = hmis_mysql_query($mysql_insert);
if ( !$result ){
die('<strong>Failed Insert:</strong>'.mysql_error());
}
}
}
Can anyone see any flaws in my code? I'm open to suggestions on how I could do this a different way, though I would prefer to be able to keep my code. Thanks for any help.
The reason I'm copying data from oracle to MySQL is because most of the queries take very long to run (15-20 min), and deal wit huge datasets (200 million), but my result sets are very small(a few thousand at most). The way my application was built it has to run repeat queries on the same dataset to accomplish it's task. I would have liked to create views to accomplish this in oracle, but I don't have the authority to do so. Therefore I save an mid layer result set and perform my analysis on it, which happens to be on a much faster machine.
I do not know the format of the date from Oracle, but I would use the strtotime() function to convert it to a unix timestamp and then use the date() function to put it into MySQL (YYYY-MM-DD) format. This should work as long as the date field in oracle is not suppose to be a birthdate or out of the range of a UNIX Timestamp.

Categories