PHP/MySQL: Copy Table and Data from one Database to another - php

Anoter question: What is the best way to copy a whole table from one Database without using something like this:
CREATE TABLE DB2.USER SELECT * FROM DB1.USER;
This does not work because I can't use data from two Databases at the same time. So I decided to make this with php. There I had to cache all Data and then I'd create a table in the other DB.
But now - what would be the fastest way to cache the data? I guess there are closely everytime less than 1000 records per table.
Thanks for your input

Export it to a .sql file and import to the new database.
In phpMyAdmin click the database you want to export, click export, check save as file, then go.
Then upload it to the new database that you want to duplicate the data in.
In a strict PHP sense, the only way I could think to do it would to be use of SHOW TABLES and describe {$table} to return the all the field names and structures of the records, parse that out, create your create table statements, then loop through each table and create insert statements.
I'm afraid the best I can do for you is a sort of prototype code that I would imagine would be incredibly server intensive, which is why I recommend you go an alternate route.
Something like:
<?php
// connect to first DB
$tables = mysql_query("SHOW TABLES") or die(mysql_error());
while ($row = mysql_fetch_assoc($tables)) {
foreach($row as $value) {
$aTables[] = $value;
}
}
$i = 0;
foreach ($aTables as $table) {
$desc = mysql_query("describe " . $table);
while ($row = mysql_fetch_assoc($desc)) {
$aFields[$i][] = array($row["Field"],$row["Type"],$row["Null"],$row["Key"],$row["Default"],$row["Extra"]);
}
$i++;
}
// connect to second DB
for ($i = 0; $i < count($aTables); $i++) {
// Loop through tables, fields, and rows for create table/insert statements
$query = 'CREATE TABLE IF NOT EXISTS {$aTables[$i]} (
//loop through fields {
{$aFields[$i][$j][0]} {$aFields[$i][$j][1]} {$aFields[$i][$j][2]} {$aFields[$i][$j][3]} {$aFields[$i][$j][4]} {$aFields[$i][$j][5]},
{$aFields[$i][$j][0]} {$aFields[$i][$j][1]} {$aFields[$i][$j][2]} {$aFields[$i][$j][3]} {$aFields[$i][$j][4]} {$aFields[$i][$j][5]},
{$aFields[$i][$j][0]} {$aFields[$i][$j][1]} {$aFields[$i][$j][2]} {$aFields[$i][$j][3]} {$aFields[$i][$j][4]} {$aFields[$i][$j][5]},
{$aFields[$i][$j][0]} {$aFields[$i][$j][1]} {$aFields[$i][$j][2]} {$aFields[$i][$j][3]} {$aFields[$i][$j][4]} {$aFields[$i][$j][5]},
etc...
}
)';
//loop through data
$query .= 'INSERT INTO {$aTables[$i]} VALUES';
$result = mysql_query("SELECT * FROM " . $aTables[$i]);
while ($row = mysql_fetch_assoc($result)) {
$query .= '(';
foreach ($aFields[$i][0] as $field) {
$query .= '"{$row[$field]}",';
}
$query .= '),';
}
mysql_query($query);
}
?>
This is based off of this script which may come in handy for reference.
Hopefully that's something to get you started, but I would suggest you look for a non PHP alternative.

This is the way I do it, not original with me:
http://homepage.mac.com/kelleherk/iblog/C711669388/E2080464668/index.html

$servername = "localhost";
$username = "root";
$password = "*******";
$dbname = "dbname";
$sqlfile = "/path/backupsqlfile.sql";
$command='mysql -h' .$servername .' -u' .$username .' -p' .$password .' ' .$dbname .' < ' .$sqlfile;
exec($command);

Related

How to compare tables from different mySQL databases with PHP?

I'm new to PHP as well as the field and have been tasked with finding inconsistencies amongst tables from different mySQL databases. Every table should be the same between databases (column names, column count), with the exception of the actual data held in it, which they are not. I will need to identify the offending columns, the table they are in and, the database they are in. Below is my code thus far:
<?php
chdir(<path>);
include(<file>);
include(<file>);
$db = new db($hostname, $user, $pass, $commondatabase);
// Get baseline DB columns
//$db->select_db('<baseDB>');
//$btq = "SHOW TABLES";
//$btr = $db->query($btq);
//while($trows = $db->fetch_assoc($btr) {
// $bcq = "SHOW COLUMNS FROM ".$btr[<key>];
// $bcr = "$db->query($bcq);
// $basecolumns[] = $bcr;
//}
$dbq = "SELECT * FROM <commonDB>.<DBnames> ORDER BY <DBnamesID>";
$dbr = $db->query($dbq);
while($dbrow = $db->fetch_assoc($dbr)) {
echo "\n";
print_r($dbrow['dbname']);
echo "\n";
$db->select_db($dbrow['dbname']);
$tq = "SHOW TABLES";
$tr = $db->query($tq);
while($table = $db->fetch_assoc($tr)) {
/*print_r($dbtables);*/
$cq = "SHOW COLUMNS FROM ".$table['Tables_in_'.$dbrow['dbname']];
$cr = $db->query($cq);
$dbcols = [];
while($col = $db->fetch_assoc($cr)) {
/*print_r($col);*/ $dbcols = $col;
// Do check against baseline here
//if($dbcols != $basecolumns) {
// $badcolumns[] = $dbcols;
//}
}
}
}
$db->close();
?>
Currently it will loop down to the columns, but the output is not visually pleasing nor very manageable. My first concern is to get this working to where I can loop down to the columns and actually get them in their own arrays to be checked against the baseline and I've hit a wall, so any direction would be much appreciated. Currently each column is being assigned to its own array versus an array of all the column names for the database and table the loop is on. TIA!
Here is what I came up with. Not sure if it's the most DRY way to go about it, but it works and achieves the results I was looking for. Thank you to Barmar for sending me down the right path.
<?php
chdir(<path>);
include(<file>);
include(<file>);
$db = new db($hostname,$user,$pass,$commondatabase);
$db->select_db(<database>);
// Get tables from baseline database
$q="select distinct <column-name> from <table-name> where <where-param> ORDER BY <order-by-param>";
$r=$db->query($q);
while($tables=$db->fetch_assoc($r)) {
$table = $tables[<key>];
$x = array();
$x["$table"]=array();
// Get table columns from baseline database
$q="select <column-name> from <table-name> where <where-param> and <where-param> ORDER BY <order-by-param>";
$r2=$db->query($q);
while ($columns=$db->fetch_assoc($r2)) {
$x["$table"][]=$columns[<key>];
}
// Check other databases for this table
$q="select * from $commondatabase.<table-with-database-names> ";
$q.="where <where-param> order by <order-by-param>";
$r2=$db->query($q);
while ($databases=$db->fetch_assoc($r2)) {
$y = array();
$y["$table"]=array();
// Get table columns in $databases[<key>]
$q="select <column-name> from <table-name> where <where-param> and <where-param> ORDER BY <order-by-param>";
$r3=$db->query($q);
while ($columns=$db->fetch_assoc($r3)) {
$y["$table"][]=$columns[<key>];
}
// Check against baseline
$z1 = array_diff($x["$table"],$y["$table"]);
$z2 = array_diff($y["$table"],$x["$table"]);
// Echo out comparison results
if (empty($z1) && empty($z2)) {
//echo "all good.\n";
} else {
echo "Difference found in {$databases[<key>]}.{$tables[<key>]}:";
if (!empty($z1)) print_r($z1);
if (!empty($z2)) print_r($z2);
}
}
}
$db->close();
?>

PHP + mySQL when inserting many rows in loop to table my php hungs/freases/

I have some strange problem with inserting rows in loop into mySQL table.
Let me show you php code first that I use then I describe some statistic.
I tend to think that it is some mySQL issue, but absolutely no idea what kind of. Max inserts in table per minute? (Can't be max row reached - planty of spase on disk)
echo " For character=" . $row[1];
$xml = simplexml_load_file($api_url);
$i=0;
foreach ($xml->result->rowset->row as $value) {
$newQuery = 'INSERT INTO '.$tableName.' (transactionDateTime, quantity, typeName, price, clientName, station, transactionType, seller) VALUES ("'.$value['transactionDateTime'].'",'.$value['quantity'].',"'.$value['typeName'].'","'.$value['price'].'","'.$value['clientName'].'","'.$value['stationName'].'","'.$value['transactionType'].'","'.$row[1].'")';
$i++;
if (!mysqli_query($conn, $newQuery)) {
die('Error while adding transaction record: ' . mysqli_error($conn));
} // if END
} // foreach END
echo " added records=" . $i;
I have same data in XML that doesn't change. (XML has something like 1400+ rows that i would insert)
It always inserts different amount of rows. Max amount it inserted was around 800+
If I insert like 10sec delay into foreach loop at $i==400 it will add even less rows. And more delays - less rows.
It never comes to that part of code where mysqli_error($conn)
It never reaches echo " added records=" . $i; part of the code.
Since it alwasy stops on different recors I have to assume nothing wrong with INSERT query.
Since it never reaches line after foreach loop echo " added records=" . $i; I also assume XML data wasn't processed by the end of it.
If I use another sources of data (another character) where are less records in XML then this code works just fine.
What could possibly be my problem?
Could be that your firing multiple queries at your SQL server. Better to build a single SQL query via your foreach then fire it once.
Something like this, basically:
$db = new mysqli($hostname, $username, $password, $database);
if($db->connect_errno > 0)
{
$error[] = "Couldn't establish connection to the database.";
}
$commaIncrement = 1;
$commaCount = count($result);
$SQL[] = "INSERT INTO $table $columns VALUES";
foreach ($result as $value)
{
$comma = $commaCount == $commaIncrement ? "" : ",";
$SQL[] = "(";
$SQL[] = "'$value[0]'"."'$value[1]'"."'$value[2]'"."'$value[3]'";
$SQL[] = ")$comma";
$commaIncrement++;
}
$SQL[] = ";";
$completedSQL = implode(' ',$SQL);
$query = $db->prepare($completedSQL);
if($query)
{
$db->query($completedSQL)
}
$db->close();
Scrowler is right, your php is timing out. As a test, you can add
set_time_limit(0);
to the start of your php script.
WARNING - Don't use this in production or anywhere else. Always set a reasonable time limit for the script.

Yii insert series of data

So, I got a series of data that I need to insert into a table. Right now I am using a for loop to iterate through each entry and save the model one by one. But that doesn't seem like a good way to do it, moreover using transaction would be an issue. What's a better way to do it to improve performance and also so I can use transaction.Here's the code I am currently using.
foreach ($sheetData as $data)
{
$newRecord = new Main;
$newRecord->id = $data['A'];
$newRecord->name = $data['B'];
$newRecord->unit = $data['C'];
$newRecord->save();
}
If you can skip validation, you can generate a simple sql insert and execute it once. like:
$count = 0;
$sql = '';
foreach ($sheetData as $data)
{
if(!$count)
$sql .= 'INSERT INTO tbl_main (id ,name ,unit) Values ('.$data['A'].','$data['B']','$data['C']') ';
else
$sql .= ' , ('.$data['A'].','$data['B']','$data['C']')';
$count++;
}
Yii::app()->db->createCommand($sql)->execute();

Separate merged SQL rows with a comma

I'm having a bit of trouble getting my retrieved values from an SQL query into the correct format.
I've managed to join multiple rows into the one value, however I am not sure how to make it separate each of the values with a comma. Essentially I need all the ID's of a product to be retrieved as, for example, if the database had values of '5,6,9,1' '1,3,4' and '2,1' I want it to throw a comma in between each like -> '5,6,9,1,1,3,4,2,1' instead is doing something more like -> '5,6,911,3,42,1' which is what it is doing at the moment.
The code I'm using is below. Any help would be greatly appreciated.
$hist = "SELECT ORDITEMS FROM cust_orderc WHERE ORDDATE >
to_date('".$olddate."','dd/mm/yyyy')";
$histitem = OCIParse($db, $hist);
OCIExecute($histitem);
while($row = oci_fetch_array($histitem)){
$pastitem .= $row['ORDITEMS'];
}
echo "$pastitem";
You can do same in oracle using LISTAGG
$hist = "SELECT LISTAGG(ORDITEMS) as ORDITEMS FROM cust_orderc WHERE ORDDATE > to_date('".$olddate."','dd/mm/yyyy')";
Edit OR PHP way
$pastitem = '';
while($row = oci_fetch_array($histitem)){
$pastitem .= $row['ORDITEMS'] . ',';
}
$pastitem = trim($pastitem, ",");
echo $pastitem;

How to remove htmlentities() values from the database?

Long before I knew anything - not that I know much even now - I desgined a web app in php which inserted data in my mysql database after running the values through htmlentities(). I eventually came to my senses and removed this step and stuck it in the output rather than input and went on my merry way.
However I've since had to revisit some of this old data and unfortunately I have an issue, when it's displayed on the screen I'm getting values displayed which are effectively htmlentitied twice.
So, is there a mysql or phpmyadmin way of changing all the older, affected rows back into their relevant characters or will I have to write a script to read each row, decode and update all 17 million rows in 12 tables?
EDIT:
Thanks for the help everyone, I wrote my own answer down below with some code in, it's not pretty but it worked on the test data earlier so barring someone pointing out a glaring error in my code while I'm in bed I'll be running it on a backup DB tomorrow and then on the live one if that works out alright.
I ended up using this, not pretty, but I'm tired, it's 2am and it did its job! (Edit: on test data)
$tables = array('users', 'users_more', 'users_extra', 'forum_posts', 'posts_edits', 'forum_threads', 'orders', 'product_comments', 'products', 'favourites', 'blocked', 'notes');
foreach($tables as $table)
{
$sql = "SELECT * FROM {$table} WHERE data_date_ts < '{$encode_cutoff}'";
$rows = $database->query($sql);
while($row = mysql_fetch_assoc($rows))
{
$new = array();
foreach($row as $key => $data)
{
$new[$key] = $database->escape_value(html_entity_decode($data, ENT_QUOTES, 'UTF-8'));
}
array_shift($new);
$new_string = "";
$i = 0;
foreach($new as $new_key => $new_data)
{
if($i > 0) { $new_string.= ", "; }
$new_string.= $new_key . "='" . $new_data . "'";
$i++;
}
$sql = "UPDATE {$table} SET " . $new_string . " WHERE id='" . $row['id'] . "'";
$database->query($sql);
// plus some code to check that all out
}
}
Since PHP was the method of encoding, you'll want to use it to decode. You can use html_entity_decode to convert them back to their original characters. Gotta loop!
Just be careful not to decode rows that don't need it. Not sure how you'll determine that.
I think writing a php script is good thing to do in this situation. You can use, as Dave said, the html_entity_decode() function to convert your texts back.
Try your script on a table with few entries first. This will make you save a lot of testing time. Of course, remember to backup your table(s) before running the php script.
I'm afraid there is no shorter possibility. The computation for millions of rows remains quite expensive, no matter how you convert the datasets back. So go for a php script... it's the easiest way
This is my bullet proof version. It iterates over all Tables and String columns in a database, determines primary key(s) and performs updates.
It is intended to run the php-file from command line to get progress information.
<?php
$DBC = new mysqli("localhost", "user", "dbpass", "dbname");
$DBC->set_charset("utf8");
$tables = $DBC->query("SHOW FULL TABLES WHERE Table_type='BASE TABLE'");
while($table = $tables->fetch_array()) {
$table = $table[0];
$columns = $DBC->query("DESCRIBE `{$table}`");
$textFields = array();
$primaryKeys = array();
while($column = $columns->fetch_assoc()) {
// check for char, varchar, text, mediumtext and so on
if ($column["Key"] == "PRI") {
$primaryKeys[] = $column['Field'];
} else if (strpos( $column["Type"], "char") !== false || strpos($column["Type"], "text") !== false ) {
$textFields[] = $column['Field'];
}
}
if (!count($primaryKeys)) {
echo "Cannot convert table without primary key: '$table'\n";
continue;
}
foreach ($textFields as $textField) {
$sql = "SELECT `".implode("`,`", $primaryKeys)."`,`$textField` from `$table` WHERE `$textField` like '%&%'";
$candidates = $DBC->query($sql);
$tmp = $DBC->query("SELECT FOUND_ROWS()");
$rowCount = $tmp->fetch_array()[0];
$tmp->free();
echo "Updating $rowCount in $table.$textField\n";
$count=0;
while($candidate = $candidates->fetch_assoc()) {
$oldValue = $candidate[$textField];
$newValue = html_entity_decode($candidate[$textField], ENT_QUOTES | ENT_XML1, 'UTF-8');
if ($oldValue != $newValue) {
$sql = "UPDATE `$table` SET `$textField` = '"
. $DBC->real_escape_string($newValue)
. "' WHERE ";
foreach ($primaryKeys as $pk) {
$sql .= "`$pk` = '" . $DBC->real_escape_string($candidate[$pk]) . "' AND ";
}
$sql .= "1";
$DBC->query($sql);
}
$count++;
echo "$count / $rowCount\r";
}
}
}
?>
cheers
Roland
It's a bit kludgy but I think the mass update is the only way to go...
$Query = "SELECT row_id, html_entitied_column FROM table";
$result = mysql_query($Query, $connection);
while($row = mysql_fetch_array($result)){
$updatedValue = html_entity_decode($row['html_entitied_column']);
$Query = "UPDATE table SET html_entitied_column = '" . $updatedValue . "' ";
$Query .= "WHERE row_id = " . $row['row_id'];
mysql_query($Query, $connection);
}
This is simplified, no error handling etc.
Not sure what the processing time would be on millions of rows so you might need to break it up into chunks to avoid script timeouts.
I had the exact same problem. Since I had multiple clients running the application in production, I wanted to avoid running a PHP script to clean the database for every one of them.
I came up with a solution that is far from perfect, but does the job painlessly.
Track all the spots in your code where you use htmlentities() before inserting data, and remove that.
Change your "display data as HTML" method to something like this :
return html_entity_decode(htmlentities($chaine, ENT_NOQUOTES), ENT_NOQUOTES);
The undo-redo process is kind of ridiculous, but it does the job. And your database will slowly clean itself everytime users update the incorrect data.

Categories