Copy/duplicate/backup database tables effectively - mysql - php

Reason: I was assigned to run some script that advances a website,it's a fantasy football site and there are several instants of the site located into different domains. Some has more than 80k users and each users supposed to have a team that consists of 15 players. Hence some tables have No.users x No.players rows.
However Sometimes the script fails and the result gets corrupted, therefore I must backup 10 tables in question before i execute the script. Nevertheless, I still need to backup the tables to keep historical record of users action. Because football matches may last for 50+ game weeks.
Task: To duplicate db tables using php script. When i started i used to backup the tables using sqlyog. it's works but it's time consuming since I have to wait for each table to be duplicated. Besides, for large tables the sqlyog application crashes during the duplicating of large tables which may be very annoying.
Current solution: I have created a simple application with interface that does the job and it works great. It consist of three files, one for db connection, 2nd for db manipulation, 3rd for user interface and to use the 2nd file's code.
The thing is, sometimes it get stuck at the middle of duplicating tables process.
Objective: To create an application to be used by admin to facilitate database backing up using mysql+php.
My Question: How to ensure that the duplicating script will definitely backup the table completely without hanging the server or interrupting the script.
Down here I will include my code for duplicating function, but basically these are the two crucial lines that i think the problem is located in them:
//duplicate tables structure
$query = "CREATE TABLE $this->dbName.`$newTableName` LIKE $this->dbName.`$oldTable`";
//duplicate tables data
$query = "INSERT INTO $this->dbName.`$newTableName` SELECT * FROM $this->dbName.`$oldTable`";
The rest of the code is solely for validation in case error occur. If you wish to take a look at the whole code, be my guest. Here's the function:
private function duplicateTable($oldTable, $newTableName) {
if ($this->isExistingTable($oldTable))
{
$this->printLogger("Original Table is valid -table exists- : $oldTable ");
}
else
{
$this->printrR("Original Table is invalid -table does not exist- : $oldTable ");
return false;
}
if (!$this->isExistingTable($newTableName))// make sure new table does not exist alrady
{
$this->printLogger("Distination Table name is valid -no table with this name- : $newTableName");
$query = "CREATE TABLE $this->dbName.`$newTableName` LIKE $this->dbName.`$oldTable`";
$result = mysql_query($query) or $this->printrR("Error in query. Query:\n $query\n Error: " . mysql_error());
}
else
{
$this->printrR("Distination Table is invalid. -table already exists- $newTableName");
$this->printr("Now checking if tables actually match,: $oldTable => $newTableName \n");
$varifyStatus = $this->varifyDuplicatedTables($oldTable, $newTableName);
if ($varifyStatus >= 0)
{
$this->printrG("Tables match, it seems they were duplicated before $oldTable => $newTableName");
}
else
{
$this->printrR("The duplicate table exists, yet, doesn't match the original! $oldTable => $newTableName");
}
return false;
}
if ($result)
{
$this->printLogger("Query executed 1/2");
}
else
{
$this->printrR("Something went wrong duplicateTable\nQuery: $query\n\n\nMySql_Error: " . mysql_error());
return false;
}
if (!$this->isExistingTable($newTableName))//validate table has been created
{
$this->printrR("Attemp to duplicate table structure failed $newTableName table was not found after creating!");
return false;
}
else
{
$this->printLogger("Table created successfully: $newTableName");
//Now checking table structure
$this->printLogger("Now comparing indexes ... ");
$autoInc = $this->checkAutoInc($oldTable, $newTableName);
if ($autoInc == 1)
{
$this->printLogger("Auto inc seems ok");
}
elseif ($autoInc == 0)
{
$this->printLogger("No inc key for both tables. Continue anyways");
}
elseif ($autoInc == -1)
{
$this->printLogger("No match inc key!");
}
$time = $oldTable == 'team_details' ? 5 : 2;
$msg = $oldTable == 'team_details' ? "This may take a while for team_details. Please wait." : "Please wait.";
$this->printLogger("Sleep for $time ...\n");
sleep($time);
$this->printLogger("Preparing for copying data ...\n");
$query = "INSERT INTO $this->dbName.`$newTableName` SELECT * FROM $this->dbName.`$oldTable`";
$this->printLogger("Processing copyign data query.$msg...\n\n\n");
$result = mysql_query($query) or $this->printrR("Error in query. Query:\n $query\n Error: " . mysql_error());
// ERROR usually happens here if large tables
sleep($time); //to make db process current requeste.
$this->printLogger("Query executed 2/2");
sleep($time); //to make db process current requeste.
if ($result)
{
$this->printLogger("Table created ($newTableName) and data has been copied!");
$this->printLogger("Confirming number of rows ... ");
/////////////////////////////////
// start checking count
$numRows = $this->checkCountRows($oldTable, $newTableName);
if ($numRows)
{
$this->printLogger("Table duplicated successfully ");
return true;
}
else
{
$this->printLogger("Table duplicated, but, please check num rows $newTableName");
return -3;
}
// end of checking count
/////////////////////////////////
}//end of if(!$result) query 2/2
else
{
$this->printrR("Something went wrong duplicate Table\nINSERT INTO $oldTable -> $newTableName\n\n$query\n mysql_error() \n " . mysql_error());
return false;
}
}
}
AS you noticed the function is only to duplicate one table, that's why there is another function that that takes an array of tables from the user and pass the tables names array one by one to duplicateTable().
Any other function should be included for this question, please let me know.
One solution pops into my mind, would duplicating tables by part by part add any improvement, I'm not sure how Insert into works, but maybe if I could insert let's say 25% at a time it may help?

However Sometimes the script fails and the result gets corrupted,
therefore I must backup 10 tables in question before i execute the
script.
Probably you need to use another solution here: transactions. You need to wrap up all queries you are using in failing script into transaction. If transaction fails all data will be the same as in the beginning of the operation. If queries got executed correctly - you are OK.

why are you every time duplicating the table..
CLUSTERS are good option which can make duplicate copies of your table in distributed manner and is much more reliable and secure.

Related

To delete() or forceDelete(). A better way to determine

I have a database with 95 tables. A users tables exists for the system users. Many other tables (45 out of the 95) have a "created_by" that refers to the user who created/added the row, through users.id.
Now. If I wanted to delete a user, I just cannot go and do $user->delete(), I need to keep the user around (soft-delete it) in case this user has created rows on other tables. But what if this user didn't add any content, then I should just go ahead and $user->forceDelete() it.
My question is: is there a good way to go about doing this? To check whether a user should be deleted or Force-deleted when we have this big number of tables.
I figured I could just loop though the tables and check if the id of the user (to be deleted) exists, if found then it's ->delete(), else it's ->forceDelete(). Here is the code:
// Get all tables
$allTables = \DB::connection()->getDoctrineSchemaManager()->listTableNames();
$tablesWithCreatedBy = [];
foreach($allTables as $tableName){
$tableColumns = \DB::getSchemaBuilder()->getColumnListing($tableName);
if(in_array('created_by', $tableColumns)){
$tablesWithCreatedBy[] = $tableName;
}
}
foreach($tablesWithCreatedBy as $tableName){
$result = \DB::select(" SELECT created_by FROM `$tableName`
WHERE `created_by` = {$this->user->id} LIMIT 0, 1 ");
if(isset($result[0])){
$this->user->delete();
break;
}
}
// If wasn't trashed from the code above, then force delete the user!
if(!$this->user->trashed()){
$this->user->forceDelete();
}
I feel there must be a better way to do it! Is there?
You've to add records_count to users table that will be incremented every time when user adds content to other tables, so after that change solution will be simple as:
$result = ($this->user->records_count > 0)
? $this->user->delete()
: $this->user->forceDelete();
Or write Laravel Console Command and run it at background that will walk through db and do cleanup operations.

Splitting a string of values like 1030:0,1031:1,1032:2 and storing data in database

I have a bunch of photos on a page and using jQuery UI's Sortable plugin, to allow for them to be reordered.
When my sortable function fires, it writes a new order sequence:
1030:0,1031:1,1032:2,1040:3,1033:4
Each item of the comma delimited string, consists of the photo ID and the order position, separated by a colon. When the user has completely finished their reordering, I'm posting this order sequence to a PHP page via AJAX, to store the changes in the database. Here's where I get into trouble.
I have no problem getting my script to work, but I'm pretty sure it's the incorrect way to achieve what I want, and will suffer hugely in performance and resources - I'm hoping somebody could advise me as to what would be the best approach.
This is my PHP script that deals with the sequence:
if ($sorted_order) {
$exploded_order = explode(',',$sorted_order);
foreach ($exploded_order as $order_part) {
$exploded_part = explode(':',$order_part);
$part_count = 0;
foreach ($exploded_part as $part) {
$part_count++;
if ($part_count == 1) {
$photo_id = $part;
} elseif ($part_count == 2) {
$order = $part;
}
$SQL = "UPDATE article_photos ";
$SQL .= "SET order_pos = :order_pos ";
$SQL .= "WHERE photo_id = :photo_id;";
... rest of PDO stuff ...
}
}
}
My concerns arise from the nested foreach functions and also running so many database updates. If a given sequence contained 150 items, would this script cry for help? If it will, how could I improve it?
** This is for an admin page, so it won't be heavily abused **
you can use one update, with some cleaver code like so:
create the array $data['order'] in the loop then:
$q = "UPDATE article_photos SET order_pos = (CASE photo_id ";
foreach($data['order'] as $sort => $id){
$q .= " WHEN {$id} THEN {$sort}";
}
$q .= " END ) WHERE photo_id IN (".implode(",",$data['order']).")";
a little clearer perhaps
UPDATE article_photos SET order_pos = (CASE photo_id
WHEN id = 1 THEN 999
WHEN id = 2 THEN 1000
WHEN id = 3 THEN 1001
END)
WHERE photo_id IN (1,2,3)
i use this approach for exactly what your doing, updating sort orders
No need for the second foreach: you know it's going to be two parts if your data passes validation (I'm assuming you validated this. If not: you should =) so just do:
if (count($exploded_part) == 2) {
$id = $exploded_part[0];
$seq = $exploded_part[1];
/* rest of code */
} else {
/* error - data does not conform despite validation */
}
As for update hammering: do your DB updates in a transaction. Your db will queue the ops, but not commit them to the main DB until you commit the transaction, at which point it'll happily do the update "for real" at lightning speed.
I suggest making your script even simplier and changing names of the variables, so the code would be way more readable.
$parts = explode(',',$sorted_order);
foreach ($parts as $part) {
list($id, $position) = explode(':',$order_part);
//Now you can work with $id and $position ;
}
More info about list: http://php.net/manual/en/function.list.php
Also, about performance and your data structure:
The way you store your data is not perfect. But that way you will not suffer any performance issues, that way you need to send less data, less overhead overall.
However the drawback of your data structure is that most probably you will be unable to establish relationships between tables and make joins or alter table structure in a correct way.

Transform MySQL table and rows

I have one problem here, and I don't even have clue what to Google and how to solve this.
I am making PHP application to export and import data from one MySQL table into another. And I have problem with these tables.
In source table it looks like this:
And my destination table has ID, and pr0, pr1, pr2 as rows. So it looks like this:
Now the problem is the following: If I just copy ( insert every value of 1st table as new row in second) It will have like 20.000 rows, instead of 1000 for example.
Even if I copy every record as new row in second database, is there any way I can fuse rows ? Basically I need to check if value exists in last row with that ID_, if it exist in that row and column (pr2 for example) then insert new row with it, but if last row with same ID_ does not have value in pr2 column, just update that row with value in pr2 column.
I need idea how to do it in PHP or MySQL.
So you got a few Problems:
1) copy the table from SQL to PHP, pay attention to memory usage, run your script with the PHP command Memory_usage(). it will show you that importing SQL Data can be expensive. Look this up. another thing is that PHP DOESNT realese memory on setting new values to array. it will be usefull later on.
2)i didnt understand if the values are unique at the source or should be unique at the destination table.. So i will assume that all the source need to be on the destination as is.
I will also assume that pr = pr0 and quant=pr1.
3) you have missmatch names.. that can also be an issue. would take care of that..also.
4) will use My_sql, as the SQL connector..and $db is connected..
SCRIPT:
<?PHP
$select_sql = "SELECT * FROM Table_source";
$data_source = array();
while($array_data= mysql_fetch_array($select_sql)) {
$data_source[] = $array_data;
$insert_data=array();
}
$bulk =2000;
foreach($data_source as $data){
if(isset($start_query) == false)
{
$start_query = 'REPLACE INTO DEST_TABLE ('ID_','pr0','pr1','pr2')';
}
$insert_data[]=implode(',',$data).',0)';// will set 0 to the
if(count($insert_data) >=$bulk){
$values = implode('),(',$insert_data);
$values = substr(1,2,$values);
$values = ' VALUES '.$values;
$insert_query = $start_query.' '.$values;
$mysqli->query($insert_query);
$insert_data = array();
} //CHECK THE SYNTAX IM NOT SURE OF ALL OF IT MOSTLY THE SQL PART>> SEE THAT THE QUERY IS OK
}
if(count($insert_data) >=$bulk) // IF THERE ARE ANY EXTRA PIECES..
{
$values = implode('),(',$insert_data);
$values = substr(1,2,$values);
$values = ' VALUES '.$values;
$insert_query = $start_query.' '.$values;
$mysqli->query($insert_query);
$insert_data = null;
}
?>
ITs off the top off my head but check this idea and tell me if this work, the bugs night be in small things i forgot with the QUERY structure, print this and PASTE to PHPmyADMIN or you DB query and see its all good, but this concept will sqve a lot of problems..

Insert multiple data as well as update in database using php?

I get Nearest 50 km location names from current location using google api, so it' works fine.
So I need to insert all these locations into my database. If some location already there in database, I need to update these location.
For example I get 10 locations in google api so 5 locations are already there in my database. I need to 5 location are update and remaining 5 locations are insert.
Here is my code:
<?php
require 'dbconnect.php';
$LocaName=$_REQUEST['locname'];
$address=$_REQUEST['address'];
$latt=$_REQUEST['Latt'];
$long=$_REQUEST['Long'];
if($latt && $long)
{
$LocaNamearray = explode("|||", $LocaName);
$addressarray = explode("|||", $address);
$lattarray=explode("|||",$latt);
$longarray=explode("|||",$long);
for($i=0;$i<count($lattarray);$i++)
{
$query1="select * from tbl_MapDetails where Latitude='".$lattarray[$i]."'and Longitude='".$longarray[$i]."'";
$result1=mysql_query($query1);
$now=mysql_num_rows($result1);
}
if($now >=1)
{
for($k=0;$k<count($lattarray);$k++)
{
$query="update tbl_MapDetails set LocationName='".$LocaNamearray[$k]."', Address='".$addressarray[$k]."',Latitude='".$lattarray[$k]."', Longitude='".$longarray[$k]."' where Latitude='".$lattarray[$k]."'and Longitude='".$longarray[$k]."'";
}
$nav="update";
}
else
{
$query ="INSERT INTO tbl_MapDetails(LocationName,Address,Latitude,Longitude) VALUES";
$strDelimiter = "";
for($j=0;$j<count($LocaNamearray);$j++)
{
$name =$LocaNamearray[$j];
$address =$addressarray[$j];
$lat = $lattarray[$j];
$long = $longarray[$j];
$query .= $strDelimiter."('$name', '$address','$lat','$long')";
$strDelimiter = ',';
}
$nav="Add";
}
$result= mysql_query($query);
if($result)
{
echo mysql_error();
$message=array("message"=>"sucessfully".$nav);
}
else
{
echo mysql_error();
$message=array("message"=>"fail".$nav);
}
}
else
{
$message=array("message"=>"require latt and long");
}
echo json_encode($message);
?>
Here insert and update working but I need to check every location in database. There is no location in database. It need to insert other location are update. how to check both these conditions matched locations are update and unmatched locations are inserted Please guide me.
Your logic is wrong in the code. What you are doing is looping through the provided data and for each set of data checking if a location with that lat/long exists and storing it in the $now variable. Once you've finished that loop, you're then checking $now and looping through the provided data again and either INSERTing or UPDATEing each set of data. So if the last set of data exists, your script will try and UPDATE each set of data. If it doesn't, your script will try to INSERT each set of data. Your code should be something like this (mixture of your code and pseudo-code):
for($i=0;$i<count($lattarray);$i++)
{
$query1="select * from tbl_MapDetails where Latitude='".$lattarray[$i]."'and Longitude='".$longarray[$i]."'";
$result1=mysql_query($query1);
$now=mysql_num_rows($result1);
if($now >=1)
{
// update table with location details
}
else
{
// insert location details into table
}
}
If this becomes a performance issue you could look at retrieving all the SELECT data first but if you're only dealing with 10 rows at a time you should be OK.
Note: depending on where your $_REQUEST data is coming from you might want to do some validation, i.e. to check you have matching sets of lat/long/name/address details.
Take a look at MySQL`s ON DUPLICATE KEY UPDATE. But you must be careful, because it is quite slow operation.
But, I think, it would be better if you just union all your SELECT requests in one using OR conditions.

Need help INSERT record(s) MySQL DB

I have an online form which collects member(s) information and stores it into a very long MySQL database. We allow up to 16 members to enroll at a single time and originally structured the DB to allow such.
For example:
If 1 Member enrolls, his personal information (first name, last name, address, phone, email) are stored on a single row.
If 15 Members enroll (all at once), their personal information are stored in the same single row.
The row has information housing columns for all 'possible' inputs. I am trying to consolidate this code and having every nth member that enrolls put onto a new record within the database.
I have seen sugestions before for inserting multiple records as such:
INSERT INTO tablename VALUES
(('$f1name', '$f1address', '$f1phone'), ('$f2name', '$f2address', '$f2phone')...
The issue with this is two fold:
I do not know how many records are
being enrolled from person to person
so the only way to make the
statement above is to use a loop
The information collected from the
forms is NOT a single array so I
can't loop through one array and
have it parse out. My information is
collected as individual input fields
like such: Member1FirstName,
Member1LastName, Member1Phone,
Member2Firstname, Member2LastName,
Member2Phone... and so on
Is it possible to store information in separate rows WITHOUT using a loop (and therefore having to go back and completely restructure my form field names and such (which can't happen due to the way the validation rules are built.)
If you form's structured so that all the fields are numbered properly, so that a "firstname #1" is matched up with all the other "#1" numbered fields, then a loop is the simplest solution.
start_transaction();
$errors = false;
for ($i = 1; $i <= 16; $i++) {
if (... all $i fields are properly filled in ...) {
$field = $_POST["field$i"];
$otherfield = $_POST["otherfield$i"];
etc...
... insert into database ...
} else {
... handle error condition here
$errors = true;
}
}
if (!$errors) {
commit_transaction();
} else {
rollback();
}
If they're numbered randomly, so that firstname1 is matched with lastname42 and address3.1415927, then you'd have to build a lookup table to map all the random namings together, and loop over that
followup per comment:
well, if you absolutely insist on maintaining this database structure, where each row contains 16 sets of repeated firstname/lastname/etc.. records, then you'd do something like this:
$first = true;
for ($i = 1; $i <= 16; $i++) {
if (fields at position $i are valid) {
$firstname = mysql_escape_real_string($_POST["F{$i}name"]);
$lastname = mysql_real_escape_string($_POST["F{$i}lastname"]);
if ($first) {
$dbh->query("INSERT INTO table (f{$i}name, f{$i}lastname) VALUES ($firstname, $lastname);"
$recordID = $dbh->query("SELECT last_insert_id();");
$first = false;
} else {
$dbh->query("UPDATE table SET f{$i}name=$firstname, f{$i}lastname=$lastname WHERE idfield=$recordID");
}
}
}
It's ugly, but basically:
loop through the form field sets until you find a valid set (all required fields filled in, valid data entered, etc..
Insert that data set into the database to create the new record
retrieve ID of that new record
continue looping over the rest of the fields
for every subsequent set of valid records, do an update of the previously created record and add in the new fieldset data.
Though, honestly, unless you've got some highly offbeat design need to maintain a single table with 16 sets of repeated columns, you'd be better off normalizing a bit, and maintain two seperate tables. A parent "enrollment" table, and a child "members" table. That way you can create the parent enrollment table, then just insert new children as you encounter them in the form.
update #2:
well, a simplified form of a normalized layout would be:
signups (id, name, etc...)
signup_members (id, signup_id, firstname, lastname)
and you'd pull the full signup record set with the following query:
SELECT signups.id, signups.name, signup_members.id, firstname, lastname
FROM signups
LEFT JOIN signup_members ON signups.id = signup_members.signup_id
ORDER BY ...
That would give you a series of rows, one for each 'member' signup. To build the CSV, a simple loop with some state checking to see if you've reached a new signup yet:
$oldid = null;
$csv = ... put column headers here if you want ...
while ($signup = $result->fetchrow()) {
if ($signup['signups.id'] != $oldid) {
// current signup doesn't match previous seen id, so got a new signup record
$csv .= "\n"; // start new line in CSV
$csv .= ... add first few columns to new csv row ...
$oldid = $signup['signups.id']; // store new record id
} else {
$csv .= ... add extra member columns to current csv row ...
}
}
What you're trying to do could be simpler, but to solve the problem, you can join the user information into one variable, separated by a char of your choice and send it to Mysql DB...
$user1 = $f1name . ';' . $f1address . ';' . $f1phone;
$user2 = $f2name . ';' . $f2address . ';' . $f2phone;
$user3 = $f3name . ';' . $f3address . ';' . $f3phone;
INSERT INTO table-name VALUES('$user1','$user2','$user3')
To extract, just "explode" the value by the ";".
If you use the same order for all users data, and if you send a verification string in case one user leaves a field blank, works just fine :)
humm... this work's just fine if the user isn't allowed to use ";" as "personal data" :)
Hope it helps U!
I think you might want to look at "variable variables":
http://php.net/manual/en/language.variables.variable.php
Then you could conceivably loop through from 1 to 15, without having to rename your form fields.

Categories