I have a php project running in my local machine(angular,php,mysql).Same copy of project running in online.
My Aim is to Sync(copy local db to server db) every one hour by running any PHP Script using angular 'set Interval' function.
What is the IDEA behind this functionality should i use?
or how i will achieve this ?
Any suggestions will be great help for me,
and Thanks in advance.
If your database tables not gonna change what you can do is create a function select all the data from your local database and pass that data to online function to update your online database with new or updated records.
For ex:
If you have a table called users. From AJAX you will select all local data and create JSON Object pass the data to script function.
From that JSON Object you will pass data to online php file and update your online database from it.
Note: You have to careful with giving lot of conditions to check whether data get missing or override.
You'll have to write a service and some code to dump your database (if you want to sync complete database every time) follow this answer
After dumping your sql next you have to upload the file to your server via the service. Upon receiving you can load the data again mysql -u username -p database_name < file.sql
However I won't recommend this, try exploring database approach of Master-Slave Database, where your local server's database will be a Master and your remote server will be slave. Your data will automatically be synchronized.Please see this tutorial
You can implement interface to select tables you want to import in live. Use below code generate CSV files of selected tables and prepare array.
<?php
$file_name_flat = 'TABLE-NAME.csv'; // Replace TABLE-NAME with your selected table name.
$fpointer = fopen(FOLDER-LOCATION.$file_name_flat, 'w+'); // Open CSV file. Replace
FOLDER-LOCATION with your local folder path where you want to save this CSV files.
//Execute query to get all columns data
$query = "select * FROM TABLE-NAME WHERE 1"; // Replace TABLE-NAME with your selected
table name. You can set other conditions based on your requirement.
//Execute query as per your CMS / Framework coding standards and write CSV file.
$result_flat = $DB_Connection->query($query)->fetchAll('assoc');
foreach ($result_flat as $fields) {
fputcsv($fpointer, $fields);
}
//Prepare Array of CSVs to create ZIP file
$files = array($file_name_flat);
fclose($fpointer); // close CSV file after successfully write.
?>
CREATE ZIP of CSVs
//Create ZIP
$zipname = 'tables_'.date('Y-m-d-H-i-s').'.zip';
createZipFile($files,$zipname,FOLDER_LOCATION); //Replace FOLDER-LOCATION with your
local folder path where you saved CSV files.
/* createZipFile Funcation to create zip file Both params are mandatory */
function createZipFile($files_names = array(),$zipfileName, $files_path=""){
$zip = new \ZipArchive;
$zip->open(TMP.$zipfileName, \ZipArchive::CREATE);
foreach ($files_names as $file) {
$zip->addFile($files_path.$file,$file);
}
$zip->close();
foreach ($files_names as $file) {
unlink($files_path.$file);
}
///Then download the zipped file.
header('Content-Type: application/zip');
header('Content-disposition: attachment; filename='.$zipfileName);
header('Content-Length: ' . filesize(FOLDER_LOCATION.$zipfileName));
readfile(TMP.$zipfileName);
unlink(TMP.$zipfileName);
die;
}
Now Implement a form to upload this zip file on live server. In Post action of this form add code to get zip file.
$filename = $_FILES['filename']['name'];
$source = $_FILES["filename"]["tmp_name"];
//Upload zip file to server location. Replace SERVER_FOLDER_PATH to server's location
where you want to save uploaded zip.
if(move_uploaded_file($source, SERVER_FOLDER_PATH)) {
//Extract ZIP file
$zip = new \ZipArchive();
$x = $zip->open($target_path);
if($x === true) {
$zip->extractTo(PATH_TO_SAVE_EXTRACTED_ZIP); // change this to the correct site path
$zip->close();
$cdir = scandir(PATH_TO_SAVE_EXTRACTED_ZIP); // Read DIR
$fieldSeparator = ",";
$lineSeparator = '\n';
foreach ($cdir as $key => $value)
{
if (!in_array($value,array(".","..")))
{
$fileName = PATH_TO_SAVE_EXTRACTED_ZIP.$value; // replace
PATH_TO_SAVE_EXTRACTED_ZIP with your server path
$tableName = SET_TABLE_NAME; // You have to set the logic to get the table name.
if (is_file($fileName))
{
// User MYSQL "LOAD DATA LOCAL INFILE" to IMPORT CSVs into particular tables. There are option available for this LOAD DATA process. It will import your CSV to particular table. No need to execute loop to insert data one by one.
$q = 'LOAD DATA LOCAL INFILE "'.$fileName.'" REPLACE INTO TABLE '.$tableName.' FIELDS TERMINATED BY "' .$fieldSeparator. '" Enclosed BY '.'\'"\''.' LINES TERMINATED BY "'.$lineSeparator.'"';
$DB_Connection->query($q);
}
}
}
}
You can check LOAD DATA of MySQL from - MYSQL
You can run a cron job on your local computer that exports the MySQL data using mysqldump and then uploads it to the server using rsync and sshpass.
Related
Am saving a file as a long blob in php by first saving it in a folder then to a db, the problem is that the server has write permissions so i would like to save it directly
This is what i have tried (This works perfectly):
if(isset($_POST['image'])){
$id = 0;
$image = $_POST['image'];
$tmp_image = date('YmdHisu').'.jpg';
file_put_contents($tmp_image, base64_decode($image));
$sql = "INSERT INTO fingerprint(template)
VALUES ('".addslashes(file_get_contents($tmp_image))."')";
try
{
$connection=Yii::app()->db;
$command=$connection->createCommand($sql);
$rowCount=$command->execute(); // execute the non-query SQL
echo "saved successifully";
unlink($tmp_image);
}
catch(Exception $ex)
{
echo 'Query failed' , $ex->getMessage();
unlink($tmp_image);
}
}
How can i save this in a blob field in mysql without first having to save it in a folder then saving to db
Let's assume you've sent the file via Android using POST method to your server running Yii1.
First of all the physical file is contained in the $_FILES variable and in the $_POST variable, as you said, contains a string of the file encoded in base 64 format (i write this for a clear answer).
$_FILE DOCUMENTATION
Now this is how you can try to upload the file with the standard MVC yii way using yii code:
It's true that you're uploading your file from an external device but Yii came in your help with CUploadedFile Class:
Call getInstance to retrieve the instance of an uploaded file, and then use saveAs to save it on the server. You may also query other information about the file, including name, tempName, type, size and error.
In particular you should use the function getInstancesByName that returns an array of instances starting with specified array name.
$temp = CUploadedFile::getInstanceByName("image"); // $_FILES['image']
and with this you can access to the file data:
$temp->name; // Name of the file
$temp->type; // Type of the file
$temp->size; // Size of the file
// etc etc..
$temp->saveAs("/your/path/" . $tmp_image . $temp->type); // Save your file
for final you can check if the file was saved and execute your query:
if($temp->saveAs("/your/path/" . $tmp_image . $temp->type)) {
// File is saved you can execute the query for save the record in your db
} else {
// Something went wrong.
}
You can also transfer this logic in a model.
check this link for more infos
Hope this will help you.
I have this method that uploads a file of a CSV format, but now i want to know how to upload it into columns in my database.
My method:
public function postUpload ()
{
if (Input::hasFile('file')){
$file = Input::file('file');
$name = time() . '-' . $file->getClientOriginalName();
// Moves file to folder on server
$file->move(public_path() . '/uploads/CSV', $name);
return 'Ok'; // return for testing
}
}
So my question would be within this method how can i can put it into the database ?
this one should work for you, it uses PDO + mySql's "LOAD DATA" approach
private function _import_csv($path, $filename)
{
$csv = $path . $filename;
//ofcourse you have to modify that with proper table and field names
$query = sprintf("LOAD DATA local INFILE '%s' INTO TABLE your_table FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\"' LINES TERMINATED BY '\\n' IGNORE 0 LINES (`filed_one`, `field_two`, `field_three`)", addslashes($csv));
return DB::connection()->getpdo()->exec($query);
}
So combined with your code, it could be something like the below
public function postUpload ()
{
if (Input::hasFile('file')){
$file = Input::file('file');
$name = time() . '-' . $file->getClientOriginalName();
//check out the edit content on bottom of my answer for details on $storage
$storage = '/some/world/readible/dir';
$path = $storage . '/uploads/CSV';
// Moves file to folder on server
$file->move($path, $name);
// Import the moved file to DB and return OK if there were rows affected
return ( $this->_import_csv($path, $name) ? 'OK' : 'No rows affected' );
}
}
EDIT
One thing to be noted, as per the error you report in comments which is probably some permissions issue (OS error code 13: Permission denied)
Please see: http://dev.mysql.com/doc/refman/5.1/en/load-data.html
"For security reasons, when reading text files located on the server,
the files must either reside in the database directory or be readable
by all. Also, to use LOAD DATA INFILE on server files, you must have
the FILE privilege. See Section 5.7.3, “Privileges Provided by
MySQL”."
As reported on mySql bug tracker (http://bugs.mysql.com/bug.php?id=31670) it seems that you need particular permission for all the folders in the csv file path:
All parent directories of the infile need world-readable I think
aswell as just the directory and infile...
So for an infile here: /tmp/imports/site1/data.file
you would need (I think, 755 worked) r+x for 'other' on these
directories: /tmp /tmp/imports
as well as the main two: /tmp/imports/site1
/tmp/imports/site1/data.file
To sum up:
To solve the "sqlstate hy000 general error 13 can't get stat of..." issue you have to move the uploaded file to a location with proper permissions (so not neccessarily the current one you are using) try something like "/tmp/import".
While load data infile is the quickest way, I prefer to use a lib like https://github.com/ddeboer/data-import or https://github.com/goodby/csv for 2 reasons.
It is extensible, what if your data source changes to excel files or a mongo db or some other method?
It is mallable, if you need to convert dates, or strings or numbers you can do it conditionally which cannot be done with a batch command.
my 2c
I want to import a text file that contains data separated by , . I read on several sources that most people use, "LOAD DATA INFILE". So, I figured it would work for me too.
I get this permissions error however when I do so. I ran this command and here is what I got:
LOAD DATA INFILE '/public_html/nyccrash.txt' INTO TABLE nyccrash;
But it gives me this error:
ERROR 1045(28000): Access denied for user 'username'#'%' (using password: YES)
I read on some other threads that all I had to do was include the full file path and I did but it still didn't work.
Is there another way to import a text file into my table in my database? Using SQL or PHP.
EDIT:
I found this command I can use:
<?php
$row = 1;
$handle = fopen("nyccrash.txt", "r");
echo("<table>");
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
echo("<tr>\r\n");
foreach ($data as $index=>$val) {
echo("\t<td>$val</td>\r\n");
}
echo("</tr>\r\n");
}
echo("</table>");
fclose($handle);
?>
That allows me to read the data and create a table and print it. I can also use the INSERT INTO table sql command after using the above to collect the data but I'm not sure how to insert the values into the table. That is, loop through the values for insertion. My data in the txt file doesn't not contain the attributes or headers of what's contained. So... I'm a little confused on how to sort the data into the right columns.
In order to load data via LOAD DATA LOCAL INFILE you need two things:
FILE privilege. Have a superuser run GRANT FILE ON *.* TO 'username'#'%';.
Set local_infile to 1 in my.cnf. To avoid having to restart mysql, have a superuser run SET GLOBAL local_infile=1;.
CAVEAT : Both of these things would be deemed a security breach.
I made sure I gave my txt file permissions: chmod 711
Then I used LOAD DATA LOCAL INFILE 'nyccrash.txt' INTO TABLE nyccrash FIELDS TERMINATED BY ','; and it worked.
IF you gonna do this a few times, you can mount a sql query by importing data into Excel, and use concat to mount the sql lines, and copy/paste to the client and execute. It's not very usefull for lots of tables. If so, it's not hard to use php to upload a csv file and populate the table.
Try something along these lines:
// Connect to Database
$db_Host = "";
$db_Username = "";
$db_Password = "";
mysql_connect($db_Host,$db_Username,$db_Password) or die("MySQL - Connection Error");
mysql_select_db($db_Database) or die("MySQL - Cannot Select Database");
mysql_query("LOAD DATA LOCAL INFILE '/home/username/public_html/Database.txt' INTO TABLE yourtablename") or die("MySQL - Query Error - " . MySQL_Error());
//MySQL is automatically disconnected from when PHP ends.
I want to allow the user to select a file from which data is exported and saved into a database. I found the following code to import the data:
$fp = fopen("people.csv", "r");
while ($line = fgets($fp)) {
$char = split("\t",$line,4);
list($name, $age,$height,$date) = $char;
//Split the line by the tab delimiter and store it in our list
$sql = "insert into people(name,age,height,date) values('$name','$age','$height','$date')"; // Generate our sql string
mysql_query($sql) or die(mysql_error()); // Execute the sql
//fclose($line);
I'm not sure if it works but i'm yet to get that far. My question is how do I get the full path name of the selected file so I can feed it into the following command rather than hard coding the filename:
$fp = fopen("people.csv", "r");
I have spent alot time researching this but to no avail.
If you want to let users upload this file, you should implement file upload. Please check W3Schools Tutorial for instructions.
The main idea is to get file path from $_FILES["file"]["tmp_name"]:
$fp = fopen($_FILES["file"]["tmp_name"], "r");
If you store your file statically in your web project folder, you can use the path including your DOCUMENT_ROOT:
$path = $_SERVER["DOCUMENT_ROOT"] + 'relative/path/to/your/file/' + 'people.csv';
You can obtain the path of the current file via the __DIR__ magic constant;
Magic constants
In your case;
echo __DIR__ . 'people.csv';
Also, you may consider using the built-in CSV functions that PHP offers;
http://php.net/manual/en/function.fgetcsv.php
If you want to skip processing the file via PHP altogether, MySQL offers ways to directly import data from a CSV file;
https://dev.mysql.com/doc/refman/5.5/en/load-data.html
I want to create a utility in PHP like phpMyAdmin's import option, which should allow database updates to the remote server via a .sql file without creating a new database.
Since it's a client side utility, access to cpanel is not allowed.
The app has two kinds of working environments, offline & online.
If the client works offline, they need to take the backup of database and should update the database with remote server similar for online.
Then they have to update the database of remote server.
Solution 1
If you are running your PHP on a Linux system, you can try using the 'mysql' command itself. However please note that your PHP installation has the permission to run "system" commands, like system(), exec() etc.
So here is what I mean to say:
system("mysql -U{db_user_name} -h{db_host} -P{db_password} < {full_path_to_your_sql_file}");
Please replace,
{db_user_name} with the DB username,
{db_host} with the DB host,
{db_password} with the DB password,
{full_path_to_your_sql_file} with the path to your SQL file.
And this of course requires the SQL file to be uploaded.
Solution 2:
Read the SQL file line by line and while reading execute each statement using PHP's standard MySQL library. Something like:
$arrFile = file("full_path_to_sql_file.sql", FILE_IGNORE_NEW_LINES);
foreach ($arrFile as $q) {
mysql_query($q);
}
However, this might not be as simple as it seems. If your SQL file has comments and other .sql specific statements, you might need to put checks to ignore them. Or better if the SQL file contains nothing but SQL statements.
You can use a regular upload script to obtain the .sql file, make sure you sanitize appropriately the input string to obtain only the .sql file and text type,
move_uploaded_file($_FILES["file"]["tmp_name"],"tmpdb/" . $_FILES["file"]["name"]);
Once you have that, you can either preset their db settings defining the db using
mysql_select_db('dbname');
Then just open the sql file with fopen(); slap that sucker in a variable
$file = fopen("userdb.sql","r");
$usersql = fread($file, 5);
fclose($file);
then just throw it in a mysql_query();
$uploaddb = mysql_query($usersql) or die(mysql_error());
Those are the concepts I would suggest, alternatively you can use shell exec but then that just opens up other security concerns.
You may want to consider using BigDump?
Eventually,I've got answer for my question myself.I've just pasted the php coding without config and other stuff.
$filename ="test.sql";
mysql_select_db("test");
//truncate the database.
$result_t = mysql_query("SHOW TABLES");
while($row = mysql_fetch_assoc($result_t))
{
mysql_query("TRUNCATE " . $row['Tables_in_' . $mysql_database]);
}
// Temporary variable, used to store current query
$templine = '';
// Read in entire file
$lines = file($filename);
// Loop through each line
foreach ($lines as $line)
{
// Skip it if it's a comment
if (substr($line, 0, 2) == '--' || $line == '')
continue;
// Add this line to the current segment
$templine .= $line;
// If it has a semicolon at the end, it's the end of the query
if (substr(trim($line), -1, 1) == ';')
{
// Perform the query
mysql_query($templine) or print('Error performing query \'<strong>' . $templine . '\': ' . mysql_error() . '<br /><br />');
// Reset temp variable to empty
$templine = '';
}
}