I have almost 100MB of example.sql file. (Actually data export file from another database)
I want to give user the ability to run this file through some interface in my application.
Can you please guide me how can i do that? Should i treat it as simple text file? Either there is any quick way???
In other words, I want to add same functionality as we have in phpMyAdmin, The import functionality.
If you can refer me some class on PHPclass.org that will be great.
function import_file($filename){
if ($file = file_get_contents($filename)){
foreach(explode(";", $file) as $query){
$query = trim($query);
if (!empty($query) && $query != ";") {
mysql_query($query);
}
}
}
}
can be called with
import_file("files/files.sql");
However, this function will not work properly if file includes semicolon ; somewhere else than at the end of the query
Create a file upload form that allows trusted users to upload the file to your server. Then call the mysql command-line tools from PHP to import the data into your database. Doing it by trying to explode(";", ...) will fail if there are any quoted semicolons within the imported data.
Related
I have a php project running in my local machine(angular,php,mysql).Same copy of project running in online.
My Aim is to Sync(copy local db to server db) every one hour by running any PHP Script using angular 'set Interval' function.
What is the IDEA behind this functionality should i use?
or how i will achieve this ?
Any suggestions will be great help for me,
and Thanks in advance.
If your database tables not gonna change what you can do is create a function select all the data from your local database and pass that data to online function to update your online database with new or updated records.
For ex:
If you have a table called users. From AJAX you will select all local data and create JSON Object pass the data to script function.
From that JSON Object you will pass data to online php file and update your online database from it.
Note: You have to careful with giving lot of conditions to check whether data get missing or override.
You'll have to write a service and some code to dump your database (if you want to sync complete database every time) follow this answer
After dumping your sql next you have to upload the file to your server via the service. Upon receiving you can load the data again mysql -u username -p database_name < file.sql
However I won't recommend this, try exploring database approach of Master-Slave Database, where your local server's database will be a Master and your remote server will be slave. Your data will automatically be synchronized.Please see this tutorial
You can implement interface to select tables you want to import in live. Use below code generate CSV files of selected tables and prepare array.
<?php
$file_name_flat = 'TABLE-NAME.csv'; // Replace TABLE-NAME with your selected table name.
$fpointer = fopen(FOLDER-LOCATION.$file_name_flat, 'w+'); // Open CSV file. Replace
FOLDER-LOCATION with your local folder path where you want to save this CSV files.
//Execute query to get all columns data
$query = "select * FROM TABLE-NAME WHERE 1"; // Replace TABLE-NAME with your selected
table name. You can set other conditions based on your requirement.
//Execute query as per your CMS / Framework coding standards and write CSV file.
$result_flat = $DB_Connection->query($query)->fetchAll('assoc');
foreach ($result_flat as $fields) {
fputcsv($fpointer, $fields);
}
//Prepare Array of CSVs to create ZIP file
$files = array($file_name_flat);
fclose($fpointer); // close CSV file after successfully write.
?>
CREATE ZIP of CSVs
//Create ZIP
$zipname = 'tables_'.date('Y-m-d-H-i-s').'.zip';
createZipFile($files,$zipname,FOLDER_LOCATION); //Replace FOLDER-LOCATION with your
local folder path where you saved CSV files.
/* createZipFile Funcation to create zip file Both params are mandatory */
function createZipFile($files_names = array(),$zipfileName, $files_path=""){
$zip = new \ZipArchive;
$zip->open(TMP.$zipfileName, \ZipArchive::CREATE);
foreach ($files_names as $file) {
$zip->addFile($files_path.$file,$file);
}
$zip->close();
foreach ($files_names as $file) {
unlink($files_path.$file);
}
///Then download the zipped file.
header('Content-Type: application/zip');
header('Content-disposition: attachment; filename='.$zipfileName);
header('Content-Length: ' . filesize(FOLDER_LOCATION.$zipfileName));
readfile(TMP.$zipfileName);
unlink(TMP.$zipfileName);
die;
}
Now Implement a form to upload this zip file on live server. In Post action of this form add code to get zip file.
$filename = $_FILES['filename']['name'];
$source = $_FILES["filename"]["tmp_name"];
//Upload zip file to server location. Replace SERVER_FOLDER_PATH to server's location
where you want to save uploaded zip.
if(move_uploaded_file($source, SERVER_FOLDER_PATH)) {
//Extract ZIP file
$zip = new \ZipArchive();
$x = $zip->open($target_path);
if($x === true) {
$zip->extractTo(PATH_TO_SAVE_EXTRACTED_ZIP); // change this to the correct site path
$zip->close();
$cdir = scandir(PATH_TO_SAVE_EXTRACTED_ZIP); // Read DIR
$fieldSeparator = ",";
$lineSeparator = '\n';
foreach ($cdir as $key => $value)
{
if (!in_array($value,array(".","..")))
{
$fileName = PATH_TO_SAVE_EXTRACTED_ZIP.$value; // replace
PATH_TO_SAVE_EXTRACTED_ZIP with your server path
$tableName = SET_TABLE_NAME; // You have to set the logic to get the table name.
if (is_file($fileName))
{
// User MYSQL "LOAD DATA LOCAL INFILE" to IMPORT CSVs into particular tables. There are option available for this LOAD DATA process. It will import your CSV to particular table. No need to execute loop to insert data one by one.
$q = 'LOAD DATA LOCAL INFILE "'.$fileName.'" REPLACE INTO TABLE '.$tableName.' FIELDS TERMINATED BY "' .$fieldSeparator. '" Enclosed BY '.'\'"\''.' LINES TERMINATED BY "'.$lineSeparator.'"';
$DB_Connection->query($q);
}
}
}
}
You can check LOAD DATA of MySQL from - MYSQL
You can run a cron job on your local computer that exports the MySQL data using mysqldump and then uploads it to the server using rsync and sshpass.
What would be the best way to take an .sql file that includes schema and table creation statements and use it to create new databases from within CodeIgniter? What I want to be able to do is use this .sql file as a blueprint for several databases with the same schema, but which may be created at any time.
I imagine I just need to be able to take this file, extract the contents and echo it out into a database query. Is there a better way to do it?
I also need to be able to inject a custom database name into the statements before submitting the query. How would I go about this? Just have a placeholder keyword and do a preg replace with the database name?
Just to ensure all databases are maintained synchronously, I thought perhaps this blueprint schema should be added to CodeIgniter as a module. That way if I need to alter the schema, I can upload a new version of the module with the updated .sql file and perform the necessary migrations across all the databases. Sound reasonable? If so, how would I go about this?
I have done this (run a .sql file) before, and I used this;
$sql = read_file('path/to/file.sql');
$final = '';
foreach(explode("\n", $sql) as $line)
{
if ( isset($line[0]) && $line[0] != '#' )
{
$final .= $line . "\n";
}
}
foreach (explode(";\n", final) as $sql)
{
if ($sql)
{
$this->db->query($sql);
}
}
Is there any way I can save the data of a specific table of the sugarcrm database into a doc file ?
I have a custom module with username,some notes and date. I want to write this into the database and into a file as well.
Its not just a php file. I want to use logic hooks and write the code. I want to use the logic hooks to access database and then write the data into the file.
Thanks in advance
Saving as a DOC file probably isn't the best idea, since it is primarily used for formatting information. A standard .txt file is usually what you would use for such a process.
With that said, there isn't any methods built into sugar that will let you do this. You will need to build the capability into the module.
What exactly are you attempting to accomplish? There is a very powerful auditing tool set, which is good for seeing revisions to a module object. If you are just wanting to monitor changes to the table, you can setup logging for that table/database inside of SQL.
+++Ok, if you are just looking to write to a file after saves, follow the instructions at: http://cheleguanaco.blogspot.com/2009/06/simple-sugarcrm-logic-hook-example.html for a quick how-to on getting the logic hooks working. You are going to want to make a php file that simply uses the data passed to it via the bean class, and either writes to the file directly from the data within bean, or uses the bean->id parameter to do a SQL query and write to the file from that data.
Also, is this a DOC file that is going to be immediately generated and then destroyed at the end of the transaction? Or is it more of a log file that will be persistent?
++++That is simple enough then
Where you have the Query right now, replace it with:
$file = fopen($pathAndNameOfFile, 'a+') or die('Could not open file.');
$query = "SELECT * FROM data_base.table";
$result = $bean->db->query($query,true);
$dbRowData = $bean->db->mysql_fetch_assoc($result);
$printedArray = print_r($dbRowData);
fwrite($file, $printedArray) or die('Could not write to file.');
fclose($file);
*A quick note, you might need to set permissions in order to be able to read/write to the file, but those are specific to the machine type, so if you encounter errors with either do a search for setting permissions for your particular server type.
**Also, 'SELECT * FROM database.table' is going to return ALL of the rows in the entire table. This will generate a very large file, and be a performance hindrance as the table grows. You should use the bean class to update the last saved tuple:
$file = fopen($pathAndNameOfFile, 'a+') or die('Could not open file.');
$query = "SELECT * FROM data_base.table WHERE id = '".$focus->id."';";
$result = $bean->db->query($query,true);
$dbRowData = $bean->db->mysql_fetch_assoc($result);
$printedArray = print_r($dbRowData);
fwrite($file, $printedArray) or die('Could not write to file.');
fclose($file);
You can export/dump mysql databases into SQL files using mysqldump
mysqldump -u userName -p databaseName tableName > fileName.sql
I'm trying to use 1 include file for both perl and php
Is there a nice way to import a myphp.inc file within perl?
$ cat myphp.inc
<?php
$some_var="hello world";
?>
Using the above in my test.php works fine:
include "myphp.inc";
If I remove the < ? php then test.php will just print out the contents of the myphp.inc file... if I leave them in then my perl programs complains with:
Unterminated <> operator
I've seen the perl module: PHP::Include but I would like to stay away from external modules if possible.
Anyone have ideas on doing this??
Don't try to write code that is both PHP and Perl, they are different languages, even if they have some shared ancestry. If you want to share data between the two, then use a structured data format. JSON is a popular flavour. PHP has parse_json and Perl has the JSON module.
I would like to stay away from external modules if possible
Code reuse is a virtue … although there is nothing stopping you reimplementing the modules from scratch.
Ideally you would store the shared/configuration data in a format easily readable in both PHP and Perl. XML, JSON, or a simple text file with key-value pairs (as in the .ini file that simbabque suggests) would work great.
If you are determined to read the PHP file in Perl but you do not want to use a module such as PHP::Include then you are left with writing something like this:
use IO::File;
sub require_php {
my $source_filename = $_[0];
my $dest_filename = 'temp.inc.pl';
open my $source, $source_filename or die "Could not open $source_filename: $!";
open my $destination, '>>'.$dest_filename or die "Cound not open file for writing";
while(my $line = <$source>) {
if(index($line,'<?php')==-1 && index($line,'?>')==-1) {
print $destination $line
}
}
close $destination;
close $source;
require $dest_filename;
unlink $dest_filename;
}
our $some_var = '';
require_php('myphp.inc');
which will end with $some_var having the value of "hello world".
I want the following functionality using php
I have a csv file. Each file corresponds to a row in my database
There is a html form that will allow me to choose the csv file.
Then once the form is submitted, I must parse the csv file and insert data into the db accordingly
How do I go about doing this ?
Reading a CSV file can generally be done using the fgetcsv function (depending on your kind of CSV file, you might have to specify the delimiter, separator, ... as parameters)
Which means that going through you file line by line would not be much harder than something like this :
$f = fopen('/path/to/file', 'r');
if ($f) {
while ($line = fgetcsv($f)) { // You might need to specify more parameters
// deal with $line.
// $line[0] is the first column of the file
// $line[1] is the second column
// ...
}
fclose($f);
} else {
// error
}
(Not tested, but example given on the manual page of fgetcsv should help you get started)
Of course, you'll have to get the correct path to the uploaded file -- see the $_FILE superglobal, and the section on Handling file uploads, for more informations about that.
And, to save the data into your database, you'll have to use the API which suits your DB engine -- if using MySQL, you should use either :
mysqli
Note that you should prefer mysqli, instead of the old mysql extension (which doesn't support features added in MySQL >= 4.1)
or PDO