Save file from mysql database to file system - php

I am storing some doc files in a mysql database that need to be saved to the file system and manipulated, however some of the documents are not opening after saving to the file system.
I am pulling the file from the database as follows:
$load_doc = "SELECT doc_id, `doc_ext`, doc_size, `doc_conten`
FROM `li_webs_trans`.`li_docs` WHERE `doc_id` = '$doc_id'";
$doc = mysql_query($load_doc) or die(mysql_error());
$fp = fopen("/var/www/OCR/temp_doc/preview__".$doc_id .".docx", "w+");
while($row = mysql_fetch_array($doc)){
fwrite($fp, $row['doc_conten']);
}
The doc_content filed is a Long BLOB. The files are all docx files.
Any idea why some files open fine but others don't?

Related

how to copy mysql DB from localhost to server using php?

I have a php project running in my local machine(angular,php,mysql).Same copy of project running in online.
My Aim is to Sync(copy local db to server db) every one hour by running any PHP Script using angular 'set Interval' function.
What is the IDEA behind this functionality should i use?
or how i will achieve this ?
Any suggestions will be great help for me,
and Thanks in advance.
If your database tables not gonna change what you can do is create a function select all the data from your local database and pass that data to online function to update your online database with new or updated records.
For ex:
If you have a table called users. From AJAX you will select all local data and create JSON Object pass the data to script function.
From that JSON Object you will pass data to online php file and update your online database from it.
Note: You have to careful with giving lot of conditions to check whether data get missing or override.
You'll have to write a service and some code to dump your database (if you want to sync complete database every time) follow this answer
After dumping your sql next you have to upload the file to your server via the service. Upon receiving you can load the data again mysql -u username -p database_name < file.sql
However I won't recommend this, try exploring database approach of Master-Slave Database, where your local server's database will be a Master and your remote server will be slave. Your data will automatically be synchronized.Please see this tutorial
You can implement interface to select tables you want to import in live. Use below code generate CSV files of selected tables and prepare array.
<?php
$file_name_flat = 'TABLE-NAME.csv'; // Replace TABLE-NAME with your selected table name.
$fpointer = fopen(FOLDER-LOCATION.$file_name_flat, 'w+'); // Open CSV file. Replace
FOLDER-LOCATION with your local folder path where you want to save this CSV files.
//Execute query to get all columns data
$query = "select * FROM TABLE-NAME WHERE 1"; // Replace TABLE-NAME with your selected
table name. You can set other conditions based on your requirement.
//Execute query as per your CMS / Framework coding standards and write CSV file.
$result_flat = $DB_Connection->query($query)->fetchAll('assoc');
foreach ($result_flat as $fields) {
fputcsv($fpointer, $fields);
}
//Prepare Array of CSVs to create ZIP file
$files = array($file_name_flat);
fclose($fpointer); // close CSV file after successfully write.
?>
CREATE ZIP of CSVs
//Create ZIP
$zipname = 'tables_'.date('Y-m-d-H-i-s').'.zip';
createZipFile($files,$zipname,FOLDER_LOCATION); //Replace FOLDER-LOCATION with your
local folder path where you saved CSV files.
/* createZipFile Funcation to create zip file Both params are mandatory */
function createZipFile($files_names = array(),$zipfileName, $files_path=""){
$zip = new \ZipArchive;
$zip->open(TMP.$zipfileName, \ZipArchive::CREATE);
foreach ($files_names as $file) {
$zip->addFile($files_path.$file,$file);
}
$zip->close();
foreach ($files_names as $file) {
unlink($files_path.$file);
}
///Then download the zipped file.
header('Content-Type: application/zip');
header('Content-disposition: attachment; filename='.$zipfileName);
header('Content-Length: ' . filesize(FOLDER_LOCATION.$zipfileName));
readfile(TMP.$zipfileName);
unlink(TMP.$zipfileName);
die;
}
Now Implement a form to upload this zip file on live server. In Post action of this form add code to get zip file.
$filename = $_FILES['filename']['name'];
$source = $_FILES["filename"]["tmp_name"];
//Upload zip file to server location. Replace SERVER_FOLDER_PATH to server's location
where you want to save uploaded zip.
if(move_uploaded_file($source, SERVER_FOLDER_PATH)) {
//Extract ZIP file
$zip = new \ZipArchive();
$x = $zip->open($target_path);
if($x === true) {
$zip->extractTo(PATH_TO_SAVE_EXTRACTED_ZIP); // change this to the correct site path
$zip->close();
$cdir = scandir(PATH_TO_SAVE_EXTRACTED_ZIP); // Read DIR
$fieldSeparator = ",";
$lineSeparator = '\n';
foreach ($cdir as $key => $value)
{
if (!in_array($value,array(".","..")))
{
$fileName = PATH_TO_SAVE_EXTRACTED_ZIP.$value; // replace
PATH_TO_SAVE_EXTRACTED_ZIP with your server path
$tableName = SET_TABLE_NAME; // You have to set the logic to get the table name.
if (is_file($fileName))
{
// User MYSQL "LOAD DATA LOCAL INFILE" to IMPORT CSVs into particular tables. There are option available for this LOAD DATA process. It will import your CSV to particular table. No need to execute loop to insert data one by one.
$q = 'LOAD DATA LOCAL INFILE "'.$fileName.'" REPLACE INTO TABLE '.$tableName.' FIELDS TERMINATED BY "' .$fieldSeparator. '" Enclosed BY '.'\'"\''.' LINES TERMINATED BY "'.$lineSeparator.'"';
$DB_Connection->query($q);
}
}
}
}
You can check LOAD DATA of MySQL from - MYSQL
You can run a cron job on your local computer that exports the MySQL data using mysqldump and then uploads it to the server using rsync and sshpass.

PHP. Transferring downloaded csv to mysql database - how can I store downloaded csv file in php memory

I would like to store data downloaded from a website into my mysql database.
I use my function "CallAPI("GET", $url, $data = false)" to access the database using a url such as "http://www.xflow1.com/xGlobalHist.csv/"...
So my call $results = CallAPI($method, $url, $data = false); returns a comma delimted array that is saved in the variable "$results". I can echo $results in a web page and it show me the data, all comma delimited. All good `til here.
To upload the csv to my mysql database I want to use the "LOAD DATA INFILE" function as so:
$upload = <<<eof
LOAD DATA INFILE $results
INTO TABLE X_Adjusted_All
FIELDS TERMINATED BY '|' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(Cusip, Date, Price)
eof;
The snag arises as "LOAD DATA INFILE $results" does not work as LOAD DATA INFILE only wants a filename, so I would like to store $results as a csv file in the memory to avoid creating and deleting files all the time. I thought this may work:
$fp = fopen('php://memory', 'w');
fputcsv($fp, $results);
Alas no. Does anyone have any idea how to take the downloaded csv file and save it to the phps memory as a csv file for use in the LOAD DATA INFILE function?
MySQL cannot read directly from PHP's memory. You need to create an external file using tempnam(). That file can then be read by MySQL. You could also create a temporary file and find out the filename as mentioned in this question.
The example of tempnam() shows how to write to that file:
<?php
$tmpfname = tempnam("/tmp", "FOO");
$handle = fopen($tmpfname, "w");
fwrite($handle, $results);
fclose($handle);
// do something
unlink($tmpfname);
This will store the contents of $results to a file called $tmpfname which you can store in mysql using LOAD DATA INFILE $tmpfname.
As mentioned in the comments, you could create a ram drive to store that file, if performance is an issue.

PHP: retrieving full path from a selected file to import data to database

I want to allow the user to select a file from which data is exported and saved into a database. I found the following code to import the data:
$fp = fopen("people.csv", "r");
while ($line = fgets($fp)) {
$char = split("\t",$line,4);
list($name, $age,$height,$date) = $char;
//Split the line by the tab delimiter and store it in our list
$sql = "insert into people(name,age,height,date) values('$name','$age','$height','$date')"; // Generate our sql string
mysql_query($sql) or die(mysql_error()); // Execute the sql
//fclose($line);
I'm not sure if it works but i'm yet to get that far. My question is how do I get the full path name of the selected file so I can feed it into the following command rather than hard coding the filename:
$fp = fopen("people.csv", "r");
I have spent alot time researching this but to no avail.
If you want to let users upload this file, you should implement file upload. Please check W3Schools Tutorial for instructions.
The main idea is to get file path from $_FILES["file"]["tmp_name"]:
$fp = fopen($_FILES["file"]["tmp_name"], "r");
If you store your file statically in your web project folder, you can use the path including your DOCUMENT_ROOT:
$path = $_SERVER["DOCUMENT_ROOT"] + 'relative/path/to/your/file/' + 'people.csv';
You can obtain the path of the current file via the __DIR__ magic constant;
Magic constants
In your case;
echo __DIR__ . 'people.csv';
Also, you may consider using the built-in CSV functions that PHP offers;
http://php.net/manual/en/function.fgetcsv.php
If you want to skip processing the file via PHP altogether, MySQL offers ways to directly import data from a CSV file;
https://dev.mysql.com/doc/refman/5.5/en/load-data.html

php and CSV file upload,what if the file has a virus?

im using this code to upload a CSV file to my server folder and extract the data to be inserted to the database. what if the user accidentally uploaded a CSV file with virus? does the system/server be enfected with the virus since the system does not execute the file?
$name = ($_FILES['fileuploaded']['name']);
$tmp_name = ($_FILES['fileuploaded']['tmp_name']);
$_SESSION['username']=$username;
if($name){
$location = "files/$name";
move_uploaded_file($tmp_name,$location);
$file_handle = fopen("files/".$name, "r");
}
"what if the user accidentally uploaded a CSV file with virus?". I would be more concerned with the question "what if the user INTENTIONALLY uploaded a CSV file with virus". Since you are not checking for the file type, someone could upload a file called bad.php and guess that you might put it in a folder called files and then they could execute it and do all sorts of damage.

Download file which was upload to mysql to server directory

I've searched and searched but have not found a script capable of downloading files which are uploaded to my mysql database down to a directory in my server.
This is only temporary and once the file has been used it'll need to be disposed of.
Got any ideas?
I suppose you meant that the file is saved into mysql database like BLOB (binary data)
$result = mysql_query("select * from tablename where id=XXXX");
if (mysql_num_rows($result)>0){
$r = mysql_fetch_array($result,MYSQL_ASSOC);
file_put_contents($filename,$r['column_blob']);
}
I used file_put_contents function that it is binary safe, but there is also fwrite

Categories