Upload CSV file and import it to database using Laravel - php

I have this method that uploads a file of a CSV format, but now i want to know how to upload it into columns in my database.
My method:
public function postUpload ()
{
if (Input::hasFile('file')){
$file = Input::file('file');
$name = time() . '-' . $file->getClientOriginalName();
// Moves file to folder on server
$file->move(public_path() . '/uploads/CSV', $name);
return 'Ok'; // return for testing
}
}
So my question would be within this method how can i can put it into the database ?

this one should work for you, it uses PDO + mySql's "LOAD DATA" approach
private function _import_csv($path, $filename)
{
$csv = $path . $filename;
//ofcourse you have to modify that with proper table and field names
$query = sprintf("LOAD DATA local INFILE '%s' INTO TABLE your_table FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\"' LINES TERMINATED BY '\\n' IGNORE 0 LINES (`filed_one`, `field_two`, `field_three`)", addslashes($csv));
return DB::connection()->getpdo()->exec($query);
}
So combined with your code, it could be something like the below
public function postUpload ()
{
if (Input::hasFile('file')){
$file = Input::file('file');
$name = time() . '-' . $file->getClientOriginalName();
//check out the edit content on bottom of my answer for details on $storage
$storage = '/some/world/readible/dir';
$path = $storage . '/uploads/CSV';
// Moves file to folder on server
$file->move($path, $name);
// Import the moved file to DB and return OK if there were rows affected
return ( $this->_import_csv($path, $name) ? 'OK' : 'No rows affected' );
}
}
EDIT
One thing to be noted, as per the error you report in comments which is probably some permissions issue (OS error code 13: Permission denied)
Please see: http://dev.mysql.com/doc/refman/5.1/en/load-data.html
"For security reasons, when reading text files located on the server,
the files must either reside in the database directory or be readable
by all. Also, to use LOAD DATA INFILE on server files, you must have
the FILE privilege. See Section 5.7.3, “Privileges Provided by
MySQL”."
As reported on mySql bug tracker (http://bugs.mysql.com/bug.php?id=31670) it seems that you need particular permission for all the folders in the csv file path:
All parent directories of the infile need world-readable I think
aswell as just the directory and infile...
So for an infile here: /tmp/imports/site1/data.file
you would need (I think, 755 worked) r+x for 'other' on these
directories: /tmp /tmp/imports
as well as the main two: /tmp/imports/site1
/tmp/imports/site1/data.file
To sum up:
To solve the "sqlstate hy000 general error 13 can't get stat of..." issue you have to move the uploaded file to a location with proper permissions (so not neccessarily the current one you are using) try something like "/tmp/import".

While load data infile is the quickest way, I prefer to use a lib like https://github.com/ddeboer/data-import or https://github.com/goodby/csv for 2 reasons.
It is extensible, what if your data source changes to excel files or a mongo db or some other method?
It is mallable, if you need to convert dates, or strings or numbers you can do it conditionally which cannot be done with a batch command.
my 2c

Related

how to copy mysql DB from localhost to server using php?

I have a php project running in my local machine(angular,php,mysql).Same copy of project running in online.
My Aim is to Sync(copy local db to server db) every one hour by running any PHP Script using angular 'set Interval' function.
What is the IDEA behind this functionality should i use?
or how i will achieve this ?
Any suggestions will be great help for me,
and Thanks in advance.
If your database tables not gonna change what you can do is create a function select all the data from your local database and pass that data to online function to update your online database with new or updated records.
For ex:
If you have a table called users. From AJAX you will select all local data and create JSON Object pass the data to script function.
From that JSON Object you will pass data to online php file and update your online database from it.
Note: You have to careful with giving lot of conditions to check whether data get missing or override.
You'll have to write a service and some code to dump your database (if you want to sync complete database every time) follow this answer
After dumping your sql next you have to upload the file to your server via the service. Upon receiving you can load the data again mysql -u username -p database_name < file.sql
However I won't recommend this, try exploring database approach of Master-Slave Database, where your local server's database will be a Master and your remote server will be slave. Your data will automatically be synchronized.Please see this tutorial
You can implement interface to select tables you want to import in live. Use below code generate CSV files of selected tables and prepare array.
<?php
$file_name_flat = 'TABLE-NAME.csv'; // Replace TABLE-NAME with your selected table name.
$fpointer = fopen(FOLDER-LOCATION.$file_name_flat, 'w+'); // Open CSV file. Replace
FOLDER-LOCATION with your local folder path where you want to save this CSV files.
//Execute query to get all columns data
$query = "select * FROM TABLE-NAME WHERE 1"; // Replace TABLE-NAME with your selected
table name. You can set other conditions based on your requirement.
//Execute query as per your CMS / Framework coding standards and write CSV file.
$result_flat = $DB_Connection->query($query)->fetchAll('assoc');
foreach ($result_flat as $fields) {
fputcsv($fpointer, $fields);
}
//Prepare Array of CSVs to create ZIP file
$files = array($file_name_flat);
fclose($fpointer); // close CSV file after successfully write.
?>
CREATE ZIP of CSVs
//Create ZIP
$zipname = 'tables_'.date('Y-m-d-H-i-s').'.zip';
createZipFile($files,$zipname,FOLDER_LOCATION); //Replace FOLDER-LOCATION with your
local folder path where you saved CSV files.
/* createZipFile Funcation to create zip file Both params are mandatory */
function createZipFile($files_names = array(),$zipfileName, $files_path=""){
$zip = new \ZipArchive;
$zip->open(TMP.$zipfileName, \ZipArchive::CREATE);
foreach ($files_names as $file) {
$zip->addFile($files_path.$file,$file);
}
$zip->close();
foreach ($files_names as $file) {
unlink($files_path.$file);
}
///Then download the zipped file.
header('Content-Type: application/zip');
header('Content-disposition: attachment; filename='.$zipfileName);
header('Content-Length: ' . filesize(FOLDER_LOCATION.$zipfileName));
readfile(TMP.$zipfileName);
unlink(TMP.$zipfileName);
die;
}
Now Implement a form to upload this zip file on live server. In Post action of this form add code to get zip file.
$filename = $_FILES['filename']['name'];
$source = $_FILES["filename"]["tmp_name"];
//Upload zip file to server location. Replace SERVER_FOLDER_PATH to server's location
where you want to save uploaded zip.
if(move_uploaded_file($source, SERVER_FOLDER_PATH)) {
//Extract ZIP file
$zip = new \ZipArchive();
$x = $zip->open($target_path);
if($x === true) {
$zip->extractTo(PATH_TO_SAVE_EXTRACTED_ZIP); // change this to the correct site path
$zip->close();
$cdir = scandir(PATH_TO_SAVE_EXTRACTED_ZIP); // Read DIR
$fieldSeparator = ",";
$lineSeparator = '\n';
foreach ($cdir as $key => $value)
{
if (!in_array($value,array(".","..")))
{
$fileName = PATH_TO_SAVE_EXTRACTED_ZIP.$value; // replace
PATH_TO_SAVE_EXTRACTED_ZIP with your server path
$tableName = SET_TABLE_NAME; // You have to set the logic to get the table name.
if (is_file($fileName))
{
// User MYSQL "LOAD DATA LOCAL INFILE" to IMPORT CSVs into particular tables. There are option available for this LOAD DATA process. It will import your CSV to particular table. No need to execute loop to insert data one by one.
$q = 'LOAD DATA LOCAL INFILE "'.$fileName.'" REPLACE INTO TABLE '.$tableName.' FIELDS TERMINATED BY "' .$fieldSeparator. '" Enclosed BY '.'\'"\''.' LINES TERMINATED BY "'.$lineSeparator.'"';
$DB_Connection->query($q);
}
}
}
}
You can check LOAD DATA of MySQL from - MYSQL
You can run a cron job on your local computer that exports the MySQL data using mysqldump and then uploads it to the server using rsync and sshpass.

file not found during import csv to mysql db table

the folders structure of my site is the following: www.mysite.it/site/scripts.
At this path there are two files: import.php and tabella.csv.
tabella.csv is a CSV file like this:
"2016-09-02", 100.01, 4005.09, 5000, 1.09, 120.09, 100.5, 200.77
"2016-09-03", 150.01, 4205.09, 5600, 1.10, 150.09, 300.5,300.77
import.php is the PHP script to execute te import and it's a file like this:
<?php
$csvFile = "../scripts/tabella.csv";
$db = #mysql_connect('****', '****', '****');
#mysql_select_db('******');
$query = 'LOAD DATA LOCAL INFILE \' '. $csvFile .' \' INTO TABLE rame
FIELDS TERMINATED BY \',\'
LINES TERMINATED BY \'\r\n\'
IGNORE 1 LINES
(
giorno,
lmedollton,
changedolleuro,
euroton,
lmesterton,
delnotiz,
girm,
sgm
)';
if(!mysql_query($query)){
die(mysql_error());
}
mysql_close($db);
?>
The error is 'file not found': I tried to use 'tabella.csv' and the the realpath also (using realpath PHP function) but the error is always the same. Which is the correct string that I have to assign to $csvFile variable? Can you help me, please?
You should provide an absolute path to your csv file e.g.
$csvFile = "/var/www/htdocs/site/scripts/tabella.csv"
because mysql will not be able to resolve the path relative to apache httpdoc-root directory. See also the Mysql-Doc:
If LOCAL is specified, the file is read by the client program on the
client host and sent to the server. The file can be given as a full
path name to specify its exact location. If given as a relative path
name, the name is interpreted relative to the directory in which the
client program was started.

Export MySQL table rows as individual files (specifically JSON) in one go

I have over 750 JSON files I need to create from a MySQL Database table.
It is the WordPress "wp_options" table, but this is a MySQL question.
The wp_options table has the following properties.
option_id, option_name, option_value, autoload
The "option_name" is to become the JSON file name.
I am fine if I "have to" rename each file name manually.
The "option_value" is to become the JSON data.
Is there a way I can do this more efficiently instead of creating an empty JSON file for each row and then copying the data base option_value to the JSON file?
My main concern is with 750 files to make I am a little weary I will miss something or double up on something, and this information has to be exact.
NOTE: I've read this stack article (which is the closest I could find) # http://goo.gl/RnV5cf. But, It doesn't seem to be working as expected given the Wordpress wp_options values I think.
If I needed to do this, and only needed to do it once, I'd probably just run a little php script locally.
Assuming you have grabbed this table as an array (here I've called it $wp_options), you could just iterate over it using fopen, fwrite and fclose to make your files. I've also assumed you want the files to have '.json' extensions but obviously you can strip that out.
foreach ($wp_options as $wpo) {
$newFile = fopen($wpo['option_name'].'.json', 'w'); // w=write mode
fwrite($newFile, json_encode($wpo['option_value']));
fclose($newFile);
}
The above is untested, but I think that would work.
Sounds like you just need a local script:
<?php
// ...
foreach ($wp_options as $wp_option)
{
$fileName = __DIR__ . '/' . $wp_option['option_name'] . '.json';
file_put_contents($fileName, json_encode($wp_option['option_value']));
}

PHP: retrieving full path from a selected file to import data to database

I want to allow the user to select a file from which data is exported and saved into a database. I found the following code to import the data:
$fp = fopen("people.csv", "r");
while ($line = fgets($fp)) {
$char = split("\t",$line,4);
list($name, $age,$height,$date) = $char;
//Split the line by the tab delimiter and store it in our list
$sql = "insert into people(name,age,height,date) values('$name','$age','$height','$date')"; // Generate our sql string
mysql_query($sql) or die(mysql_error()); // Execute the sql
//fclose($line);
I'm not sure if it works but i'm yet to get that far. My question is how do I get the full path name of the selected file so I can feed it into the following command rather than hard coding the filename:
$fp = fopen("people.csv", "r");
I have spent alot time researching this but to no avail.
If you want to let users upload this file, you should implement file upload. Please check W3Schools Tutorial for instructions.
The main idea is to get file path from $_FILES["file"]["tmp_name"]:
$fp = fopen($_FILES["file"]["tmp_name"], "r");
If you store your file statically in your web project folder, you can use the path including your DOCUMENT_ROOT:
$path = $_SERVER["DOCUMENT_ROOT"] + 'relative/path/to/your/file/' + 'people.csv';
You can obtain the path of the current file via the __DIR__ magic constant;
Magic constants
In your case;
echo __DIR__ . 'people.csv';
Also, you may consider using the built-in CSV functions that PHP offers;
http://php.net/manual/en/function.fgetcsv.php
If you want to skip processing the file via PHP altogether, MySQL offers ways to directly import data from a CSV file;
https://dev.mysql.com/doc/refman/5.5/en/load-data.html

PHP MySQL OUTFILE command

If I use the following in a mysql_query command:
SELECT *
FROM mytable
INTO OUTFILE '/tmp/mytable.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
Where is the tmp file relative to, to the MySQL database somehow or to the PHP file?
If it does not exist will it be created?
If I would like it to appear 1 folder up from the PHP file which does it, how would I do that?
According to The Documentation On Select, it's stored on the server and not on the client:
The SELECT ... INTO OUTFILE 'file_name' form of SELECT writes the selected rows to a file. The file is created on the server host, so you must have the FILE privilege to use this syntax. file_name cannot be an existing file, which among other things prevents files such as /etc/passwd and database tables from being destroyed. As of MySQL 5.0.19, the character_set_filesystem system variable controls the interpretation of the file name.
And, more to the point:
The SELECT ... INTO OUTFILE statement is intended primarily to let you very quickly dump a table to a text file on the server machine. If you want to create the resulting file on some other host than the server host, you normally cannot use SELECT ... INTO OUTFILE since there is no way to write a path to the file relative to the server host's file system.
So, don't use it in production to generate CSV files. Instead, build the CSV in PHP using fputcsv:
$result = $mysqli->query($sql);
if (!$result) {
//SQL Error
}
$f = fopen('mycsv.csv', 'w');
if (!$f) {
// Could not open file!
}
while ($row = $result->fetch_assoc()) {
fputcsv($f, $row);
}
fclose($f);
Where is the tmp file relative to?
A: The file will have the result of the select * from mytable
If it does not exist will it be created?
A: yes
If I would like it to appear 1 folder up from the php file which does it, how would I do that?
A: if you want one folder up from the fileYouAreRunning.php then make path like that: "../mytable.csv"
Your current query has an absolute path. So the outfile will not be relative to anything, but saved to /tmp/mytable.csv.
I'd say, the safest bet would be to keep useing absolute paths, so check in your php what your absolute path to the parent is, and then add this to your query using a variable.

Categories