I'm trying to import data from my students.csv file into mysql using php. The entries in the csv file is in such a way that column (student_number, fname, lname, level) will be inserted into biodata table..
I'm also uploading the student.csv file from my computer.
When I run the page I dont get anything out on the screen.
session_start();
require('includes/dbconnect.php');
require 'includes/header.inc.php';
//check for file upload
if (isset($_FILES['csv_file']) && is_uploaded_file($_FILES['csv_file']['tmp_name'])) {
//upload directory
$upload_dir = "C:\Users\DOTMAN\Documents\students.csv";
//create file name
$file_path = $upload_dir . $_FILES['csv_file']['name'];
//move uploaded file to upload dir
if (!move_uploaded_file($_FILES['csv_file']['tmp_name'], $file_path)) {
//error moving upload file
echo "Error moving file upload";
}
//open the csv file for reading
$handle = fopen($file_path, 'r');
//turn off autocommit and deletethe bio data
mysql_query("SET AUTOCOMMIT=0");
mysql_query("BEGIN");
mysql_query("TRUNCATE TABLE biodata") or die(mysql_error());
while (($data = fgetcsv($handle, 1000, ',')) !== FALSE) {
//Access field data in $data array ex.
$student_number = $data[0];
$fname = $data[1];
$lname = $data[2];
$level = $data[3];
//Use data to insert into db
$query = "INSERT INTO biodata (student_number, fname, lname, level)
VALUES ('$student_number', '$fname', '$lname', '$level')";
mysql_query($query) or die (mysql_error());
}
}
I'd suggest you to upload CSV-file with LOAD DATA INFILE command. This is fast method.
if you only need to do this once, i would consider using something like: http://csv2sql.com/
One immediate issue I can see is here:
$upload_dir = "C:\Users\DOTMAN\Documents\students.csv";
//create file name
$file_path = $upload_dir . $_FILES['csv_file']['name'];
You are already assigning the entire path, including the file name, to the $upload_dir variable - and then you're appending the uploaded file name again.
If you think there are errors in your code, start by adding
ini_set('display_errors', 1);
error_reporting(E_ALL);
to the beginning of your PHP code and fix any warnings/errors displayed. You can then turn off printing error messages by changing the second parameter to 0 in the first call.
Have u debug the $_FILES:
print_r($_FILES);
before doing any thing
Solution using PHP
$file = 'path/to.csv';
$lines = file($file);
$firstLine = $lines[0];
foreach ($lines as $line_num => $line) {
if($line_num==0) { continue; } //escape the header column
$arr = explode(",",$line);
$column1= $arr[0];
$column2= $arr[1];
echo $column1.$column2."<br />";
//put the mysql insert statement here
}
Related
here is my code to import csv data to my database
if (isset($_POST["submit"])) {
$file = $_FILES['file']['tmp_name'];
$handle = fopen($file, "r");
$c = 0;
**while(($filesop = fgetcsv($handle, 1000, ",")) !== false)
{
$name = $filesop[0];
$email = $filesop[1];
$sql = mysql_query("INSERT INTO selleruser (emaili) VALUES ('$name')");
$c = $c + 1;
}**
if ($sql) {
echo "You database has imported successfully. You have inserted ". $c ." recoreds";
} else {
echo "Sorry! There is some problem.";
}
}?>
</div>
I have a csv file where there is a column which contains emails
the import is done suvcessfull but the issue is that it just imports the value in another format like in other language or encripted or something un readable
Try this query to import csv from phpmyadmin or any other mysql client
Before firing query, please make sure that table matching the fields is already created into db.
LOAD DATA LOCAL INFILE 'local machine path to csv file' INTO TABLE `selleruser`
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(emaili);
So I have searched high and low for a relevant answer, and I have yet to find one, so I'm just going to ask the question myself. In my MySQL tables I have 5 columns of id, customer__id, file_data, file_name, mime_typethe relevant one here is file_data which is a type of LONGBLOB. Now from what I understand the size of files that a LONGBLOB can handle is pretty substantial but when I try to upload a file that is around 978 KB it fails, could it be that the dimensions of the image are too large (2048 x 1536)?
Here is the code for my uploader. It works very elegantly for things like excel sheets, pdf, word documents, and other stuff, but when it comes to images it fails:
<?php
require_once '../../inc/config.php';
$response = array();
$response['errors'] = false;
$id = $_REQUEST['id'];
if(!empty($_FILES)){
//set default data arrays
$names = array(); //stores file names
$files = array(); //stores the file data
$mime_types = array(); //store the file type as a mime type
//force each file name to the names array
foreach($_FILES['file']['name'] as $name){
array_push($names, $name);
}
//force the file data into its own array spot in the files array
foreach($_FILES['file']['tmp_name'] as $temp){
array_push($files, prepareImageDBString($temp));
}
//force the mimetypes into the mime_types array
foreach($_FILES['file']['type'] as $type){
array_push($mime_types, $type);
}
//process all three of the file arrays simultaneously so that no data is left out
for($i = 0; $i < count($names); $i++){
$file_name = $names[$i];
$file_data = $files[$i];
$mime_type = $mime_types[$i];
//set the query for the data to go into the note_file table in the database
$q = "INSERT INTO brb.files (customer__id, file_name, file_data, mime_type)
VALUES('$id', '$file_name', '$file_data', '$mime_type')";
//run the query
if($stmt = $CONN->prepare($q)){
//process any errors that may occur
if(!$stmt->execute()){
printf("Error Message: %s\n", $CONN->error);
}
}
}
}
echo json_encode($response);
function prepareImageDBString($filepath){
$out = 'null';
$handle = fopen($filepath, 'r');
if($handle){
$content = fread($handle, filesize($filepath));
$content = bin2hex($content);
fclose($handle);
$out = $content;
}
return $out;
}
?>
If someone could point me in the direction that would be fantastic. Please do not provide be an answer on why the practice is bad, I'm aware it's bad practice and know what proper practice is, this is a learning tool for myself and nothing more, just bear with me on it for a minute.
Thank you to everyone who provides help!
I have created a website which requires me to upload a csv file into the mysql database (Wamp server). Since I have never done this before a detailed answer with steps will be really helpful. I need the user to upload file using html input file option and then a php code to upload this file to mysql database. Iam using this code
<?php
$con=mysql_connect("localhost","","");
mysql_select_db("sg",$con);
define('CSV_PATH','C:/Users/mkutbudd/Desktop/');
$csv_file = CSV_PATH . "dum.csv";
if (($getfile = fopen($csv_file, "r")) !== FALSE) {
$data = fgetcsv($getfile, 1000, ",");
while (($data = fgetcsv($getfile, 1000, ",")) !== FALSE) {
$num = count($data);
for ($c=0; $c < $num; $c++) {
$result = $data;
$str = implode(",", $result);
$slice = explode(",", $str);
$col1 = $slice[0];
$col2 = $slice[1];
$col3 = $slice[2];
$query = "INSERT INTO dummy(dum1,dum2,dum3)
VALUES('".$col1."','".$col2."','".$col3."')";
$s=mysql_query($query, $con );
}
}
}
echo "File data successfully imported to database!!";
mysql_close($con);
?>
Your first step is to allow user to upload a file, and then you need to handle the errors if there are, validate the file, and if everything is correct, then move the uploaded file to its place: http://php.net/manual/en/features.file-upload.php
Then you need to parse your file, with a php function for it: http://hu1.php.net/manual/en/function.fgetcsv.php
After this, you need to create a connection to the database: http://hu1.php.net/manual/en/book.mysqli.php
And then you need to insert your data into your table with a loop, like foreach on your array, what you created from fgetcsv: http://hu1.php.net/manual/en/control-structures.foreach.php
There are an alternative way, if you have right to run system or execute functions and you have right to run mysql command, then you can use the mysql load data command: http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Done.
I am using the following script to import data into my mysql database from CSV files. The CSV is setup like this :
Name Postcode
fred hd435hg
bob dh345fj
Above is what it looks like in excel, in raw csv format viewed in notepad it looks like this :
name,postcode
frank,ng435tj
The problem I am having is for some reason the postcode column isnt getting imported at all, also the header row is getting imported as a record too, is it possible to make it skip the first row ?. I have been through the code and cant see why the postcode is not being pulled in, it is very odd.
<?php
//database connect info here
//check for file upload
if(isset($_FILES['csv_file']) && is_uploaded_file($_FILES['csv_file']['tmp_name'])){
//upload directory
$upload_dir = "./csv";
//create file name
$file_path = $upload_dir . $_FILES['csv_file']['name'];
//move uploaded file to upload dir
if (!move_uploaded_file($_FILES['csv_file']['tmp_name'], $file_path)) {
//error moving upload file
echo "Error moving file upload";
}
//open the csv file for reading
$handle = fopen($file_path, 'r');
while (($data = fgetcsv($handle, 1000, ',')) !== FALSE) {
//Access field data in $data array ex.
$name = $data[0];
$postcode = $data[1];
//Use data to insert into db
$sql = sprintf("INSERT INTO test (name, postcode) VALUES ('%s',%d)",
mysql_real_escape_string($name),
$postcode
);
mysql_query($sql) or (mysql_query("ROLLBACK") and die(mysql_error() . " - $sql"));
}
//delete csv file
unlink($file_path);
}
?>
Your CSV file seems to be a TSV file actually. It doesn't use commas, but tabulators for separating the fields.
Therefore you need to change the fgetcsv call. Instead of ',' use the tab:
while (($data = fgetcsv($handle, 1000, "\t") ...
And to also skip the header row, add another faux fgetcsv before the while block:
fgetcsv($handle);
while (...) {
That will skip the first line. (A simple fgets would also do.)
Oh, just noticed: The postcode might also get dropped because you concat it into the string as decimal with the sprintf placeholder %d for $postcode. Should that field actually contain lettery, like in your example, then that wouldn't work. -- Though I presume that's just a wrong example printout.
Try this one.
$file = $_FILES['file']['tmp_name'];
$handle = fopen($file,"r");
while(($fileop = fgetcsv($handle,1000,",")) !==false)
{
$username = $fileop[0];
$name = $fileop[1];
$address = $fileop[2];
$sql = mysql_query("INSERT INTO ...... ");
}
Following is my script to upload the table employee with a csv file. The file is uploading and upldating the employee table perfectly. BUT the problem is i specified a row with headings in the csv file. That heading row is also getting updated in the table. I want only those datas to get uploaded except the heading row in the csv file, any help or ideas?.
<?php
require_once '../config.php';
if(isset($_POST['upload']))
{
$fname = $_FILES['sel_file']['name'];
$chk_file = explode(".",$fname);
if(strtolower($chk_file[1]) == 'csv')
{
$filename = $_FILES['sel_file']['tmp_name'];
$handle = fopen($filename,"r");
while(($data = fgetcsv($handle,1000,",")) != false)
{
$sql = "INSERT into employee(employee_code,employee_name,employee_address,emp_dateofjoin,emp_designation,emp_hq,pf_num,esic_num,emp_state,month,tot_work_days,lop_days,arrear_amt,leave_encash) values('$data[0]','$data[1]','$data[2]','$data[3]','$data[4]','$data[5]','$data[6]','$data[7]','$data[8]','$data[9]','$data[10]','$data[11]','$data[12]','$data[13]')";
/$upd = "UPDATE student SET month='',tot_work_days='',lop_days='',arrear_amt='',leave_encash='' where employee_code=''";
mysql_query($sql) or die(mysql_error());
}
fclose($handle);
echo "Successfully Imported";
}
else
{
echo "Invalid File";
}
}
?>
Skip the first line by calling fgetcsv once before your loop:
fgetcsv($handle,1000,",");
while(($data = fgetcsv($handle,1000,",")) != false)
You also could use LOAD DATA INFILE MySQL statement with 'IGNORE 1 LINES' clause.