I insert data into table as bulk upload,
$handle = fopen($_FILES['file_clg']['tmp_name'], "r");
fgetcsv($handle);
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$collegename = trim(str_replace(array("’","'"),"'",$data[0]));
$description = trim(str_replace(array("’","'"),"'",$data[1]));
$sql1 = $db->selectquery("insert into $tbl(name,details)values('" .$collegename."','" .$description."')");
}
fclose($handle);
Only two fields is mentioned here: morethan 25 columns in my bulkupload csv
The problem is that the csv delimiter is the comma (',') but in some cases 'details' field contents include commas, as in this case record not inserted properly..
how to solve this case???
And a problem in insertion section,
College name : Alva’s Institute of Engineering & Technology (AIET)
and its saved in table as below format :
Alva�s Institute of Engineering & Technology (AIET)
I try below code:
$collegename = htmlentities(iconv("cp1252", "utf-8", trim(str_replace(array("’","'"),"'",$data[0]))), ENT_IGNORE, "UTF-8");
but its not working, how can i solve the issue in single quotes
And i placed : header("Content-Type: text/html; charset=ISO-8859-1");
into the header section..
I'd need to see some samples to say anything with confidence, but there are specs on quoting values with commas to preserve the value count.
https://stackoverflow.com/a/769675/2943403
Alternatively, you could create a new fputcsv() code block that will generate a semi-colon (or other non-conflicting character) delimited csv file. I am not providing the actual snippet for this. There are many available resources on SO, including:
php fputcsv use semicolon separator in CSV
Export to CSV via PHP
PHP How to convert array into csv using fputcsv function
Then your while loop could use ; (or whatever) as a delimiter.
As for safely placing your values (which may have single quotes) into your query, use prepared statements
// I realize your query will be much larger than the sample...
// declare $collegename and $decription values
$stmt=$db->prepare("INSERT INTO `$tbl` (`name`,`details`) VALUES (?,?)");
$stmt->bind_param("ss", $collegename,$description);
if($stmt->execute()){
echo "success";
}else{
echo "Error: ",$db->error;
}
Related
Multiple CSV files have consistent 26 column headers. I am using plain PHP to insert the data in MySQL. Data may contain > 100k rows.
Here is my sample CSV file.
Following is my code:
while(($line = fgetcsv($csvFile)) !== FALSE){
//to get the data from csv
$account_code = $line[0];
$caller_number = $line[1];
$callee_number = $line[2];
.
.
.
$action_type=$line[23];
$source_trunk_name=$line[24];
$dest_trunk_name =$line[25];
$query = $db->query("INSERT INTO `cdrnew`
(`account_code`, `caller_number`, `callee_number`, ...........,`action_type`, `source_trunk_name`, `dest_trunk_name`)
VALUES ('".$account_code."','".$caller_number."', '".$callee_number."', ............,'".$action_type."','".$source_trunk_name."','".$dest_trunk_name."')");
Is this the right approach to insert the data from CSV to MySQL if I have consistent 26 column headers across all the CSV files. Or is there a better approach?
I would suggest using a parameterised and bound query. The benefit above and beyond the obvious, of removing the possibility of SQL Injections, would be that you can compile the query once but run it many times with new parameters each time.
This stops the database having to do 1000's of query compilations and therefore 1000 of unnecessary round trips to the server and back.
First move the query outside the loop and prepare it there
$stmt = $db->prepare("INSERT INTO `cdrnew`
(`account_code`, `caller_number`, `callee_number`,
.,.,.,.,.,.,
`action_type`, `source_trunk_name`, `dest_trunk_name`)
VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)";
while(($line = fgetcsv($csvFile)) !== FALSE){
$stmt->bind_param('ssssssssssssssssssssssssss',
$line[0], $line[1], $line[2], $line[3], $line[4],
$line[5], $line[6], $line[7], $line[8], $line[9],
$line[10], $line[11], $line[12], $line[13], $line[14],
$line[15], $line[16], $line[17], $line[18], $line[19],
$line[20], $line[21], $line[22], $line[23], $line[24],
$line[25]
);
$stmt->execute();
}
NOTE: You may want to add some error checking to this base layout
I have a problem. I'm trying to get some data from a database into a .csv table.
$fn=fopen($path.$filename, "w");
$addstring = file_get_contents($path.$filename);
$addstring = 'Azonosito;Datum;Ido;Leiras;IP-cim;allomasnév;MAC-cim;Felhasznalonev;Tranzakcioazonosito;Lekerdezes eredmenye;Vizsgalat ideje;Korrelacios azonosito;DHCID;';
/*$addstring .= "\n";*/
$sql="select * from dhcpertekeles.dhcpk";
$result =mysqli_query($conn, $sql);
if ($result=mysqli_query($conn,$sql))
{
while ($row=mysqli_fetch_row($result))
{
$addstring .= "\n".$row[0].";".$row[1].";".$row[2].";".$row[3].";".$row[4].";".$row[5].";".$row[6].";".$row[7].";".$row[8].";".$row[9].";".$row[10].";".$row[11].";".$row[12].";";
};
};
/*file_put_contents($path.$filename, $addstring);*/
fwrite($fn, $addstring);
fclose($fn);
The data is in the following format:
The first addstring contains the column names, and has no issues
the second (addstring .=) contains the data:
ID($row[0]), Date($row[1]), Time($row[2]), Description($row[3]), IP($row[4]), Computer name($row[5]), MAC($row[6]), User($row[7])(empty), Transactionid($row[8]), query result($row[9]), query time($row[10]), correlation id($row[11])(empty), DHCID($row[12])(empty)
It is basically daily DHCP server data, uploaded to a database. Now, the code works, it does write everything i want to the csv, but there are 2 problems.
1, the code for some inexplicable reason, inserts an empty row into the csv table between the rows that contain the data. Removing $row[12] fixes this. I tried removing special characters, converting spaces into something that can be seen, and even converting empty string into something that can be seen. Yet nothing actually worked, i even tried file_puts_content(same for the second problem) instead of fwrite, but nothing. The same thing keeps happening. If i remove \n it will work, but the 2nd row onwards will be misplaced to the right by 1 column.
2, For some reason, the last 2 character is removed from the csv. The string that is to be inserted into the csv still contains said 2 characters before writing it to the file. Tried both fwrite and file_puts_content.
As for the .csv format, the data clumns are divided by ; and rows by \n.
Also tried reading the file with both libre office and excel thinking it might be excel that was splurging but no.
Try using fputcsv() function. I didn't test following code but I think it should work.
$file = fopen($path . $filename, 'w');
$header = array(
'Azonosito',
'Datum',
'Ido',
'Leiras',
'IP-cim',
'allomasnév',
'MAC-cim',
'Felhasznalonev',
'Tranzakcioazonosito',
'Lekerdezes eredmenye',
'Vizsgalat ideje',
'Korrelacios azonosito',
'DHCID'
);
fputcsv($file, $header, ';');
$sql = "select * from dhcpertekeles.dhcpk";
$result = mysqli_query($conn, $sql);
if ($result = mysqli_query($conn, $sql)) {
while ($row = mysqli_fetch_row($result)) {
fputcsv($file, $row, ';');
}
}
fclose($file);
The $addstring = file_get_contents($path.$filename) doesn't does nothing because you're overwriting that variable in the next line.
To remove the extra row on 12 did you tried removing the \n AND the \r with something like:
$row[12] = strtr($row[12], array("\n"=>'', "\r"=>''));
You can also check which ascii characters are you receiving in the $row[12] with this function taken form the php site:
function AsciiToInt($char){
$success = "";
if(strlen($char) == 1)
return "char(".ord($char).")";
else{
for($i = 0; $i < strlen($char); $i++){
if($i == strlen($char) - 1)
$success = $success.ord($char[$i]);
else
$success = $success.ord($char[$i]).",";
}
return "char(".$success.")";
}
}
Another tip can be the database it's returning UTF-8 or UTF-16 and you're losing some characters in the text file.
Try looking at that with the mb_detect_encoding function.
This is a very strange issue that I have and I don't understand what's causing it.
Basically, the issue is that I have a simple upload function in my PHP that uploads a CSV file and then imports each row into the MYSQL database.
Now, the issue is that I have around 200+ rows in my CSV file but when i upload and import it into the MYSQL using my PHP page, I only get around 158 of them imported and I don't get any errors at all either so i don't understand what's causing this.
I have another CSV file that has around 300+ rows in it and when I upload/import this CSV file, i get around 270 rows imported into MYSQL.
This is like the import function is always short a few rows and I don't understand it at all.
This is my PHP import code:
error_reporting(-1);
ini_set('display_errors', 'On');
if (isset($_POST['UP'])) {
include "config/connect.php";
$imp= $_FILES["csv"]["name"];
move_uploaded_file($_FILES["csv"]["tmp_name"],"imports/$imp");
// path where your CSV file is located
////////////////////////////////////////////////////////
define('CSV_PATH','');
// Name of your CSV file
$csv_file = CSV_PATH . "imports/".$imp."";
$i = 0;
set_time_limit(10000);
$fp = fopen("imports/".$imp."", "r");
while( !feof($fp) ) {
if( !$line = fgetcsv($fp, 1000, ',', '"')) {
continue;
}
$sql0 = "INSERT INTO `myTable`(`column1`) VALUES('".$line[0]."')";
$query0 = mysqli_query($db_conx, $sql0);
}
fclose($fp);
printf("<script>location.href='mypage.php'</script>");
exit();
}
Using direct import into MYSQL is out of question due to security issues/hole..
Could someone please advice on this issue?
Any help would be appreciated.
To get around quoting issues, you want to use prepared statements with bind_param.
Procedural style:
$stmt = mysqli_prepare($db_conx, "INSERT INTO `myTable`(`column1`) VALUES(?)");
mysqli_stmt_bind_param($stmt, 's', $line[0] );
mysqli_stmt_execute($stmt);
Object-oriented style:
$stmt = $mysqli->prepare("INSERT INTO `myTable`(`column1`) VALUES(?)");
$stmt->bind_param('s', $line[0]);
$stmt->execute();
Per the docs, use s for strings, i for integers, and d for doubles/decimals.
So I'm trying to make it so that I can update a MySQL database by importing a CSV file, only problem is I am seeing some of my data has commas, which is causing the data to be imported into the wrong tables. Here's my existing import code.
if ($_FILES[csv][size] > 0) {
//get the csv file
$file = $_FILES[csv][tmp_name];
$handle = fopen($file,"r");
//loop through the csv file and insert into database
do {
if ($data[0]) {
mysql_query("INSERT INTO songdb (artist, title) VALUES
(
'".addslashes($data[0])."',
'".addslashes($data[1])."'
)
") or die (mysql_error());
}
} while ($data = fgetcsv($handle,1000,",","'"));
//
//redirect
header('Location: import.php?success=1'); die;
}
Is there a way I can set it to ignore the commas, quotes and apostrophes in the CSV file?
I would also let to set it to ignore the first line in the csv, seeing as how it's just column information. If that is at all possible.
** EDIT **
For example if the CSV contains data such as "last name, first name", or "User's Data", these are literally just examples of the data that's actually in there. The data is imported each month and we've just noticed this issue.
Sample Data:
Column 1, Column 2
Item 1, Description
Item 2, Description
Item, 3, Description
Item, 4, Description
"Item 5", Description
"Item, 6", Description
Above is the sample data that was requested.
You might want to use MySQL's built-in LOAD DATA INFILE statement which not only will work faster, but will let you use the clause FIELDS OPTIONALLY ENCLOSED BY '"' to work with that kind of files.
So your query will be something like that:
mysql_query(<<<SQL
LOAD DATA LOCAL INFILE '{$_FILES['csv']['tmp_name']}'
INTO TABLE songdb
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\\n'
IGNORE LINES 1 (artist, title)
SQL
) or die(mysql_error());
If your data is dirty, the easiest way to handle this will be to clean it up manually, and either use data entry forms that strip out bad characters and/or escape the input data, or tell the users who are generating this data to stop putting commas in fields.
Your example has inconsistent column count and inconsistent fields due to lack of escaping input in whatever they used to generate this data.
That said, you could do some advanced logic to igore any comma after Item but before a space or digit, using regular expressions, but that is getting kind of ridiculous and depending on the number of rows, it may be easier to clean it up manually before importing.
In terms of skipping the header row, you can do this:
if ($_FILES[csv][size] > 0) {
//get the csv file
$file = $_FILES[csv][tmp_name];
$handle = fopen($file,"r");
$firstRow = false;
//loop through the csv file and insert into database
do {
if ($data[0]) {
// skip header row
if($firstRow) {
$firstRow=false;
continue;
}
mysql_query("INSERT INTO songdb (artist, title) VALUES
(
'".addslashes($data[0])."',
'".addslashes($data[1])."'
)
") or die (mysql_error());
}
} while ($data = fgetcsv($handle,1000,",","'"));
//
//redirect
header('Location: import.php?success=1'); die;
}
Oh I just read your comment, 5gb. Wow. Manual cleanup is not an option. You need to look at the range of possible ways the data is screwed up and really assess what logic you need to use to capture the right columns.
Is your example above a representative sample or could other fields without enclosures have commas?
Try this, this is working fine for me.
ini_set('auto_detect_line_endings',TRUE);
$csv_data=array();
$file_handle = fopen($_FILES['file_name']['tmp_name'], 'r');
while(($data = fgetcsv($file_handle) ) !== FALSE){
$update_data= array('first'=>$data['0'],
'second'=>$data['1'],
'third'=>$data['2'],
'fourth'=>$data['34']);
// save this array in your database
}
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I would like to create and upload page in php and import the uploaded csv file data into multiple tables. tried searching here but looks like can't find any which is importing from a csv to multiple table. any help here is greatly appreciated. thank you.
As the another variant proposed above, you can read your CSV line-by-line and explode each line into fields. Each field will corresponds one variable.
$handle = fopen("/my/file.csv", "r"); // opening CSV file for reading
if ($handle) { // if file successfully opened
while (($CSVrecord = fgets($handle, 4096)) !== false) { // iterating through each line of our CSV
list($field1, $field2, $field3, $field4) = explode(',', $CSVrecord); // exploding CSV record (line) to the variables (fields)
// and here you can easily compose SQL queries and map you data to the tables you need using simple variables
}
fclose($handle); // closing file handler
}
If you have access to PHPmyadmin, you can upload the CSV into there. Then copy if over to each desired table
In response to your comment that some data is going to one table and other data is going to another table, here is a simple example.
Table1 has 3 fields: name, age and sex. Table2 has 2 fields: haircolour, shoesize. So your CSV could be laid out like:
john smith,32,m,blonde,11
jane doe,29,f,red,4
anders anderson,56,m,grey,9
For the next step you will be using the function fgetcsv. This will break each line of the csv into an array that you can then use to build your SQL statements:
if (($handle = fopen($mycsvfile, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
// this loops through each line of your csv, putting the values into array elements
$sql1 = "INSERT INTO table1 (`name`, `age`, `sex`) values ('".$data[0]."', '".$data[1]."', '".$data[2]."')";
$sql2 = "INSERT INTO table2 (`haircolour`, `shoesize`) values ('".$data[3]."', '".$data[4]."')";
}
fclose($handle);
}
Please note that this does not take any SQL security such as validation into account, but that is basically how it will work.
the problem seems to me to differentiate what field is for which table.
when you are sending a header like
table.field, table.field, table.field
and then split the header, you'll get all tables and fields.
could that be a way to go?
all the best
ps: because of your comment ...
A csv file has/can have a first line with fieldnames in it. when there is a need too copy csv data into more than one tables, then you can use a workaround to find out which field is for which table.
user.username, user.lastname, blog.comment, blog.title
"sam" , "Manson" , "this is a comment", "and I am a title"
Now, when reading the csv data you can work over the first line, split the title at the dot to find out wich tables are used and also the fields.
With this method you are able to copy csv data to more than one table.
But it means, you have to code it first :(
To split the fieldnames
// only the first line for the fieldnames
$topfields = preg_split('/,|;|\t/')
foreach( $topfields as $t => $f ) {
// t = tablename, f = field
}
if (($handle = fopen($mycsvfile, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
// this loops through each line of your csv, putting the values into array elements
$sql1 = "INSERT INTO table1 (`name`, `age`, `sex`) values ('".$data[0]."', '".$data[1]."', '".$data[2]."')";
$sql2 = "INSERT INTO table2 (`haircolour`, `shoesize`) values ('".$data[3]."', '".$data[4]."')";
}
fclose($handle);
}
in above code you use two insert queries how you gonna run these queries ?