populating database from csv file using php - php

How would I go about populating a database from info in a csv file using PHP code? I need to practice using php to make database calls but at the moment, all I have access to is this csv file...

Design Considerations:
You probably don't want to load the entire file into memory at once using a function like file_get_contents. With large files this will eat up all of your available memory and cause problems. Instead do like Adam suggested, and read one line at a time.
fgetcsv at php manual
//Here's how you would start your database connection
mysql_connect($serverName, $username, $password);
mysql_select_db('yourDBName');
//open the file as read-only
$file = fopen("file.csv", "r");
// lineLength is unlimited when set to 0
// comma delimited
while($data = fgetcsv($file, $lineLength = 0, $delimiter = ",")) {
//You should sanitize your inputs first, using a function like addslashes
$success = mysql_query("INSERT INTO fileTable VALUES(".$data[0].",".$data[1].")");
if(!$success) {
throw new Exception('failed to insert!');
}
}

just do it through phpmyadmin: http://vegdave.wordpress.com/2007/05/19/import-a-csv-file-to-mysql-via-phpmyadmin/

Use the built-in PHP functions to read the CSV and write an output file. Then you can import the SQL into your database. This should work with any type of database.
Don't forget to escape any strings you are using. I used sqlite_escape_string() for that purpose in this example.
$fd = fopen("mydata.csv", "r");
$fdout = fopen("importscript.sql","w");
while(!feof($fd))
{
$line = fgetcsv($fd, 1024); // Read a line of CSV
fwrite($fdout,'INSERT INTO mytable (id,name)'
.'VALUES ('.intval($line[0]).",'".sqlite_escape_string($line[1])."');\r\n";
}
fclose($fdout);
fclose($fd);

Related

csv file with delimiter other than comma in php

I insert data into table as bulk upload,
$handle = fopen($_FILES['file_clg']['tmp_name'], "r");
fgetcsv($handle);
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$collegename = trim(str_replace(array("’","'"),"'",$data[0]));
$description = trim(str_replace(array("’","'"),"'",$data[1]));
$sql1 = $db->selectquery("insert into $tbl(name,details)values('" .$collegename."','" .$description."')");
}
fclose($handle);
Only two fields is mentioned here: morethan 25 columns in my bulkupload csv
The problem is that the csv delimiter is the comma (',') but in some cases 'details' field contents include commas, as in this case record not inserted properly..
how to solve this case???
And a problem in insertion section,
College name : Alva’s Institute of Engineering & Technology (AIET)
and its saved in table as below format :
Alva�s Institute of Engineering & Technology (AIET)
I try below code:
$collegename = htmlentities(iconv("cp1252", "utf-8", trim(str_replace(array("’","'"),"'",$data[0]))), ENT_IGNORE, "UTF-8");
but its not working, how can i solve the issue in single quotes
And i placed : header("Content-Type: text/html; charset=ISO-8859-1");
into the header section..
I'd need to see some samples to say anything with confidence, but there are specs on quoting values with commas to preserve the value count.
https://stackoverflow.com/a/769675/2943403
Alternatively, you could create a new fputcsv() code block that will generate a semi-colon (or other non-conflicting character) delimited csv file. I am not providing the actual snippet for this. There are many available resources on SO, including:
php fputcsv use semicolon separator in CSV
Export to CSV via PHP
PHP How to convert array into csv using fputcsv function
Then your while loop could use ; (or whatever) as a delimiter.
As for safely placing your values (which may have single quotes) into your query, use prepared statements
// I realize your query will be much larger than the sample...
// declare $collegename and $decription values
$stmt=$db->prepare("INSERT INTO `$tbl` (`name`,`details`) VALUES (?,?)");
$stmt->bind_param("ss", $collegename,$description);
if($stmt->execute()){
echo "success";
}else{
echo "Error: ",$db->error;
}

PHP: CSV import into MYSQL is always less than the actual amount of rows in the CSV file?

This is a very strange issue that I have and I don't understand what's causing it.
Basically, the issue is that I have a simple upload function in my PHP that uploads a CSV file and then imports each row into the MYSQL database.
Now, the issue is that I have around 200+ rows in my CSV file but when i upload and import it into the MYSQL using my PHP page, I only get around 158 of them imported and I don't get any errors at all either so i don't understand what's causing this.
I have another CSV file that has around 300+ rows in it and when I upload/import this CSV file, i get around 270 rows imported into MYSQL.
This is like the import function is always short a few rows and I don't understand it at all.
This is my PHP import code:
error_reporting(-1);
ini_set('display_errors', 'On');
if (isset($_POST['UP'])) {
include "config/connect.php";
$imp= $_FILES["csv"]["name"];
move_uploaded_file($_FILES["csv"]["tmp_name"],"imports/$imp");
// path where your CSV file is located
////////////////////////////////////////////////////////
define('CSV_PATH','');
// Name of your CSV file
$csv_file = CSV_PATH . "imports/".$imp."";
$i = 0;
set_time_limit(10000);
$fp = fopen("imports/".$imp."", "r");
while( !feof($fp) ) {
if( !$line = fgetcsv($fp, 1000, ',', '"')) {
continue;
}
$sql0 = "INSERT INTO `myTable`(`column1`) VALUES('".$line[0]."')";
$query0 = mysqli_query($db_conx, $sql0);
}
fclose($fp);
printf("<script>location.href='mypage.php'</script>");
exit();
}
Using direct import into MYSQL is out of question due to security issues/hole..
Could someone please advice on this issue?
Any help would be appreciated.
To get around quoting issues, you want to use prepared statements with bind_param.
Procedural style:
$stmt = mysqli_prepare($db_conx, "INSERT INTO `myTable`(`column1`) VALUES(?)");
mysqli_stmt_bind_param($stmt, 's', $line[0] );
mysqli_stmt_execute($stmt);
Object-oriented style:
$stmt = $mysqli->prepare("INSERT INTO `myTable`(`column1`) VALUES(?)");
$stmt->bind_param('s', $line[0]);
$stmt->execute();
Per the docs, use s for strings, i for integers, and d for doubles/decimals.

csv file upload in php inserts only first row in database

I want to upload csv file in mysql database in php. I am using,
$handle = fopen($_FILES['filename']['tmp_name'], "r");
while (($data = fgetcsv($handle,1000,',','"')) !== FALSE) {
// insert into database...
}
fclose($handle);
But, it inserts only first row of the file.
EDIT:
when I am trying to print_r $data within while loop, then also it's giving me only one row.

small glitch while generating csv with newline character in php

I am simply generating a csv file based on data stored in a mysql table. The generated csv, when opened in excel, seems mostly ok, but whenever it has a newline character, excel puts the data on a new row. Any idea how to prevent that?
Sample data
line 1 some data
another data
CSV generation code:
header("Content-Type: text/csv; charset=UTF-8");
header("Content-Disposition: attachment; filename=\"".$MyFileName."\"");
$filename = $MyFileName;
$handle = fopen("temp_files/".$filename, "r");
$contents = fread($handle, filesize("temp_files/".$filename));
fclose($handle);
echo $contents;
exit;
content snippet I used to get rid of new line(didn't work):
$pack_inst = str_replace(',',' ',$get_data->fields['pack_instruction']);
$pack_inst = str_replace('\n',' ',$pack_inst);
$pack_inst = str_replace('\r',' ',$pack_inst);
$pack_inst = str_replace('\r\n',' ',$pack_inst);
$pack_inst = str_replace('<br>',' ',$pack_inst);
$pack_inst = str_replace('<br/>',' ',$pack_inst);
$pack_inst = str_replace(PHP_EOL, '', $pack_inst);
$pattern = '(?:[ \t\n\r\x0B\x00\x{A0}\x{AD}\x{2000}-\x{200F}\x{201F}\x{202F}\x{3000}\x{FEFF}]| |<br\s*\/?>)+';
$pack_inst = preg_replace('/^' . $pattern . '|' . $pattern . '$/u', ' ', $pack_inst);
$content .=','.$pack_inst;
According to RFC 4180, if a column's content contains the row delimiter (\r\n), the column delimiter (,) or the string delimiter (") then you must enclose the content inside double quotes ". When you do that, you must escape all " characters inside the content by preceding them with another ". So the following CSV content:
1: OK,2: this "might" work but not recommended,"3: new
line","4: comma, and text","5: new
line and ""double"" double quotes"
1: Line 2
Will produce 2 rows of CSV data, first one containing 5 columns.
Having said that, have a look at fputcsv() function. It will handle most gory details for you.
What you show is not the CSV generation code, it is simply the code that you have used to force a download to the browser. Regardless, the function that you need to sort this out is fputcsv(), which will automatically consider all sorts of edge cases that any code you write to convert tabular data to CSV format will likely not consider.
You say you are basing this on data in MySQL table, here is a basic framework for creating the CSV file, assuming the MySQLi extension used in a procedural manner:
<?php
// Connect to database and generate file name here
$fileName = 'file.csv';
// Get the data from the database
$query = "
SELECT *
FROM table_name
WHERE some_column = 'Some Value'
ORDER BY column_name
";
if (!$result = mysqli_query($db, $query)) {
// The query failed
// You may want to handle this with a more meaningful error message
header('HTTP/1.1 500 Internal Server Error');
exit;
} else if (!mysqli_num_rows($result)) {
// The query returned no results
// You may want to handle this with a more meaningful error message
header('HTTP/1.1 404 Not Found');
exit;
}
// Create a temporary file pointer for storing the CSV file
$tmpFP = fopen('php://temp', 'w+');
// We'll keep track of how much data we write to the file
$fileLength = 0;
// Create a column head row and write first row to file
$firstRow = mysqli_fetch_assoc($result);
$fileLength += fputcsv($tmpFP, array_keys($firstRow));
$fileLength += fputcsv($tmpFP, array_values($firstRow));
// Write the rest of the rows to the file
while ($row = mysqli_fetch_row($result)) {
$fileLength += fputcsv($tmpFP, $row);
}
// Send the download headers
header('Content-Type: text/csv; charset=UTF-8');
header('Content-Disposition: attachment; filename="'.$fileName.'"');
header('Content-Length: '.$fileLength);
// Free some unnecessary memory we are using
// The data might take a while to transfer to the client
mysqli_free_result($result);
unset($query, $result, $firstRow, $row, $fileName, $fileLength);
// Prevent timeouts on slow networks/large files
set_time_limit(0);
// Place the file pointer back at the beginning
rewind(tmpFP);
// Serve the file download
fpassthru($tmpFP);
// Close the file pointer
fclose($tmpFP);
// ...and we're done
exit;

Outputting Query to Textfile

I have this code (which thanks to the users of stackoverflow I got the markup I needed :) ). However, I have come to a road that I have no knowledge of what so ever. I need to output this formatted table of the query to a text file on the server.
<?php
// Make a MySQL Connection
mysql_connect("hostname.net", "user", "pass") or die(mysql_error());
mysql_select_db("database") or die(mysql_error());
// Get all the data from the "example" table
$result = mysql_query("SELECT * FROM cards ORDER BY card_id")
or die(mysql_error());
echo "";
echo " Name AgeTitlebar ";
// keeps getting the next row until there are no more to get
while($row = mysql_fetch_array( $result )) {
// Print out the contents of each row into a table
echo "";
echo $row['card_id'];
echo "";
echo $row['title'];
echo "";
echo $row['item_bar'];
echo "";
}
echo "";
?>
I know I could use something similar to
<?php
$myFile = "test.txt";
$fh = fopen($myFile, 'w') or die("can't open file");
$stringData = "Bobby Bopper\n";
fwrite($fh, $stringData);
fclose($fh);
?>
but I am sure that it cant be the best solution. So I guess my question is does anyone know how to achieve this?
The nicest solution, particularly if you are short on memory, would be to put the writing into the loop:
$fh = fopen('cards.csv', 'w');
// keeps getting the next row until there are no more to get
while($row = mysql_fetch_array( $result )) {
fputcsv($fh, array($row['card_id'], $row['title'], $row['item_bar']), "\t");
}
fclose('cards.csv');
Note that I have used fputcsv to output the data in CSV format (using a tab as the delimiter). This should be easy to read by hand, and will also be easily understood by, for instance, a spreadsheet program. If you preferred a custom format, you should use fwrite as in your question.
Have a look at:
http://dev.mysql.com/doc/refman/5.0/en/select.html
especially:
[INTO OUTFILE 'file_name' export_options
| INTO DUMPFILE 'file_name'
| INTO var_name [, var_name]]
Something like this?
// Make sure the file exists, do some checks.
// Loop trough result set and append anything to the file.
while (false !== ($aRow = mysql_fetch_assoc($rResult))) {
$sCreateString = $aRow['field1'].';'.$aRow['field2'];
file_put_contents('example.txt', $sCreateString, FILE_APPEND);
}
// Done
If you need an exact dump of the database table, there are better options. (much better actually).
If you want to write a text file, then there's nothing wrong with what you've suggested.
There's lots of information in the PHP manual: http://www.php.net/manual/en/ref.filesystem.php
It depends entirely on how you want to store the data inside the file. You can create your own flat-file database format if you wish, and extract the data using PHP once you've read in the file.

Categories