I have to import a csv to a mysql database
I can't use load data infile because it's disabled on the webserver.
are there any other ways to do so?
If you have a scripting language available, you can loop through the CSV lines and have it generate SQL code:
PHP example:
<?php
$lines = file('file.csv');
foreach($lines as $line){
//sepatates each cell by the delimiter "," (watch for delimiters in the cell, escaped or not)
$cell = explode(",",$line);
$sql = "INSERT INTO table (col1,col2,col3) VALUES (";
$sql.= "'".$cell[0]."','".$cell[1]."','".$cell[2]."');";
echo $sql;
}
?>
Loop through the file and insert using a prepared query. Prepared querys should be quicker too, since the DB doesn't have to recompile every SQL string you send it. That will be more noticeable when you have thousands and thousands of lines.
<?php
// assume $db is a PDO connection
$stmt = $db->prepare('INSERT INTO table (col1, col2, col3) VALUES(?, ?, ?)');
// read file contents to an array
$lines = file('file.csv');
// insert each line
foreach ($lines as $line) {
// see manual to specify $delimter, $enclousure, or $more
$cols = str_getcsv($lines);
$stmt->execute($cols);
}
That'll work. Since we're using file(), the script can consume a lot of memory if your CSV file is HUGE. To make better use of resources, do the following to keep only one line in memory at a time:
<?php
// assume $db is a PDO connection
$stmt = $db->prepare('INSERT INTO table (col1, col2, col3) VALUES(?, ?, ?)');
$handle = fopen('test.csv', 'r');
while ($cols = fgetcsv($handle)) {
$stmt->execute($cols);
}
Related
I am currently trying to use the multi-valued INSERT queries with SQLite3 and PDO.
I did some research and found that before SQLite version: 3.7.11 the multi-valued INSERT syntax was not supported. This has since (2012) changed.
My phpinfo() is informing me that:
PDO Driver for SQLite 3.x enabled
SQLite Library 3.7.7.1
Regardless of that, PDO doesn't seem to support using these kinds of INSERT queries using SQLite3.
My question is if there is any workaround this issue. I am working on an application that is compatible with both SQLite3 and MySQL. Seeing that both of them support multi-value inserts, I would hate to use two kinds of query and INSERT logic only because PDO is not up-to-date.
Some edits - adding code specifics:
Opening the DB connection:
public function useSQLite3($file)
{
$dsn = "sqlite:$file";
$this->dbService = new PDO ($dsn);
$this->dbService->query('PRAGMA journal_mode=WAL;');
$this->dbService->setAttribute(PDO::ATTR_DEFAULT_FETCH_MODE, PDO::FETCH_ASSOC);
}
Method that handles the bulk insert to the DB:
public function bulkInsertLink(array $links)
{
$insertRows = array();
$placeholders = array();
$j = 0;
$i=0;
foreach($links as $linkData) {
$placeholders[$j] = '(';
foreach($linkData as $columnData) {
$placeholders[$j] .= '?,';
$insertRows[$i] = $columnData;
$i++;
}
$placeholders[$j] = rtrim($placeholders[$j], ',');
$placeholders[$j] .= ')';
$j++;
}
$query = 'INSERT INTO links (status, date, lang, group_ID, group_link_ID, link, sitemap_link) VALUES ';
$query .= implode(',', $placeholders);
$preparedQuery = $this->dbService->prepare($query);
$preparedQuery->execute($insertRows);
}
$links is an array, where each element represents the information for one row to be inserted.
Using PDO you can do multi values inserts like this:
$statement = $pdo->prepare('INSERT INTO t VALUES (?, ?), (?, ?)');
$pdo->execute([1, 2, 3, 4]);
But I'd personally prepared a single insert statement and executed it multiple times with different parameters.
hi, i work with a small xml script that filters all entries in a database. My problem is, that in the names of some xml strings are apostrophs that i need to filter in a mysql database. but when i run the script, all data es there, except for them with apostrophe. heres my code:
include 'new.php'; //include xml file
$haus = new SimpleXMLElement($xmlstr);
ยด
foreach ($haus->features as $features) {
foreach ($features->properties as $properties) {
$name = $properties->name;
$insert = $mysqli->query("INSERT INTO locations (name)
VALUES ('$name')");
echo $mysqli->affected_rows;
}
}
is there a way to get the apostrophe in the database with php?
Use a prepared statement which means you don't have to worry about quotes. This will also protect you from SQL Injection. There will also be a slight performance benefit because you can create the prepared statement once, then execute it many times without having to send the whole query to the MySQL engine over and over.
$count = 0; // if you want to check how many rows were inserted
if($stmt = $mysqli->prepare('INSERT INTO locations (name) VALUES (?)')){
foreach ($haus->features as $features) {
foreach ($features->properties as $properties) {
$name = $properties->name;
$stmt->bind_param('s', $name);
if($stmt->execute()){
$count++;
}
}
}
}
echo 'total inserted: ' . $count;
I have recently asked how to insert a CSV into a MySQL database. It was suggested to me to use LOAD DATA LOCAL INFILE, however it turns out that this is disabled on my host, so no longer an option. Back to PHP loops..
I'm having an issue looping through the results of a temp upload, since I'm mapping the values to an array on insert. On multiple lines therefore, this causes the same entry to be entered twice (the first line's values), as the array values are explicitly defined.
It's inserting 1, 2, 3, 4 and then 1, 2, 3, 4. I want to insert 1, 2, 3, 4 then 5, 6, 7, 8 of the array.
What's the solution (aside from hacky for's and row++)?
Thanks in advance.
$handle = fopen($_FILES['csv']['tmp_name'], "r");
$sql = "INSERT INTO tbl (col1, col2, col3, col4) VALUES (?, ?, ?, ?)";
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$query = $db->prepare($sql);
if ($query->execute(array($data[0],$data[1],$data[2],$data[3]))) return true;
else return false;
}
The only thing I can think of is that your loop is only executing once, but you run the loop twice. (You have a "return" statement in your loop.)
The following should work:
function loadCsv($db, $filename){
$fp = fopen($filename, 'rb');
$sql = 'INSERT INTO tbl (col1, col2, col3, col4) VALUES (?,?,?,?)';
$pstmt = $db->prepare($sql);
while (FALSE !== ($data = fgetcsv($fp, 1000))) {
$cols = array_slice($data, 0, 4);
$query->execute($cols);
}
$pstmt->closeCursor();
fclose($fp);
}
For maximum compatibility and performance, I recommend connecting to PDO with a function like this connect_PDO function.
First of all, you only need to prepare the query once (that's one of the two main advantages of using prepared statements, with injection prevention being the other one), so put the call to prepare() before the while loop, not inside it.
Outside of that, I see no reason why the code you've posted would behave the way you claim it does, unless your data is duplicated in your CSV file.
The issue was with the return statement. Removing the return instantly fixed the issue.
Unfortunately the user who posted this answer has since removed it.
Thanks everyone for your suggestion and help with this!
I need to import my text file to MySQL which has only one field. It is like this:
I could also do this with PHP.
You can use Mysql LOAD DATA LOCAL INFILE syntax
LOAD DATA LOCAL INFILE '/path/to/file.txt'
INTO TABLE 'table1'
LINES TERMINATED BY '\n'
For this, make sure Mysql has access to /path/to/file.txt. Also the user who is executing the query must have FILE privilege.
With Pure PHP its easy. Read the file, build the query, execute it.
You need to build the query so that you dont end up looping query which is slow.
$data = file("/path/to/file.txt", FILE_SKIP_EMPTY_LINES);
// make sure you have valid database connection prior to this point.
// otherwise mysql_real_escape_string won't work
$values = "('". implode("'), ('", array_map('mysql_real_escape_string', $data)). "')";
$query = "INSERT INTO `TABLE1` (`COLUMN1`) VALUES $values";
// Now just execute the query once.
mysql_query($query);
Why not use some regex to basically split it up by new lines, ie.
$array = preg_split("/[\r\n]+/", file_get_contents("path/to/file.txt"));
and then do a foreach loop, ie:
foreach($array as $line){
// insert into database
}
Then all you need to do is fill the line above where you insert it into the correct field in the database, line by line - but please sanitize each line, just so you don't inject the database with anything bad!
Using csv import in phpMyAdmin
phpMyAdmin supports CSV files import, look at this article for more details (first result after: phpMyAdmin csv)
Using mysqlimport
Mysql provides CLI application, mysqlimport, example usage (again CSV):
mysqlimport --fields-optionally-enclosed-by='"' --fields-terminated-by=, \
--lines-terminated-by="\r\n" --user=YOUR_USERNAME --password \
YOUR_DATABASE YOUR_TABLE.csv
Mysql LOAD DATA
Mysql itself (client, query) supports LOAD DATA INFILE command, example syntax:
LOAD DATA INFILE 'data.txt' INTO TABLE db2.my_table;
Build sql query with php (and insert manually)
You will parse text file in php and output will be one large INSERT statement (this will be useful when/if you cannot connect to mysql from the server where you're running the script:
// This will escape values correctly
function escapeValues( &$val){
// mysql_real_escape_string is not an option (remote server not accessible
// in this case)
return "'" . addslashes( $val) . "'";
}
$fp = fopen( $filename, 'r') or die( 'Cannot open');
$result = array();
while( $row = fgets( $fp)){
$values = explode( '/', $row);
array_walk( $values, 'escapeValues');
$results[] = '(' . implode( ', ', $values) . ')';
}
fclose( $fp);
if( !count( $results){
die();
}
echo 'INSERT INTO tableName (col1, col2, ... colN) VALUES ';
echo implode( ",\n", $results);
echo "\n";
Direct connection and direct import
This should be the best approach for you.
$conn = mysql_connect( ...) or die(...);
mysql_select_db(...);
// Now we need to build small class which will allow us escape values properly
// mysql_escape_string is DEPRECATED and mysql_real_escape_string
// requires connection parameter
// Ps: choose better name
class CallbackHack {
public $connection = null;
public function escapeValue( &$val){
$val = "'" . mysql_real_escape_string( $val, $this->connection) . "'";
}
}
$hack = new CallbackHack();
$hack->connection = $conn;
$fp = fopen( $filename, 'r') or die( 'Cannot open');
mysql_query( 'BEGIN TRANSACTION;'); // Increases insert performance for InnoDb
while( $row = fgets( $fp)){
$values = explode( '/', $row);
array_walk( $values, array( $hack, 'escapeValue'));
$sql = 'INSERT INTO tableName (col1, col2, ... colN) VALUES (' .
implode( ', ', $values) . ');';
mysql_query( $sql);
}
mysql_query( 'COMMIT;'); // Make sure it will run
fclose( $fp);
<?php
$con = mysql_connect("localhost","peter","abc123");
if (!$con)
{
die('Could not connect: ' . mysql_error());
}
mysql_select_db("my_db", $con);
$file = fopen("welcome.txt", "r") or exit("Unable to open file!");
//Output a line of the file until the end is reached
while(!feof($file))
{
$data = mysql_real_excape_string(fgets($file));
mysql_query("INSERT INTO Persons (filedata)
VALUES ($data)");
//make sure above query is right according to your table structure
}
fclose($file);
mysql_close($con);
?>
<?php
$array = preg_split("[\r\n]+", file_get_contents("path/to/file.txt"))
foreach ($array as $line) {
mysql_query("INSERT INTO `dbname` (colmun1) VALUES ('$line')");
}
?>
Always sanitize too !
How can I implement recursive MySQL Queries. I am trying to look for it but resources are not very helpful.
Trying to implement similar logic.
public function initiateInserts()
{
//Open Large CSV File(min 100K rows) for parsing.
$this->fin = fopen($file,'r') or die('Cannot open file');
//Parsing Large CSV file to get data and initiate insertion into schema.
$query = "";
while (($data=fgetcsv($this->fin,5000,";"))!==FALSE)
{
$query = $query + "INSERT INTO dt_table (id, code, connectid, connectcode)
VALUES (" + $data[0] + ", " + $data[1] + ", " + $data[2] + ", " + $data[3] + ")";
}
$stmt = $this->prepare($query);
// Execute the statement
$stmt->execute();
$this->checkForErrors($stmt);
}
#Author: Numenor
Error Message: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '0' at line 1
This Approach inspired to look for an MySQL recursive query approach.
Here is the Approach I was using Earlier:
Current Code:
public function initiateInserts()
{
//Open Large CSV File(min 100K rows) for parsing.
$this->fin = fopen($file,'r') or die('Cannot open file');
//Parsing Large CSV file to get data and initiate insertion into schema.
while (($data=fgetcsv($this->fin,5000,";"))!==FALSE)
{
$query = "INSERT INTO dt_table (id, code, connectid, connectcode)
VALUES (:id, :code, :connectid, :connectcode)";
$stmt = $this->prepare($query);
// Then, for each line : bind the parameters
$stmt->bindValue(':id', $data[0], PDO::PARAM_INT);
$stmt->bindValue(':code', $data[1], PDO::PARAM_INT);
$stmt->bindValue(':connectid', $data[2], PDO::PARAM_INT);
$stmt->bindValue(':connectcode', $data[3], PDO::PARAM_INT);
// Execute the statement
$stmt->execute();
$this->checkForErrors($stmt);
}
}
Updated Code
public function initiateInserts()
{
//Open Large CSV File(min 100K rows) for parsing.
$this->fin = fopen($file,'r') or die('Cannot open file');
//Prepare insertion query to insert data into schema.
$query = "INSERT INTO dt_table (id, code, connectid, connectcode)
VALUES (:id, :code, :connectid, :connectcode)";
$stmt = $this->prepare($query);
// Then, for each line : bind the parameters
$stmt->bindValue(':id', $data[0], PDO::PARAM_INT);
$stmt->bindValue(':code', $data[1], PDO::PARAM_INT);
$stmt->bindValue(':connectid', $data[2], PDO::PARAM_INT);
$stmt->bindValue(':connectcode', $data[3], PDO::PARAM_INT);
//Loop through CSV file and execute inserts prepared, but this is not working
//and there are not data being populated into database.
while (($data=fgetcsv($this->fin,5000,";"))!==FALSE)
{
// Execute the statement
list($id, $code, $connid, $conncode)=$data;
$stmt->execute();
$this->checkForErrors($stmt);
}
}
This was my Main Question for which I am looking for suggestions !!!
There's nothing recursive in that code snippet.
The wrong operator is used to concatenate the strings, it's . (dot) not +
You'd have to use something like mysqli::multi_query() to execute more than one statement with a single function call and the statements would have to be separated by a delimiter character (by default a semicolon)
Since you're already using prepare() and execute() why not simply make it a parametrized prepared statement and then assign the values in each iteration of the loop and execute the statement? (Exactly what is $this and what type of object does $this->prepare() return?)
edit and btw: $this->prepare() indicates that your class extends a database class. And it also holds a file descriptor $this->fin. This has a certain code smell. My guess is that your class uses/has a database/datasink object and a file/datasource, but not is a database+readfile class. Only extend a class if your derived class is something.
edit: a simple example
class Foo {
protected $pdo;
public function __construct(PDO $pdo) {
$this->pdo = $pdo;
}
public function initiateInserts($file)
{
$query = '
INSERT INTO
dt_table_tmp
(id, code, connectid, connectcode)
VALUES
(:id, :code, :connid, :conncode)
';
$stmt = $this->pdo->prepare($query);
$stmt->bindParam(':id', $id);
$stmt->bindParam(':code', $code);
$stmt->bindParam(':connid', $connid);
$stmt->bindParam(':conncode', $conncode);
$fin = fopen($file, 'r') or die('Cannot open file');
while ( false!==($data=fgetcsv($fin,5000,";")) ) {
list($id, $code, $connid, $conncode)=$data;
$stmt->execute();
}
}
}
$pdo = new PDO("mysql:host=localhost;dbname=test", 'localonly', 'localonly');
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
// set up a demo table and some test data
$pdo->exec('CREATE TEMPORARY TABLE dt_table_tmp (id int, code int, connectid int, connectcode int)');
$sourcepath = 'sample.data.tmp';
$fh = fopen($sourcepath, 'wb') or die('!fopen(w)');
for($i=0; $i<10000; $i++) {
fputcsv($fh, array($i, $i%4, $i%100, $i%3), ';');
}
fclose($fh); unset($fh);
// test script
$foo = new Foo($pdo);
$foo->initiateInserts($sourcepath);
a few tips about speeding up mysql data import
check if your data really requires to be parsed, sometimes load data works just fine for csv
if possible, create an sql file first via php and then execute it with mysql command line client
use multivalue inserts
disable keys before inserting
multivalue insert statement is something like
INSERT INTO users(name, age) VALUES
("Sam", 13),
("Joe", 14),
("Bill", 33);
this is much faster than three distinct insert statements.
Disabling keys is important to prevent indexing each time you're executing an INSERT:
ALTER TABLE whatever DISABLE KEYS;
INSERT INTO whatever .....
INSERT INTO whatever .....
INSERT INTO whatever .....
ALTER TABLE whatever ENABLE KEYS;
further reading http://dev.mysql.com/doc/refman/5.1/en/insert-speed.html
Inspired by this question I would say you should do something similar. If you really have so many data, then a bulk import is the most appropriate approach for this. And you already have the data in a file.
Have a look at the LOAD DATA INFILE command.
The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed. The file name must be given as a literal string.
If you are interested in the speed differences then read Speed of INSERT Statements.
E.g. you can do this:
$query = "LOAD DATA INFILE 'data.txt' INTO TABLE tbl_name
FIELDS TERMINATED BY ';'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
"
This will also ignore the first line assuming that it only indicates the columns.