I have a .txt file with details of all countries in this format:
Country,City,AccentCity,Region,Population,Latitude,Longitude
ad,aixas,Aixàs,06,,42.4833333,1.4666667
ad,aixirivali,Aixirivali,06,,42.4666667,1.5
ad,aixirivall,Aixirivall,06,,42.4666667,1.5
ad,aixirvall,Aixirvall,06,,42.4666667,1.5
ad,aixovall,Aixovall,06,,42.4666667,1.4833333
ad,andorra,Andorra,07,,42.5,1.5166667
ad,andorra la vella,Andorra la Vella,07,20430,42.5,1.5166667
ad,andorra-vieille,Andorra-Vieille,07,,42.5,1.5166667
ad,andorre,Andorre,07,,42.5,1.5166667
ad,andorre-la-vieille,Andorre-la-Vieille,07,,42.5,1.5166667
ad,andorre-vieille,Andorre-Vieille,07,,42.5,1.5166667
ad,ansalonga,Ansalonga,04,,42.5666667,1.5166667
I've to insert these data into 3 tables like cities, states and country with out duplication.
Any possible way to read the datas from the .txt file and insert it to the database?
How can I get the state and city database?
Read the file as a CSV and then insert:
if (($handle = fopen("cities.txt", "r")) !== FALSE) {
while (($data = fgetcsv($handle)) !== FALSE) {
// Craft your SQL insert statement such as:
$sql = "INSERT INTO cities (country, city, accent_city, etc.) VALUES ('{$data[0]}','{$data[1]}','{$data[2]}', etc.)";
// Use the appropriate backend functions depending on your DB, mysql, postgres, etc.
}
}
If you database is mysql, exists an utility to bulk insert, documentation
If not, probably your database has one too, but if you want to do this using PHP, #davidethell's example is good for do the task
Related
Multiple CSV files have consistent 26 column headers. I am using plain PHP to insert the data in MySQL. Data may contain > 100k rows.
Here is my sample CSV file.
Following is my code:
while(($line = fgetcsv($csvFile)) !== FALSE){
//to get the data from csv
$account_code = $line[0];
$caller_number = $line[1];
$callee_number = $line[2];
.
.
.
$action_type=$line[23];
$source_trunk_name=$line[24];
$dest_trunk_name =$line[25];
$query = $db->query("INSERT INTO `cdrnew`
(`account_code`, `caller_number`, `callee_number`, ...........,`action_type`, `source_trunk_name`, `dest_trunk_name`)
VALUES ('".$account_code."','".$caller_number."', '".$callee_number."', ............,'".$action_type."','".$source_trunk_name."','".$dest_trunk_name."')");
Is this the right approach to insert the data from CSV to MySQL if I have consistent 26 column headers across all the CSV files. Or is there a better approach?
I would suggest using a parameterised and bound query. The benefit above and beyond the obvious, of removing the possibility of SQL Injections, would be that you can compile the query once but run it many times with new parameters each time.
This stops the database having to do 1000's of query compilations and therefore 1000 of unnecessary round trips to the server and back.
First move the query outside the loop and prepare it there
$stmt = $db->prepare("INSERT INTO `cdrnew`
(`account_code`, `caller_number`, `callee_number`,
.,.,.,.,.,.,
`action_type`, `source_trunk_name`, `dest_trunk_name`)
VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)";
while(($line = fgetcsv($csvFile)) !== FALSE){
$stmt->bind_param('ssssssssssssssssssssssssss',
$line[0], $line[1], $line[2], $line[3], $line[4],
$line[5], $line[6], $line[7], $line[8], $line[9],
$line[10], $line[11], $line[12], $line[13], $line[14],
$line[15], $line[16], $line[17], $line[18], $line[19],
$line[20], $line[21], $line[22], $line[23], $line[24],
$line[25]
);
$stmt->execute();
}
NOTE: You may want to add some error checking to this base layout
I'm going to explain with my best efforts what my goal is here. Everything I've searched for online hasn't been relevant enough for me to gain an idea.
First off, this is a PHP assignment where we have to load CSV files into a MySQL database.
Now, each table (total of 4) have the exact same field values. What I am trying to accomplish is using a for each loop that populates each table with the information from the CSV file. I know I can do this by having a while loop for each table and CSV file but I'm trying to go above the requirements and learn more about PHP. Here is my code for what I'm trying to accomplish:
$files = glob('*.txt'); // .txt required extension
foreach($files as $file) {
if (($handle = fopen($file, "r")) !== FALSE) {
while (($data = fgetcsv($handle,4048, ",")) !== FALSE) {
echo $data[0]; // making sure data is correct
$import = "INSERT INTO".basename($file)."(id,itemName,price) VALUES('$data[0]','$data[1]','$data[2]')";
multi_query($import) or die (mysql_error());
}
fclose($handle);
}
else {
echo "Could not open file: " . $file;
}
}
Each CSV file contains the id, itemName and price. Hopefully this is understandable enough. Thank you
The way you are importing data into MySQL is OK for small volume of data. However, if you are importing huge volumes(thousands of rows), the best way would be to import it directy into MySQL is by using infile. Fo example:
LOAD DATA LOCAL INFILE '/path/to/your_file.csv'
INTO TABLE your_table_name
FIELDS TERMINATED BY ','
ENCLOSED BY '"' LINES
TERMINATED BY '\n' (id, itemName, price)
That's a smarter way to import your CSV data :)
So I'm trying to make it so that I can update a MySQL database by importing a CSV file, only problem is I am seeing some of my data has commas, which is causing the data to be imported into the wrong tables. Here's my existing import code.
if ($_FILES[csv][size] > 0) {
//get the csv file
$file = $_FILES[csv][tmp_name];
$handle = fopen($file,"r");
//loop through the csv file and insert into database
do {
if ($data[0]) {
mysql_query("INSERT INTO songdb (artist, title) VALUES
(
'".addslashes($data[0])."',
'".addslashes($data[1])."'
)
") or die (mysql_error());
}
} while ($data = fgetcsv($handle,1000,",","'"));
//
//redirect
header('Location: import.php?success=1'); die;
}
Is there a way I can set it to ignore the commas, quotes and apostrophes in the CSV file?
I would also let to set it to ignore the first line in the csv, seeing as how it's just column information. If that is at all possible.
** EDIT **
For example if the CSV contains data such as "last name, first name", or "User's Data", these are literally just examples of the data that's actually in there. The data is imported each month and we've just noticed this issue.
Sample Data:
Column 1, Column 2
Item 1, Description
Item 2, Description
Item, 3, Description
Item, 4, Description
"Item 5", Description
"Item, 6", Description
Above is the sample data that was requested.
You might want to use MySQL's built-in LOAD DATA INFILE statement which not only will work faster, but will let you use the clause FIELDS OPTIONALLY ENCLOSED BY '"' to work with that kind of files.
So your query will be something like that:
mysql_query(<<<SQL
LOAD DATA LOCAL INFILE '{$_FILES['csv']['tmp_name']}'
INTO TABLE songdb
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\\n'
IGNORE LINES 1 (artist, title)
SQL
) or die(mysql_error());
If your data is dirty, the easiest way to handle this will be to clean it up manually, and either use data entry forms that strip out bad characters and/or escape the input data, or tell the users who are generating this data to stop putting commas in fields.
Your example has inconsistent column count and inconsistent fields due to lack of escaping input in whatever they used to generate this data.
That said, you could do some advanced logic to igore any comma after Item but before a space or digit, using regular expressions, but that is getting kind of ridiculous and depending on the number of rows, it may be easier to clean it up manually before importing.
In terms of skipping the header row, you can do this:
if ($_FILES[csv][size] > 0) {
//get the csv file
$file = $_FILES[csv][tmp_name];
$handle = fopen($file,"r");
$firstRow = false;
//loop through the csv file and insert into database
do {
if ($data[0]) {
// skip header row
if($firstRow) {
$firstRow=false;
continue;
}
mysql_query("INSERT INTO songdb (artist, title) VALUES
(
'".addslashes($data[0])."',
'".addslashes($data[1])."'
)
") or die (mysql_error());
}
} while ($data = fgetcsv($handle,1000,",","'"));
//
//redirect
header('Location: import.php?success=1'); die;
}
Oh I just read your comment, 5gb. Wow. Manual cleanup is not an option. You need to look at the range of possible ways the data is screwed up and really assess what logic you need to use to capture the right columns.
Is your example above a representative sample or could other fields without enclosures have commas?
Try this, this is working fine for me.
ini_set('auto_detect_line_endings',TRUE);
$csv_data=array();
$file_handle = fopen($_FILES['file_name']['tmp_name'], 'r');
while(($data = fgetcsv($file_handle) ) !== FALSE){
$update_data= array('first'=>$data['0'],
'second'=>$data['1'],
'third'=>$data['2'],
'fourth'=>$data['34']);
// save this array in your database
}
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I would like to create and upload page in php and import the uploaded csv file data into multiple tables. tried searching here but looks like can't find any which is importing from a csv to multiple table. any help here is greatly appreciated. thank you.
As the another variant proposed above, you can read your CSV line-by-line and explode each line into fields. Each field will corresponds one variable.
$handle = fopen("/my/file.csv", "r"); // opening CSV file for reading
if ($handle) { // if file successfully opened
while (($CSVrecord = fgets($handle, 4096)) !== false) { // iterating through each line of our CSV
list($field1, $field2, $field3, $field4) = explode(',', $CSVrecord); // exploding CSV record (line) to the variables (fields)
// and here you can easily compose SQL queries and map you data to the tables you need using simple variables
}
fclose($handle); // closing file handler
}
If you have access to PHPmyadmin, you can upload the CSV into there. Then copy if over to each desired table
In response to your comment that some data is going to one table and other data is going to another table, here is a simple example.
Table1 has 3 fields: name, age and sex. Table2 has 2 fields: haircolour, shoesize. So your CSV could be laid out like:
john smith,32,m,blonde,11
jane doe,29,f,red,4
anders anderson,56,m,grey,9
For the next step you will be using the function fgetcsv. This will break each line of the csv into an array that you can then use to build your SQL statements:
if (($handle = fopen($mycsvfile, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
// this loops through each line of your csv, putting the values into array elements
$sql1 = "INSERT INTO table1 (`name`, `age`, `sex`) values ('".$data[0]."', '".$data[1]."', '".$data[2]."')";
$sql2 = "INSERT INTO table2 (`haircolour`, `shoesize`) values ('".$data[3]."', '".$data[4]."')";
}
fclose($handle);
}
Please note that this does not take any SQL security such as validation into account, but that is basically how it will work.
the problem seems to me to differentiate what field is for which table.
when you are sending a header like
table.field, table.field, table.field
and then split the header, you'll get all tables and fields.
could that be a way to go?
all the best
ps: because of your comment ...
A csv file has/can have a first line with fieldnames in it. when there is a need too copy csv data into more than one tables, then you can use a workaround to find out which field is for which table.
user.username, user.lastname, blog.comment, blog.title
"sam" , "Manson" , "this is a comment", "and I am a title"
Now, when reading the csv data you can work over the first line, split the title at the dot to find out wich tables are used and also the fields.
With this method you are able to copy csv data to more than one table.
But it means, you have to code it first :(
To split the fieldnames
// only the first line for the fieldnames
$topfields = preg_split('/,|;|\t/')
foreach( $topfields as $t => $f ) {
// t = tablename, f = field
}
if (($handle = fopen($mycsvfile, "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
// this loops through each line of your csv, putting the values into array elements
$sql1 = "INSERT INTO table1 (`name`, `age`, `sex`) values ('".$data[0]."', '".$data[1]."', '".$data[2]."')";
$sql2 = "INSERT INTO table2 (`haircolour`, `shoesize`) values ('".$data[3]."', '".$data[4]."')";
}
fclose($handle);
}
in above code you use two insert queries how you gonna run these queries ?
Automatically build mySql table upon a CSV file upload.
I have a admin section where admin can upload CSV files with different column count and different column name.
which it should then build a mySql table in the db which will read the first line and create the columns and then import the data accordingly.
I am aware of a similar issue, but this is different because of the following specs.
The name of the Table should be the name of the file (minus the extension [.csv])
each csv file can be diffrent
Should build a table with number of columns and names from the CSV file
add the the data from the second line and on
Here is a design sketch
Maybe there are known frameworks that makes this easy.
Thanks.
$file = 'filename.csv';
$table = 'table_name';
// get structure from csv and insert db
ini_set('auto_detect_line_endings',TRUE);
$handle = fopen($file,'r');
// first row, structure
if ( ($data = fgetcsv($handle) ) === FALSE ) {
echo "Cannot read from csv $file";die();
}
$fields = array();
$field_count = 0;
for($i=0;$i<count($data); $i++) {
$f = strtolower(trim($data[$i]));
if ($f) {
// normalize the field name, strip to 20 chars if too long
$f = substr(preg_replace ('/[^0-9a-z]/', '_', $f), 0, 20);
$field_count++;
$fields[] = $f.' VARCHAR(50)';
}
}
$sql = "CREATE TABLE $table (" . implode(', ', $fields) . ')';
echo $sql . "<br /><br />";
// $db->query($sql);
while ( ($data = fgetcsv($handle) ) !== FALSE ) {
$fields = array();
for($i=0;$i<$field_count; $i++) {
$fields[] = '\''.addslashes($data[$i]).'\'';
}
$sql = "Insert into $table values(" . implode(', ', $fields) . ')';
echo $sql;
// $db->query($sql);
}
fclose($handle);
ini_set('auto_detect_line_endings',FALSE);
Maybe this function will help you.
fgetcsv
(PHP 4, PHP 5)
fgetcsv — Gets line from file pointer
and parse for CSV fields
http://php.net/manual/en/function.fgetcsv.php
http://bytes.com/topic/mysql/answers/746696-create-mysql-table-field-headings-line-csv-file has a good example of how to do this.
The second example should put you on the right track, there isn't some automatic way to do it so your going to need to do a lil programming but it shouldn't be too hard once you implement that code as a starting point.
Building a table is a query like any other and theoretically you could get the names of your columns from the first row of a csv file.
However, there are some practical problems:
How would you know what data type a certain column is?
How would you know what the indexes are?
How would you get data out of the table / how would you know what column represents what?
As you can´t relate your new table to anything else, you are kind of defeating the purpose of a relational database so you might as well just keep and use the csv file.
What you are describing sounds like an ETL tool. Perhaps Google for MySQL ETL tools...You are going to have to decide what OS and style you want.
Or just write your own...