dumping csv file data into db through sql query - php

i have a .csv file containing a table data.I want to dump the total csv file into db using a query in mysql using column name as key...
rather than doing it manually...like using "insert into" query...
if any other language like php or python program make this work...then also ok..
can some one suggest me....pls

you can do this fairly easily with navicat software.
more info here

load data local infile ...
LOAD DATA LOCAL INFILE '/tmp/your_file.csv' INTO TABLE your_table;
You need to ensure the mysql user has enough privileges to perform load

Related

Concatenate my php variable to csv data when importing LOAD DATA INFILE

I need to import some csv files to mysql.
I recently found about the LOAD DATA INFILE, I checked the manual, as far as I understand, to imports each line of .csv file to the database.
I was wondering if there's an option to concatenate an extra value to what the csv file contains.
For example: my table has 3 columns.
MYTABLE(#name,#email,company_adrs);
The csv file has data for "name,email" only, and I have the address in a php variable.
With a normal loop, I would achieve this as follows:
//open file
while (/* not end of file, read eachline*/)
{
$current_line_of_csv_file; // contains: "John,john#john.com" without the 3rd column
$myAddrsVar= mysqli_real_escape_string($db, $_POST['adrs']); //"44 company st"
$insert = $db->query("INSERT MYTABLE (name,email,company_adrs) VALUES ( $current_line_of_csv_file,'".$myAddrsVar."')");
}
With this method I have to loop through each line the csv and do a query in each iteration.
I was wondering if LOAD DATA INFILE has a method to concatenate my data to the csv file? if so, how can I achieve it?
Thanks in advance
Given your situation, I would recommend using PHP all-in-all and insert data into MySQL that way.
https://www.php.net/manual/en/function.str-getcsv.php can help you parse CSV into an array. Add to the array the address. Now, you'd have the dataset in an array.
Use prepared statements like shown here https://websitebeaver.com/prepared-statements-in-php-mysqli-to-prevent-sql-injection and insert data one after the other.
Use transactions so that all data is inserted or no data is insert. See How to start and end transaction in mysqli?.
I'd avoid using LOAD DATA INLINE because the volume of data is not that big. If you really want to use it and add another column, you can use SET statement. There's an example of that here: Add extra column of data when using LOAD DATA LOCAL INFILE

Load csv into MySQL, using a new table

I am using phpMyAdmin to load a CSV file into a MySQL table but, does the table have to be created before the load?
I have read that if the workbench is used ( Import a CSV file into MySQL workbench into a new table dynamically ) the latest version creates the table on the fly but, is this possible with phpMyAdmin.
Here is the code that doesn't work
load data local infile '/Users/...'
into table skin_cutaneous_melanoma
fields terminated by ','
lines terminated by '\n'
Error is:
#1146 - Table hospital.skin_cutaneous_melanoma' doesn't exist
Sure, phpMyAdmin can do this.
From the main page, look for the Import tab. This is, of course, how you expect to import any file including if you already have the database and table created, but if not phpMyAdmin creates a temporary database and table for you using generic names and best guesses as to the data types for your columns. Note that this is probably not the ideal structure, but is the best guess based on the existing data in your database. For best results, you'll probably want to put the desired column names at the top of your CSV file and select the corresponding option during import, or rename them after import (likewise with the database and table names) -- but it is absolutely possible to do the import to a new table.

check if row is in table in php

I have to write a script that updates some mysql tables. For this purpose I am provided with a .dbf file that contains the up-to-date data. So what I am doing is:
Convert .dbf file to .sql file (using this script by xtranophilist )
Extract mysql statements from .sql file and execute them (Creating mysql table "temp" and filling it with data)
Get data from freshly created table ("temp") where column tablenr = '1' and check for each row if row exists in other table ("data_side1")
1.and 2. is working so far, now I am wondering how to do 3:
How do you check if some row exists in some table in MySQL via PHP? And what is the best way to do so?
You can use this:
SELECT EXISTS(SELECT 1 FROM data_side1 WHERE ...)
For more details please check http://dev.mysql.com/doc/refman/5.7/en/exists-and-not-exists-subqueries.html

On the fly anonymisation of a MySQL dump

I am using mysqldump to create DB dumps of the live application to be used by developers.
This data contains customer data. I want to anonymize this data, i.e. remove customer names / credit card data.
An option would be:
create copy of database (create dump and import dump)
fire SQL queries that anonymize the data
dump the new database
But this has to much overhead.
A better solution would be, to do the anonymization during dump creation.
I guess I would end up parsing all the mysqlsqldump output? Are there any smarter solutions?
You can try Myanon: https://myanon.io
Anonymization is done on the fly during dump:
mysqldump | myanon -f db.conf | gzip > anon.sql.gz
Why are you selecting from your tables if you want to randomize the data?
Do a mysqldump of the tables that are safe to dump (configuration tables, etc) with data, and a mysqldump of your sensitive tables with structure only.
Then, in your application, you can construct the INSERT statements for the sensitive tables based on your randomly created data.
I had to develop something similar few days ago. I couldn't do INTO OUTFILE because the db is AWS RDS. I end up with that approach:
Dump data in tabular text form from some table:
mysql -B -e 'SELECT `address`.`id`, "address1" , "address2", "address3", "town", "00000000000" as `contact_number`, "example#example.com" as `email` FROM `address`' some_db > addresses.txt
And then to import it:
mysql --local-infile=1 -e "LOAD DATA LOCAL INFILE 'addresses.txt' INTO TABLE \`address\` FIELDS TERMINATED BY '\t' ENCLOSED BY '\"' IGNORE 1 LINES" some_db
only mysql command is required to do this.
As the export is pretty quick (couple of seconds for ~30.000 rows), the import process is a bit slower, but still fine. I had to join few tables on the way and there was some foreign keys so it will surely be faster if you don't need that. Also if you disable foreign key checks while importing it will also speed up things.
You could do a select of each table (and not a select *) and specify the columns you want to have and omit or blank those you don't want to have, and then use the export option of phpmyadmin for each query.
You can also use the SELECT ... INTO OUTFILE syntax from a SELECT query to make a dump with a column filter.
I found to similar questions but it looks like there is no easy solution for what you want. You will have to write a custom export yourself.
MySQL dump by query
MySQL: Dump a database from a SQL query
phpMyAdmin provides an export option to the SQL format based on SQL queries. It might be an option to extract this code from PHPmyadmin (which is probably well tested) and use it in this application.
Refer to the phpMyAdmin export plugin - exportData method for the code.

Update MySQL Table using CSV file

my Current MySQL table employee_data has 13k rows with 17 columns. The data in the table came from a CSV file Employees.csv. after importing my csv data I added a new column 'password' (so its not in the csv file) Password is edited and accessed via a web portal. I now have an updated csv file and I want to update my main table with that data but I don't want to lose my password info.
Should I import my new CSV file into a temp table in my database and some how compare them? I am not sure where to start and I am open to recommendations.
I am now realizing I should have kept my password info in a separate table. Doh!
I guess I could created a php file that compares each row based on the employee_id field but with 13k rows I am afraid it would time out possibly.
I would do it like this :
Create a temp table using CREATE TABLE new_tbl LIKE orig_tbl; syntax
use LOAD DATA INFILE to import the data from the CSV into the table
Use UPDATE to update the primary table using a primary key / unique column (perhaps employee_id)
I have worked with tables containing 120 million lines and imported CSV files containing 30 million lines into it - this is the method I use all of the time - much more efficient than anything in PHP (and thats my server side language of choice)
Try other tools other than php based ones phpMyAdmin
MySQL workbench is a great tool,
based on you connection it will take a while to query the database with your data.
There are workarounds with php timeout limit,
set_time_limit();

Categories