Update MySQL Table using CSV file - php

my Current MySQL table employee_data has 13k rows with 17 columns. The data in the table came from a CSV file Employees.csv. after importing my csv data I added a new column 'password' (so its not in the csv file) Password is edited and accessed via a web portal. I now have an updated csv file and I want to update my main table with that data but I don't want to lose my password info.
Should I import my new CSV file into a temp table in my database and some how compare them? I am not sure where to start and I am open to recommendations.
I am now realizing I should have kept my password info in a separate table. Doh!
I guess I could created a php file that compares each row based on the employee_id field but with 13k rows I am afraid it would time out possibly.

I would do it like this :
Create a temp table using CREATE TABLE new_tbl LIKE orig_tbl; syntax
use LOAD DATA INFILE to import the data from the CSV into the table
Use UPDATE to update the primary table using a primary key / unique column (perhaps employee_id)
I have worked with tables containing 120 million lines and imported CSV files containing 30 million lines into it - this is the method I use all of the time - much more efficient than anything in PHP (and thats my server side language of choice)

Try other tools other than php based ones phpMyAdmin
MySQL workbench is a great tool,
based on you connection it will take a while to query the database with your data.
There are workarounds with php timeout limit,
set_time_limit();

Related

PHP times out on importing data into mysql type InnoDB database

I am working on a PHP/mysql program that needs its own mysql table and I want to include a sample database for testing/learning purposes.
I want use a PHP installation script to automate the creation of the mysql table and inserting the sample database.
The latest versions of mysql now set the engine type to InnoDB and I can successfully create the mysql database using PHP - which defaults to the InnoDB type.
The problem comes when I try to import the sample database (from a csv file) - out of 1800 records only 500 records are imported before PHP times out.
I have come up with a possible solution.
Create a mysql database with MyISAM type - using CREATE TABLE $table_name ...... ENGINE=MyISAM
Import the records from the csv file into the MyISAM table - using INSERT INTO $table_name .......
Finally change the database type from MyISAM to InnoDB - using ALTER TABLE $table_name ENGINE = InnoDB
This three step process works MUCH faster and completes well before the PHP script times out.
I have checked the InnoDB table and data using phpmyadmin and all appears to be OK.
Can anyone find fault with this method and if so can you offer an easy solution.
The processing would be even faster if you did not do so much work.
LOAD DATA INFILE ...
will load the entire CSV file in one step, without your having to open + read + parse + INSERT each row one by one.
If you need to manipulate any of the columns, then these steps are more general, yet still much faster than either of your methods:
CREATE TABLE tmp ... ENGINE=CSV; -- and point to your file
INSERT INTO real_table
SELECT ... the columns, suitably manipulated in SQL
FROM tmp;
No loop, no OPEN, no parsing.
This happens to all apache php and mysql installation. You need to up the Apache max execution time in order to make php upload large files into mysql.
I would recommend you carefully study php.ini file and understand how it's controlled in the backend.

Load csv into MySQL, using a new table

I am using phpMyAdmin to load a CSV file into a MySQL table but, does the table have to be created before the load?
I have read that if the workbench is used ( Import a CSV file into MySQL workbench into a new table dynamically ) the latest version creates the table on the fly but, is this possible with phpMyAdmin.
Here is the code that doesn't work
load data local infile '/Users/...'
into table skin_cutaneous_melanoma
fields terminated by ','
lines terminated by '\n'
Error is:
#1146 - Table hospital.skin_cutaneous_melanoma' doesn't exist
Sure, phpMyAdmin can do this.
From the main page, look for the Import tab. This is, of course, how you expect to import any file including if you already have the database and table created, but if not phpMyAdmin creates a temporary database and table for you using generic names and best guesses as to the data types for your columns. Note that this is probably not the ideal structure, but is the best guess based on the existing data in your database. For best results, you'll probably want to put the desired column names at the top of your CSV file and select the corresponding option during import, or rename them after import (likewise with the database and table names) -- but it is absolutely possible to do the import to a new table.

Swap Column of Data in phpMyAdmin

I have a database table with 6 columns of 365 rows of data. I need to swap the 3rd column (named 'Date_line') with new data while leaving the other 5 columns in place, without exporting the whole table, but can't get phpMyAdmin to work with me.
Normally I'd just truncate the table and upload a revised CSV file for the whole table, but here's the catch: I have to update 232 data tables with this same exact column of data (the column data is common to all 232 tables). To do all 232 individually would mean exporting each table, opening it in Excel, swapping the old column for the new one, converting to CSV then re-uploading. It would be a lot easier if I could just import a single column CSV to overwrite the old one. But I don't know how.
I'd like to do this using the phpMyAdmin interface... I'm not much experienced in assigning scripts. Is there a way?

Big Data : Handling SQL Insert/Update or Merge best line by line or by CSV?

So basically I have a bunch of 1 Gig data files (compressed) with just text files containing JSON data with timestamps and other stuff.
I will be using PHP code to insert this data into MYSQL database.
I will not be able to store these text files in memory! Therefor I have to process each data-file line by line. To do this I am using stream_get_line().
Some of the data contained will be updates, some will be inserts.
Question
Would it be faster to use Insert / Select / Update statements, or create a CSV file and import it that way?
Create a file thats a bulk operation and then execute it from sql?
I need to basically insert data with a primary key that doesnt exist, and update fields on data if the primary key does exist. But I will be doing this in LARGE Quantities.
Performance is always and issue.
Update
The table has 22,000 Columns, and only say 10-20 of them do not contain 0.
I would load all of the data to a temporary table and let mysql do the heavy lifting.
create the temporary table by doing create table temp_table as select * from live_table where 1=0;
Read the file and create a data product that is compatible for loading with load data infile.
Load the data into the temporary table and add an index for your primary key
Next Isolate you updates by doing a inner query between the live table and the temporary table. walk through and do your updates.
remove all of your updates from the temporary (again using an inner join between it and the live table).
process all of the inserts with a simple insert into live_table as select * from temp_table.
drop the temporary table, go home and have a frosty beverage.
This may be over simplified for your use case but with a little tweaking it should work a treat.

Importing a specific column into phpmyadmin database

I have been given an excel document which contains information about different embassies, their contact numbers etc. Most of this information is in the database already, but I need to extract one column from this file (emails) and insert it into a specific column (emails) in the database.
Is this possible, or can you only import an exact copy of what you want in the database?
Thanks for any help
Export the table from phpmyadmin as CSV for excel -
drag it into excel -
add the desired column -
Save -
export the table again from phpmyadmin as .sql -
copy the table creation(Not the data) -
delete the table from phpmyadmin-
Run the table creation query so you now have a blank table -
Import the excel csv.
IMPORTANT: backup your database
You may want to make a script with php-excel-reader which goes throught every row in the file, and updates the corresponding database row. I don't think that you can import a file "selectively" in phpMyAdmin.

Categories