I asked a question yesterday that was unclear and I've now expanded it slightly. In short, this current project calls for a simple web interface where the user can upload a csv file (this web page is created already). I've modified my PHP for a test file but my situation calls for something different. Every day, the user will upload 1 to 5 different CSV reports. These reports have about 110 fields/columns, though not all fields will be filled in every report. I've created a database with 5 tables, each table covering different fields out of the 110. For instance, one table holds info on the water meters (25 fields) and another table holds info for the tests done on the meters (45 fields). I'm having a hard time finding a way to take the CSV, once uploaded, and split the data into the different tables. I've heard of putting the whole CSV into one table and splitting from there with INSERT statements but I have questions with that:
Is there a way to put a CSV with 110 fields into one table without having fields created? Or would I have to create 110 fields in MYSQL workbench and then create a variable for each in PHP?
If not, would I be able to declare variables from the table dump so that the right data then goes into its correct table?
I'm not as familiar with CSVs in terms of uploading like this, usually just pulling a csv from a folder with a known file name, so that's where my confusion is coming from. Here is the PHP i've used as a simple test with only 10 columns. This was done to make sure the CSV upload works, which it does.
<?php
$server = "localhost";
$user = "root";
$pw = "root";
$db = "uwstest";
$connect = mysqli_connect($server, $user, $pw, $db);
if ($connect->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
if(isset($_POST['submit']))
{
$file = $_FILES['file']['tmp_name'];
$handle = fopen($file, "r");
$c = 0;
while(($filesop = fgetcsv($handle, 1000, ",")) !== false)
{
$one = $filesop[0];
$two = $filesop[1];
$three = $filesop[2];
$four = $filesop[3];
$five = $filesop[4];
$six = $filesop[5];
$seven = $filesop[6];
$eight = $filesop[7];
$nine = $filesop[8];
$ten = $filesop[9];
$sql = "INSERT INTO staging (One, Two, Three, Four, Five, Six, Seven, Eight, Nine, Ten) VALUES ('$one','$two', '$three','$four','$five','$six','$seven','$eight','$nine','$ten')";
}
if ($connect->query($sql) === TRUE) {
echo "You database has imported successfully";
} else {
echo "Error: " . $sql . "<br>" . $conn->error;
}
}
}?>
Depending on CSV size, you might want to consider using MySQL's native CSV import function since it runs 10x-100x times faster.
If you do insist on importing row by row, then you can do something like this with PDO (or adapt it to mysqli).
If you want to match columns, then ,either store your csv as associative array, or parse first row and store it in in array like $cols.
in this case, $results is an associative array that stores a row of csv with column_name=>column_value
$cols=implode(',',array_keys($result));
$vals=':'.str_replace(",",",:",$cols);
$inserter = $pdo->prepare("INSERT INTO `mydb`.`mytable`($cols) VALUES($vals);");
foreach ($result as $k => $v) {
$result[':' . $k] = utf8_encode($v);
if(is_null($v))
$result[':' . $k] = null;
unset($result[$k]);
}
$inserter->execute($result);
hope this helps.
I suggest going with PDO just to avoid all kinds of weirdness that you may encounter in CSV's data.
This is how I would create columns/vals.
$is_first=true;
$cols='';
$vals='';
$cols_array=array();
while (($csv = fgetcsv($handle)) !== false) {
if($is_first)
{
$cols_array=$csv;
$cols=implode(',',$csv);
$is_first=false;
$vals=':'.str_replace(",",",:",$cols);
continue;
}
foreach ($result as $k => $v) {
$result[':' . $cols_array[$k]] = utf8_encode($v);
if(is_null($v))
$result[':' . $cols_array[$k]] = null;
unset($result[$k]);
}
$inserter->execute($result);
}
here is the code that I use for CSV imports.
$file='data/data.csv';
$handle = fopen($file, "r");
$path=realpath(dirname(__FILE__));
$full_path=$path."/../../$file";
$cnt = 0;
$is_first = true;
$headers=array();
$bind=array();
$csv = fgetcsv($handle, 10000, ",");
$headers=$csv;
$alt_query='LOAD DATA LOCAL INFILE \''.$full_path.'\' INTO TABLE mytable
FIELDS TERMINATED BY \',\'
ENCLOSED BY \'\"\'
LINES TERMINATED BY \'\r\n\'
IGNORE 1 LINES
(' . implode(',',$headers).')';
echo exec("mysql -e \"USE mydb;$alt_query;\"",$output,$code);
Assuming the relation between the tables and the CSV is arbitrary but uniform for now on you just need to establish that correspondence array index -> table column once.
Related
I'm embarrassed because this should be a pretty simple task, but I can't figure out why this is not working. I'm using a tab separated file to get values I need to populate a MySQL table. I have 2 MySQL tables, clients and data The clients table has an ID I need to fetch and use in the insert to the data table
<?php
// MySQL settings
define('DB_SERVER', 'localhost');define('DB_USERNAME', 'USER');
define('DB_PASSWORD', 'pass');define('DB_DATABASE', 'DB');
// connect to DB
if ($db = mysqli_connect(DB_SERVER,DB_USERNAME,DB_PASSWORD,DB_DATABASE)){}
else {echo 'Connection to DB failed';die();}
// load tab delim file
$file = "file.csv";// TSV actually
$handle = fopen($file, "r"); // Make all conditions to avoid errors
$read = file_get_contents($file); //read
$lines = explode("\n", $read);//get
$i= 0;//initialize
// loop over file, one line at a time
foreach($lines as $key => $value){
$cols[$i] = explode("\t", $value);
// get order ID for this URL
//$cols[$i]['6'] stores website URLs that match `salesurl` in the clients table
$getidsql = 'select `id` FROM DB.clients WHERE `salesurl` = \''. $cols[$i]['6'].'\'';
if ($result = mysqli_query($db, $getidsql)){
$totalcnt = mysqli_num_rows($result);
$idrow = mysqli_fetch_array($result);
echo '<h1>:'. $idrow['id'] .': '.$totalcnt.'</h1>'; //prints ':: 0'
} else {
echo mysqli_error($db);
echo 'OOPS<hr>'. $getidsql .'<hr>';
}
// if $idrow['id'] actually had a value, then
$insertqry = 'INSERT INTO `data` ......';
$i++;
} //end for each, file line loop
?>
The $getidsql query does work when copy pasted into PHPMyADMIN I get the id result, but within this script mysqli_num_rows is ALWAYS zero, and $idrow is never populated eg; NO ERRORS.. but no result (well, an empty result)
Turns out my code was working fine. My problem was with the data file I was working with. All the data had non-printable characters in it, in fact each character was followed by a non-ASCII character. Running this preg_replace prior to using it in my query solved the problem.
$data[$c] = preg_replace('/[\x00-\x08\x0B\x0C\x0E-\x1F\x80-\x9F]/u', '', $data[$c]);
I'm looping through a CSV to insert/update the name field of some records into a table. The script is mean't to insert the record and if it exists, only update the name field.
It's taking quite some time for larger CSV files so I was wondering if this code could be modified into a multiple INSERT query with an ON DUPLICATE KEY UPDATE command which will only update the name field of the record.
The CSV DOES NOT contain all the fields for the table, only the ones for the primary key and the name. And for that reason, REPLACE will not work for this case.
if (($handle = fopen($_FILES['csv']['tmp_name'], "r")) !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$title = 'Import: '.date('d-m-Y').' '.$row;
# CHECK IF ALREADY EXISTS
$explode = explode('-',$data[0]);
$areacode = $explode[0];
$exchange = $explode[1];
$number = $explode[2];
$update = "INSERT INTO ".TBLPREFIX."numbers SET
area_code = ".$areacode.",
exchange = ".$exchange.",
number = ".$number.",
status = 1,
name = '".escape($data[1])."'
ON DUPLICATE KEY UPDATE name = '".escape($data[1])."'";
mysql_query($update) or die(mysql_error());
$row++;
}
fclose($handle);
$content .= success($row.' numbers have been imported.');
}
Open a transaction before you start inserting, and commit it after you are done. This way the database can optimize the write operation on disk, because it takes place in a separate memory space. Without a transaction, all single queries are automatically committed at once and effective for every other query.
At least I hope you are using InnoDB as a storage engine - MyISAM does not support transactions and has other significant drawbacks. You should avoid it if possible.
So I have a flatfile db in the format of
username:$SHA$1010101010101010$010110010101010010101010100101010101001010:255.255.255.255:1342078265214
Each record on a new line... about 5000+ lines.. I want to import it into a mysql table. Normally I'd do this using phpmyadmin and "file import", but now I want to automate this process by using php to download the db via ftp and then clean up the existing table data and upload the updated db.
id(AUTH INCREMENT) | username | password | ip | lastlogin
The script I've got below for the most part works.. although php will generate an error:
"PHP Fatal error: Maximum execution time of 30 seconds exceeded" I believe I could just increase this time, but on remote server I doubt I'll be allowed, so I need to find better way of doing this.
Only about 1000 records will get inserted into the database before that timeout...
The code I'm using is below.. I will say right now I'm not a pro in php and this was mainly gathered up and cobbled together. I'm looking for some help to make this more efficient as I've heard that doing an insert like this is just bad. And it really sounds bad aswel, as a lot of disk scratching when I run this script on local pc.. I mean why does it want to kill the hdd for doing such a seemingly simple task.
<?php
require ('Connections/local.php');
$wx = array_map('trim',file("auths.db"));
$username = array();
$password = array();
$ip = array();
$lastlogin = array();
foreach($wx as $i => $line) {
$tmp = array_filter(explode(':',$line));
$username[$i] = $tmp[0];
$password[$i] = $tmp[1];
$ip[$i] = $tmp[2];
$lastlogin[$i] = $tmp[3];
mysql_query("INSERT INTO authdb (username,password,ip,lastlogin) VALUES('$username[$i]', '$password[$i]', '$ip[$i]', '$lastlogin[$i]') ") or die(mysql_error());
}
?>
Try this, with bound parameters and PDO.
<?php
require ('Connections/local.php');
$wx = array_map('trim',file("auths.db"));
$username = array();
$password = array();
$ip = array();
$lastlogin = array();
try {
$dbh = new PDO("mysql:host=$ip;dbname=$database", $dbUsername, $dbPassword);
$dbh->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
} catch(PDOException $e) {
echo 'ERROR: ' . $e->getMessage();
}
$mysql_query = "INSERT INTO authdb (username,password,ip,lastlogin) VALUES(:username, :password, :ip, :lastlogin)";
$statement = $dbh->prepare($mysql_query);
foreach($wx as $i => $line) {
set_time_limit(0);
$tmp = array_filter(explode(':',$line));
$username[$i] = $tmp[0];
$password[$i] = $tmp[1];
$ip[$i] = $tmp[2];
$lastlogin[$i] = $tmp[3];
$params = array(":username" => $username[$i],
":password" => $password[$i],
":ip" => $ip[$i],
":lastlogin" => $lastlogin[$i]);
$statement->execute($params);
}
?>
Instead of sending queries to server one by one in the form
insert into table (x,y,z) values (1,2,3)
You should use extended insert syntax, as in:
insert into table (x,y,z) values (1,2,3),(4,5,6),(7,8,9),...
This will increase insert performance by miles. However you need to be careful about how many rows you insert in one statement, since there is a limit to the size of a single SQL can be. So, I'd say start with 100 row packs and see how it goes, then adjust pack size accordingly. Chances are your insert time will go down to like 5 seconds, putting it way under max_execution_time limit.
I currently have a relatively large HTML form (100+ fields). I want to take the data from that form and upload it to a mySQL database when the use hits submit. I have created the PHP code below and have been slowly adding fields and testing to see if the connection is successful. Everything was working through $skilled_nursing, but when I added the next set of values I am no longer successfully creating database entries. All of my echo commands are displayed and I am not getting failures in my error log, but the data is not being received in the database.
Can anyone see what is going wrong? I have checked multiple times for spelling errors, but I haven't seen any. I am wondering if I am somehow timing out with the connection or if I am trying to stick too many values into the execute command.
<?php
echo 'started ok';
// configuration
$dbtype = "mysql";
$dbhost = "localhost";
$dbname = "dbname";
$dbuser = "dbuser";
$dbpass = "userpass";
echo 'variables assigned ok';
// database connection
$conn = new PDO("mysql:host=$dbhost;dbname=$dbname",$dbuser,$dbpass);
echo 'connection established';
// new data
$facility_name = $_POST['facility_name'];
$facility_street = $_POST['facility_street'];
$facility_county = $_POST['facility_county'];
$facility_city = $_POST['facility_city'];
$facility_state = $_POST['facility_state'];
$facility_zipcode = $_POST['facility_zipcode'];
$facility_phone = $_POST['facility_phone'];
$facility_fax = $_POST['facility_fax'];
$facility_licensetype = $_POST['facility_licensetype'];
$facility_licensenumber = $_POST['facility_licensenumber'];
$facility_email = $_POST['facility_email'];
$facility_administrator = $_POST['facility_administrator'];
$skilled_nursing = $_POST['skilled_nursing'];
$independent_living = $_POST['independent_living'];
$assisted_living = $_POST['assisted_living'];
$memory_care = $_POST['memory_care'];
$facility_type_other = $_POST['facility_type_other'];
$care_ratio = $_POST['care_ratio'];
$nurse_ratio = $_POST['nurse_ratio'];
// query
$sql = "INSERT INTO Facilities (facility_name, facility_street, facility_county, facility_city, facility_state, facility_zipcode, facility_phone, facility_fax, facility_licensetype, facility_licensenumber, facility_email, facility_administrator, skilled_nursing, independent_living, assisted_living, memory_care, facility_type_other, care_ratio, nurse_ratio) VALUES (:facility_name, :facility_street, :facility_county, :facility_city, :facility_state, :facility_zipcode, :facility_phone, :facility_fax, :facility_licensetype, :facility_licensenumber, :facility_email, :facility_administrator, :skilled_nursing, :independent_living, :assisted_living, :memory_care, :facility_type_other, :care_ratio, :nurse_ratio)";
$q = $conn->prepare($sql);
$q->execute(array(':facility_state'=>$facility_name,
':facility_street'=>$facility_street,
':facility_county'=>$facility_county,
':facility_city'=>$facility_city,
':facility_state'=>$facility_state,
':facility_name'=>$facility_name,
':facility_zipcode'=>$facility_zipcode,
':facility_phone'=>$facility_phone,
':facility_fax'=>$facility_fax,
':facility_licensetype'=>$facility_licensetype,
':facility_licensenumber'=>$facility_licensenumber,
':facility_email'=>$facility_email,
':facility_administrator'=>$facility_administrator,
':skilled_nursing'=>$skilled_nursing,
':independent_living'=>$independent_living,
':assisted_living'=>$assisted_living,
':memory_care'=>$memory_care,
':facility_type_other'=>$facility_type_other,
':care_ratio'=>$care_ratio,
':nurse_ratio'=>$nurse_ratio));
echo 'query parsed';
?>
This doesn't exactly answer what's going wrong with your code, but it might help solve it.
I would do this a bit differently. You say that you have a lot of fields. Your code is likely to get very long and repetitive. Since it looks like your form field names already correspond with your table columns, I would do something more like this (not tested):
// get a list of column names that exist in the table
$sql = "SELECT column_name FROM information_schema.columns WHERE table_name = 'Facilities'";
$q = $conn->prepare($sql);
$q->execute();
$columns = $q->fetchAll(PDO::FETCH_COLUMN, 0);
$cols = array();
foreach ($_POST as $key=>$value)
{
// if a field is passed in that doesn't exist in the table, remove it
if (!in_array($key, $columns)) {
unset($_POST[$key]);
}
}
$cols = array_keys($_POST);
$sql = "INSERT INTO Facilities(". implode(", ", $cols) .") VALUES (:". implode(", :", $cols) .")";
$q = $conn->prepare($sql);
array_walk($_POST, "addColons");
$q->execute($_POST);
function addColons($value, &$key)
{
$key = ":{$key}";
}
This way, you could have 10, 100, or 1000 fields and this code won't have to change at all. You also reduce your chance for typo errors because there's only one place where the column name is specified. You don't have to worry about SQL injection on the column names because you check to make sure that the column exists before allowing it to be used in your query.
This does, of course, assume that all fields passed in via $_POST correspond with column names in your table. If this isn't the case, it may be easiest to just store those particular field values that aren't columns in separate variables and unset() them from the $_POST array.
Automatically build mySql table upon a CSV file upload.
I have a admin section where admin can upload CSV files with different column count and different column name.
which it should then build a mySql table in the db which will read the first line and create the columns and then import the data accordingly.
I am aware of a similar issue, but this is different because of the following specs.
The name of the Table should be the name of the file (minus the extension [.csv])
each csv file can be diffrent
Should build a table with number of columns and names from the CSV file
add the the data from the second line and on
Here is a design sketch
Maybe there are known frameworks that makes this easy.
Thanks.
$file = 'filename.csv';
$table = 'table_name';
// get structure from csv and insert db
ini_set('auto_detect_line_endings',TRUE);
$handle = fopen($file,'r');
// first row, structure
if ( ($data = fgetcsv($handle) ) === FALSE ) {
echo "Cannot read from csv $file";die();
}
$fields = array();
$field_count = 0;
for($i=0;$i<count($data); $i++) {
$f = strtolower(trim($data[$i]));
if ($f) {
// normalize the field name, strip to 20 chars if too long
$f = substr(preg_replace ('/[^0-9a-z]/', '_', $f), 0, 20);
$field_count++;
$fields[] = $f.' VARCHAR(50)';
}
}
$sql = "CREATE TABLE $table (" . implode(', ', $fields) . ')';
echo $sql . "<br /><br />";
// $db->query($sql);
while ( ($data = fgetcsv($handle) ) !== FALSE ) {
$fields = array();
for($i=0;$i<$field_count; $i++) {
$fields[] = '\''.addslashes($data[$i]).'\'';
}
$sql = "Insert into $table values(" . implode(', ', $fields) . ')';
echo $sql;
// $db->query($sql);
}
fclose($handle);
ini_set('auto_detect_line_endings',FALSE);
Maybe this function will help you.
fgetcsv
(PHP 4, PHP 5)
fgetcsv — Gets line from file pointer
and parse for CSV fields
http://php.net/manual/en/function.fgetcsv.php
http://bytes.com/topic/mysql/answers/746696-create-mysql-table-field-headings-line-csv-file has a good example of how to do this.
The second example should put you on the right track, there isn't some automatic way to do it so your going to need to do a lil programming but it shouldn't be too hard once you implement that code as a starting point.
Building a table is a query like any other and theoretically you could get the names of your columns from the first row of a csv file.
However, there are some practical problems:
How would you know what data type a certain column is?
How would you know what the indexes are?
How would you get data out of the table / how would you know what column represents what?
As you can´t relate your new table to anything else, you are kind of defeating the purpose of a relational database so you might as well just keep and use the csv file.
What you are describing sounds like an ETL tool. Perhaps Google for MySQL ETL tools...You are going to have to decide what OS and style you want.
Or just write your own...