I'm embarrassed because this should be a pretty simple task, but I can't figure out why this is not working. I'm using a tab separated file to get values I need to populate a MySQL table. I have 2 MySQL tables, clients and data The clients table has an ID I need to fetch and use in the insert to the data table
<?php
// MySQL settings
define('DB_SERVER', 'localhost');define('DB_USERNAME', 'USER');
define('DB_PASSWORD', 'pass');define('DB_DATABASE', 'DB');
// connect to DB
if ($db = mysqli_connect(DB_SERVER,DB_USERNAME,DB_PASSWORD,DB_DATABASE)){}
else {echo 'Connection to DB failed';die();}
// load tab delim file
$file = "file.csv";// TSV actually
$handle = fopen($file, "r"); // Make all conditions to avoid errors
$read = file_get_contents($file); //read
$lines = explode("\n", $read);//get
$i= 0;//initialize
// loop over file, one line at a time
foreach($lines as $key => $value){
$cols[$i] = explode("\t", $value);
// get order ID for this URL
//$cols[$i]['6'] stores website URLs that match `salesurl` in the clients table
$getidsql = 'select `id` FROM DB.clients WHERE `salesurl` = \''. $cols[$i]['6'].'\'';
if ($result = mysqli_query($db, $getidsql)){
$totalcnt = mysqli_num_rows($result);
$idrow = mysqli_fetch_array($result);
echo '<h1>:'. $idrow['id'] .': '.$totalcnt.'</h1>'; //prints ':: 0'
} else {
echo mysqli_error($db);
echo 'OOPS<hr>'. $getidsql .'<hr>';
}
// if $idrow['id'] actually had a value, then
$insertqry = 'INSERT INTO `data` ......';
$i++;
} //end for each, file line loop
?>
The $getidsql query does work when copy pasted into PHPMyADMIN I get the id result, but within this script mysqli_num_rows is ALWAYS zero, and $idrow is never populated eg; NO ERRORS.. but no result (well, an empty result)
Turns out my code was working fine. My problem was with the data file I was working with. All the data had non-printable characters in it, in fact each character was followed by a non-ASCII character. Running this preg_replace prior to using it in my query solved the problem.
$data[$c] = preg_replace('/[\x00-\x08\x0B\x0C\x0E-\x1F\x80-\x9F]/u', '', $data[$c]);
Related
I have a problem. I'm trying to get some data from a database into a .csv table.
$fn=fopen($path.$filename, "w");
$addstring = file_get_contents($path.$filename);
$addstring = 'Azonosito;Datum;Ido;Leiras;IP-cim;allomasnév;MAC-cim;Felhasznalonev;Tranzakcioazonosito;Lekerdezes eredmenye;Vizsgalat ideje;Korrelacios azonosito;DHCID;';
/*$addstring .= "\n";*/
$sql="select * from dhcpertekeles.dhcpk";
$result =mysqli_query($conn, $sql);
if ($result=mysqli_query($conn,$sql))
{
while ($row=mysqli_fetch_row($result))
{
$addstring .= "\n".$row[0].";".$row[1].";".$row[2].";".$row[3].";".$row[4].";".$row[5].";".$row[6].";".$row[7].";".$row[8].";".$row[9].";".$row[10].";".$row[11].";".$row[12].";";
};
};
/*file_put_contents($path.$filename, $addstring);*/
fwrite($fn, $addstring);
fclose($fn);
The data is in the following format:
The first addstring contains the column names, and has no issues
the second (addstring .=) contains the data:
ID($row[0]), Date($row[1]), Time($row[2]), Description($row[3]), IP($row[4]), Computer name($row[5]), MAC($row[6]), User($row[7])(empty), Transactionid($row[8]), query result($row[9]), query time($row[10]), correlation id($row[11])(empty), DHCID($row[12])(empty)
It is basically daily DHCP server data, uploaded to a database. Now, the code works, it does write everything i want to the csv, but there are 2 problems.
1, the code for some inexplicable reason, inserts an empty row into the csv table between the rows that contain the data. Removing $row[12] fixes this. I tried removing special characters, converting spaces into something that can be seen, and even converting empty string into something that can be seen. Yet nothing actually worked, i even tried file_puts_content(same for the second problem) instead of fwrite, but nothing. The same thing keeps happening. If i remove \n it will work, but the 2nd row onwards will be misplaced to the right by 1 column.
2, For some reason, the last 2 character is removed from the csv. The string that is to be inserted into the csv still contains said 2 characters before writing it to the file. Tried both fwrite and file_puts_content.
As for the .csv format, the data clumns are divided by ; and rows by \n.
Also tried reading the file with both libre office and excel thinking it might be excel that was splurging but no.
Try using fputcsv() function. I didn't test following code but I think it should work.
$file = fopen($path . $filename, 'w');
$header = array(
'Azonosito',
'Datum',
'Ido',
'Leiras',
'IP-cim',
'allomasnév',
'MAC-cim',
'Felhasznalonev',
'Tranzakcioazonosito',
'Lekerdezes eredmenye',
'Vizsgalat ideje',
'Korrelacios azonosito',
'DHCID'
);
fputcsv($file, $header, ';');
$sql = "select * from dhcpertekeles.dhcpk";
$result = mysqli_query($conn, $sql);
if ($result = mysqli_query($conn, $sql)) {
while ($row = mysqli_fetch_row($result)) {
fputcsv($file, $row, ';');
}
}
fclose($file);
The $addstring = file_get_contents($path.$filename) doesn't does nothing because you're overwriting that variable in the next line.
To remove the extra row on 12 did you tried removing the \n AND the \r with something like:
$row[12] = strtr($row[12], array("\n"=>'', "\r"=>''));
You can also check which ascii characters are you receiving in the $row[12] with this function taken form the php site:
function AsciiToInt($char){
$success = "";
if(strlen($char) == 1)
return "char(".ord($char).")";
else{
for($i = 0; $i < strlen($char); $i++){
if($i == strlen($char) - 1)
$success = $success.ord($char[$i]);
else
$success = $success.ord($char[$i]).",";
}
return "char(".$success.")";
}
}
Another tip can be the database it's returning UTF-8 or UTF-16 and you're losing some characters in the text file.
Try looking at that with the mb_detect_encoding function.
I asked a question yesterday that was unclear and I've now expanded it slightly. In short, this current project calls for a simple web interface where the user can upload a csv file (this web page is created already). I've modified my PHP for a test file but my situation calls for something different. Every day, the user will upload 1 to 5 different CSV reports. These reports have about 110 fields/columns, though not all fields will be filled in every report. I've created a database with 5 tables, each table covering different fields out of the 110. For instance, one table holds info on the water meters (25 fields) and another table holds info for the tests done on the meters (45 fields). I'm having a hard time finding a way to take the CSV, once uploaded, and split the data into the different tables. I've heard of putting the whole CSV into one table and splitting from there with INSERT statements but I have questions with that:
Is there a way to put a CSV with 110 fields into one table without having fields created? Or would I have to create 110 fields in MYSQL workbench and then create a variable for each in PHP?
If not, would I be able to declare variables from the table dump so that the right data then goes into its correct table?
I'm not as familiar with CSVs in terms of uploading like this, usually just pulling a csv from a folder with a known file name, so that's where my confusion is coming from. Here is the PHP i've used as a simple test with only 10 columns. This was done to make sure the CSV upload works, which it does.
<?php
$server = "localhost";
$user = "root";
$pw = "root";
$db = "uwstest";
$connect = mysqli_connect($server, $user, $pw, $db);
if ($connect->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
if(isset($_POST['submit']))
{
$file = $_FILES['file']['tmp_name'];
$handle = fopen($file, "r");
$c = 0;
while(($filesop = fgetcsv($handle, 1000, ",")) !== false)
{
$one = $filesop[0];
$two = $filesop[1];
$three = $filesop[2];
$four = $filesop[3];
$five = $filesop[4];
$six = $filesop[5];
$seven = $filesop[6];
$eight = $filesop[7];
$nine = $filesop[8];
$ten = $filesop[9];
$sql = "INSERT INTO staging (One, Two, Three, Four, Five, Six, Seven, Eight, Nine, Ten) VALUES ('$one','$two', '$three','$four','$five','$six','$seven','$eight','$nine','$ten')";
}
if ($connect->query($sql) === TRUE) {
echo "You database has imported successfully";
} else {
echo "Error: " . $sql . "<br>" . $conn->error;
}
}
}?>
Depending on CSV size, you might want to consider using MySQL's native CSV import function since it runs 10x-100x times faster.
If you do insist on importing row by row, then you can do something like this with PDO (or adapt it to mysqli).
If you want to match columns, then ,either store your csv as associative array, or parse first row and store it in in array like $cols.
in this case, $results is an associative array that stores a row of csv with column_name=>column_value
$cols=implode(',',array_keys($result));
$vals=':'.str_replace(",",",:",$cols);
$inserter = $pdo->prepare("INSERT INTO `mydb`.`mytable`($cols) VALUES($vals);");
foreach ($result as $k => $v) {
$result[':' . $k] = utf8_encode($v);
if(is_null($v))
$result[':' . $k] = null;
unset($result[$k]);
}
$inserter->execute($result);
hope this helps.
I suggest going with PDO just to avoid all kinds of weirdness that you may encounter in CSV's data.
This is how I would create columns/vals.
$is_first=true;
$cols='';
$vals='';
$cols_array=array();
while (($csv = fgetcsv($handle)) !== false) {
if($is_first)
{
$cols_array=$csv;
$cols=implode(',',$csv);
$is_first=false;
$vals=':'.str_replace(",",",:",$cols);
continue;
}
foreach ($result as $k => $v) {
$result[':' . $cols_array[$k]] = utf8_encode($v);
if(is_null($v))
$result[':' . $cols_array[$k]] = null;
unset($result[$k]);
}
$inserter->execute($result);
}
here is the code that I use for CSV imports.
$file='data/data.csv';
$handle = fopen($file, "r");
$path=realpath(dirname(__FILE__));
$full_path=$path."/../../$file";
$cnt = 0;
$is_first = true;
$headers=array();
$bind=array();
$csv = fgetcsv($handle, 10000, ",");
$headers=$csv;
$alt_query='LOAD DATA LOCAL INFILE \''.$full_path.'\' INTO TABLE mytable
FIELDS TERMINATED BY \',\'
ENCLOSED BY \'\"\'
LINES TERMINATED BY \'\r\n\'
IGNORE 1 LINES
(' . implode(',',$headers).')';
echo exec("mysql -e \"USE mydb;$alt_query;\"",$output,$code);
Assuming the relation between the tables and the CSV is arbitrary but uniform for now on you just need to establish that correspondence array index -> table column once.
i got a function in PHP to read table from ODBC (to IBM AS400) and write it to a text file on daily basis. it works fine until it reach more than 1GB++. Then it just stop to some rows and didn't write completely.
function write_data_to_txt($table_new, $query)
{
global $path_data;
global $odbc_db, $date2;
if(!($odbc_rs = odbc_exec($odbc_db,$query))) die("Error executing query $query");
$num_cols = odbc_num_fields($odbc_rs);
$path_folder = $path_data.$table_new."/";
if (!file_exists($path_folder)) mkdir ($path_folder,0777);
$filename1 = $path_folder. $table_new. "_" . $date2 . ".txt";
$comma = "|";
$newline = chr(13).chr(10);
$handle = fopen($filename1, "w+");
if (is_writable($filename1)) {
$ctr=0;
while(odbc_fetch_row($odbc_rs))
{
//function for writing all field
// for($i=1; $i<=$num_cols; $i++)
// {
// $data = odbc_result($odbc_rs, $i);
// if (!fwrite($handle, $data) || !fwrite($handle, $comma)) {
// print "Cannot write to file ($filename1)";
// exit;
// }
//}
//end of function writing all field
$data = odbc_result($odbc_rs, 1);
fwrite($handle,$ctr.$comma.$data.$newline);
$ctr++;
}
echo "Write Success. Row = $ctr <br><br>";
}
else
{
echo "Write Failed<br><br>";
}
fclose($handle);
}
no errors, just success message but it should be 3,690,498 rows (and still increase) but i just got roughly 3,670,009 rows
My query is ordinary select like :
select field1 , field2, field3 , field4, fieldetc from table1
What i try and what i assume :
I think it was fwrite limitation so i try not to write all field (just write $ctr and 1st record) but it still stuck in same row.. so i assume its not about fwrite exceed limit..
I try to reduce field i select and it can works completely!! so i assume it have some limitation on odbc.
I try to use same odbc datasource with SQL Server and try to select all field and it give me complete rows. So i assume its not odbc limitation.
Even i try on 64 bits machine but it even worse, it just return roughly 3,145,812 rows.. So i assume it's not about 32/64 bit infrastructure.
I try to increase memory_limit in php ini to 1024mb but it didnt work also..
Is there anyone know if i need to set something in my PHP to odbc connection??
I currently have a relatively large HTML form (100+ fields). I want to take the data from that form and upload it to a mySQL database when the use hits submit. I have created the PHP code below and have been slowly adding fields and testing to see if the connection is successful. Everything was working through $skilled_nursing, but when I added the next set of values I am no longer successfully creating database entries. All of my echo commands are displayed and I am not getting failures in my error log, but the data is not being received in the database.
Can anyone see what is going wrong? I have checked multiple times for spelling errors, but I haven't seen any. I am wondering if I am somehow timing out with the connection or if I am trying to stick too many values into the execute command.
<?php
echo 'started ok';
// configuration
$dbtype = "mysql";
$dbhost = "localhost";
$dbname = "dbname";
$dbuser = "dbuser";
$dbpass = "userpass";
echo 'variables assigned ok';
// database connection
$conn = new PDO("mysql:host=$dbhost;dbname=$dbname",$dbuser,$dbpass);
echo 'connection established';
// new data
$facility_name = $_POST['facility_name'];
$facility_street = $_POST['facility_street'];
$facility_county = $_POST['facility_county'];
$facility_city = $_POST['facility_city'];
$facility_state = $_POST['facility_state'];
$facility_zipcode = $_POST['facility_zipcode'];
$facility_phone = $_POST['facility_phone'];
$facility_fax = $_POST['facility_fax'];
$facility_licensetype = $_POST['facility_licensetype'];
$facility_licensenumber = $_POST['facility_licensenumber'];
$facility_email = $_POST['facility_email'];
$facility_administrator = $_POST['facility_administrator'];
$skilled_nursing = $_POST['skilled_nursing'];
$independent_living = $_POST['independent_living'];
$assisted_living = $_POST['assisted_living'];
$memory_care = $_POST['memory_care'];
$facility_type_other = $_POST['facility_type_other'];
$care_ratio = $_POST['care_ratio'];
$nurse_ratio = $_POST['nurse_ratio'];
// query
$sql = "INSERT INTO Facilities (facility_name, facility_street, facility_county, facility_city, facility_state, facility_zipcode, facility_phone, facility_fax, facility_licensetype, facility_licensenumber, facility_email, facility_administrator, skilled_nursing, independent_living, assisted_living, memory_care, facility_type_other, care_ratio, nurse_ratio) VALUES (:facility_name, :facility_street, :facility_county, :facility_city, :facility_state, :facility_zipcode, :facility_phone, :facility_fax, :facility_licensetype, :facility_licensenumber, :facility_email, :facility_administrator, :skilled_nursing, :independent_living, :assisted_living, :memory_care, :facility_type_other, :care_ratio, :nurse_ratio)";
$q = $conn->prepare($sql);
$q->execute(array(':facility_state'=>$facility_name,
':facility_street'=>$facility_street,
':facility_county'=>$facility_county,
':facility_city'=>$facility_city,
':facility_state'=>$facility_state,
':facility_name'=>$facility_name,
':facility_zipcode'=>$facility_zipcode,
':facility_phone'=>$facility_phone,
':facility_fax'=>$facility_fax,
':facility_licensetype'=>$facility_licensetype,
':facility_licensenumber'=>$facility_licensenumber,
':facility_email'=>$facility_email,
':facility_administrator'=>$facility_administrator,
':skilled_nursing'=>$skilled_nursing,
':independent_living'=>$independent_living,
':assisted_living'=>$assisted_living,
':memory_care'=>$memory_care,
':facility_type_other'=>$facility_type_other,
':care_ratio'=>$care_ratio,
':nurse_ratio'=>$nurse_ratio));
echo 'query parsed';
?>
This doesn't exactly answer what's going wrong with your code, but it might help solve it.
I would do this a bit differently. You say that you have a lot of fields. Your code is likely to get very long and repetitive. Since it looks like your form field names already correspond with your table columns, I would do something more like this (not tested):
// get a list of column names that exist in the table
$sql = "SELECT column_name FROM information_schema.columns WHERE table_name = 'Facilities'";
$q = $conn->prepare($sql);
$q->execute();
$columns = $q->fetchAll(PDO::FETCH_COLUMN, 0);
$cols = array();
foreach ($_POST as $key=>$value)
{
// if a field is passed in that doesn't exist in the table, remove it
if (!in_array($key, $columns)) {
unset($_POST[$key]);
}
}
$cols = array_keys($_POST);
$sql = "INSERT INTO Facilities(". implode(", ", $cols) .") VALUES (:". implode(", :", $cols) .")";
$q = $conn->prepare($sql);
array_walk($_POST, "addColons");
$q->execute($_POST);
function addColons($value, &$key)
{
$key = ":{$key}";
}
This way, you could have 10, 100, or 1000 fields and this code won't have to change at all. You also reduce your chance for typo errors because there's only one place where the column name is specified. You don't have to worry about SQL injection on the column names because you check to make sure that the column exists before allowing it to be used in your query.
This does, of course, assume that all fields passed in via $_POST correspond with column names in your table. If this isn't the case, it may be easiest to just store those particular field values that aren't columns in separate variables and unset() them from the $_POST array.
Automatically build mySql table upon a CSV file upload.
I have a admin section where admin can upload CSV files with different column count and different column name.
which it should then build a mySql table in the db which will read the first line and create the columns and then import the data accordingly.
I am aware of a similar issue, but this is different because of the following specs.
The name of the Table should be the name of the file (minus the extension [.csv])
each csv file can be diffrent
Should build a table with number of columns and names from the CSV file
add the the data from the second line and on
Here is a design sketch
Maybe there are known frameworks that makes this easy.
Thanks.
$file = 'filename.csv';
$table = 'table_name';
// get structure from csv and insert db
ini_set('auto_detect_line_endings',TRUE);
$handle = fopen($file,'r');
// first row, structure
if ( ($data = fgetcsv($handle) ) === FALSE ) {
echo "Cannot read from csv $file";die();
}
$fields = array();
$field_count = 0;
for($i=0;$i<count($data); $i++) {
$f = strtolower(trim($data[$i]));
if ($f) {
// normalize the field name, strip to 20 chars if too long
$f = substr(preg_replace ('/[^0-9a-z]/', '_', $f), 0, 20);
$field_count++;
$fields[] = $f.' VARCHAR(50)';
}
}
$sql = "CREATE TABLE $table (" . implode(', ', $fields) . ')';
echo $sql . "<br /><br />";
// $db->query($sql);
while ( ($data = fgetcsv($handle) ) !== FALSE ) {
$fields = array();
for($i=0;$i<$field_count; $i++) {
$fields[] = '\''.addslashes($data[$i]).'\'';
}
$sql = "Insert into $table values(" . implode(', ', $fields) . ')';
echo $sql;
// $db->query($sql);
}
fclose($handle);
ini_set('auto_detect_line_endings',FALSE);
Maybe this function will help you.
fgetcsv
(PHP 4, PHP 5)
fgetcsv — Gets line from file pointer
and parse for CSV fields
http://php.net/manual/en/function.fgetcsv.php
http://bytes.com/topic/mysql/answers/746696-create-mysql-table-field-headings-line-csv-file has a good example of how to do this.
The second example should put you on the right track, there isn't some automatic way to do it so your going to need to do a lil programming but it shouldn't be too hard once you implement that code as a starting point.
Building a table is a query like any other and theoretically you could get the names of your columns from the first row of a csv file.
However, there are some practical problems:
How would you know what data type a certain column is?
How would you know what the indexes are?
How would you get data out of the table / how would you know what column represents what?
As you can´t relate your new table to anything else, you are kind of defeating the purpose of a relational database so you might as well just keep and use the csv file.
What you are describing sounds like an ETL tool. Perhaps Google for MySQL ETL tools...You are going to have to decide what OS and style you want.
Or just write your own...