I have a some 350-lined CSV File with all sorts of vendors that fall into Clothes, Tools, Entertainment, etc.. categories. Using the following code I have been able to print out my CSV File.
<?php
$fp = fopen('promo_catalog_expanded.csv', 'r');
echo '<tr><td>';
echo implode('</td><td>', fgetcsv($fp, 4096, ','));
echo '</td></tr>';
while(!feof($fp)) {
list($cat, $var, $name, $var2, $web, $var3, $phone,$var4, $kw,$var5, $desc) = fgetcsv($fp, 4096);
echo '<tr><td>';
echo $cat. '</td><td>' . $name . '</td><td>' .$web.'</td><td>'.$phone.'</td><td>'.$kw.'</td><td>'.$desc.'</td>' ;
echo '</td></tr>';
}
fclose($file_handle);
show_source(__FILE__);
?>
First thing you will probably notice is the extraneous vars within the list(). this is because of how the excel spreadsheet/csv file:
Category,,Company Name,,Website,,Phone,,Keywords,,Description
,,,,,,,,,,
Clothes,,4imprint,,4imprint.com,,877-466-7746,,"polos, jackets, coats, workwear, sweatshirts, hoodies, long sleeve, pullovers, t-shirts, tees, tshirts,",,An embroidery and apparel company based in Wisconsin.
,,Apollo Embroidery,,apolloemb.com,,1-800-982-2146,,"hats, caps, headwear, bags, totes, backpacks, blankets, embroidery",,An embroidery sales company based in California.
One thing to note is that the last line starts with two commas as it is also listed within "Clothes" category.
My concern is that I am going about the CSV output wrong.
Should I be using a foreach loop instead of this list way?
Should I first get rid of any unnecessary blank columns?
Please advise any flaws you may find, improvements I can use so I can be ready to import this data to a MySQL DB.
Im not sure of the overall structure of your CSV - its hard to make rule assumptions based on two lines... but something like the following should work:
$fp = fopen('promo_catalog_expanded.csv', 'r');
// normalize the column names
$columns = array_change_key_case(fgetcsv($fp, 0, ','), CASE_LOWER);
$lastCategory = null;
while(false !== ($data = fgetcsv($fp, 0, ','))) {
$data = array_combine($columns, $data); // make it an assoc array
// test if category has a value - if it doesnt use the last category
if(empty($data['category']) && null !== $lastCategory ){
$data['category'] = $lastCategory;
}
// if we have started a new set of entries for a cat, we need to make it the $lastCategory
if($lastCategory !== $dataCategory && null !== $data['category']) {
$lastCategory = $data['category'];
}
// your sql to do the insert
}
Related
I need to insert my CSV data into mysql by insert query. Currently my CSV having 9976 rows data. But after running the query 9-10 lines skips randomly after 1000 rows and very hard to find out that which rows was skipped.
Firstly I tried to access the folder, then accessing the file and taking the csv first row as my header for table. Then running the query to insert the data in mysql but it skipped some random rows.
$dir = "C:\Users\\".strtolower($username)."\Downloads";
$fp = opendir($dir);
$dates = array();
$latest_file = glob($dir."\\Filter_ Tempo-jql-AP*");
closedir($fp);
$filepath=$latest_file[0];
$the_big_array = [];
$tablename="aht_tracker";
$dbname="ford_resource_capacity";
$conn =mysqli_connect("localhost","root","","$dbname") or die(mysqli_connect_error());
$fields="";
$fields1="";
$fieldsinsert="";
if (($h = fopen("{$filepath}", "r")) !== FALSE)
{
if (($data = fgetcsv($h, 100000, ",")) !== FALSE)
{
$issuekey=array_search("Issue key", $data);
$hours=array_search("Hours", $data);
$username=array_search("Username", $data);
$issuetype=array_search("Issue Type", $data);
$workdescription=array_search("Work Description", $data);
$new_array=array($issuekey,$hours,$username,$issuetype,$workdescription);
$arr_count=count($new_array);
$c=0;
$fieldsinsert .='(';
foreach ($new_array as $key => $value)
{
$fieldsinsert .=($key==0) ? '' : ', ';
$fieldsinsert .="`".str_replace(" ","_",$data[$value])."`";
$fields .="`".str_replace(" ","_",$data[$value])."` varchar(250) DEFAULT NULL,";
}
$fieldsinsert .= ')';
}
while(($data = fgetcsv($h, 100000, ",")) !== FALSE)
{
$fieldsInsertvalues="";
$c=0;
foreach ($new_array as $key => $value)
{
$fieldsInsertvalues .=($key==0) ? '(' : ', ';
$fieldsInsertvalues .="'".$data[$value]."'";
}
$fieldsInsertvalues .= ')';
$sql1 = "INSERT INTO ".$tablename." ".$fieldsinsert."VALUES".$fieldsInsertvalues;
mysqli_query($conn,$sql1);
}
fclose($h);
//unlink($filepath);
}
Show me some code which will help me to insert my all rows from csv data or give me some idea that is it possible to insert the csv data in 500-500 packet.
https://dev.mysql.com/doc/refman/8.0/en/load-data.html
It is more effective to do the MySQL command. Import the data from CSV. You can specificy Container, Separator and stuff.
Now I understood why my program skips some random line.
This is not just skipping the random lines. It is skipping the whole row when that row contain single quote in any of the column data.
So, again i am asking to all of you those. Does any one have the solution that how can i skip that particular column which contain single quote. I do not want to skip the row i just want o skip that cell from the csv.
Please update my code in which the program can skip those particular cell which contain single quote.
What I am trying to do is Upload a CSV file with Php. The first line is the Column names and below that the data (of course). Each column name can change depends on the end user uploads. So the main column names we need can change spots (A1 or B1 etc...) So lets say the column I need is B1 and I need to get all the data in B. Not sure on how to go by it. So far this is what I have. Any ideas?
ini_set("allow_url_fopen", 1);
$handle = fopen($_FILES['fileToUpload']['tmp_name'], 'r') or die ('cannot open the file');
while(!feof($handle)) {
$data[] = fgetcsv($handle);
}
var_dump($data);
fclose($handle);
UPDATE:
I am importing this file from .CSV to PHP
I need to search for column header that starts with “SKU” and then “COST”
From there once those are found then I want the whole column… B, E. But those column letters can change, depends on how it is being exported by the end user. I do not need the rows, just columns.
Once the file is uploaded into the server, use something like the following code to parse it and actually use it as an array[];
Code:
$filename = "upload/sample.csv";
if (($handle = fopen($filename, 'r')) !== FALSE){
while (($row = fgetcsv($handle, 1000, ",")) !== FALSE){
print_r($row);
}
}
That's one way of doing it, you could also read more about it here.
If you want the value of a specific column for each row then you need to loop through the results and pick it out. It looks like you are getting an array of arrays so...(EDITED to get the column based on the header name):
$header = $data[0];
unset($data[0]); // delete the header row so those values don't show in results
$sku_index = '';
$cost_index = '';
// get the index of the desired columns by name
for($i=0; $i < count($header); $i++) {
if($header[$i] == 'SKU') {
$sku_index = $i;
}
if($header[$i] == 'COST') {
$cost_index = $i;
}
}
// loop through each row and grab the values for the desired columns
foreach($data as $row) {
echo $row[$sku_index];
echo $row[$cost_index];
}
Should get what you want.
I am trying to read a certain data in my csv file and transfer it to an array. What I want is to get all the data of a certain column but I want to start on a certain row (let say for example, row 5), is there a possible way to do it? What I have now only gets all the data in a specific column, want to start it in row 5 but can't think any way to do it. Hope you guys can help me out. Thanks!
<?php
//this is column C
$col = 2;
// open file
$file = fopen("example.csv","r");
while(! feof($file))
{
echo fgetcsv($file)[$col];
}
// close connection
fclose($file);
?>
Yes you can define some flag to count the row. Have a look on below solution. It will start printing from 5th row, also you can accesscolum by its index. For eg. for second column you can use $row[1]
$start_row = 5; //define start row
$i = 1; //define row count flag
$file = fopen("myfile.csv", "r");
while (($row = fgetcsv($file)) !== FALSE) {
if($i >= $start_row) {
print_r($row);
//do your stuff
}
$i++;
}
// close file
fclose($file);
You have no guarantee that your file exists or you can read it or ....
Similar to fgets() except that fgetcsv() parses the line it reads for fields in CSV format and returns an array containing the fields read. PHP Manual
//this is column C
$col = 2;
// open file
$file = fopen("example.csv","r");
if (!$file) {
// log your error ....
}
else {
while( ($row = fgetcsv($file)) !== FALSE){
if (isset($row[$col])) // field doesn't exist ...
else print_r ($row[$col]);
}
}
// close file
fclose($file);
?>
Depending on the quality and volume of your incoming data, you may wish to use iterated conditions to build your output array or you may prefer to dump all of the csv data into a master array and then filter it to the desired structure.
To clarify the numeracy in my snippets, the 5th row of data with be located at index [4]. The same indexing is used for column targeting -- the 4th column is at index [3].
A functional approach (assumes no newlines in values and is not set up with any extra csv parsing flags):
$starting_index = 4;
$target_column = 3;
var_export(
array_column(
array_slice(
array_map(
'str_getcsv',
file('example.csv')
),
$starting_index
),
$target_column
)
);
A language construct approach with leading row exclusions based on a decrementing counter.
$disregard_rows = 4;
$target_column = 3;
$file = fopen("example.csv", "r");
while (($row = fgetcsv($file)) !== false) {
if ($disregard_rows) {
--$disregard_rows;
} else {
$column_data[] = $row[$target_column];
}
}
var_export($column_data);
I have a csv file in a format resembling the following. There are
no column heads in the actual file - They are shown here for clarity.
id|user|date|description
0123456789|115|2011-10-12:14:29|bar rafael
0123456789|110|2012-01-10:01:34|bar rafael
0123456902|120|2011-01-10:14:55|foo fighter
0123456902|152|2012-01-05:07:17|foo fighter
0123456902|131|2011-11-21:19:48|foo fighter
For each ID, I need to keep the most recent record only, and write
the results back to the file.
The result should be:
0123456789|110|2012-01-10:01:34|bar rafael
0123456902|152|2012-01-05:07:17|foo fighter
I have looked at the array functions and don't see anything that
will do this without some kind of nested loop.
Is there a better way?
const F_ID = 0;
const F_USER = 1;
const F_DATE = 2;
const F_DESCRIPTION = 3;
$array = array();
if (($handle = fopen('test.csv', 'r')) !== FALSE) {
while (($data = fgetcsv($handle, 1000, '|')) !== FALSE) {
if (count($data) != 4)
continue; //skip lines that have a different number of cells
if (!array_key_exists($data[F_ID], $array)
|| strtotime($data[F_DATE]) > strtotime($array[$data[F_ID]][F_DATE]))
$array[$data[F_ID]] = $data;
}
}
You'll have, in $array, what you want. You can write it using fputcsv.
NOTE. I didn't test this code, it's meant to provide a basic idea of how this would work.
The idea is to store the rows you want into $array, using the first value (ID) as the key. This way, on each line you read, you can check if you already have a record with that ID, and only replace it if the date is more recent.
Each time you encounter a new id, put it in your $out array. If the id already exists, overwrite it if the value is newer. Something like:
$in_array = file('myfile.txt');
$out_array = array();
$fields = array('id', 'user', 'date', 'description');
foreach($in_array as $line) {
$row = array_combine($fields, explode('|', $line) );
//New id? Just add it.
if ( !isset($out_array[ $row['id'] ]) ) {
$out_array[ $row['id'] ] = $row;
}
//Existing id? Overwrite if newer.
else if (strcmp( $row['date'], $out_array[ $row['id'] ]['date'] ) > 0 ) {
$out_array[ $row['id'] ] = $row;
}
//Otherwise ignore
}
//$out_array now has the newest row for each id, keyed by id.
I'm trying to convert some MYSQL querys to MYSQLI, but I'm having an issue, below is part of the script I am having issues with, the script turn a query into csv:
$columns = (($___mysqli_tmp = mysqli_num_fields($result)) ? $___mysqli_tmp : false);
// Build a header row using the mysql field names
$rowe = mysqli_fetch_assoc($result);
$acolumns = array_keys($rowe);
$csvstring = '"=""' . implode('""","=""', $acolumns) . '"""';
$header_row = $csvstring;
// Below was used for MySQL, Above was added for MySQLi
//$header_row = '';
//for ($i = 0; $i < $columns; $i++) {
// $column_title = $file["csv_contain"] . stripslashes(mysql_field_name($result, $i)) . $file["csv_contain"];
// $column_title .= ($i < $columns-1) ? $file["csv_separate"] : '';
// $header_row .= $column_title;
// }
$csv_file .= $header_row . $file["csv_end_row"]; // add header row to CSV file
// Build the data rows by walking through the results array one row at a time
$data_rows = '';
while ($row = mysqli_fetch_array($result)) {
for ($i = 0; $i < $columns; $i++) {
// clean up the data; strip slashes; replace double quotes with two single quotes
$data_rows .= $file["csv_contain"] .$file["csv_equ"] .$file["csv_contain"] .$file["csv_contain"] . preg_replace('/'.$file["csv_contain"].'/', $file["csv_contain"].$file["csv_contain"], stripslashes($row[$i])) . $file["csv_contain"] .$file["csv_contain"] .$file["csv_contain"];
$data_rows .= ($i < $columns-1) ? $file["csv_separate"] : '';
}
$data_rows .= $this->csv_end_row; // add data row to CSV file
}
$csv_file .= $data_rows; // add the data rows to CSV file
if ($this->debugFlag) {
echo "Step 4 (repeats for each attachment): CSV file built. \n\n";
}
// Return the completed file
return $csv_file;
The problem I am having is when building a header row for the column titles mysqli doesn't use field_names so I am fetching the column titles by using mysqli_fetch_assoc() and then implode() the array, adding the ,'s etc for the csv.
This works but when I produce the csv I am deleting the first data row when the header is active, when I remove my header part of the script and leave the header as null I get all data rows and a blank header (As expected).
So I must be missing something when joining my header to array to the $csv_file.
Can anyone point me in the right direction?
Many Thanks
Ben
A third alternative is to refactor the loop body as a function, then also call this function on the first row before entering the loop. You can use fputcsv as this function.
$csv_stream = fopen('php://temp', 'r+');
if ($row = $result->fetch_assoc()) {
fputcsv($csv_stream, array_keys($row));
fputcsv($csv_stream, $row);
while ($row = $result->fetch_row()) {
fputcsv($csv_stream, $row);
}
fseek($csv_stream, 0);
}
$csv_data = stream_get_contents($csv_stream);
if ($this->debugFlag) {
echo "Step 4 (repeats for each attachment): CSV file built. \n\n";
}
// Return the completed file
return $csv_data;
As this basically does the same thing as a do ... while loop, which would make more sense to use. I bring up this alternative to present the loop body refactoring technique, which can be used when a different kind of loop doesn't make sense.
Best of all would be to use both mysqli_result::fetch_fields and fputcsv
$csv_stream = fopen('php://temp', 'r+');
$fields = $result->fetch_fields();
foreach ($fields as &$field) {
$field = $field->name;
}
fputcsv($csv_stream, $fields);
while ($row = $result->fetch_row()) {
fputcsv($csv_stream, $row);
}
fseek($csv_stream, 0);
$csv_data = stream_get_contents($csv_stream);
if ($this->debugFlag) {
echo "Step 4 (repeats for each attachment): CSV file built. \n\n";
}
// Return the completed file
return $csv_data;
If you can require that PHP be at least version 5.3, you can replace the foreach that generates the header line with a call to array_map. There admittedly isn't much advantage to this, I just find the functional approach more interesting.
fputcsv($csv_stream,
array_map(function($field) {return $field->name},
$result->fetch_fields()));
As you observe, you're using the first row to obtain the field names but then not using the data from the row. Evidently, you need to change your code so that you get both of those things.
There are a number of ways you might do this. The most appropriate one is to use mysqli_fetch_fields() instead to get the field metadata from the result object.
http://www.php.net/manual/en/mysqli-result.fetch-fields.php
Alternatively, you could make the loop lower down in the code a do... while instead of a while.