PHP CSV from S3 as string - want to convert to array - php

I have some code which downloads a CSV file from an S3 bucket into a PHP variable.
I want to get this CSV content (a string) from this variable and convert it into an array.
I'm having some issues reading from my php://memory resource.
When I print the $csv_fread or $rows I see nothing.
Dumping the $csv_fstat shows the correct length of my file I uploaded and then downloaded and put into the $file_contents string and wrote to the $csv resource.
Any help appreciated.
// use Aws\S3\S3Client;
$result = $this->client->getObject([
'Bucket' => $this->bucket,
'Key' => $id
]);
$file_contents = $result['Body'];
$csv = fopen('php://memory', 'r+');
if ($csv === false) {
throw new Exception('Could not create file wrapper');
}
if (fwrite($csv, $file_contents) === false) {
throw new Exception('Could not write contents to file wrapper');
}
$csv_fstat = fstat($csv);
$csv_fread = fread($csv, $csv_fstat['size']);
if ($csv_fread === false) {
throw new Exception('there was an error reading shit');
}
$header = null;
$rows = [];
while (($row = fgetcsv($csv, 0, ',')) !== false) {
if (!$header) {
$header = [];
foreach ($row as $v) {
$header_raw[] = $v;
$hcounts = array_count_values($header_raw);
$header[] = $hcounts[$v] > 1 ? $v . $hcounts[$v] : $v;
}
} else {
foreach ($row as &$l) {
$l = trim($l);
}
$rows[] = array_combine($header, $row);
}
}
fclose($csv);

I figured out the issue. Needed to do:
rewind($csv);
After fwrite

Related

How to write and save CSV file to server in PHP?

I am trying to create a new CSV file using PHP and upload or move it to a new part of the server but the spreadsheet it returns is a spreadsheet that has only the first cell in the first row with a value of either 404 or 1. What am I doing wrong?
My code is attached below.
// genrate new general spreadsheet
$filepath = substr($file_path, 1);
$data = load_csv_file($filepath);
header('Content-type: text/csv');
header('Content-Disposition: attachment; filename="file-saved.csv"');
$fp = fopen('php://output', 'wb');
foreach ($data as $row) {
$output = fputcsv($fp, $row);
}
$filename = "file-saved.csv";
file_put_contents( $filename, $output);
fclose($fp);
The $data variable is an array of values from another CSV file.
$output = [];
foreach($data as $row) {
$output[] = ..
}
...
error_reporting(0);
$file_n = public_path('/csv_file/product_details.csv');
$infoPath = pathinfo($file_n);
if($infoPath['extension'] == 'csv'){
$file = fopen($file_n, "r");
$i = 0;
$all_data = array();
while ( ($filedata = fgetcsv($file, null, "|")) !==FALSE) {
$num = count($filedata );
for ($c=0; $c < $num; $c++) {
$all_data[$i][] = $filedata [$c];
}
$i++;
}
fclose($file);
foreach($all_data as $importData){
$insertData = array(
"article_number"=>$importData[0],
"article_name"=>$importData[1],
"article_description"=>$importData[2],
"article_price"=>$importData[3],
"article_manufacturer"=>$importData[6],
"article_productgroupkey"=>$importData[7],
"article_productgroup"=>$importData[8],
"article_ean"=>$importData[9],
"article_hbnr"=>$importData[10],
"article_shippingcosttext"=>$importData[11],
"article_amount"=>$importData[12],
"article_paymentinadvance"=>$importData[13],
"article_maxdeliveryamount"=>$importData[14],
"article_energyefficiencyclass"=>$importData[15]
);
insertData($insertData);
}
}else{
echo "Invalid file extension.";
}
function insertData($data){
if($article_number->count() == 0){
//write your insert query here for $data
}elseif($article_number->count() > 0){
//article_number already present then update the table.UPDATE QUERY
}
}

skip columns while converting tab delimited text file to csv php

I am trying to convert a tab delimited file to csv. The problem is its a huge file. 100000 plus records. And i want only specific columns from that file. The file is not generated by me but by amazon so i cant really control the format.
The code i made works fine. But i need to ignore/remove some columns or rather i want only few columns from that. How do i do that without effecting the performance of conversion from txt to csv.
$file = fopen($file_name.'.txt','w+');
fwrite($file,$report);
fclose($file);
$handle = fopen($file_name.".txt", "r");
$lines = [];
$row_count=0;
$array_count = 0;
$uid = array($user_id);
if (($handle = fopen($file_name.".txt", "r")) !== FALSE)
{
while (($data = fgetcsv($handle, 100000, "\t")) !== FALSE)
{
if($row_count>0)
{
$lines[] = str_replace(",","<c>",$data);
array_push($lines[$array_count],$user_id);
$array_count++;
}
$row_count++;
}
fclose($handle);
}
$fp = fopen($file_name.'.csv', 'w');
foreach ($lines as $line)
{
fputcsv($fp, $line);
}
fclose($fp);
I am using unset to remove any column. But is there a better way ? for multiple columns.
I would do that by checking keys. For example:
// columns keys you don't wanna skip
$keys = array(0, 1, 3, 4, 7, 9);
$lines = file($file_name);
$result_lines = array();
foreach ($lines as $line) {
$tmp = array();
$tabs = explode("\t", $line);
foreach($tabs as $key => $value){
if(in_array($key, $keys)){
$tmp[] = $value;
}
}
$result_lines[] = implode(",", $tmp);
}
$finalString = implode("\n", $result_lines);
// Then write string to file
Hope it helps.
Cheers,
Siniša
In its simplest form i.e. without worrying about removing columns from the output this will do a simple read line and write line, therefore no need to maintain any memory hungry arrays.
$file_name = 'tst';
if ( ($f_in = fopen($file_name.".txt", "r")) === FALSE) {
echo 'Cannot find inpout file';
exit;
}
if ( ($f_out = fopen($file_name.'.csv', 'w')) === FALSE ) {
echo 'Cannot open output file';
exit;
}
while ($data = fgetcsv($f_in, 8000, "\t")) {
fputcsv($f_out, $data, ',', '"');
}
fclose($f_in);
fclose($f_out);
This is one way of removing the unwanted columns
$file_name = 'tst';
if ( ($f_in = fopen("tst.txt", "r")) === FALSE) {
echo 'Cannot find inpout file';
exit;
}
if ( ($f_out = fopen($file_name.'.csv', 'w')) === FALSE ) {
echo 'Cannot open output file';
exit;
}
$unwanted = [26,27]; //index of unwanted columns
while ($data = fgetcsv($f_in, 8000, "\t")) {
// remove unwanted columns
foreach($unwanted as $i) {
unset($data[$i]);
}
fputcsv($f_out, $data, ',', '"');
}
fclose($f_in);
fclose($f_out);

Getting special characters while reading a csv file in php

I have a space separated csv file. I am trying to read using following method:
$file = fopen('/var/www/html/my.csv', 'r');
while (($line = fgetcsv($file,0,"\t")) !== FALSE) {
$header_count = 0;
// $final_data = array();
if ( $count == 0 )
{
foreach ($line as $data) {
$header[$header_count] = $data;
$header_count++;
}
}
else
{
// print_r($line);
foreach ($line as $data) {
$final_data[$count-1][$header[$header_count]] = $data;
$header_count++;
}
}
$count++;
}
The file looks like:
id created ad
410699345585 11:12:29+05:30 ag:6061734588280
...
But, the reading output gives:
["��id"]=>
as id index. What am I doing wrong?

Convert csv in php and get unique value

I would like to convert a csv file that has duplicate contents and i would like to sum the quantity and extract the price without sum it.
file.csv :
code,qty,price
001,2,199
001,1,199
002,2,159
002,2,159
Actual php that sum the quantiy and get a result with unique value and total qty.
<?php
$tsvFile = new SplFileObject('file.csv');
$tsvFile->setFlags(SplFileObject::READ_CSV);
$tsvFile->setCsvControl("\t");
$file = fopen('file.csv', 'w');
$header = array('sku', 'qty');
fputcsv($file, $header, ',', '"');
foreach ($tsvFile as $line => $row) {
if ($line > 0) {
if (isset($newData[$row[0]])) {
$newData[$row[0]]+= $row[1];
} else {
$newData[$row[0]] = $row[1];
}
}
}
foreach ($newData as $key => $value) {
fputcsv($file, array($key, $value), ',', '"');
}
fclose($file);
?>
the result for this is:
code,qty
001,3
002,4
and i would like to add price, but without sum it.
The result i need is:
code,qty,price
001,3,199
002,4,159
I haven't tested this yet, but I think this is what you are looking for:
<?php
$tsvFile = new SplFileObject('file.csv');
$tsvFile->setFlags(SplFileObject::READ_CSV);
$tsvFile->setCsvControl("\t");
$file = fopen('file.csv', 'w');
$header = array('sku', 'qty');
fputcsv($file, $header, ',', '"');
foreach ($tsvFile as $line => $row) {
if ($line > 0) {
if(!isset($newData[$row[0]])) {
$newData[$row[0]] = array('qty'=>0, 'price'=>$row[2]);
}
$newData[$row[0]]['qty'] += $row[1];
}
}
foreach ($newData as $key => $arr) {
fputcsv($file, array($key, $arr['qty'], $arr['price']), ',', '"');
}
fclose($file);
?>
To start with, there's a nice function on the PHP page str_getcsv which will help you end up with a more legible array to work with:
function csv_to_array($filename='', $delimiter=',') {
if(!file_exists($filename) || !is_readable($filename))
return FALSE;
$header = NULL;
$data = array();
if (($handle = fopen($filename, 'r')) !== FALSE) {
while (($row = fgetcsv($handle, 1000, $delimiter)) !== FALSE) {
if(!$header)
$header = $row;
else
$data[] = array_combine($header, $row);
}
fclose($handle);
}
return $data;
}
This is purely for legibility sake but now comes the code which would allow you to work over the array.
$aryInput = csv_to_array('file.csv', ',');
$aryTemp = array();
foreach($aryInput as $aryRow) {
if(isset($aryTemp[$aryRow['code'])) {
$aryTemp[$aryRow['code']['qty'] += $aryRow['qty'];
} else {
$aryTemp[$aryRow['code']] = $aryRow;
}
}
In the above code, it simply:
Loops through the input
Checks whether the key exists in a temporary array
If it does, it just adds the new quantity
If it doesn't, it adds the entire row
Now you can write out your expectant csv file :)

CSV to Json with header row as key

I would like to convert a CSV to Json, use the header row as a key, and each line as object. How do I go about doing this?
----------------------------------CSV---------------------------------
InvKey,DocNum,CardCode
11704,1611704,BENV1072
11703,1611703,BENV1073
---------------------------------PHP-----------------------------------
if (($handle = fopen('upload/BEN-new.csv'. '', "r")) !== FALSE) {
while (($row_array = fgetcsv($handle, 1024, ","))) {
while ($val != '') {
foreach ($row_array as $key => $val) {
$row_array[] = $val;
}
}
$complete[] = $row_array;
}
fclose($handle);
}
echo json_encode($complete);
Just read the first line separately and merge it into every row:
if (($handle = fopen('upload/BEN-new.csv', 'r')) === false) {
die('Error opening file');
}
$headers = fgetcsv($handle, 1024, ',');
$complete = array();
while ($row = fgetcsv($handle, 1024, ',')) {
$complete[] = array_combine($headers, $row);
}
fclose($handle);
echo json_encode($complete);
I find myself converting csv strings to arrays or objects every few months.
I created a class because I'm lazy and dont like copy/pasting code.
This class will convert a csv string to custom class objects:
Convert csv string to arrays or objects in PHP
$feed="https://gist.githubusercontent.com/devfaysal/9143ca22afcbf252d521f5bf2bdc6194/raw/ec46f6c2017325345e7df2483d8829231049bce8/data.csv";
//Read the csv and return as array
$data = array_map('str_getcsv', file($feed));
//Get the first raw as the key
$keys = array_shift($data);
//Add label to each value
$newArray = array_map(function($values) use ($keys){
return array_combine($keys, $values);
}, $data);
// Print it out as JSON
header('Content-Type: application/json');
echo json_encode($newArray);
Main gist:
https://gist.github.com/devfaysal/9143ca22afcbf252d521f5bf2bdc6194
For those who'd like things spelled out a little more + some room to further parse any row / column without additional loops:
function csv_to_json_byheader($filename){
$json = array();
if (($handle = fopen($filename, "r")) !== FALSE) {
$rownum = 0;
$header = array();
while (($row = fgetcsv($handle, 1024, ",")) !== FALSE) {
if ($rownum === 0) {
for($i=0; $i < count($row); $i++){
// maybe you want to strip special characters or merge duplicate columns here?
$header[$i] = trim($row[$i]);
}
} else {
if (count($row) === count($header)) {
$rowJson = array();
foreach($header as $i=>$head) {
// maybe handle special row/cell parsing here, per column header
$rowJson[$head] = $row[$i];
}
array_push($json, $rowJson);
}
}
$rownum++;
}
fclose($handle);
}
return $json;
}

Categories