Multidimensional Array + CSV Export - php

I have an issue with an export script im trying to write...
I create a multidimensional array
while($row = $insert_row->fetch_assoc()) {
foreach ($selectArray as $value) {
$userData = $row[$value];
$userDataArray[] = $userData;
}
$userArray[] = $userDataArray;
unset($userDataArray);
}
Now I want to create the CSV File
$sendfilename = "export" . ".csv";
$filename = "file" . ".csv";
$delimiter = ';';
$enclosure = '"';
$encloseAll = true;
$nullToMysqlNull = false;
$delimiter_esc = preg_quote($delimiter, '/');
$enclosure_esc = preg_quote($enclosure, '/');
$fp = fopen($filename, 'wb');
if ($fp)
{
foreach ($userArray as $users) {
foreach ($users as $fields) {
fputcsv($fp, $fields,";",'"');
}
}
}
fclose($fp);
readfile($filename);
Im getting the error "fputcsv() expects parameter 2 to be array, string given"
Any solution?

Already fixed, I went one level to deep on fput...

Related

I want to write a function that opens a series of files and copies some of the data into an array

I want to write a function as a lot of this code is repeated but I am having trouble passing the name of the file and the mode as a parameter to the function.
name = array();
dob = array();
address = array();
data = array();
#get name data
$handle = fopen('data/name.txt', 'r');
while (!feof($handle)) {
$data = explode(':',fgets($handle, 1024));
$name[] = $data[1];
}
fclose($handle);
#get dob data
$handle = fopen('data/dob.txt', 'r');
while (!feof($handle)) {
$data = explode(':',fgets($handle, 1024));
$dob[] = $data[1];
}
fclose($handle);
#get address data
$handle = fopen('data/address.txt', 'r');
while (!feof($handle)) {
$data = explode(':',fgets($handle, 1024));
$address[] = $data[1];
}
fclose($handle);
This is what I've written so far as a function.
function get_data($file, $mode, $array) {
$handle = fopen("'" . $file . "'", "'" . $mode . "'");
while (!feof($handle)) {
$data = explode(':',fgets($handle, 1024));
$array[] = $data[0];
}
So I want to be able to call the function on each file, such as;
get_data ('data/name.txt' , 'r', $name);
Your function is almost correct :) Just two mistakes
You didn't call fclose() inside your function
You tried to enclose a string twice, where one is enough
function get_data($file, $mode, $array) {
$handle = fopen("'" . $file . "'", "'" . $mode . "'");
# /\ these /\ and these are unnecessary
while (!feof($handle)) {
$data = explode(':',fgets($handle, 1024));
$array[] = $data[0];
}
Just calling fopen($file, $mode) is ok :)
If you would like to use your $array variable outside of your function, please remember to return it. If you add return $array; to the end of your function, you will be able to go:
$result = get_data('file.txt', 'r');
PS: There is already a similar function in PHP, file_get_contents(), perhaps you could use it? :)

“How to export other next file‘ after 2000 rows data

I am using PHP for export CSV file this is working, but i export 2000 or greater rows how to create automatic next CSV file.
How to Move other file on after 2000 rows?
<?php
header('Content-type: application/csv');
header('Content-Disposition: attachment; filename = records.csv');
echo $header = "Name";
echo "\r\n";
$sql = mysql_query(“Select * from table”);
while ($getData = mysql_fetch_assoc($sql)) {
echo '"'.$name.'"';
echo "\r\n";
}
exit;
?>
You can use array_chunk function to keep the records of 2000 and export them in csv.
For example
$rowData = [
[1,2,3],
[11,21,31],
[21,22,32],
[31,42,73],
[111,222,333]
];
foreach(array_chunk($rowData, 2) as $key => $records){
$file_name = 'export_data_'.$key.'.csv';
writeToCsv($file_name,$records);
}
//funtion to export data to csv
function writeToCsv($fileName, $rowData){
$fp = fopen($fileName, 'w');
foreach ($rowData as $fields) {
fputcsv($fp, $fields);
}
fclose($fp);
}
In your case use array_chunk($rowData, 2000)
<?php
$rowLimit = 2000;
$fileIndex = 0;
$i = 1;
$handle = null;
$fileList = array();
$timestampFolder = strtotime("now").'/';
mkdir($timestampFolder);
$sql = mysql_query("Select * from table");
while ($getData = mysql_fetch_assoc($sql)) {
if( ($i%$rowLimit) == 1) {
$fileIndex+=1;
if(!is_null($handle)) {
fclose($handle);
}
$fileName = "records".$fileIndex.".csv";
$handle = fopen($timestampFolder.$fileName, "a");
$fileList[] = $fileName;
}
fputcsv($handle, $getData);
$i++;
}
foreach($fileList as $file) {
echo ''.$file.'<br>';
}

fputcsv() putting all data in one row

I'm working on reading some json, flattening it in to a single array, and then writing it to a csv. I've gotten pretty far but I am currently stuck on the last component. I'm able to pull the data, flatten it, and write it to the csv, but it for some reason it writes all of it in one row, while it should be creating a new line with each new object. I think it's happening because my flattened array is taking multiple objects and combining them in to one, but I cant figure out how to fix it. Any help would be greatly appreciated.
Here's what I have:
<?php
$csvArray = array();
function array_flatten($array, $prefix = 'new')
{
if (!is_array($array)) {
return false;
}
$result = array();
foreach ($array as $key => $value) {
if (is_array($value)) {
$result = array_merge($result, array_flatten($value, $prefix . '_' . $key));
} else {
$result[$prefix . '_' . $key] = $value;
}
}
return $result;
}
for ($x = 1; $x <= 1; $x++) {
$response = file_get_contents('https://seeclickfix.com/api/v2/issues?page=' . $x . '&per_page=2');
//Decode the JSON and convert it into an associative array.
$jsonDecoded = json_decode($response, true);
$flat = array_flatten($jsonDecoded['issues']);
//Give our CSV file a name.
$csvFileName = '/Applications/MAMP/htdocs/SeeClickFix/all_exports_3/export' . $x . '.csv';
//Open file pointer.
$fp = fopen($csvFileName, 'w');
foreach ($flat as $item) {
// fputcsv($fp, $item);
if (is_null($item) || !isset($item) || $item == "Point") {
} else {
array_push($csvArray, $item);
}
fputcsv($fp, $csvArray);
}
fclose($fp);
}
?>

PHP excel file without PHPExcel

I have to create a excel file having near about 350 columns and 1000 rows. I have developed code for this task using PHPExcel. But it takes 42secs to create file. So I have to create excel file without PHPExcel. So I have developed a script in which data is identified by "\t" for different tab data. It takes just 2secs to create file. But problem is that when I'm going to open that file (created by using "\t"), message of corrupted file is displayed. And on repairing that file, it works fine. But I can't understand that where I'm making mistake in script. If anyone can solve this problem either by using PHPExcel (less execution time) or by solving error of corrupt file, then answer will be appreciated. Here is my code (CakePHP 3).
Input Array like
// Do not confuse with { instead of [. It's okey.
// $export_data and $data both are same (Input array).
{
"0": {
"customer_id": 1,
"name": "John Stevens",
"Date of 1 Purchase": "2014-08-05T00:00:00+0000",
"Date of 2 Purchase": "2014-09-05T00:00:00+0000",
"Date of 3 Purchase": "2014-10-05T00:00:00+0000",
...
...
... 350 Cols ...
}
"1": {
...
}
...
...
"999 Rows"
}
Using PHPExcel
$r = 1;
$filename = FILE_PATH . 'galliyan.xlsx';
$header = array_keys($export_data[0]);
$objPHPExcel = new \PHPExcel();
$col = 0;
foreach ($header as $field) {
$objPHPExcel->getActiveSheet()->setCellValueByColumnAndRow($col, 1, $field);
$col++;
}
$objWriter = \PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
$objWriter->save($filename);
chmod($filename, 0777);
$r++;
$objPHPExcel = \PHPExcel_IOFactory::load($filename);
foreach($export_data as $row) {
$col = 0;
foreach ($row as $ro) {
$objPHPExcel->getActiveSheet()->setCellvalueByColumnAndRow($col, $r, $ro);
$col++;
}
$r++;
}
$objWriter = \PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
$objWriter->save($filename);
chmod($filename, 0777);
EDIT (UPDATED CODE)
$filename = FILE_PATH . 'galliyan.xlsx';
$header = array_keys($export_data[0]);
$objPHPExcel = new \PHPExcel();
$sheet = $objPHPExcel->getActiveSheet();
$col = 0;
foreach ($header as $field) {
$sheet->setCellValueByColumnAndRow($col, 1, $field);
$col++;
}
$objPHPExcel->getActiveSheet()->fromArray($export_data, null, 'A2');
$writer = \PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel2007');
$writer->save($filename);
chmod($filename, 0777);
Using "\t"
foreach ($data as $row) {
if (!$flag) {
$header = array();
// display field/column names as first row
$header = array_keys($row);
$header = $this->setExcelHeaders($header);
$this->createExcelFile($dir_name, $filename, $header, $export_type);
$flag = TRUE;
}
array_walk($row, array($this, 'cleanData'));
array_push($values, $row);
}
$values = $this->setValues($values);
$this->writeExcelFile($dir_name, $filename, $values);
function setExcelHeaders($hdrs) {
$header = '';
foreach ($hdrs as $title_val) {
$header .= $title_val . "\t";
}
return $header;
}
function createExcelFile($dir_name, $filename, $header, $export_type = '') {
$fp = fopen($dir_name . "/" . $filename, 'w');
fwrite($fp, "$header\n");
fclose($fp);
$permission = Configure::read('Config.PERMISSION');
if ($export_type == "") {
chmod($dir_name, $permission);
}
chmod($dir_name . "/" . $filename, $permission);
}
public function cleanData(&$str) {
$str = preg_replace("/\t/", "\\t", $str);
$str = preg_replace("/\r?\n/", "\\n", $str);
if (strstr($str, '"'))
$str = '"' . str_replace('"', '""', $str) . '"';
}
private function setValues($all_vals) {
$data = '';
for ($i = 0; $i < count($all_vals); $i++) {
$line = '';
foreach ($all_vals[$i] as $value) {
if ((!isset($value)) || ( $value == "")) {
$value = "\t";
}
else {
$value = str_replace('"', '""', $value);
$value = '"' . $value . '"' . "\t";
}
$line .= $value;
}
$data .= trim($line) . "\n";
}
return $values = str_replace("\r", "", $data);
}
function writeExcelFile($dir_name, $filename, $data) {
$fp = fopen($dir_name . "/" . $filename, 'a');
fwrite($fp, "$data");
fclose($fp);
}
As you mentioned "without PHPExcel", have you tried looking at alternatives that can be faster than PHPExcel? For instance, you can generate a 350x1000 XLSX spreadsheet with Spout (https://github.com/box/spout) in just a few seconds.
Generating a CSV file is also a good alternative and I'd recommend you going this route if you can. The export process will be way faster! But don't try to reinvent the wheel, there a already a lot of CSV writer out there, ready to use (Spout and PHPExcel both have one for instance).

Convert csv in php and get unique value

I would like to convert a csv file that has duplicate contents and i would like to sum the quantity and extract the price without sum it.
file.csv :
code,qty,price
001,2,199
001,1,199
002,2,159
002,2,159
Actual php that sum the quantiy and get a result with unique value and total qty.
<?php
$tsvFile = new SplFileObject('file.csv');
$tsvFile->setFlags(SplFileObject::READ_CSV);
$tsvFile->setCsvControl("\t");
$file = fopen('file.csv', 'w');
$header = array('sku', 'qty');
fputcsv($file, $header, ',', '"');
foreach ($tsvFile as $line => $row) {
if ($line > 0) {
if (isset($newData[$row[0]])) {
$newData[$row[0]]+= $row[1];
} else {
$newData[$row[0]] = $row[1];
}
}
}
foreach ($newData as $key => $value) {
fputcsv($file, array($key, $value), ',', '"');
}
fclose($file);
?>
the result for this is:
code,qty
001,3
002,4
and i would like to add price, but without sum it.
The result i need is:
code,qty,price
001,3,199
002,4,159
I haven't tested this yet, but I think this is what you are looking for:
<?php
$tsvFile = new SplFileObject('file.csv');
$tsvFile->setFlags(SplFileObject::READ_CSV);
$tsvFile->setCsvControl("\t");
$file = fopen('file.csv', 'w');
$header = array('sku', 'qty');
fputcsv($file, $header, ',', '"');
foreach ($tsvFile as $line => $row) {
if ($line > 0) {
if(!isset($newData[$row[0]])) {
$newData[$row[0]] = array('qty'=>0, 'price'=>$row[2]);
}
$newData[$row[0]]['qty'] += $row[1];
}
}
foreach ($newData as $key => $arr) {
fputcsv($file, array($key, $arr['qty'], $arr['price']), ',', '"');
}
fclose($file);
?>
To start with, there's a nice function on the PHP page str_getcsv which will help you end up with a more legible array to work with:
function csv_to_array($filename='', $delimiter=',') {
if(!file_exists($filename) || !is_readable($filename))
return FALSE;
$header = NULL;
$data = array();
if (($handle = fopen($filename, 'r')) !== FALSE) {
while (($row = fgetcsv($handle, 1000, $delimiter)) !== FALSE) {
if(!$header)
$header = $row;
else
$data[] = array_combine($header, $row);
}
fclose($handle);
}
return $data;
}
This is purely for legibility sake but now comes the code which would allow you to work over the array.
$aryInput = csv_to_array('file.csv', ',');
$aryTemp = array();
foreach($aryInput as $aryRow) {
if(isset($aryTemp[$aryRow['code'])) {
$aryTemp[$aryRow['code']['qty'] += $aryRow['qty'];
} else {
$aryTemp[$aryRow['code']] = $aryRow;
}
}
In the above code, it simply:
Loops through the input
Checks whether the key exists in a temporary array
If it does, it just adds the new quantity
If it doesn't, it adds the entire row
Now you can write out your expectant csv file :)

Categories