i have a 5000 rows in the database and trying to get the data as chunks and create one json file that contain all the 5000 rows, my code
ManagerFacade::findAllAsChunks(function ($timeSignals) {
foreach ($timeSignals as $timeSignalIndex => $timeSignal) {
$InputFile = '{';
$InputFile .= '"v": ' . $timeSignal->v . ",";
$InputFile .= '"t": ' . $timeSignal->t;
$InputFile .= '}';
if($isLastChunk && !isset($timeSignals[$timeSignalIndex + 1])) {
$InputFile .= '';
} else {
$InputFile .= ',';
}
file_put_contents($filePath, $InputFile, FILE_APPEND);
}
}
);
i'm looking for a better way to create the json file without the need to concatenated, is their a way to just json_encode the $timeSignals arrays and put them in the file one after each other, without looping
Related
I was recently tasked to update some older sites from MySQL to MySQLi.
Though slow and steady, the update has been ok until I ran into an issue when exporting some data to an excel document.
This code was written by a previous developer. There's a lot going on in the file, and I hope I'm grabbing the part that is supposed to be creating the excel document:
<?php
$export = mysqli_query ( $session->connDB(),$sql ) or die ( "Sql error : " . mysqli_error( ) );
$fields = mysqli_num_fields ( $export );
$num_rows = mysqli_num_rows( $export );
$pattern = '/[A-Z0-9][A-Z0-9._-]+#[A-Z0-9][A-Z0-9.-]{0,61}[A-Z0-9]\.[A-Z.]{2,6}/i'; //email
$phonpat = '/(\(?([0-9]{3})+\)?[\-\s\.]?([0-9]{3})+[\-\s\.]?([0-9]{4})(( |-|\.)?[ext\.]+ ?\d+)|\(?([0-9]{3})+\)?[\-\s\.]?([0-9]{3})+[\-\s\.]?([0-9]{4}))/i'; //telephone
$phPat = '/([0-9]{3})([0-9]{3})([0-9]{4})/';
$vippatt = '/VIP/i';
for($f=0; $f<$fields; $f++){
$header.='"'.mysqli_fetch_fields($export, $f).'"'."\t";
}
for($i=0; $i<$num_rows; $i++){
for($x=0; $x<$fields; $x++){
$email = mysqli_fetch_assoc($export,$i,"EMAIL");
$phone = mysqli_fetch_assoc($export,$i,"PHONE");
$viprm = mysqli_fetch_assoc($export,$i,"VIP");
preg_match ($pattern, $email, $matches);
preg_match ($phonpat, $phone, $phoneno);
preg_match ($vippatt, $viprm, $vpmatch);
if(isset($matches[0])) {$emal=strtolower($matches[0]);} else {$emal="";}
if(isset($vpmatch[0])) {$vips=strtoupper($vpmatch[0]);} else {$vips="";}
if(isset($phoneno[0])) {$phne=preg_replace($phPat,'($1) $2-$3 ',formatPhone($phoneno[0],false,false));} else {$phne="";}
if(mysqli_fetch_fields($export, $x)=='EMAIL'){
$fld=$emal;
} else {
if(mysqli_fetch_fields($export, $x)=='PHONE'){
$fld=$phne;
} else {
if(mysqli_fetch_fields($export, $x)=='VIP'){
$fld=$vips;
} else {
if(mysqli_fetch_fields($export, $x)=='UNITS'){
$fld=1;
} else {
$fld = mysqli_fetch_assoc($export,$i,mysqli_fetch_fields($export, $x));
}
}
}
}
$data.= '"'.$fld.'"'."\t";
}
$data.="\n";
}
?>
Here is where the code checks if the data is blank or not, and then exports the spreadsheet:
<?php
if ($data == "") {
$data = "\nNo records found for your search parameters.\n\n".$sql;
} else {
echo "should show data";
}
global $time;
$time = time();
header("Content-Disposition: attachment; filename=CargoManagementCustomReport-$time.xls");
header("Pragma: no-cache");
header("Expires: 0");
print "$header\n$data";
?>
When the spreadsheet gets exported, I see "should show data". This tells me the $data variable obviously has data. It's just not getting into the spreadsheet.
If you'll notice in the above, I'm using mysqli_fetch_fields. This was used to replace mysql_field_name (in my attempt to update to MySQLi).
I also tried mysqli_fetch_field, but got the same results.
I am getting no errors, but the spreadsheet is still blank.
I can echo $sql to get the query, and I can run the query in the database and it returns data.
What am I doing wrong and how can I fix it?
That whole code is gibberish, so I hope I understood what it is that it was meant to do.
Here are the main problems:
mysqli_fetch_fields() takes only 1 argument and returns an array of objects. You can't cast an array to a string. I assume you wanted to get the field name.
mysqli_fetch_assoc() takes only 1 argument and returns an array of data in an associative array as the name suggests. It also moves the internal pointer to the next row every time it is called. You are trying to use it as if it was mysql_result().
Your nested loops are very messy. I replaced them with simple foreach loops and replaced the nested if statements with a switch. While I would normally stay away from such constructs, this is the easiest way to migrate this code.
After removing all the mysqli nonsense, the code is now readable. It iterates over every field of every row, applying some transformations to some fields and concatenating the result into a string.
Fixed code:
$conn = $session->connDB();
$export = mysqli_query($conn, $sql);
$pattern = '/[A-Z0-9][A-Z0-9._-]+#[A-Z0-9][A-Z0-9.-]{0,61}[A-Z0-9]\.[A-Z.]{2,6}/i'; //email
$phonpat = '/(\(?([0-9]{3})+\)?[\-\s\.]?([0-9]{3})+[\-\s\.]?([0-9]{4})(( |-|\.)?[ext\.]+ ?\d+)|\(?([0-9]{3})+\)?[\-\s\.]?([0-9]{3})+[\-\s\.]?([0-9]{4}))/i'; //telephone
$phPat = '/([0-9]{3})([0-9]{3})([0-9]{4})/';
$vippatt = '/VIP/i';
foreach (mysqli_fetch_fields($result) as $field) {
$header .= '"' . $field->name . '"' . "\t";
}
$data = '';
foreach ($export as $row) {
foreach ($rows as $fieldName => $value) {
switch ($fieldName) {
case 'EMAIL':
preg_match($pattern, $value, $matches);
$data .= '"' . (isset($matches[0]) ? strtolower($matches[0]) : '') . '"' . "\t";
break;
case 'PHONE':
preg_match($phonpat, $value, $phoneno);
$phne = "";
if (isset($phoneno[0])) {
$phne = preg_replace($phPat, '($1) $2-$3 ', formatPhone($phoneno[0], false, false));
}
$data .= '"' . $phne . '"' . "\t";
break;
case 'VIP':
preg_match($vippatt, $value, $vpmatch);
$data .= '"' . (isset($vpmatch[0]) ? strtolower($vpmatch[0]) : '') . '"' . "\t";
break;
case 'UNITS':
$data .= '"1"' . "\t";
break;
default:
$data .= '"' . $value . '"' . "\t";
break;
}
}
$data .= "\n";
}
I am trying to export the data from table on SQL server database to the CSV file.
Data is formatted correctly and placed in each separate cells on the file. But the header is not formatted properly and is printed all on to one cell as a continuous stream.
Say you have a,b,c,d as headers :
Header is printed as abcd on to the first cell and is not spitting out on to individual cells. how do we separate them out ?
Here is the code :
$flag = false;
if ($query) {
while( $data = sqlsrv_fetch_array( $query, SQLSRV_FETCH_ASSOC) ) {
foreach($data AS $key => $value){
if(!$flag) {
// display field/column names as first row
$out .= implode("\t", array_keys($data)) . "\n";
//$out .= '"'.$head.'",';
$flag = true;
}
//If the character " exists, then escape it, otherwise the csv file will be invalid.
$pos = strpos($value, '"');
if ($pos !== false) {
$value = str_replace('"', '\"', $value);
}
$out .= '"'.$value.'",';
}
$out .= "\n";
}
$out .= implode("\t", array_keys($data)) . "\n";
Is creating a tab separated line, but elsewhere you are using comma separated.
Probably you want to use comma's here as well:
$out .= implode(",", array_keys($data)) . "\n";
I need to update a file using php
Sample file:
#Start#
No. of records: 2
Name: My name,
Age: 18,
Date: 2013-07-11||
Name: 2nd name,
Age: 28,
Date: 2013-07-11||
#End#
I need to edit 'No. of records' on each time I add another record on file. And another record needs to be before '#End#'
I'm using
$Handle = fopen($File, 'a');
$data = .......
fwrite($Handle, $Data);
to add records
How can I edit 'No. of records' & add data before '#End#'?
Instead of modifying the file I would parse it, change the data in PHP an rewrite the file after that.
To achieve this, I would firstly create a function that parses the input into php arrays:
function parse($file) {
$records = array();
foreach(file($file) as $line) {
if(preg_match('~^Name: (.*),~', $line, $matches)) {
$record = array('name' => $matches[1]);
}
if(preg_match('~^Age: (.*),~', $line, $matches)) {
$record ['age'] = $matches[1];
}
if(preg_match('~^Date: (.*)\|\|~', $line, $matches)) {
$record ['date'] = $matches[1];
$records [] = $record;
}
}
return $records;
}
Secondly I would create a function that flattens the arrays back into the same file format again:
function flatten($records, $file) {
$str = '#Start#';
$str .= "\n\n";
$str .= 'No. of records: ' . count($records) . "\n\n";
foreach($records as $record) {
$str .= 'Name: ' . $record['name'] . ",\n";
$str .= 'Age: ' . $record['name'] . ",\n";
$str .= 'Date: ' . $record['name'] . "||\n\n";
}
file_put_contents($file, $str . '#End#');
}
Then use it like this:
$records = parse('your.file');
var_dump($records);
$records []= array(
'name' => 'hek2mgl',
'age' => '36',
'date' => '07/11/2013'
);
flatten($records, 'your.file');
In case if file is relatively small (easily fits in memory), you can use file() function. It will return array, which you can iterate, etc.
If the file is larger, you'll need to read it in the loop using fgets(), writing data to the new temporary file and replacing original file with it after you're done
I've run into this issue a few times.
I have a piece of code that exports data to a CSV. The method I'm using passes the result set to the template, the template cycles through the results, echoing the fields.
In the action:
$this->result = ObjectPeer::doSelect($criteria);
In the template:
foreach ($result as $row)
{
echo $row->getValue1().','.$row->getValue2().','.$row->getValue3()...
}
However, if the result set is large, I'll run out of memory:
[error] PHP Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 1556481 bytes) in exportSuccess.php on line 13, referer: https://mysite.com/module
The process is using over 250 mb, but the file it creates is only ~2 mb.
I can up the amount of memory that php.ini gives the process, but I'd rather not. If the export is large enough, I kind of doubt i'll be able to give it enough memory.
I've read a few other cases similar to this, which suggested unsetting $row after each echo. That didn't work in my case.
I assume there is a way to chunk this query and still build the whole file - can anyone recommend or point me to a clear tutorial?
Depending on number of elements you want to retrieve (ie: getValue1, getValue2, etc ..), you can hydrate the result in a different way:
$stmt = ObjectPeer::doSelectStmt($m_criteria);
while ($row = $stmt->fetch(PDO::FETCH_NUM))
{
echo $row[0];
}
Also, you can check another solution using doSelectRS:
$rs = ObjectPeer::doSelectRS($criteria);
$rs->setFetchMode(ResultSet::FETCHMODE_ASSOC);
while($rs->next())
{
$records = $rs->getRow();
// then use $records['key'] to retrieve information
}
Few others things that you can check. This french aticle about memory leak form Propel.
Here is a basic idea to write to a csv file. this doesn't show the connection to the database but it is everything afterwards and make sure to close the connection at the end.
$filename = 'c:\whatever.csv';
$fp = fopen($filename, "w") or die(mysql_error());
echo "Connected to ".$filename." <br>";
$res = mysql_query("SELECT *
FROM $table1
WHERE Something
");
// fetch a row and write the column names out to the file
echo mysql_error();
$row = mysql_fetch_assoc($res);
$line = "";
$comma = "";
foreach($row as $name => $value) {
$line .= $comma . '"' . str_replace('"', '""', $name) . '"';
$comma = ",";
}
$line .= "\n";
fputs($fp, $line);
// remove the result pointer back to the start
mysql_data_seek($res, 0);
// and loop through the actual data
echo mysql_error();
while($row = mysql_fetch_assoc($res)) {
$line = "";
$comma = "";
foreach($row as $value) {
$line .= $comma . '"' . str_replace('"', '""', $value) . '"';
$comma = ",";
}
$line .= "\n";
fputs($fp, $line);
}
I need to convert a CSV file to JSON on the server using PHP. I am using this script which works:
function csvToJSON($csv) {
$rows = explode("\n", $csv);
$i = 0;
$len = count($rows);
$json = "{\n" . ' "data" : [';
foreach ($rows as $row) {
$cols = explode(',', $row);
$json .= "\n {\n";
$json .= ' "var0" : "' . $cols[0] . "\",\n";
$json .= ' "var1" : "' . $cols[1] . "\",\n";
$json .= ' "var2" : "' . $cols[2] . "\",\n";
$json .= ' "var3" : "' . $cols[3] . "\",\n";
$json .= ' "var4" : "' . $cols[4] . "\",\n";
$json .= ' "var5" : "' . $cols[5] . "\",\n";
$json .= ' "var6" : "' . $cols[6] . "\",\n";
$json .= ' "var7" : "' . $cols[7] . "\",\n";
$json .= ' "var8" : "' . $cols[8] . "\",\n";
$json .= ' "var9" : "' . $cols[9] . "\",\n";
$json .= ' "var10" : "' . $cols[10] . '"';
$json .= "\n }";
if ($i !== $len - 1) {
$json .= ',';
}
$i++;
}
$json .= "\n ]\n}";
return $json;
}
$json = csvToJSON($csv);
$json = preg_replace('/[ \n]/', '', $json);
header('Content-Type: text/plain');
header('Cache-Control: no-cache');
echo $json;
The $csv variable is a string resulting from a cURL request which returns the CSV content.
I am sure this is not the most efficient PHP code to do it because I am a beginner developer and my knowledge of PHP is low. Is there a better, more efficient way to convert CSV to JSON using PHP?
Thanks in advance.
Note. I am aware that I am adding whitespace and then removing it, I do this so I can have the option to return "readable" JSON by removing the line $json = preg_replace('/[ \n]/', '', $json); for testing purposes.
Edit. Thanks for your replies, based on them the new code is like this:
function csvToJson($csv) {
$rows = explode("\n", trim($csv));
$csvarr = array_map(function ($row) {
$keys = array('var0','var1','var2','var3','var4','var5','var6','var7','var8','var9','var10');
return array_combine($keys, str_getcsv($row));
}, $rows);
$json = json_encode($csvarr);
return $json;
}
$json = csvToJson($csv);
header('Content-Type: application/json');
header('Cache-Control: no-cache');
echo $json;
Well there is the json_encode() function, which you should use rather than building up the JSON output yourself. And there is also a function str_getcsv() for parsing CSV:
$array = array_map("str_getcsv", explode("\n", $csv));
print json_encode($array);
You must however adapt the $array if you want the JSON output to hold named fields.
I modified the answer in the question to use the first line of the CSV for the array keys. This has the advantage of not having to hard-code the keys in the function allowing it to work for any CSV with column headers and any number of columns.
Here is my modified version:
function csvToJson($csv) {
$rows = explode("\n", trim($csv));
$data = array_slice($rows, 1);
$keys = array_fill(0, count($data), $rows[0]);
$json = array_map(function ($row, $key) {
return array_combine(str_getcsv($key), str_getcsv($row));
}, $data, $keys);
return json_encode($json);
}
None of these answers work with multiline cells, because they all assume a row ends with '\n'. The builtin fgetcsv function understands that multiline cells are enclosed in " so it doesn't run into the same problem. The code below instead of relying on '\n' to find each row of a csv lets fgetcsv go row by row and prep our output.
function csv_to_json($file){
$columns = fgetcsv($file); // first lets get the keys.
$output = array(); // we will build out an array of arrays here.
while(!feof($file)){ // until we get to the end of file, we'll pull in a new line
$line = fgetcsv($file); // gets the next line
$lineObject = array(); // we build out each line with our $columns keys
foreach($columns as $key => $value){
$lineObject[$value] = $line[$key];
}
array_push($output, $lineObject);
}
return json_encode($output); // encode it as json before sending it back
}
Some tips...
If you have URL opening enabled for fopen() and wrappers, you can use fgetscsv().
You can build an array of the CSV, and then convert it with PHP's native json_encode().
The correct mime type for JSON is application/json.
You could probably reduce the overhead by removing all the spaces and \n's. But that's in your note.
You could increase the performance by skipping the preg_replace and passing a boolean that would turn it on and off.
Other than that, the variable unrolling of your var[1-10] actually is good, as long as there are always ten varaibles.
The explode and the foreach approach are just fine.
I recommend using Coseva (a csv parsing library) and using the built in toJSON() method.
<?php
// load
require('../src/CSV.php');
// read
$csv = new Coseva\CSV('path/to/my_csv.csv');
// parse
$csv->parse();
// disco
echo $csv->toJSON();