PHP fputcsv not appending all data in csv - php

I have this simple function to write data to csv, but not able to write all the data that I fetch.
Consider an example where I have 500 records to write to csv, which will correspond to 500 rows in csv. While I run this operation it sometimes write only 120 rows or 150 or even 10 rows . Everytime I run this function I get variable rows not 500 rows.
I have not yet found the root cause of why it not writing all 500 records that I fetch.
Following is the PHP code :
$arrHead = array();
$arrHead = array('Assign to','Group assigned to','Customer name','Email','Phone','created Date','Ticket type',
'Ticket Status','internal note','Ticket number','Subject','Origin','Travel Start Date','Destination',
'Package Type','Lead Received','Mail box ID');
$csvfname = time().'.csv';
if(Yii::app()->params['env'] == "LOCAL")
{
$basePath = '/var/www/c360/uploads/1/1/';
$basePath = '/var/www/html/c360/uploads/1/1/'; // Mangesh
}
else
{
$basePath = '/data/cview/run/nginx/htdocs/c360/uploads/1/1/';
}
$completeFilePath = $basePath . $csvfname;
if (!empty($arrRecords) && count($aarrRecords) > 0)
{
$fp = fopen ($completeFilePath,'w+');
fputcsv($fp, $arrHead,",",'"');
chmod($completeFilePath, 0777);
foreach ($arrRecords as $keyTicket => $arrTicket)
{
// preparing columns data into array
$columnValue = array();
$columnValue['col1'] = trim('some value');
.
.
.
.
// after the data is prepared in array I write it to CSV
fputcsv($fp, $columnValue,",",'"');
}
fclose ($fp);
}
After it writes to CSV , and when i check the CSV has less data appended, not all 500 rows
I dont know here I'm wrong in this case.

Related

PHP memory exhaused while using array_combine in foreach loop

I'm having a trouble when tried to use array_combine in a foreach loop. It will end up with an error:
PHP Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 85 bytes) in
Here is my code:
$data = array();
$csvData = $this->getData($file);
if ($columnNames) {
$columns = array_shift($csvData);
foreach ($csvData as $keyIndex => $rowData) {
$data[$keyIndex] = array_combine($columns, array_values($rowData));
}
}
return $data;
The source file CSV which I've used has approx ~1,000,000 rows. This row
$csvData = $this->getData($file)
I was using a while loop to read CSV and assign it into an array, it's working without any problem. The trouble come from array_combine and foreach loop.
Do you have any idea to resolve this or simply have a better solution?
UPDATED
Here is the code to read the CSV file (using while loop)
$data = array();
if (!file_exists($file)) {
throw new Exception('File "' . $file . '" do not exists');
}
$fh = fopen($file, 'r');
while ($rowData = fgetcsv($fh, $this->_lineLength, $this->_delimiter, $this->_enclosure)) {
$data[] = $rowData;
}
fclose($fh);
return $data;
UPDATED 2
The code above is working without any problem if you are playing around with a CSV file <=20,000~30,000 rows. From 50,000 rows and up, the memory will be exhausted.
You're in fact keeping (or trying to keep) two distinct copies of the whole dataset in your memory. First you load the whole CSV date into memory using getData() and the you copy the data into the $data array by looping over the data in memory and creating a new array.
You should use stream based reading when loading the CSV data to keep just one data set in memory. If you're on PHP 5.5+ (which you definitely should by the way) this is a simple as changing your getData method to look like that:
protected function getData($file) {
if (!file_exists($file)) {
throw new Exception('File "' . $file . '" do not exists');
}
$fh = fopen($file, 'r');
while ($rowData = fgetcsv($fh, $this->_lineLength, $this->_delimiter, $this->_enclosure)) {
yield $rowData;
}
fclose($fh);
}
This makes use of a so-called generator which is a PHP >= 5.5 feature. The rest of your code should continue to work as the inner workings of getData should be transparent to the calling code (only half of the truth).
UPDATE to explain how extracting the column headers will work now.
$data = array();
$csvData = $this->getData($file);
if ($columnNames) { // don't know what this one does exactly
$columns = null;
foreach ($csvData as $keyIndex => $rowData) {
if ($keyIndex === 0) {
$columns = $rowData;
} else {
$data[$keyIndex/* -1 if you need 0-index */] = array_combine(
$columns,
array_values($rowData)
);
}
}
}
return $data;

Incrementally read a file and put in DB. Doesn't give errors, but does not insert data completely or correctly

I am trying to write a file to a database 500 lines at a time so I do not run low on memory by avoiding dealing with very large arrays. For some reason, I am not getting any errors, but I am seeing a very, very small fraction entered into my table.
$ln = intval(shell_exec("wc -l $text_filename_with_path"));
echo "FILENAME WITH PATH: " . $text_filename_with_path ."\n\n";
echo "ARRAY LENGTH: " . $ln . "\n\n";
//pointer is initialized at zero
$fp = fopen($text_filename_with_path, "r");
$offset = 0;
$c = 0;
while($offset < $ln){
$row_limit = 500;
//get a 500 row section of the file
$chunk = fgets($fp, $row_limit);
//prepare for `pg_copy_from` by exploding to array
$chunk = explode("\n", $chunk);
//each record from the file being read is just one element
//prepare for three column DB table by adding columns (one
//unique PK built from UNIX time concat with counter, the
//other from a non-unique batch ID)
array_walk($chunk,
function (&$item, $key) use ($datetime, $c) {
$item = time() . $c . $key . "\t" . $datetime . "\t" . $item;
}
);
//increase offset to in order to move pointer forward
$offset += $row_limit;
//set pointer ahead to new position
fseek($fp, $offset);
echo "CURRENT POINTER: " . ftell($fp) . "\n"; //prints out 500, 1000, 1500 as expected
//insert array directly into DB from array
pg_copy_from($con, "ops.log_cache_test", $chunk, "\t", "\\NULL");
//increment to keep PK column unique
$c++;
}
I am getting as I say a fraction of the contents of the file, and lots of the data looks a bit messed up, eg about have the entries are blank in the part of the array element that gets assigned by $item within my array_walk() callback. Further it seems that exploding on \n is not working properly as lines seem exploded at ununiform positions (ie, log records don't look symmetrical). Have I just made a total mess out of this
You are not using fgets properly (2nd parameter isn't the number of rows);
There are two ways I can think of at the moment to solve it:
1. A loop getting one line at a time, until you've reached your row limit.
code should look something like this (not tested, assuming the end of line char is "\n" and no "\r")
<?php
/**Your code and initialization here*/
while (!feof($file)){
$counter = 0;
$buffer = array();
while (($line = fgets($file)) !== false && $counter < $row_limit) {
$line = str_replace("\n", "", $line); // fgets gets the line with the newline char at the end of line.
$buffer[] = $line;
$counter++;
}
insertRows($rows);
}
function insertRows($rows){
/** your code here */
}?>
Assuming the file isn't too big- using file_get_contents();
code should look something like this (same assumptions)
<?php
/**Your code and initialization here*/
$data = file_get_contents($filename);
if ($data === FALSE )
echo "Could not get content for file $filename\n";
$data = explode("\n",$data);
for ($offset=0;$offset<count($data);$offset+=$row_limit){
insertRows(array_slice ($rows,$offset,$row_limit));
}
function insertRows($rows){
/** your code here */
}
I didn't test it, so I hope it's ok.

Edit specific record from a line in text file and PHP

I'm trying to make a simple news hit counter with PHP and text file. i wrote a simple code to check and read the file:
Text File:
//Data in Source File
//Info: News-ID|Hits|Date
1|32|2013-9-25
2|241|2013-9-26
3|57|2013-9-27
PHP File:
//Get Source
$Source = ENGINE_DIR . '/data/top.txt';
$Read = file($Source);
//Add New Record
foreach($Read as $News){
//Match News ID
if($News[0] == "2"){
//Add New Record and Update the Text File
}
}
Problem is i can't change the news hits! For example, i need change hits from second line from 241 to 242 and write it again in to the txt file.
I searched in this site and Google and tried some ways but i couldn't fix that.
At the least, you're forgetting to write the increment back to the file. Also, you're going to want to parse each row into columns you can work with (delimited by a pipe |).
Untested code, but the idea is:
$Source = ENGINE_DIR . '/data/top.txt'; // you already have this line
$Read = file($Source); // and this one
foreach ( $Read as $LineNum => $News ) { // iterate through each line
$NewsParts = explode('|',$News); // expand the line into pieces to work with
if ( $NewsParts[0] == 2 ) { // if the first column is 2
$NewsParts[1]++; // increment the second column
$Read[$LineNum] = implode('|',$NewsParts); // glue the line back together, we're updating the Read array directly, rather than the copied variable $News
break; // we're done so exit the loop, saving cycles
}
}
$UpdatedContents = implode(PHP_EOL,$Read); // put the read lines back together (remember $Read as been updated) using "\n" or "\r\n" whichever is best for the OS you're running on
file_put_contents($Source,$UpdatedContents); // overwrite the file
You could read the file and do something like this:
//Get Source
$Source = ENGINE_DIR . '/data/top.txt';
$Read = file($Source);
$News = array();
foreach ($Read as $line) {
list($id, $views, $date) = explode('|', $line);
$News[$id] = array(
'id' => $id,
'views' => $views,
'date' => $date,
);
}
At this point you have the array $News which contains every news item and you can change them as you wish (example: $News[2]['views'] = 242;).
The only thing you're missing now is the writing back to the file part, which is also easy.
$fh = fopen(ENGINE_DIR . '/data/top.txt', 'w'); //'w' mode opens the file for write and truncates it
foreach ($News as $item) {
fwrite($fh, $item['id'] . '|' . $item['views'] . '|' . $item['date'] . "\n");
}
fclose($fh);
And that's it! :)

Export large rows to Excel document, in small memory footprint

I am using PHPExcel to create an Excel document, using data from a MySQL database. My script must execute in under 512MB of RAM, and I am running into trouble as my export reaches 200k records:
PHP Fatal error: Allowed memory size of...
How can I use PHPExcel to create large documents in as little amount of RAM as possible?
My current code:
// Autoload classes
ProjectConfiguration::registerPHPExcel();
$xls = new PHPExcel();
$xls->setActiveSheetIndex(0);
$i = 0;
$j = 2;
// Write the col names
foreach ($columnas_excel as $columna) {
$xls->getActiveSheet()->setCellValueByColumnAndRow($i,1,$columna);
$xls->getActiveSheet()->getColumnDimensionByColumn($i)->setAutoSize(true);
$i++;
}
// paginate the result from database
$pager = new sfPropelPager('Antecedentes',50);
$pager->setCriteria($query_personas);
$pager->init();
$last_page = $pager->getLastPage();
//write the data to excel object
for($pagina =1; $pagina <= $last_page; $pagina++) {
$pager->setPage($pagina);
$pager->init();
foreach ($pager->getResults() as $persona) {
$i = 0;
foreach ($columnas_excel as $key_col => $columnas) {
$xls->getActiveSheet()->setCellValueByColumnAndRow($i,$j,$persona->getByName($key_col, BasePeer::TYPE_PHPNAME));
$i++;
}
$j++;
}
}
// write the file to the disk
$writer = new PHPExcel_Writer_Excel2007($xls);
$filename = sfConfig::get('sf_upload_dir') . DIRECTORY_SEPARATOR . "$cache.listado_personas.xlsx";
if (file_exists($filename)) {
unlink($filename);
}
$writer->save($filename);
CSV version:
// Write the col names to the file
$columnas_key = array_keys($columnas_excel);
file_put_contents($filename, implode(",", $columnas_excel) . "\n");
//write data to the file
for($pagina =1; $pagina <= $last_page; $pagina++) {
$pager->setPage($pagina);
$pager->init();
foreach ($pager->getResults() as $persona) {
$persona_arr = array();
// make an array
foreach ($columnas_excel as $key_col => $columnas) {
$persona_arr[] = $persona->getByName($key_col, BasePeer::TYPE_PHPNAME);
}
// append to the file
file_put_contents($filename, implode(",", $persona_arr) . "\n", FILE_APPEND | LOCK_EX);
}
}
Still have the problem of RAM when Propel makes requests to the database, it's like Propel, does not release the RAM every time you make a new request. I even tried to create and delete the Pager object in each iteration
Propel has formatters in the Query API, you'll be able to write this kind of code:
<?php
$query = AntecedentesQuery::create()
// Some ->filter()
;
$csv = $query->toCSV();
$csv contains a CSV content you'l be able to render by setting the correct mime-type.
Since it appears you can use a CSV, try pulling 1 record at a time and appending it to your CSV. Don't try to get all 200k records at the same time.
$cursor = mysql_query( $sqlToFetchData ); // get a MySql resource for your query
$fileHandle = fopen( 'data.csv', 'a'); // use 'a' for Append mode
while( $row = mysql_fetch_row( $cursor ) ){ // pull your data 1 record at a time
fputcsv( $fileHandle, $row ); // append the record to the CSV file
}
fclose( $fileHandle ); // clean up
mysql_close( $cursor );
I'm not sure how to transform the CSV into an XLS file, but hopefully this will get you on your way.

Issue creating CSV in PHP MySQL

I have to run a pairing algorithm for a game and when the pairing is done, I display the pairing on HTML and create a csv file as well. Right now, once I am done with pairing, I create a multidimensional array to store the specific value and then pass it to the function in same php file to generate the csv file. However, doing this outputs the entire page code i.e. html and php code to the .csv file. Here is the code:
function performPairing()
{
....
$count=0;
$resultArray[][] = array();
while ($currrow = #mysql_fetch_row($result))
{
$playerone = $currrow;
$playertwo = #mysql_fetch_row($result);
$resultArray[$count][] = $playerone[1];
$resultArray[$count][] = $playerone[0];
$resultArray[$count][] = $playertwo[1];
$resultArray[$count][] = $playertwo[0];
$count++;
updateforeachrow($playerone, $playertwo);
}
generateDocument($resultArray, $count);
}
function generateDocument($resultArray, $count)
{
$output = fopen('php://temp/maxmemory'.(5*1024*1024), 'r+');
$columns = array('Player One Col1', 'Player One Col2', 'Player Two Col1', 'Player Two Col2');
fputcsv($output, $columns);
for ($index=0 ; $index <=$count; $index++)
{
fputcsv($output, $resultArray[$index]);
}
rewind($output);
$export = stream_get_contents($output);
fclose($output);
header('Content-type: application/octet-stream');
header('Content-Disposition: attachment; filename = "export.csv"');
echo $export;
}
However doing this outputs the entire html code to csv rather than specific rows. Can anyone please help me on this?
. make string
. output string to file
. send header with (is this allowed this way?)
use file_put_contents ( filename, str ), and than send it with headers.
Make code simpler :)

Categories