How to optimal phpExcel when it's very slow? - php

I am doing exporting excel file using phpExcel. I output 2000 rows and its execution time is about 10 seconds . The problem is that when I output 20,000 rows it gets error: ERROR TIME OUT . At first: max_execution_time = 30, I set it back to max_execution_time = 60 (This value is equal to the value on the main server). Now is there a way to minimize file export time, since the data is sometimes very large..Up to a few hundred thousand rows. Please help me. Thanks. Sorry my english is not good.
Here is my code :
// array example
// 20.000 rows
$data = [
[
'name' => 'hello',
'address' => 'usa',
'birthday' => '2021-04-30'
],
[
'name' => 'hello',
'address' => 'usa',
'birthday' => '2021-04-30'
],
];
// Sample processing code
$exRow = 2;
foreach ($data as $val) {
for ($i = 0; $i < 12; $i++) { // set title
$excel->getActiveSheet()->setCellValueByColumnAndRow($i, $exRow, $val);
}
}
header('Content-type: application/vnd.ms-excel');
header('Content-Disposition: attachment; filename="data.xls"');
PHPExcel_IOFactory::createWriter($excel, 'Excel2007')->save('php://output');

you just need to increase max_execution_time in php.ini file. if you don't access to do this just can make a php.ini or user.ini and put this text on there
max_execution_time=500
also you can use yield key to add excel row. this is help you render excel rows by lazy rendering
visit this link

Related

read text file and insert into mysql php

I have huge text file and I am trying to read and insert this line by line.
this is txt file data.
'REG','KOIL','Kohinoor Industries Ltd.','READY',4.82,2.82,3.82
'REG','EPQL','Engro Powergen Qadirpur Ltd.','READY',36.9495,33.4305,35.19
Function for insert data
$file_path =FCPATH.'uploads/text/'.$file_name;
$psx_date=$this->input->post('file_date');
$open=fopen($file_path,"r");
$i=1;
while(!feof($open)){
$line=fgets($open);
if($i>2){
$values = explode(",",$line);
$psx_symbol=str_replace('\'',null,$values[1]);
$no_of_rows=read_psx_where($psx_symbol,$psx_date);
if($no_of_rows<=0){
$psx_data=array(
'PSX_SYMBOL' => $psx_symbol,
'PSX_DATE' => $psx_date,
'PSX_HIGH' => $values[4],
'PSX_LOW' => $values[5],
'PSX_CLOSE' => $values[6],
'PSX_DATETIME' => date('Y-m-d H:i:s'),
'PSX_SATUS' => 1
);
insert_psx_data($psx_data);
}
}
$i++;
}
fclose($open);
I am first skip first two lines of test file and then I am checking if same symbol is already exist so then skip this line.
This method is working but too much slowdown and exceeding max exectution time.

Proper way to use PHP fputcsv function to export CSV file

What I want to do is, to export some dataset for Excel without using extra, 3rd party, heavy libraries.
The problem is, when I export the file, first row looks well, but starting from second row, it puts all $row data into first field of current row.
So I get file with first row properly placed in right columns and starting from second row instead of columns I see whole text in first field seperated by delimiter (comma)
Here is code snippet that I use for result.
$counter=0;
$file = fopen("sample.csv", "w");
foreach ($registrants as $registrant) {
$row = [
'Fullname' => $registrant->fullname,
'Phone' => $registrant->phone,
'Email' => $registrant->email,
];
if ($counter == 0)
fputcsv($file, array_keys($row));
fputcsv($file, $row);
$counter++;
}
fclose($file);
Also tried to
fputcsv($file, array_values($row), ';', ' ');
No success. What am I doing wrong? What is proper way to see correct result on all Excel versions regardless of OS or Excel locale and etc.?

fputcsv - running out of memory during creation of larger files

I am creating sometimes large csv files from db information for users to then download - 100k or more rows. It appears I am running into a memory issue during the csv creation on some of the larger files. Here is an example of how I am currently handling creation of the csv.
Is there any way around this? Originally had 32mb and changed that to 64mb and still having the issue.
//columns array
$log_columns = array(
'1',
'2',
'3',
'4',
'5',
'6',
'7',
'8',
'9'
);
//results from the db
$results = $log_stmt->fetchAll(PDO::FETCH_ASSOC);
$log_file = 'test.csv';
$log_path = $_SERVER['DOCUMENT_ROOT'].'/../user-data/'.$_SESSION['user']['account_id'].'/downloads/';
// if location does not exist create it
if(!file_exists($log_path))
{
mkdir($log_path, 0755, true);
}
// open file handler
$fp = fopen($log_path.$log_file, 'wb');
// write the csv column titles / labels
fputcsv($fp, $log_columns);
//are there any logs?
if($results)
{
//write the rows
foreach($results as $row)
{
//rows array
$log_rows = array(
$row['1'],
$row['2'],
$row['3'],
$row['4'],
$row['5'],
$row['6'],
$row['7'],
$row['8'],
$row['9']
);
//write the rows
$newcsv = fputcsv($fp, $log_rows);
}//end foreach
}
// there were no results so just return an empty log
else
{
$newcsv = fputcsv($fp, array('No results found.') );
}
//close handler
fclose($fp);
// if csv was created return true
if($newcsv)
{
return true;
}
UPDATE :
Using a while loop and fetch instead of foreach and fetchAll still produces a memory error.
while($result = $log_stmt->fetch(PDO::FETCH_ASSOC))
How is that possible if I am only loading one one row at a time?
UPDATE 2 :
I have further tracked this down to the while loop using memory_get_usage();
echo (floor( memory_get_usage() / 1024) ).' kb<br />';
Before the while loop starts the result is 4658 kb and then for each iteration of the while loop it increases 1kb every 2-3 loops until it reaches the 32748 kb max memory allowed.
What can I do to solve this issue?
UPDATE 3 :
Played around more with this today... the way this works just does not make much sense to me - I can only assume it is a strange behavior with php's GC.
scenario 1 : My query gets all 80k rows and uses a while loop to output them. Memory used is around 4500kb after the query is fetched then increments 1kb every two to three rows that are outputted in the loop. Memory is not released what so ever and it crashes without enough memory at some point.
while($results = $log_stmt->fetch(PDO::FETCH_ASSOC))
{
echo $results['timestamp'].'<br/>';
}
scenario 2 : My query is now looped and gets 1000 rows at a time with a loop within that outputting each row. Memory maxes at 400k as it loops and completes the entire output with no memory issues.
For this example I just used a counter 80 times as I know there is more than 80k rows to retrieve. In reality I would have to do this different obviously.
$t_counter = 0;
while($t_counter < 80)
{
//set bindings
$binding = array(
'cw_start' => $t_counter * 1000,
//some other bindings...
);
$log_stmt->execute($binding);
echo $t_counter.' after statement '.floor( memory_get_usage() / 1024 ).' kb<br />';
while($results = $log_stmt->fetch(PDO::FETCH_ASSOC))
{
echo $results['capture_timestamp'].'<br/>';
}
echo $t_counter.' after while'.floor( memory_get_usage() / 1024 ).' kb<br />';
$t_counter++;
}
So I guess my question is why does the first scenario have incrementing memory usage and nothing is released? In that while loop there are no new variables and everything is 'reused'. The exact same situation happens in the second scenario just within another loop.
fetchAll fetches all records, who not just query it and do a while loop with fetch then it does not need to load all the result set in memory.
http://php.net/manual/en/pdostatement.fetch.php
Then i think you should try reading the files in bits. Read them and append into one csv file,that way you free memory during the process .
You could do count(*) ,but try to find the total count before the multiple collection
I have been using php's csv myself, i even use it as a databse system(nosql)
try
csv code for reading
<?php
$CSVfp = fopen("filename.csv", "r");
if($CSVfp !== FALSE) {
$con=1;
while(! feof($CSVfp))
{
do something
}?>
**csv code for writting **
<?php
$list = array
(
"edmond,dog,cat,redonton",
"Glenn,Quagmire,Oslo,Norway",
);$file = fopen("filename.csv","w");foreach ($list as $line)
{fputcsv($file,explode(',',$line));}fclose($file); ?>

Cache multiple pages/images from Instagram

I'm working on a small project where the users can see images taged by, in this case, "kitties". Instagram only allows 5000 requests/hour, i don't think it will reach this, but i'm choosing to cache any way. Also because i can't figure out how to get the back-link to work.
I can only get the link for next page, then the link for recent page becomes the current page, a link to itself.
Also, the api can return strange number of images, some times 14, some times 20 and so on. I want it to always show 20 images per page and only have 5 pages (100 images). And then update this file each 5/10 minutes or something.
So, my plan is to store like 100 images into a file. I got it working, but it's incredible slow.
The code looks like this:
$cachefile = "instagram_cache/".TAG.".cache";
$num_requests = 0; //Just for developing and check how many request it does
//If the file does not exsists or is older than *UPDATE_CACHE_TIME* seconds
if (!file_exists($cachefile) || time()-filemtime($cachefile) > UPDATE_CACHE_TIME)
{
$images = array();
$current_file = "https://api.instagram.com/v1/tags/".TAG."/media/recent?client_id=".INSTAGRAM_CLIENT_ID;
$current_image_index = 0;
for($i = 0; $i >= 0; $i++)
{
//Get data from API
$contents = file_get_contents($current_file);
$num_requests++;
//Decode it!
$json = json_decode($contents, true);
//Get what we want!
foreach ($json["data"] as $x => $value)
{
array_push($images, array(
'img_nr' => $current_image_index,
'thumb' => $value["images"]["thumbnail"]["url"],
'fullsize' => $value["images"]["standard_resolution"]["url"],
'link' => $value["link"],
'time' => date("d M", $value["created_time"]),
'nick' => $value["user"]["username"],
'avatar' => $value["user"]["profile_picture"],
'text' => $value['caption']['text'],
'likes' => $value['likes']['count'],
'comments' => $value['comments']['data'],
'num_comments' => $value['comments']['count'],
));
//Check if the requested amount of images is equal or more...
if($current_image_index > MAXIMUM_IMAGES_TO_GET)
break;
$current_image_index++;
}
//Check if the requested amount of images is equal or more, even in this loop...
if($current_image_index > MAXIMUM_IMAGES_TO_GET)
break;
if($json['pagination']['next_url'])
$current_file = $json['pagination']['next_url'];
else
break; //No more files to get!
}
file_put_contents($cachefile, json_encode($images));
This feels like a very ugly hack, any ideas for how to make this work better?
Or someone that can tell me how to make that "back-link" to work like it should? (Yes, i could yes js and go -1 in history, but no!).
Any ideas, suggestions, help, comments etc are appreciated.
Why not subscribe to real-time and store the images in the DB? Then, when they are rendered you can check if the image url is valid (check if the photo has been deleted). Getting the data from your own DB will be much faster than from instagram

Errors while exporting to .xls using PHP

I've asked a similar question previously but Ive been told my question is just me being lazy so let me rephrase.
Ive been using a PHP class script to enable me to export my SQL data to a .xls file but the resultant excel file doesnt display any values and no error is being displayed on the webpage itself.
The class file Im using is documented in the link below:
http://web.burza.hr/blog/php-class-for-exporting-data-in-multiple-worksheets-excel-xml/
And Ive incorporated it in my site as follows
$dbase->loadextraClass('excel.xml');
$excel = new excel_xml();
$header_style = array(
'bold' => 1,
'size' => '14',
'color' => '#000000',
'bgcolor' => '#ffffff'
);
$excel->add_style('header',$header_style);
if(isset($_POST['fec_save']))
{
if($_POST['reporttype']=='films')
{
$films = $dbase->runquery("SELECT datetime,title,country_of_origin,language,runningtime,(SELECT name FROM fec_client WHERE filmid = fec_film.filmid) AS client, (SELECT rating_decision FROM fec_rating_report WHERE filmid = fec_film.filmid) AS rating FROM fec_film WHERE datetime >= '".strtotime($_POST['fromdate'])."' AND datetime <= '".strtotime($_POST['todate'])."'",'multiple');
$filmcount = $dbase->getcount($films);
//continue with excel buildup
$columns = array('Date','Title','Origin','Language','Minutes','Client','Rating');
$excel->add_row($columns,'header');
for($i=1; $i<=$filmcount; $i++)
{
$film = $dbase->fetcharray($films);
$excel->add_row($film['datetime'],$film['title'],$film['country_of_origin'],$film['language'],$film['runningtime'],$film['client'],$film['rating']);
}
$excel->create_worksheet('From_'.str_replace(' ','',$_POST['fromdate']).'_to'.str_replace(' ','',$_POST['todate']));
$xml = $excel->generate();
$excel->download('FilmsClassified_from_'.str_replace(' ','',$_POST['fromdate']).'_to'.str_replace(' ','',$_POST['todate']));
}
}
I would like some assistance as to what I maybe doing wrong.
Thanks
Too long to post as a comment:
$xml = $excel->generate();
creates all the xml file content as a string and stores it in $xml
while
$excel->create_worksheet('From_'.str_replace(' ','',$_POST['fromdate']).'_to'.str_replace(' ','',$_POST['todate']));
creates the xml and directs it to output, with appropriate headers for downloading.
So you're not using the class correctly, as that's unnecessary duplication of work.

Categories