private function convert_to_csv($input_array, $output_file_name, $delimiter) {
$temp_memory = fopen('php://memory','w');
foreach ($input_array as $line) {
fputcsv($temp_memory, $line, $delimiter);
}
fseek($temp_memory, 0);
header('Content-Type: application/csv');
header('Content-Disposition: attachement; filename="' . $output_file_name . '";');
fpassthru($temp_memory);
}
I use the above function to take an array of data, convert to CSV, and output to the browser. Two questions:
Is the file removed from memory after being downloaded via HTTP?
How would this same function be rewritten such that the file could be used (for example, to use as an email attachment sent with PHPMailer) and then removed from memory immediately afterwards?
EDIT: Working Code - But writes to file, not memory
public function emailCSVTest() {
$test_array = array(array('Stuff','Yep'),array('More Stuff','Yep yep'));
$temp_file = '/tmp/temptest.csv';
$this->convertToCSV($test_array, $temp_file);
$this->sendUserEmail('Test Subject','Test Message','nowhere#bigfurryblackhole.com',$temp_file);
unlink($temp_file);
}
private function convertToCSV($input_array, $output_file) {
$temp_file = fopen($output_file,'w');
foreach ($input_array as $line) {
fputcsv($temp_file, $line, ',');
}
fclose($temp_file);
}
Still unanswered: does the original function remove the file from memory, or no?
I would make use PHP's temp fopen wrapper together with threshold of memory like this:
// we use a threshold of 1 MB (1024 * 1024), it's just an example
$fd = fopen('php://temp/maxmemory:1048576', 'w');
if ($fd === false) {
die('Failed to open temporary file');
}
$headers = array('id', 'name', 'age', 'species');
$records = array(
array('1', 'gise', '4', 'cat'),
array('2', 'hek2mgl', '36', 'human')
);
fputcsv($fd, $headers);
foreach($records as $record) {
fputcsv($fd, $record);
}
rewind($fd);
$csv = stream_get_contents($fd);
fclose($fd); // releases the memory (or tempfile)
The memory treshold is 1MB. If the CSV file get's larger, PHP would create a temporary file, otherwise all will happen in memory. The advantage is that large CSV files won't exhaust the memory.
About your second question, fclose() will release the memory.
I once wrote a blog article regarding to this, you may find it interesting: http://www.metashock.de/2014/02/create-csv-file-in-memory-php/
Related
I have a query that returns an array which is then processed by this function to ultimately write the output to a csv file. My question is about the path of the output file, it should not be accessible via the web root hence the idea to have the path as /tmp/ instead? Or should I create a new folder specifically for the uploads outside of the web root? So basically I just want to make sure that no one browsing our website, which is pretty much locked down anyway to specific ip's can access the csv files generated by this function. P
$date = date("Y-m-d H:i:s");
$file = $date . ".csv";
function array_to_csv_download($array, $filename, $delimiter = ",")
{
header("Content-Type: application/csv");
header('Content-Disposition: attachment; filename="' . $filename . '";');
$u = "/tmp/" . $filename;
// open the "output" stream
$f = fopen($u, "a");
$csv = array_map("str_getcsv", file($u));
if (isset($csv[0])) {
//do nothing
} else {
//header
$fields = ["Name", "Contact Number"];
fputcsv($f, $fields, $delimiter);
}
//content
foreach ($array as $line) {
//foreach ($array as $header => $line)
fputcsv($f, $line, $delimiter);
}
}
array_to_csv_download($customer_id_result, $file);
P.S. Hope I can post this type of question here, if not please advise on which site part of the group
This exports one file with 2 rows in it. How can export 2 files each with one row in it?
<?php
$list = array (
array("Peter", "Griffin" ,"Oslo", "Norway"),
array("Glenn", "Quagmire", "Oslo", "Norway")
);
$file = fopen("contacts.csv","w");
foreach ($list as $line) {
fputcsv($file, $line);
}
fclose($file);
?>
I tried this:
$list = array (
array("Peter", "Griffin" ,"Oslo", "Norway"),
array("Glenn", "Quagmire", "Oslo", "Norway")
);
$file = fopen('php://output', 'w');
$i = 1;
foreach ($list as $line) {
header('Content-Disposition: attachment; filename="' . $i . 'wp.csv"');
fputcsv($file, $line);
fclose($file);
$i++;
}
But it only downloads one file. At least it only has the one row in it though.
I found this example. Even though it creates 2 separate csv files, the data is identical in each file instead of each file containing an individual record.
// some data to be used in the csv files
$headers = array('id', 'name', 'age', 'species');
$records = array(
array('1', 'gise', '4', 'cat'),
array('2', 'hek2mgl', '36', 'human')
);
// create your zip file
$zipname = 'file.zip';
$zip = new ZipArchive;
$zip->open($zipname, ZipArchive::CREATE);
// loop to create 3 csv files
for ($i = 0; $i < 3; $i++) {
// create a temporary file
$fd = fopen('php://temp/maxmemory:1048576', 'w');
if (false === $fd) {
die('Failed to create temporary file');
}
// write the data to csv
fputcsv($fd, $headers);
foreach($records as $record) {
fputcsv($fd, $record);
}
// return to the start of the stream
rewind($fd);
// add the in-memory file to the archive, giving a name
$zip->addFromString('file-'.$i.'.csv', stream_get_contents($fd) );
//close the file
fclose($fd);
}
// close the archive
$zip->close();
header('Content-Type: application/zip');
header('Content-disposition: attachment; filename='.$zipname);
header('Content-Length: ' . filesize($zipname));
readfile($zipname);
// remove the zip archive
// you could also use the temp file method above for this.
unlink($zipname);
You can only put one "file" into an HTTP response.
If you want to generate multiple CSV files then you'll need to get more exotic. You could generate a HTML document of links to URLs that each generate a CSV file, or you could generate a zip file containing the CSV files.
Just make a new File and use it like this
$file2 = fopen("contacts2.csv", "w");
fputcsv($file2, $list[1]);
You always write in the same file here, because you are refering to $file
fputcsv($file, $list[1]);
I am a novice programmer and I searched a lot about my question but couldn't find a helpful solution or tutorial about this.
My goal is I have a PHP array and the array elements are showing in a list on the page.
I want to add an option, so that if a user wants, he/she can create a CSV file with array elements and download it.
I don't know how to do this. I have searched a lot too. But yet to find any helpful resource.
Please provide me some tutorial or solution or advice to implement it by myself. As I'm a novice please provide easy to implement solutions.
My array looks like:
Array
(
[0] => Array
(
[fs_id] => 4c524d8abfc6ef3b201f489c
[name] => restaurant
[lat] => 40.702692
[lng] => -74.012869
[address] => new york
[postalCode] =>
[city] => NEW YORK
[state] => ny
[business_type] => BBQ Joint
[url] =>
)
)
You can use the built in fputcsv() for your arrays to generate correct csv lines from your array, so you will have to loop over and collect the lines, like this:
$f = fopen("tmp.csv", "w");
foreach ($array as $line) {
fputcsv($f, $line);
}
To make the browsers offer the "Save as" dialog, you will have to send HTTP headers like this (see more about this header in the rfc):
header('Content-Disposition: attachment; filename="filename.csv";');
Putting it all together:
function array_to_csv_download($array, $filename = "export.csv", $delimiter=";") {
// open raw memory as file so no temp files needed, you might run out of memory though
$f = fopen('php://memory', 'w');
// loop over the input array
foreach ($array as $line) {
// generate csv lines from the inner arrays
fputcsv($f, $line, $delimiter);
}
// reset the file pointer to the start of the file
fseek($f, 0);
// tell the browser it's going to be a csv file
header('Content-Type: text/csv');
// tell the browser we want to save it instead of displaying it
header('Content-Disposition: attachment; filename="'.$filename.'";');
// make php send the generated csv lines to the browser
fpassthru($f);
}
And you can use it like this:
array_to_csv_download(array(
array(1,2,3,4), // this array is going to be the first row
array(1,2,3,4)), // this array is going to be the second row
"numbers.csv"
);
Update:
Instead of the php://memory you can also use the php://output for the file descriptor and do away with the seeking and such:
function array_to_csv_download($array, $filename = "export.csv", $delimiter=";") {
header('Content-Type: application/csv');
header('Content-Disposition: attachment; filename="'.$filename.'";');
// open the "output" stream
// see http://www.php.net/manual/en/wrappers.php.php#refsect2-wrappers.php-unknown-unknown-unknown-descriptioq
$f = fopen('php://output', 'w');
foreach ($array as $line) {
fputcsv($f, $line, $delimiter);
}
}
I don't have enough reputation to reply to #complex857 solution. It works great, but I had to add ; at the end of the Content-Disposition header. Without it the browser adds two dashes at the end of the filename (e.g. instead of "export.csv" the file gets saved as "export.csv--"). Probably it tries to sanitize \r\n at the end of the header line.
Correct line should look like this:
header('Content-Disposition: attachment;filename="'.$filename.'";');
In case when CSV has UTF-8 chars in it, you have to change the encoding to UTF-8 by changing the Content-Type line:
header('Content-Type: application/csv; charset=UTF-8');
Also, I find it more elegant to use rewind() instead of fseek():
rewind($f);
Thanks for your solution!
Try...
csv download.
<?php
mysql_connect('hostname', 'username', 'password');
mysql_select_db('dbname');
$qry = mysql_query("SELECT * FROM tablename");
$data = "";
while($row = mysql_fetch_array($qry)) {
$data .= $row['field1'].",".$row['field2'].",".$row['field3'].",".$row['field4']."\n";
}
header('Content-Type: application/csv');
header('Content-Disposition: attachment; filename="filename.csv"');
echo $data; exit();
?>
That is the function that I used for my project, and it works as expected.
function array_csv_download( $array, $filename = "export.csv", $delimiter=";" )
{
header( 'Content-Type: application/csv' );
header( 'Content-Disposition: attachment; filename="' . $filename . '";' );
// clean output buffer
ob_end_clean();
$handle = fopen( 'php://output', 'w' );
// use keys as column titles
fputcsv( $handle, array_keys( $array['0'] ), $delimiter );
foreach ( $array as $value ) {
fputcsv( $handle, $value, $delimiter );
}
fclose( $handle );
// flush buffer
ob_flush();
// use exit to get rid of unexpected output afterward
exit();
}
Use the below code to convert a php array to CSV
<?php
$ROW=db_export_data();//Will return a php array
header("Content-type: application/csv");
header("Content-Disposition: attachment; filename=test.csv");
$fp = fopen('php://output', 'w');
foreach ($ROW as $row) {
fputcsv($fp, $row);
}
fclose($fp);
If you're array structure will always be multi-dimensional in that exact fashion, then we can iterate through the elements like such:
$fh = fopen('somefile.csv', 'w') or die('Cannot open the file');
for( $i=0; $i<count($arr); $i++ ){
$str = implode( ',', $arr[$i] );
fwrite( $fh, $str );
fwrite( $fh, "\n" );
}
fclose($fh);
That's one way to do it ... you could do it manually but this way is quicker and easier to understand and read.
Then you would manage your headers something what complex857 is doing to spit out the file. You could then delete the file using unlink() if you no longer needed it, or you could leave it on the server if you wished.
Update for UTF-8 Encoding
Updating #complex857 's answer
function array_to_csv_download($array, $filename = "export.csv", $delimiter=",") {
header('Content-Disposition: attachment; filename="'.$filename.'";');
header('Content-Type: application/csv; charset=UTF-8');
// open the "output" stream
$f = fopen('php://output', 'w');
// Write utf-8 bom to the file
fputs($f, chr(0xEF) . chr(0xBB) . chr(0xBF));
foreach ($array as $line) {
fputcsv($f, $line, $delimiter);
}
}
May be a bit ugly code but it works!
This function can generate a downloadable CSV file, you can set up the name and the delimiter without tricky functions (UTF-8 encoding in header function).
/**
* Array2CSVDownload
*
*/
function Array2CSVDownload($array, $filename = "export.csv", $delimiter=";") {
// force object to be array, sorry i was working with object items
$keys = array_keys( (array) $array[0] );
// use keys as column titles
$data = [];
array_push($data, implode($delimiter, $keys));
// working with items
foreach ($array as $item) {
$values = array_values((array) $item);
array_push($data, implode($delimiter, $values));
}
// flush buffer
ob_flush();
// mixing items
$csvData = join("\n", $data);
//setup headers to download the file
header('Content-Disposition: attachment; filename="'.$filename.'";');
//setup utf8 encoding
header('Content-Type: application/csv; charset=UTF-8');
// showing the results
die($csvData);
}
I need to run a query that will return multiple rows and export it to a CSV. I have to put the cells in a certain order though.
So lets say my table is laid out id, name, address, wife. I need to build a csv in the order of id, address, wife, name. I figured I could just make an array in the correct order and then make a csv with that but after an hour of googling i cant find out how to make a csv with an array.
There is fputcsv but that requires a pre-made csv. Also, i was hoping there was a codeigniter way of doing it.
public function export() {
$this->load->helper('download');
$data[1] = 'i like pie';
$data[2] = 'i like cake';
force_download('result.csv', $data);
}
I tried that but the error said the download helper file was expecting a string not an array.
Here's some code I use... you could adjust the columns you need in the export...
Note: This CSV is directly sent to php://output which writes directly to the output buffer. This means you're not saving any .csv files on the server and it can handle a much larger file size that building a giant array and then trying to loop through it.
header("Content-type: application/csv");
header("Content-Disposition: attachment; filename=\"Jobs_".date('M.j.y', $from)."-".date('M.j.y', $to).".csv\"");
header("Pragma: no-cache");
header("Expires: 0");
$handle = fopen('php://output', 'w');
fputcsv($handle, array(
'JobId',
'Template',
'Customer',
'Status',
'Error',
'PDF',
'Run Time',
'Wait Time',
'Server'
));
foreach ($jobs as $jobData) {
fputcsv($handle, array(
$job->getId(),
$job->getTemplate(),
$jobData['customers_firstname'].' '.$jobData['customers_lastname'],
$status,
$error,
$jobData['products_pdfupload'],
$job->getRunTime(),
$job->getWaitTime(),
$jobData['server']
));
}
fclose($handle);
exit;
This should give you a good mental picture of how a CSV export works. I don't use CodeIgniter's file download helper, so I can't help you on that front.
This short scripts build the CSV and allows you to download it also:
function convert_to_csv($input_array, $output_file_name, $delimiter)
{
/** open raw memory as file, no need for temp files */
$temp_memory = fopen('php://memory', 'w');
/** loop through array */
foreach ($input_array as $line) {
/** default php csv handler **/
fputcsv($temp_memory, $line, $delimiter);
}
/** rewrind the "file" with the csv lines **/
fseek($temp_memory, 0);
/** modify header to be downloadable csv file **/
header('Content-Type: application/csv');
header('Content-Disposition: attachement; filename="' . $output_file_name . '";');
/** Send file to browser for download */
fpassthru($temp_memory);
}
/** Array to convert to csv */
$array_to_csv = Array(Array(12566, 'Enmanuel', 'Corvo'), Array(56544, 'John', 'Doe'), Array(78550, 'Mark', 'Smith'));
convert_to_csv($array_to_csv, 'report.csv', ',');
You can read the full post here:
PHP Array to CSV - Download
<?php
$list = array (
array('aaa', 'bbb', 'ccc', 'dddd'),
array('123', '456', '789'),
array('"aaa"', '"bbb"')
);
$fp = fopen('file.csv', 'w');
foreach ($list as $fields) {
fputcsv($fp, $fields);
}
fclose($fp);
?>
Ok, I'm looking for the fastest possible way to read all of the contents of a file via php with a filepath on the server, also these files can be huge. So it's very important that it does a READ ONLY to it as fast as possible.
Is reading it line by line faster than reading the entire contents? Though, I remember reading up on this some, that reading the entire contents can produce errors for huge files. Is this true?
If you want to load the full-content of a file to a PHP variable, the easiest (and, probably fastest) way would be file_get_contents.
But, if you are working with big files, loading the whole file into memory might not be such a good idea : you'll probably end up with a memory_limit error, as PHP will not allow your script to use more than (usually) a couple mega-bytes of memory.
So, even if it's not the fastest solution, reading the file line by line (fopen+fgets+fclose), and working with those lines on the fly, without loading the whole file into memory, might be necessary...
file_get_contents() is the most optimized way to read files in PHP, however - since you're reading files in memory you're always limited to the amount of memory available.
You can issue a ini_set('memory_limit', -1) if you have the right permissions but you'll still be limited by the amount of memory available on your system, this is common to all programming languages.
The only solution is to read the file in chunks, for that you can use file_get_contents() with the fourth and fifth arguments ($offset and $maxlen - specified in bytes):
string file_get_contents(string $filename[, bool $use_include_path = false[, resource $context[, int $offset = -1[, int $maxlen = -1]]]])
Here is an example where I use this technique to serve large download files:
public function Download($path, $speed = null)
{
if (is_file($path) === true)
{
set_time_limit(0);
while (ob_get_level() > 0)
{
ob_end_clean();
}
$size = sprintf('%u', filesize($path));
$speed = (is_int($speed) === true) ? $size : intval($speed) * 1024;
header('Expires: 0');
header('Pragma: public');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Content-Type: application/octet-stream');
header('Content-Length: ' . $size);
header('Content-Disposition: attachment; filename="' . basename($path) . '"');
header('Content-Transfer-Encoding: binary');
for ($i = 0; $i <= $size; $i = $i + $speed)
{
ph()->HTTP->Flush(file_get_contents($path, false, null, $i, $speed));
ph()->HTTP->Sleep(1);
}
exit();
}
return false;
}
Another option is the use the less optimized fopen(), feof(), fgets() and fclose() functions, specially if you care about getting whole lines at once, here is another example I provided in another StackOverflow question for importing large SQL queries into the database:
function SplitSQL($file, $delimiter = ';')
{
set_time_limit(0);
if (is_file($file) === true)
{
$file = fopen($file, 'r');
if (is_resource($file) === true)
{
$query = array();
while (feof($file) === false)
{
$query[] = fgets($file);
if (preg_match('~' . preg_quote($delimiter, '~') . '\s*$~iS', end($query)) === 1)
{
$query = trim(implode('', $query));
if (mysql_query($query) === false)
{
echo '<h3>ERROR: ' . $query . '</h3>' . "\n";
}
else
{
echo '<h3>SUCCESS: ' . $query . '</h3>' . "\n";
}
while (ob_get_level() > 0)
{
ob_end_flush();
}
flush();
}
if (is_string($query) === true)
{
$query = array();
}
}
return fclose($file);
}
}
return false;
}
Which technique you use will really depend on what you're trying to do (as you can see with the SQL import function and the download function), but you'll always have to read the data in chunks.
$file_handle = fopen("myfile", "r");
while (!feof($file_handle)) {
$line = fgets($file_handle);
echo $line;
}
fclose($file_handle);
Open the file and stores in $file_handle as reference to the file itself.
Check whether you are already at the end of the file.
Keep reading the file until you are at the end, printing each line as you read it.
Close the file.
You could use file_get_contents
Example:
$homepage = file_get_contents('http://www.example.com/');
echo $homepage;
Use fpassthru or readfile.
Both use constant memory with increasing file size.
http://raditha.com/wiki/Readfile_vs_include
foreach (new SplFileObject($filepath) as $lineNumber => $lineContent) {
echo $lineNumber."==>".$lineContent;
//process your operations here
}
Reading the whole file in one go is faster.
But huge files may eat up all your memory and cause problems. Then your safest bet is to read line by line.
If you're not worried about memory and file size,
$lines = file($path);
$lines is then the array of the file.
You Could Try cURL (http://php.net/manual/en/book.curl.php).
Altho You Might Want To Check, It Has Its Limits As Well
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://example.com/");
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec ($ch); // Whole Page As String
curl_close ($ch);