Write to file and send AJAX response at once - php

UPDATE:
I used another solution to write my data into a file. It seems that I can't echo data while AJAX is waiting for a response. So I now use fwrite.
$fileHandle = '';
$fileHandle = fopen("export.txt","w");
fwrite($fileHandle, $export);
Original:
Hi there,
maybe my logic is wrong.
I make an AJAX call to get data from another URL.
That worked so far.
But now I want to add an file export also.
$handler = new MyHandler();
// Step 1: get data from URL
$dataAjax = $handler->getData($_POST['data']);
// Step 2: write the data into a text file to provide a download
$handler->writeToText($dataAjax);
echo json_encode($dataAjax);
Now the console shows me a "parserError" because my JSON data contains also the string I wanted to write into the file. That's bad and unwanted.
This below is just a test how I want to write my data into a txt file:
function writeToText($data)
{
header("Content-type: text/plain");
header("Content-Disposition: attachment; filename=export.txt");
header("Pragma: no-cache");
header("Expires: 0");
$title = "";
$title .= "Name,Quantity,Model,Price,Weight,Status"."\n";
echo $title;
}
That is how the error looks like:
{
"readyState": 4,
"responseText": "Name,Quantity,Model,Price,Weight,Status\n[{\"domain\":\"Text\",\"name\":\"Banana\}]",
"status": 200,
"statusText": "OK"
}
parsererror

Related

Echoing to screen when using headers to output to csv

I have a PHP application which generates a set of codes , saves them to MySQL DB and then outputs the same to the user as a downloadable csv file. I also have an echo statement after the code block to convert the PHP array to csv. The echo statement after the convert_to_csv function call instead of outputting to the browser outputs to the file instead and overwrites the first line. How do I get the echo statement to output to the browser instead? The code block is below:
convert_to_csv($newCodesArray,$fileName,',');
echo "Your file was successfully generated";
function convert_to_csv($input_array, $fileName, $delimiter)
{
header('Content-Type: text/csv');
header("Content-Disposition: attachment; filename=\"$fileName\"");
$f = fopen('php://output', 'w');
/* loop through array */
foreach ($input_array as $line) {
/* default php csv handler */
fputcsv($f, $line, $delimiter);
}
fclose($f) or die("Can't close php://output");
}
You have already defined the header as text/csv. So it wont print in the browser as it requires text/html.
Alternatively you can do as following. Copy your function to different file (Ex. csv.php).
<?php
echo "Your file was successfully generated <script> window.location = 'csv.php' </script>";
Now it will print your echo string and start download your csv file.
As Magnus Eriksson commented,
Above code does not checking its really generated successfully or not. We can extend code with AJAX.
<script>
$.ajax('csv.php', {
success: function(data) {
document.write('Your file was successfully generated.');
windows.location = 'csv.php';
},
error: function() {
document.write('Your file generation failed.');
}
});
</script>
Note:- AJAX call will generate file two times.

creating multiple csv files from php loop

Im trying to create a loop that when executed it created multiple csv files and downloads them. This is my code:
session_start();
require '../connect.php'; //connect.php has connection info for my database
// and uses the variable $connect
$sqldept = "SELECT department_name from department;";
$departments = mysqli_query($connect, $sqldept);
while ($department = mysqli_fetch_array($departments)) {
$department = $department[0];
header('Content-Type: text/csv; charset=utf-8');
header("Content-Transfer-Encoding: UTF-8");
header('Content-Disposition: attachment; filename=summary-' . $department . '.csv');
header("Cache-Control: no-cache, no-store, must-revalidate"); // HTTP 1.1
header("Pragma: no-cache"); // HTTP 1.0
header("Expires: 0"); // Proxies
$date = date("Y-m-d", strtotime("-28 days" . date("Y-m-d")));
$edate = date("Y-m-d");
$startdate = "(time.dateadded BETWEEN '$date' AND '$edate') AND";
$department = " and department_name = '$department'";
// create a file pointer connected to the output stream
$output = fopen('php://output', 'w');
// output the column headings
$sql2 = "SELECT time.id as timeid, time.staff_id, SUM(time.timein), COUNT(NULLIF(time.reasonforabsence,'')) AS count_reasonforabsence, GROUP_CONCAT(CONCAT(NULLIF(time.reasonforabsence,''),' ', date_format(time.dateadded, '%d-%m-%Y'),' ')) AS reasonforabsence, time.dateadded, staff.id AS staffid, department.id AS departmentid, department.department_name, staff.staff_name, staff.department_id, SUM(staff.workhoursperday), staff.payrollnum FROM time, staff, department WHERE $startdate staff.id = time.staff_id AND staff.department_id = department.id $department $staffsearch GROUP BY staff.id ORDER BY `time`.`dateadded` ASC;";
// output headers so that the file is downloaded rather than displayed
fputcsv($output, array(
'Payroll Number',
'Name',
'Department',
'Hours Worked',
'Days Absent',
'Overtime',
'Reasons for Absence'
));
$rows = mysqli_query($connect, $sql2);
while ($rowcsv = mysqli_fetch_assoc($rows)) {
$reasonforabsence = $rowcsv['reasonforabsence'];
//$reasonforabsence = explode( ',', $rowcsv['reasonforabsence'] );
$overtime = 0;
if (empty($rowcsv['SUM(time.timein)']) == true) {
$rowcsv['SUM(time.timein)'] = 0;
}
;
if ($rowcsv['SUM(time.timein)'] > $rowcsv['SUM(staff.workhoursperday)']) {
$overtime = $rowcsv['SUM(time.timein)'] - $rowcsv['SUM(staff.workhoursperday)'];
}
;
fputcsv($output, array(
$rowcsv['payrollnum'],
$rowcsv['staff_name'],
$rowcsv['department_name'],
$rowcsv['SUM(time.timein)'],
$rowcsv['count_reasonforabsence'],
$overtime,
$reasonforabsence
));
};
readfile("php://output");
fclose($output);
};
Currently the loop created 1 CSV with a new header and the department details below it like this
I want the loop to create a new CSV for each department but its just not working for me. Any help is appreciated.
Thanks
Unfortunately you can't, 1 PHP Request results in one file, and there isn't really a way around this. You can, however, try to download them all as a ZIP file. Take a look at this question f.e.
The below are some workaround ideas, which might be useful in certain scenarios (and might be dangerous in other scenarios). Use under your own risk!
Workaround A: Loop by redirect
Output a single file normally
Do a redirect to same url that's creating the CSV file in step#1, but append a GET flag to that, like http://www.example.net/output_csv?i=1
Make sure to add a loop-breaker in step#1, like if($i==10) { exit; }
Workaround B: Loop by cronjob
Output a single file normally
Make 2nd file output be handled by a separate cronjob call.
Make sure to add a loop-breaker in step#1, like if($mycron==10) { exit; }
You can not do this by for loop.
However, You can make a php file which can do your purpose.
<a onclick="getcsv()" href="php_file_location.php?table_name=test"> Download </a>
<script>
function getcsv() {
window.open(php_file_location);
}
</script>
I was in the same problem as mentioned. But in my case I was not trying to download multiple CSVs but I was uploading it to sFTP server. While creating the file instead of using
$output = fopen('php://output', 'w');
I used
$output = fopen($path_and_name, 'w');
where $path_and_name = $path_to_sftp_folder.'/'.$file_name;
after the execution the correct file was uploaded to there respective folders correctly the way I wanted it to be. But yes the wrong file was also downloaded with same issue as sent above.
So if you are looking for uploading files on a server it can be done(even if they all have same name).

Why PHP continue to output to CSV file after fclose()?

On a web page, I am writing some data into a CSV file using the below code and finally closing with fclose();
header('Content-Type: text/csv; charset=utf-8');
header('Content-Disposition: attachment; filename='.$filename);
$out = fopen('php://output', 'w');
fputcsv($out, $cvs_cols);
fclose($out);
echo "HELLO WORLD"; // sneaks into CSV!?
Why is it that "HELLO WORLD" gets into the CSV download file when it has already fclose()? I want to output the rest of the HTML for the page to be displayed in the browser. How can I do that?
After 1 HTTP request follows 1 response. You cannot send content type text/csv and content type text/html at the same time (maybe yes with SPDY, but not with pure HTTP).
fclose closes your file descriptor but not the output to the browser.
You should also set a Content-Length header and put in the filesize.
Mark Baker already gave the most important point in the comments:
echo and writing to php://output puts content into the same stream: STDOUT. Other options would be to write the CSV to memory (but its senseless if you don't use it) or to a file. Read more about the those streams: http://www.php.net/manual/en/features.commandline.io-streams.php
Possible solution:
You need 2 HTTP requests. 1 For the download, the other for your HTML. Most popular way is it to first use the HTML response and put something in like
<meta http-equiv="refresh"
content="3; URL=http://yourserver.com/download.php?id=pdf&id=123" />
This starts the download after 3 seconds.
There is no 'CSV File' (yet).
What you are doing is sending a data stream to the client, and telling the client that this stream has a Content-Type of text/csv and a filename of $filename. The client can then chose to save this as a CSV file or just display it in the browser.
This code:
$out = fopen('php://output', 'w');
fputcsv($out, $cvs_cols);
fclose($out);
Is effectively doing the same thing that echo $cvs_cols would do (with a little extra stuff to format a csv output).
So when there is a call to echo "HELLO WORLD"; it gets sent in the same data steam as the contents of the $cvs_cols variable.
When you call fopen('php://output', 'w') you are creating a second file handle to php://output as one is created by default to output from calls to echo etc. So when you are calling fclose($out) you're only closing the second file handle.
A very old thread here but to fix this I just added a simple exit(); command. So a button calls the same page with a query string of 'action=export_csv' then that action is run with the exit(); on the last line, hope that helps out.
Export CSV
Then the 'action' on the page is:
if(isset($_GET['action']) && $_GET['action']=='export_csv'){
// output headers so that the file is downloaded rather than displayed
header('Content-Type: text/csv; charset=utf-8');
header('Content-Disposition: attachment; filename=email-responses.csv');
// create a file pointer connected to the output stream
$output = fopen('php://output', 'w');
// output the column headings
fputcsv($output, array('Email address'));
$db = new PDO('mysql:host=hostname_mysql;dbname=database_mysql;charset=UTF8', username_mysql, password_mysql);
$query = "SELECT XXX FROM XXXX";
$result = $db->query($query);
$data = $result->fetchAll(PDO::FETCH_ASSOC);
// loop over the rows, outputting them
foreach($data as $row){
fputcsv($output, $row);
}
fclose($output);
exit();
}
use
ob_clean() : ob_clean — Clean (erase) the output buffer
flush() : flush — Flush the output buffer(flush)
ob_start();
header('Content-Type: text/csv; charset=utf-8');
header('Content-Disposition: attachment; filename='.$filename);
$out = fopen('php://output', 'w');
fputcsv($out, $cvs_cols);
fclose($out);
ob_end_clean(); // the buffer and never prints or returns anything.
echo "HELLO WORLD"; // sneaks into CSV!?

Is content length necessary on inline?

I am asking this because my requests keep getting "Content Length Mismatch" on files larger than 4Mb even though the content length and the file size is exact match. If I remove the Content Length header, everything works fine.
Is there an advantage or necessity of using Content Length in the headers for inline content (not attachment)?
[EDIT] Code:
public function actionViewFile($id) {
$model = File::model()->findByPk($id);
if(!$model) {
throw new CHttpException(404, "File not found");
}
$data = file_get_contents($this->storageDir.'/'.$model->fileName);
header('Content-type: '.$model->mime_type);
header('Content-length: '.$model->size); //I have tried calling `mb_strlen($data);`
echo $data;
Yii::app()->end(); //I have tried calling `die;`
}
The framework I am using is Yii, which doesn't really matter for the context of this problem.
The Content-Length entity-header field indicates the size of the entity-body, in decimal number of OCTETs, sent to the recipient or, in the case of the HEAD method, the size of the entity-body that would have been sent had the request been a GET.
Try using filesize() as per the following:
public function actionViewFile($id) {
$model = File::model()->findByPk($id);
if(!$model) {
throw new CHttpException(404, "File not found");
}
$filepath= $this->storageDir.'/'.$model->fileName;
$data = file_get_contents($filepath);
header('Content-type: '.$model->mime_type);
header("Cache-Control: no-cache, must-revalidate");
header('Content-Length: ' . sprintf('%u', filesize($filepath)))
echo $data;
return Yii::app()->end(); //I have tried calling `die;`
}

reading a large file and forcing download

$name = 'mybigfile.csv';
$fp = fopen(...);
while($row = mysql_fetch_assoc($sql_result)) {
fputcsv($fp, $row);
}
fclose($fp);
// send the correct headers
header("Content-Type: application/csv etc ....");
header("Content-Length: " . filesize($name));
// dump the file and stop the script
readfile($name);
exit;
this method works fine but some of the files are quite big so which makes it quite slow process ... I was thinking - maybe if I could avoid the process of creating a file first and then write data and THEN read the data and output .... .i.e. if I send headers before the while loop and echo line in the while loop (instead of writing it in a line) or something like this. Would this be more efficient process? What would you suggest me to improve this process? thanks
Write directly to the output:
header("Content-Type: application/csv etc ....");
while ($row = mysql_fetch_assoc($sql_result)) {
fputcsv(STDOUT, $row);
}
See here for reference: http://www.php.net/manual/en/wrappers.php.php

Categories