I have to create a CSV export. I found this awesome service and I tried to adapt it to my case. This works very well on my own computer (windows 7, wamp 2.5 not modified) but when I put it on the server (Ubuntu 14.04.4 LTS, PHP 5.5.9), this doesn't work anymore. It seems that it tries to export an infinite flux of data and it might crash the server. I work on Symfony 3.1.1. Here is the code of my class :
public function exportCsv($affairesQuery, $em)
{
$response = new StreamedResponse(function() use($affairesQuery, $em) {
$results = $affairesQuery->iterate();
$handle = fopen('php://output', 'r+');
fputs($handle, $bom =( chr(0xEF) . chr(0xBB) . chr(0xBF) ));
fputcsv($handle, [
// My long list of headers
],
";");
while (false !== ($row = $results->next())) {
fputcsv($handle, $row[0]->toArray(), ";");
$em->detach($row[0]);
}
fclose($handle);
});
$response->setStatusCode(200);
$response->headers->set('Content-Type', 'application/force-download');
$response->headers->set('Content-Disposition','attachment; filename="Planning du '.date("Y-m-d h:i:s").".csv");
return $response;
}
UPDATE :
I tried to make goodby/csv working in my case, as said in comment below, but I got exactly the same issue. Which is very strange, because this isn't the same spirit.
public function exportCsv($affairesQuery, $em)
{
$stmt = $this->dbConnection->prepare($affairesQuery->getQuery()->getSQL());
$n = 1;
foreach($affairesQuery->getQuery()->getParameters()->toArray() as $value)
{
$stmt->bindValue($n, $value->getValue());
$n++;
}
$response = new StreamedResponse();
$response->setStatusCode(200);
$response->headers->set('Content-Type', 'text/csv');
$response->headers->set('Content-Disposition','attachment; filename="Planning du '.date("Y-m-d h:i:s").".csv");
$response->setCallback(function() use($stmt) {
$config = new ExporterConfig();
$exporter = new Exporter($config);
$exporter->export('php://output', new PdoCollection($stmt->getIterator()));
});
$response->sendContent();
return $response;
}
Related
I'm generating a large PDF with 2000 pages in symfony (4.2) framework. What I'm doing is just save the HTML content to the .HTML file by getting content from the twig.
Then I'm using the headless chrome to generate the PDF from the URL using the below command.
/usr/bin/google-chrome --headless --disable-gpu --run-all-compositor-stages-before-draw --print-to-pdf [URL of HTML file] --virtual-time-budget=10000
Now, the requirement is while the above command is running I have to display the loader with the progress bar in the front.
What I did is as below to get the stream response and display them on the browser.
Controller
public function streamAction()
{
$process = new Process(["pwd"]);
$process->run();
$output = new StreamedOutputService(fopen('php://stdout', 'w'));
$response = new StreamedResponse(function() use ($output, $process) {
// $process->isRunning() always returns false.
while ($process->isRunning()) {
$output->writeln($process->getOutput());
}
});
$response->headers->set('X-Accel-Buffering', 'no');
return $response;
}
Streamed Response Class
protected function doWrite($message, $newline)
{
if (
false === #fwrite($this->getStream(), $message) ||
(
$newline &&
(false === #fwrite($this->getStream(), PHP_EOL))
)
) {
throw new RuntimeException('Unable to write output.');
}
echo $message;
ob_flush();
flush();
}
What is the buggy on the above code? I'm not able to get the output of the command hence can not write it to the browser.
Below code is working fine and sending response at every 2 seconds on the browser
public function streamAction()
{
$output = new StreamedOutputService(fopen('php://stdout', 'w'));
$response = new StreamedResponse(function() use ($output) {
for($i = 0; $i <= 5; $i++) {
$output->writeln($i);
sleep(2);
}
});
$response->headers->set('X-Accel-Buffering', 'no');
return $response;
}
I want to upload files in chunks to a URL endpoint using guzzle.
I should be able to provide the Content-Range and Content-Length headers.
Using php I know I can split using
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
How Do I achieve sending the files in chunk using guzzle, if possible using guzzle streams?
This method allows you to transfer large files using guzzle streams:
use GuzzleHttp\Psr7;
use GuzzleHttp\Client;
use GuzzleHttp\Psr7\Request;
$resource = fopen($pathname, 'r');
$stream = Psr7\stream_for($resource);
$client = new Client();
$request = new Request(
'POST',
$api,
[],
new Psr7\MultipartStream(
[
[
'name' => 'bigfile',
'contents' => $stream,
],
]
)
);
$response = $client->send($request);
Just use multipart body type as it's described in the documentation. cURL then handles the file reading internally, you don't need to so implement chunked read by yourself. Also all required headers will be configured by Guzzle.
I am trying to zip a bunch of PDFs together and download in the browser, at the moment the PDF files are zipped and downloaded to the folder the PDFs are stored in, not via the users browser and into their download folder, I have a similar (and much simpler) function which downloads a single PDF so feel like I'm missing something fairly obvious here..
$id is a comma seperated list of filenames, this is then split into an array for looping through and adding to zip file. this bit works, thinking it may be a header issue or with the response.
Any help much appreciated.
public function downloadMultiple($id) {
$id_array = explode(',', $id);
$public_dir = storage_path();
$zipFileName = time().'.zip';
$zip = new ZipArchive;
if ($zip->open($public_dir . '/' . $zipFileName, ZipArchive::CREATE) === TRUE) {
foreach($id_array as $file) {
$file_path = storage_path($file).".pdf";
if (file_exists($file_path)) {
$zip->addFile($file_path,$file.".pdf");
}
}
if ($zip->close()) {
$filetopath = $public_dir.'/'.$zipFileName;
$headers = [
'Cache-control: maxage=1',
'Pragma: no-cache',
'Expires: 0',
'Content-Type : application/octet-stream',
'Content-Transfer-Encoding: binary',
'Content-Type: application/force-download',
'Content-Disposition: attachment; filename='.time().'.zip',
"Content-length: " . filesize($filetopath)
];
if (file_exists($filetopath)) {
$response = response()->download($filetopath, $zipFileName, $headers);
//$response->deleteFileAfterSend(true);
} else {
return ['status'=>'zip file does not exist'];
}
} else {
return ['status'=>'zip file could not close'];
}
} else {
return ['status'=>'Could not create new zip'];
}
}
Update:
Definitely gets to the return and does create the file, it just doesn't seem to download for the user, the below is what is brought back in the inspector so clearly something not working as expected
may be worth while mentioning the code which is sent to the controller
let xhr = new XMLHttpRequest(), self = this;
xhr.open('GET', window.location.origin+'/download-multiple/' + this.selected);
xhr.onload = function () {
};
xhr.send();
Assuming you're getting to that part of the controller method, I believe the problem is that you're not returning your response:
if (file_exists($filetopath)) {
// $response = response()->download($filetopath, $zipFileName, $headers);
// $response->deleteFileAfterSend(true);
return response()->download($filetopath, $zipFileName, $headers)->deleteFileAfterSend(true);
} else {
return ['status'=>'zip file does not exist'];
}
EDIT: The problem is you're trying to load the file via AJAX, which you can't do the way that you're trying to do it (see here for examples on how to do it). Change your javascript to:
let xhr = new XMLHttpRequest(), self = this;
window.location = window.location.origin+'/download-multiple/' + this.selected
I have a lengthy import process and I am trying to use the new StreamedResponse available in Symfony 2.1 to report some feedback about the task to the user but the response is not being streamed (I get all the content at once at the end of processing). This is my code in my controller:
$em = $this->getDoctrine()->getEntityManager();
$response = new StreamedResponse();
$response->setCallback(function () use ($em) {
$file = fopen(sys_get_temp_dir().'/categories.txt', 'r');
$lineNum = 0;
while ($line = fgets ($file)) {
$category = new Category();
$fields = explode("\t",$line);
$category->setFullId($fields[0]);
$category->setName($fields[2]);
$category->setFullName($fields[4]);
$em->persist($category);
if ($lineNum%100==0) {
echo 'Processing Line: '.$lineNum.'<br>';
flush();
$em->flush();
}
$lineNum++;
}
fclose($file);
});
return $response;
Any idea what may be wrong?
OK, I found it: you need to call both ob_flush() and flush().
I am having a problem transferring files using NuSOAP. I understand that you can read a file and transfer it as a string, but it's not working. Here is an example:
Client:
require('libraries/nusoap/nusoap.php');
$url = "http://www.example.com";
$client = new nusoap_client($url);
args = array('file_name' => 'myfile.zip');
$return = $client->call('getFile', array($args));
if(empty($return){
echo "WHY IN THE WORLD IS THIS EMPTY!!!!!";
}
Server:
require('libraries/nusoap/nusoap.php');
$server = new nusoap_server;
$server->configureWSDL('server', 'urn:server');
$server->wsdl->schemaTargetNamespace = 'urn:server';
$server->register('getFile',
array('value' => 'xsd:string'),
array('return' => 'xsd:string'),
'urn:server',
'urn:server#getFile');
function getFile($value){
$returnData= "";
$filePath=$value['file_path'];
$mode="r";
if (floatval(phpversion()) >= 4.3) {
$returnData= file_get_contents($filePath);
} else {
if (!file_exists($filePath)){
return -3;
}
$handler = fopen($filePath, $mode);
if (!$handler){
return -2;
}
$returnData= "";
while(!feof($handler)){
$returnData.= fread($handler, filesize($filePath));
}//end while
fclose($handler);
}//end else
return $returnData;
}
Here is the really strange part. If I return the file name or file size or something like that, it will work. It will just not return the file itself. Help please.
In the server side getFile function you should return base64_encode($returnData);