I have a lengthy import process and I am trying to use the new StreamedResponse available in Symfony 2.1 to report some feedback about the task to the user but the response is not being streamed (I get all the content at once at the end of processing). This is my code in my controller:
$em = $this->getDoctrine()->getEntityManager();
$response = new StreamedResponse();
$response->setCallback(function () use ($em) {
$file = fopen(sys_get_temp_dir().'/categories.txt', 'r');
$lineNum = 0;
while ($line = fgets ($file)) {
$category = new Category();
$fields = explode("\t",$line);
$category->setFullId($fields[0]);
$category->setName($fields[2]);
$category->setFullName($fields[4]);
$em->persist($category);
if ($lineNum%100==0) {
echo 'Processing Line: '.$lineNum.'<br>';
flush();
$em->flush();
}
$lineNum++;
}
fclose($file);
});
return $response;
Any idea what may be wrong?
OK, I found it: you need to call both ob_flush() and flush().
Related
I am writing a web scraper in PHP using gitpod. After a while I have managed to solve all problems. But even though no problems are left, the code does not open the browser nor produce any output.
Does anybody have an idea why that could be the case?
<?php
if (file_exists('vendor/autoload.php')) {
require 'vendor/autoload.php';
}
use Goutte\Client;
$client = new Goutte\Client();
// Create a new array to store the scraped data
$data = array();
// Loop through the pages
if ($client->getResponse()->getStatus() != 200) {
echo 'Failed to access website. Exiting script.';
exit();
}
for ($i = 0; $i < 3; $i++) {
// Make a request to the website
$crawler = $client->request('GET', 'https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives_de?page=' . $i);
// Find all the initiatives on the page
$crawler->filter('.initiative')->each(function ($node) use (&$data) {
// Extract the information for each initiative
$title = $node->filter('h3')->text();
$link = $node->filter('a')->attr('href');
$description = $node->filter('p')->text();
$deadline = $node->filter('time')->attr('datetime');
// Append the data for the initiative to the data array
$data[] = array($title, $link, $description, $deadline);
});
// Sleep for a random amount of time between 5 and 10 seconds
$sleep = rand(5,10);
sleep($sleep);
}
// Open the output file
$fp = fopen('initiatives.csv', 'w');
// Write the header row
fputcsv($fp, array('Title', 'Link', 'Description', 'Deadline'));
// Write the data rows
foreach ($data as $row) {
fputcsv($fp, $row);
}
// Close the output file
fclose($fp);
?>
I'm generating a large PDF with 2000 pages in symfony (4.2) framework. What I'm doing is just save the HTML content to the .HTML file by getting content from the twig.
Then I'm using the headless chrome to generate the PDF from the URL using the below command.
/usr/bin/google-chrome --headless --disable-gpu --run-all-compositor-stages-before-draw --print-to-pdf [URL of HTML file] --virtual-time-budget=10000
Now, the requirement is while the above command is running I have to display the loader with the progress bar in the front.
What I did is as below to get the stream response and display them on the browser.
Controller
public function streamAction()
{
$process = new Process(["pwd"]);
$process->run();
$output = new StreamedOutputService(fopen('php://stdout', 'w'));
$response = new StreamedResponse(function() use ($output, $process) {
// $process->isRunning() always returns false.
while ($process->isRunning()) {
$output->writeln($process->getOutput());
}
});
$response->headers->set('X-Accel-Buffering', 'no');
return $response;
}
Streamed Response Class
protected function doWrite($message, $newline)
{
if (
false === #fwrite($this->getStream(), $message) ||
(
$newline &&
(false === #fwrite($this->getStream(), PHP_EOL))
)
) {
throw new RuntimeException('Unable to write output.');
}
echo $message;
ob_flush();
flush();
}
What is the buggy on the above code? I'm not able to get the output of the command hence can not write it to the browser.
Below code is working fine and sending response at every 2 seconds on the browser
public function streamAction()
{
$output = new StreamedOutputService(fopen('php://stdout', 'w'));
$response = new StreamedResponse(function() use ($output) {
for($i = 0; $i <= 5; $i++) {
$output->writeln($i);
sleep(2);
}
});
$response->headers->set('X-Accel-Buffering', 'no');
return $response;
}
I am going the codebase of slim php for educational purposes, I kind of understand alot from reading it. however, i am finding it really difficult to understand the purpose of the buffer used in the 'main' run method of the App class.
public function run($silent = false)
{
$response = $this->container->get('response');
try {
ob_start();
$response = $this->process($this->container->get('request'), $response);
} catch (InvalidMethodException $e) {
$response = $this->processInvalidMethod($e->getRequest(), $response);
} finally {
$output = ob_get_clean();
}
if (!empty($output) && $response->getBody()->isWritable()) {
$outputBuffering = $this->container->get('settings')['outputBuffering'];
if ($outputBuffering === 'prepend') {
// prepend output buffer content
$body = new Http\Body(fopen('php://temp', 'r+'));
$body->write($output . $response->getBody());
$response = $response->withBody($body);
} elseif ($outputBuffering === 'append') {
// append output buffer content
$response->getBody()->write($output);
}
}
$response = $this->finalize($response);
if (!$silent) {
$this->respond($response);
}
return $response;
}
i have tried to dump the value of ob_get_clean() but it is always empty.
This is done in order to always return a PSR-7 Response. If you echo or print_r() inside your routes/middleware this will get prepended to the response body if the outputBuffering setting is set to prepend or if set to append it will be appended.
I want to upload files in chunks to a URL endpoint using guzzle.
I should be able to provide the Content-Range and Content-Length headers.
Using php I know I can split using
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
How Do I achieve sending the files in chunk using guzzle, if possible using guzzle streams?
This method allows you to transfer large files using guzzle streams:
use GuzzleHttp\Psr7;
use GuzzleHttp\Client;
use GuzzleHttp\Psr7\Request;
$resource = fopen($pathname, 'r');
$stream = Psr7\stream_for($resource);
$client = new Client();
$request = new Request(
'POST',
$api,
[],
new Psr7\MultipartStream(
[
[
'name' => 'bigfile',
'contents' => $stream,
],
]
)
);
$response = $client->send($request);
Just use multipart body type as it's described in the documentation. cURL then handles the file reading internally, you don't need to so implement chunked read by yourself. Also all required headers will be configured by Guzzle.
I have to create a CSV export. I found this awesome service and I tried to adapt it to my case. This works very well on my own computer (windows 7, wamp 2.5 not modified) but when I put it on the server (Ubuntu 14.04.4 LTS, PHP 5.5.9), this doesn't work anymore. It seems that it tries to export an infinite flux of data and it might crash the server. I work on Symfony 3.1.1. Here is the code of my class :
public function exportCsv($affairesQuery, $em)
{
$response = new StreamedResponse(function() use($affairesQuery, $em) {
$results = $affairesQuery->iterate();
$handle = fopen('php://output', 'r+');
fputs($handle, $bom =( chr(0xEF) . chr(0xBB) . chr(0xBF) ));
fputcsv($handle, [
// My long list of headers
],
";");
while (false !== ($row = $results->next())) {
fputcsv($handle, $row[0]->toArray(), ";");
$em->detach($row[0]);
}
fclose($handle);
});
$response->setStatusCode(200);
$response->headers->set('Content-Type', 'application/force-download');
$response->headers->set('Content-Disposition','attachment; filename="Planning du '.date("Y-m-d h:i:s").".csv");
return $response;
}
UPDATE :
I tried to make goodby/csv working in my case, as said in comment below, but I got exactly the same issue. Which is very strange, because this isn't the same spirit.
public function exportCsv($affairesQuery, $em)
{
$stmt = $this->dbConnection->prepare($affairesQuery->getQuery()->getSQL());
$n = 1;
foreach($affairesQuery->getQuery()->getParameters()->toArray() as $value)
{
$stmt->bindValue($n, $value->getValue());
$n++;
}
$response = new StreamedResponse();
$response->setStatusCode(200);
$response->headers->set('Content-Type', 'text/csv');
$response->headers->set('Content-Disposition','attachment; filename="Planning du '.date("Y-m-d h:i:s").".csv");
$response->setCallback(function() use($stmt) {
$config = new ExporterConfig();
$exporter = new Exporter($config);
$exporter->export('php://output', new PdoCollection($stmt->getIterator()));
});
$response->sendContent();
return $response;
}