I want to download my large db data as excel.
But due to the huge amount of data, it feels like impossible to download it, as it puts a lot of load on the server, takes a lot of processing time & also it keeps crashing.
Now, I want to create multiple temporary excel files 'on the go' of some limited data Ex: 50000 rows of data & at the end of the data I want to download all these temporary files in Zip.
So, it doesn't loads up the server & keeps it from crashing.
Is it achievable via PHP-CodeIgniter. Can anybody guide me ?
You can do it in many ways.
which you have thought zipping that in a folder
increase the limit of execution and memory limit in CI or in server also.
implement a queue service
For zipping you can do like below way
<?php
// Enter the name of directory
$pathdir = "DirectoryName/";
$zipcreated = "NameOfZip.zip";
$zip = new ZipArchive;
if($zip -> open($zipcreated, ZipArchive::CREATE ) === TRUE) {
// Store the path into the variable
$dir = opendir($pathdir);
while($file = readdir($dir)) {
if(is_file($pathdir.$file)) {
$zip -> addFile($pathdir.$file, $file);
}
}
$zip ->close();
}
?>
For Increasing option:
You can use excel5 in PHPExcel. It can handle 65536 rows. For that
ini_set("max_execution_time", 'time_limit'); //see manual
ini_set('memory_limit', 'memory_limit'); //your memory limit as string
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel5');
After that follow the documentation of PHPExcel.
And for the queue option:
This is a bit complex process.
You can implement a queue service and give the responsibility to generate the excel to the queue service. After generation, you can notify the user or return the download URL of the excel file.
For notification, you need to implement a notification service also.
Related
I'm programming a tool that gathers images uploaded by a user into a zip-archive. For this I came across ZipArchiveAdapter from Flysystem that seems to do a good job.
I'm encountering an issue with the memory limit when the amount of files in the zip archive goes into the thousands.
When the amount of images for a user starts to go beyond a 1000 it usually fails due to the available memory being exhausted. To get to the point where it seems to handle most users with less than 1000 images I've increased memory limit to 4GB, but increasing it beyond this is not really an option.
Simplified code at this point:
<?php
use League\Flysystem\Filesystem;
use League\Flysystem\ZipArchive\ZipArchiveAdapter;
use League\Flysystem\Memory\MemoryAdapter;
class User {
// ... Other user code
public function createZipFile()
{
$tmpFile = tempnam('/tmp', "zippedimages_");
$download = new FileSystem(new ZipArchiveAdapter($tmpFile));
if ($this->getImageCount()) {
foreach ($this->getImages() as $image) {
$path_in_zip = "My Images/{$image->category->title}/{$image->id}_{$image->image->filename}";
$download->write($path_in_zip, $image->image->getData());
}
}
$download->getAdapter()->getArchive()->close();
return $tmpFile;
// Upload zip to s3-storage
}
}
So my questions:
a) Is there a way to have Flysystem write to the zip-file "on the go" to disk? Currently it stores the entire zip in memory before writing to disk when the object is destroyed.
b) Should I utilize another library that would be better for this?
c) Should I take another approach here? For example having the user download multiple smaller zips instead of one large zip. (Ideally I want them to download just one file regardless)
i want to create a big zip-file with thousands (10000+) pdf-files and upload it to a external sftp server.
Currently the process always results in a timeout (already modified the php.ini), but the zip-file gets created never-the-less, but we can never be sure if everything went well and when it is finished.
I looked around but found no satisfactory response. Is there a possibility to prevent timeouts, keep the process spinning and know when it is finished? Maybe via AJAX? =/
Thanks in advance
For upload, you can try the ZipStream class from the PHPZip project (https://github.com/Grandt/PHPZip)
$zip = new \ZipStream("invoices.zip");
foreach($invoices as $invoice)
{
$pdf = new InvoicePdf($invoice);
$zip->addFile($pdf->render(), $pdf->getFilename());
}
$zip->finalize();
exit;
For upload, if you have tried this (https://www.sitepoint.com/upload-large-files-in-php/), then you can google for "jquery file upload"
Have a php function to generate a zip file "on fly". I have the files on one server (AWS S3) but the PHP function to generate the zip-file is on another server/web hosting. I have noticed that it takes long time to generate the zip-file and I get a corrupt zip file if there are many files when I create zip file. I want to troubleshoot/debug where it takes "stop", what the missing link is if there are many files (the limit seems to be 20 files, which is not many).
How can I find out where in my function it fails if I have more than 20 files to generate my zip-file from?
Can I add a timer to every row?
Can I find out if it is memory or something else on my shared hosting (where I have the php function)? Or if it is something with S3.
My php function to generate the zip file from files on AWS S3
<?php
$imageQueryResult = $this->getUserImages('download', array('i.image_name'));
if(!empty($imageQueryResult)) {
$imageUrl = $this->getFromAmazon('download', $imageQueryResult);
$imageNumber = 1;
$zipName = 'tt-'.date("Y-m-d").'.zip';
//create new zip object
$zip = new ZipArchive();
//create a temp file & open it
$tmp_file = tempnam('.','');
$zip->open($tmp_file, ZipArchive::CREATE);
//loop through each file
foreach($imageUrl as $image){
//Get extension
$ext = pathinfo(parse_url($image, PHP_URL_PATH), PATHINFO_EXTENSION);
//download file
$download_file = file_get_contents($image);
//add it to the zip
$zip->addFromString(basename($imageNumber.'-tt.'.$ext),$download_file);
$imageNumber++;
}
//close zip
$zip->close();
//send the file to the browser as a download
header('Content-disposition: attachment; filename='.$zipName);
header('Content-Length: ' . filesize($tmp_file));
header('Content-type: application/zip');
readfile($tmp_file);
ignore_user_abort(true);
unlink($tmp_file);
?>
You can try to use
http://php.net/manual/en/ziparchive.getstatusstring.php
ZipArchive::getStatusString — Returns the status error message, system and/or zip messages
You don't know how long it will take to compress all your files. So you'll have to check the maximal execution time and the memory that consumes that script.
If the problem is with the time, the solution might be to do this in chunks:
open the archive for writing
add N files per iteration
close the archive
Repeat until you you have files
Keep in mind that with this approach the more files you have, the more memory you need to store the temporary results
In my project I use symfony2 PHPExcel wrapper https://github.com/liuggio/ExcelBundle
With the example from the link above I can create new excel files. However this file has no style or markup at all. So I created a excel template where I want to input some data.
I know how to load an excel file:
$excelObj = $this->get('xls.load_xls2007')
->load($this->get('kernel')
->getRootDir() . '/../web/excel-template.xlsx');
//custom modifications on excel file
Now I need to create a response. But in the doc of ExcelBundle there is no information on how to do that. They just show how response work for a excel file that is created by code.
I tried:
$excelService->setExcelObj($excelObj);
$response = $excelService->getResponse();
//the rest is equal to the code in the doc
but it gives me a blank excel document.
Any ideas how to make a response with a loaded excel file?
you can do this by
// Read the file
$objReader = PHPExcel_IOFactory::createReader($fileType);
$objPHPExcel = $objReader->load($fileName);
// Change the file
$objPHPExcel->setActiveSheetIndex(0)
->setCellValue('A1', 'Hello')
->setCellValue('B1', 'World!');
// Write the file
$objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, $fileType);
$objWriter->save($fileName);
if you dont understand please comment..
I would save the file to disk and redirect the user to the on-disk version personally. This will allow several things
Your web server to serve files instead of PHP, a good thing from a performance and memory usage standpoint.
Decouple your architecture a bit to allow for future changes such as moving the creation and loading of Excel files to asynchronous operations.
The ability to use http://wiki.nginx.org/XSendfile (There is an Apache module also).
The user can re-download the file or pause and resume download without recreating it.
To do this you will want to
Save the file to a web accessible temp directory after its created
Redirect the user to that file location
Create a cron or some other job that deletes older files in the temp directory.
The tempfile api (http://us2.php.net/tmpfile) might be useful here.
the new version 2.* of PHPExcelbundle could help you.
Is now possible:
//read
$phpExcelObject = $this->get('phpexcel')->createPHPExcelObject('file.xls');
$phpExcelObject->setActiveSheetIndex(0)
->setCellValue( 'C6', 'some text' )
->setCellValue( 'D6', 'some text2' );
$writer = $this->get('phpexcel')->createWriter($phpExcelObject, 'Excel5');
$writer->save('file.xls');
// or
$response = $this->get('phpexcel')->createStreamedResponse($writer);
Does someone knows an good PHP Solution to delete or better wipe an file from an linux system?
Scenario:
File is encrypted and saved, when a download is requested the file is copyed to an temporary folder and decrypted. This is already working.
But how to remove the file from the temporary location after sending in to the user?
In my mind i have the following options:
Open the File via "fopen" and write 0,1 into it (think very slow)
Save file to Memcache instead of harddisk (could be a problem with my hoster)
Use somd 3rd pary tool on commandline or as cronjob (could be a problem to install)
Goal: Delete the file from hard disk, without the possibility to recover (wipe/overwrite)
Call "shred" via exec/system/passthru
Arguably the best is to never save the file in its decrypted state in the first place.
Rather, use stream filters to decrypt it on-the-fly and send it directly to the end-user.
Update
Your option 1 is actually not too bad if you consider this code:
$filename = 'path/to/file';
$size = filesize($filename);
$src = fopen('/dev/zero', 'rb');
$dest = fopen('/path/to/file', 'wb');
stream_copy_to_stream($src, $dest, $size);
fclose($src);
fclose($dest);
You could choose /dev/urandom as well, but that will be slow.