TCPDF Does not create pdf When data is in huge amount - php

I am working with tcpdf to generate report in pdf. when i generate report for one month it gives me the error and it only genereates 15 pages in pdf not more than that.
Fatal error: Out of memory (allocated 85721088) (tried to allocate 262142 bytes) in C:\xampp\htdocs\Apps\tcpdf\tcpdf.php on line 20732
How to resolve this issues tcpdf not creating more than 15 pages. Please Help me to resolve this issue I am stuck on it. I want More pages to genrate to show all data.
Thanks In Advance

There are a number of issues that you need to address when dealing with TCPDF,
The first is setting the max amount of memory you feel appropriate and require using something like this.
ini_set('memory_limit','VALUE HERE');
Then you need to ensure that you have sufficent processing time;
ini_set('max_execution_time', VALUE HERE);
It is then a very good idea to check all your code, looking for ways to limit the amount of resources required to create the pdf. In my case, I had lots of DB results, and multiple arrays storing huge volumes of data which remained in memory after they had been used. Once I unset these after their usage, my memory usage reduced considerably.

Related

phpExcel multiple workbooks memory peformance

I have a report thats generated using PHPexcel. Its quite a large report 8 workbooks/tabs in total. As you can imagine it uses up quite alot of memory and im starting to run into problems with it.
I was wondering what options would be best suited for me to reduce the memory load that it takes to generate this report.
The idea I had was to create the spreadsheet with 8 empty worksheets. Save the file then load the file with the
$objReader->setLoadSheetsOnly($sheetname);
Im not sure if im heading of in the right direction with this though?
The best option to start with would be to look at the cell caching options that are explicitly written to reduce memory usage. These are described in section 4.2.1 of the developer documentation

Merging PDF's memory

Im stumbled with this problem. Im merging multiple PDF-files into one depending on what PDF's the client choose.
If i choose the smallest size PDF's and merge it works fine but as soon as its a lil bigger, like around 1MB i get Fatal error: Allowed memory size of xxxxxx bytes exhausted (tried to allocate xxxx).
I know its a php.ini problem, just put it higher but i cant change it unless i pay for a bussniess account...
Is there any workaround, like lower the PDF quality size then higher it again? I really don't know what to do :S
You can try it yourself here: pdf.devharis.com
Choose the cheapest two and order them..., then try some bigger it crashes...
I'm not going to try that site because I'm not going to give you my information and I don't read whatever language that is, but I will try to answer your question.
Altering PDFs is an intensive task, if you are using cheap hosting, then it's time to upgrade. Also knowing how much memory you're allowed would be beneficial.

PHPEXCEL: Read a Spreadsheet->Audit and apply formats->Save audited Spreadsheet. MEMORY Problems

Much has been documented regarding PHPEXCEL memory issues so my question presumes that reading in "chunks" - as described by Mark Baker has been applied. My specific situation is this:
- read a "raw" spreadsheet uploaded to a database record (working fine)
- retrieve said spreadsheet, open with PHPExcel, and audit the cells
- Read rows and columns and look for conditions.
- If a cell has a [informational|warning|error] (a) apply conditional formatting and insert a cell note (working fine)
- Do this for all rows and columns. Runs out of memory before completes.
- Save the formatted spreadsheet into a server directory so user can download and look at all the embedded errors warnings, or informational items - also works just fine if I limit the number of rows read.
The spreadsheet is not especially big 130 or rows and 60 columns.
This is what I have tried:
- Reading in "chunks" ala Mark Baker. Problem is the filter only returns the subset of rows. Those associated with the chunk rule. So when I go to do a save at the end of a chunk read and processing, only those rows are saved, not the whole spreadsheet.
- Using cell caching (
$cacheMethod = PHPExcel_CachedObjectStorageFactory::cache_to_discISAM;
PHPExcel_Settings::setCacheStorageMethod($cacheMethod);
) allows me to read and update every row and the associated cells, but when I go to do the $objWriter->save(str_replace('.php', '.xlsx', $theDIRandFileName) I get a memory problem in XMLWriter.php:
Fatal error: Allowed memory size of 18874368 bytes exhausted (tried to allocate 238592 bytes) in ... /PHPExcel/Shared/XMLWriter.php on line 100
Argh!
Any assistance would be greatly appreciated. Right now it appears the only thing left for me to try is open two copies, use the chunk approach, but figure out how to read the unedited version but update a chunk version. There has to be a better way.

PHPExcel - Write/append in chunks

I am using the PHPExcel framework to try to write out a very large excel document from a mysql query.
Everything works fine until I hit the 5000 row mark (or there about's) where the page gets flagged with the error:
Fatal error: Allowed memory size of xxx bytes exhausted (tried to allocate yyy bytes) in zzz on line aaa
I know this is documented, and I have adjusted the memory allocation on the server but even so I am still hitting the ceiling. I have also tried to turn off formatting, but in reality I will need it on.
So is there a way to write in small chunks, or append to the excel document so I don't exhaust the memory allocation? I am thinking along the lines of the page writing say 1000 lines, then redirect to itself and process the next 1000 using a GET to keep track. For example:
index.php?p=0
then redirect to
index.php?p=1000,
But I can't find a way to append to an existing document without opening up the whole thing.
There is no way of writing in chunks, although a common mistake is for people to load their mysql data to an array, then loop through the array setting the Excel cell data. It's more memory efficient to set the cell data as you loop through the mySQL query resultset.
If you need to keep memory usage to a minimum, what cell caching method are you using? Cell caching is slower, but can save significant amounts of memory.

Reading multiple files with PHPExcel

I just recently started using this library (the one from CodePlex), but I ran into some issues. My goal is to use it so I can process some data from multiple Excel files, and send such data to a database, per file. I'm doing something like:
foreach( $file_list as $file ) {
$book = PHPExcel_IOFactory::load( $path . $file );
}
So, inside the foreach I'm (for now) just showing the data to the user, but after five files, I get a memory error:
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 50688 bytes) in /var/www/test/classes/PHPExcel/Shared/OLERead.php on line 76
Is there a way to __destruct the object after each file is loaded, so space is reserved (made free) for the next file, instead of accumulating it, or do you rather know of a reason and work-around for this?
Please let me know any suggestions you have.
Thanks in advance.
The latest SVN code for PHPExcel (just checked in today) introduces cell caching to reduce memory usage... it's such a new feature, I haven't even had the time to document it yet.
While the default method is identical to the present method, with the worksheet <--> cell relationship containing a cyclic reference, I believe that using any of the memory-reducing cache mechanisms should eliminate this cyclic reference. If not, let me know and I should be able to break the reference when unsetting a workbook/worksheet using some of the caching logic that already disables this connection when serializing the cells for caching.
This has been an issue for awhile, and it doesn't look like there's a way around it -- that is, unless someone has come up with something clever since the release of 5.3......
"...it seems that PHP 5.3 will fix
this. However, I would like to see a
confirmation of this somewhere." [Oct 21 2008]
(source)
(more stuff)

Categories