PHP - Read excel file - php

I want to read an Excel file with PHP row by row because reading the entire file at once cause memory overflow.
I have searched a lot, but no luck until now.
I think PHPExcel library can read chunks of an excel file, when you implement the filter class, but each time it gets this chunk it reads the entire file, which is impossible in huge .xls files because of the time it will take.
Any help ?

This may be something that is totally out of question, but from the information that I get from your question the following seems like an obvious option, at least something to consider ...
I get the impression that this is a really big file that needs to be accessed often. So, I would just try to import its data in a database.
I guess there is no need to explain that databases are masters in performance and caching.
And it is still possible to export the contents of the database to an excel file afterwards.
MySql works great with PHP and is certainly easier to access than an excel file. Most php hosting providers offer a MySql database by default with a PhpMyAdmin management tool.
How to do it:
If you have PhpMyAdmin installed, then you can follow these simple steps.
If you have command-line access to the server then you can even import the file from commandline directly to a MySql database.

If the only thing you need from your read Excel file is data, here is my way to read huge Excel files :
I install gnumeric on my server, ie with debian/ubuntu :
apt-get install gnumeric
Then the php calls to read my excel file and store it into a two dimensionnal data array are incredibly simple (dimensions are rows and cols) :
system("ssconvert \"$excel_file_name\" \"temp.csv\"");
$array = array_map("str_getcsv", file("temp.csv"));
Then I can do what I want with my array. This takes less than 10 seconds for a 10MB large xls file, the same time I would need to load the file on my favorite spreadsheet software !
For very huge files, you should use fopen() and file_getcsv() functions and do what you have to do without storing data in a huge array to avoid storing the whole csv file in memory with the file() function. This will be slower, but will not eat all your server's memory !

Related

Reading large .xls file with PHP

At the moment I am doing a mass interface of files/data and some files are in XLS format, which I need to normalize them into csv (so basically, convert XLS to CSV files)
The problem is that PHPExcel (and similar libraries) load the entire sheet data at once thus exhausting memory.
So far I tried various libraries (in the meantime negotiating to have the data in csv though no luck so far)
I am running my tests on various large file sizes, my memory allocation is set properly before and after my script runs using ini_set etc.
Is there a way that I can read an xls line by line or in chunks (like fgetcsv or fread) please?
I am programming this so it can work with any filesize (even if it takes ages to run) as this is a fully automated system.
PS: I checked this post and various others already
Reading an Excel file in PHP
Possible ways...
Get help from other languages. e.g. find a Python excel library and use it. Then call Python from PHP.
Modify the source code of those Excel readers
Use a command line tool to convert excel to csv, e.g. Pandoc maybe, and use the csv in PHP
Since xls file is nothing but a zip file, maybe it can be unzipped and found the values
First decompose one xls into many small xls files via non-PHP solution, e.g. VBA in excel, then read each of them.

Reading very large (more than 100MB) Excel files in PHP

I'm trying to read a larger than 100MB Excel file using PHPExcel but it crashes while loading the file. I don't need any styling. I tried using:
$objReader->setReadDataOnly(true);
but it still crashes.
Is there any efficient way to read this size of Excel file in PHP?
Try Spout: https://github.com/box/spout.
This is a PHP library that was created to solve your problem (reading/writing large files). Here is why it works:
Other libraries keep a representation of the spreadsheet in memory which make them subject to out of memory errors. Using some caching strategies will help with these kind of errors but will affect performance pretty badly.
On the other hand, Spout uses streams to read or write data. This means that there is only one row kept in memory at all times, all read/written rows being freed from memory. This allows fast read/write of dataset of any size! Give it a try :)
Spout just saved my time! I couldn't read a large file with PhpOffice/PhPSpreedSheet with many Fatal Error Memory size, and with Spout it works like a charm.

how to load just a few rows of .xls file with phpexcel?

I have a trouble with converting a .xls file (Excel) to CSV in PHPExcel.
All works fine until comes some Big file. My php script just exceeds the memory limit and blows up. I cannot use more than 64MB because of the specifics of the computer. I'm running Apache on it.
We need to find a solution.
I think I have to tell PHPExcel to load just a few lines of Excel than convert it to small CSV, save it, free the used memory and so on with the rest of the file until it's done...
What you think about? Can we find the more accurate way of doing it.
You have a few options for saving memory with PHPExcel. The main two are:
cell caching, described in section 4.2.1 of the developer
documentation,
This allows you to reduce the memory overhead for each cell that is read from the file
Chunking, described in section 4.3 of the User Documentation for
Readers
This allows you to read small ranges of rows and
columns from a file, rather than the whole worksheet

PHP including large files - memory problems

I am having issues with including really large files and so have used ini_set('memory_limit', '-1');
but still cannot include a file that is just over 1GB. What should I do?
You will have to reconsider your architecture; PHP's include function was not designed to handle such large files; it was designed to include and evaluate a PHP code file. Without knowing what data the file actually holds, it's hard to say more; but it seems very unlikely that this file actually only holds PHP code; it sounds like you're trying to read a lot of information which is encoded in a PHP-like format.
You should e.g. try to read the file in chunks or lines instead of using include.
The only possible reason you would be trying to load such a file is to probably import it into a database. Each database has a file loader that is meant for this purpose. Mysql has LOAD INFILE specifically to deal with this issue. I'd recommend getting the data in your database first, then use PHP or MySQL to transform the data to your needs. Using PHP to parse through a 1GB file is probably not the best usage of PHP or your resources.

Import large external MySQL XML dump into my own databases?

We need to import a MySQL dump from another website to ours. We've been trying SimpleXML, XML DOM etc, but the file is so huge it's crashing our server. We looked into BigDump, but that doesn't handle the XML import. Every tag in our XML file is called <field name="something"> In <table_something> tags, which I haven't seen before - usually it's descriptive custom tags. This is probably because I haven't done much database importing before now,
What we would like is some way to make our PHP import this huge file. It needs to be freshly updated every night so I'm thinking of dropping the tables and importing fresh unless there's a way to search for differences but I wouldn't know how.
Can anyone help with this? What would be the standard procedure for achieving these results?
Moving a database is usually done outside of PHP, via command line script:
Dump db to a file
tar the file
FTP to new server
Then on the new server
untar the file
Import to mysql
Do you have shell access?
If you have to do it via PHP, you'll need to split up the dump into lots of tiny files and import them one at a time. Depending on the size of your database, this could take a really long time.
When doing the dump, minimize the file size by not using XML, stripping comments, etc.

Categories