Welcome,
Does someone have any resources about creating large zip files (one file, not folder) in PHP ?
Without using shell access to "zip" application.
low memory usage (i can't use gzcompress) because file is to large to do this in RAM.
I have unzip function work perfect on low memory system
Here is unzip,.
http://paste-it.net/public/wdb61dc/
Regards
If the deflate algorithm is too memory intensive for you then there isn't really a good way to do what you want. You can try turning down the compression.
Can you give us an idea of how large the file is and how much memory you want to work with?
Related
I've got to get some (potentially) very large files uploaded to my S3 bucket on a Laravel Job I am building out. I am getting the dreaded "Allowed memory size of ### bytes exhausted" error, and I have no interest in increasing the memory limit in php.ini (simply because I don't know how large some of these files will go, and at some point I need to quit running away from these large files by increasing memory_limit to ridiculous levels).
The question is: Does Laravel make chunking this thing easy? Is there a function I am not seeing that I can use?
I know the answer is probably no, but Laravel makes SO many things easy for me, I figured I might ask to see if I was missing something in my Google's.
If this does not exist in Laravel, what should I do? I know that I need to take the file into memory a chunk at a time, but I have no idea where to start on that.
Thanks!
I'm trying to read a larger than 100MB Excel file using PHPExcel but it crashes while loading the file. I don't need any styling. I tried using:
$objReader->setReadDataOnly(true);
but it still crashes.
Is there any efficient way to read this size of Excel file in PHP?
Try Spout: https://github.com/box/spout.
This is a PHP library that was created to solve your problem (reading/writing large files). Here is why it works:
Other libraries keep a representation of the spreadsheet in memory which make them subject to out of memory errors. Using some caching strategies will help with these kind of errors but will affect performance pretty badly.
On the other hand, Spout uses streams to read or write data. This means that there is only one row kept in memory at all times, all read/written rows being freed from memory. This allows fast read/write of dataset of any size! Give it a try :)
Spout just saved my time! I couldn't read a large file with PhpOffice/PhPSpreedSheet with many Fatal Error Memory size, and with Spout it works like a charm.
hi i wanted to know if uploading large files like videos ( over 200 mb - 1gb) from php is a good option after setting up the server configuration like max_post_size , execution time etc. The reason i ask this question is because i read some where that when a large file is uploaded , best practice is to break that file into chunks and upload it ( I think youtube does that). Do i need to use another language like python or C++ for uploading large files or is php enough. If i need to use another language can anyone please help me with reading material for that .
Thank you.
PHP will hold the entire file in memory while the upload is happening. That means that if you are uploading 5 files in parallel, then at the very most you will need 5GB+ of memory.
This can be done in PHP, and I have done this using a chunking method. There are several SO questions on this topic:
File uploads; How to utilize “chunking”?
Upload 1GB files using chunking in PHP
But my personal preference is to use plupload. It is a very complete cross-platform (JS, Flash, Silverlight) upload script with a nice PHP code sample to handle chunking.
Its not only PHP to be considered for large file uploads. Your web server also need to support that, at least in nginx. I don't know how httpd handles that, but as you said splitting in chunks are viable solution. FTP is another option.
I'm developing a webapp in PHP, and the core library is 94kb in size at this point. While I think I'm safe for now, how big is too big? Is there a point where the script's size becomes an issue, and if so can this be ameliorated by splitting the script into multiple libraries?
I'm using PHP 5.3 and Ubuntu 10.04 32bit in my server environment, if that makes any difference.
I've googled the issue, and everything I can find pertains to PHP upload size only.
Thanks!
Edit: To clarify, the 94kb file is a single file that contains all my data access and business logic, and a small amount of UI code that I have yet to extract to its own file.
Do you mean you have 1 file that is 94KB in size or that your whole library is 94KB in?
Regardless, as long as you aren't piling everything into one file and you're organizing your library into different files your file size should remain manageable.
If a single PHP file is starting to hit a few hundred KB, you have to think about why that file is getting so big and refactor the code to make sure that everything is logically organized.
I've used PHP applications that probably included several megabytes worth of code; the main thing if you have big programs is to use a code caching tool such as APC on your production server. That will cache the compiled (to byte code) PHP code so that it doesn't have to process every file for every page request and will dramatically speed up your code.
I am needing to download a very large file via PHP, the last time I did it manually via http it was 2.2gb in size and took a few hours to download. I would like to automate the download somehow.
Previously I have used
file_put_contents($filename, file_get_contents($url));
Will this be ok for such a large file? I will want to untar the file post downloading and then perform analysis of the various files inside the tarball.
file_get_contents() is handy for small files but it's totally unsuitable for large files. Since it loads the entire file into memory you need like 2GB of RAM for each script instance!
You should use resort to old fopen() + fread() instead.
Also, don't discard using a third-party download tool like wget (installed by default in many Linux systems) and create a cron task to run it. It's possibly the best way to automate a daily download.
You will have to adapt your php.ini to accept larger files in upload, and adapt your memory usage limit.