Is there an Apache equivalent of nginx mod_zip? - php

I have been working a web app where I need to allow the user to select a number of files and allow them to download said files as a zip file. I am working with lots of data so storing the zip file in memory or on disk isn't an option.
I am currently using Apache and haven't been able to find any solutions to be able to dynamically create and stream zip files to a client. One thing I did find was nginx mod_zip that seems to do exactly what I want.
What would be an Apache equivalent to mod_zip, or another solution to dynamically zip and stream zip files (without using disk space or loading the whole file in memory)?

Related

Serving ZIP file contents using PHP without unzipping to a temporary location

I have an application that wants to serve the contents of a series of relatively small ZIP files at a URL in my application but I do not want to unzip the files to some temporary location and serve them from the temporary location. If I have a file called test123.zip which contains two files hello.png and world.png, I want the following URLs to work:
https://www.tsugi.org/livezip/test123/hello.png
https://www.tsugi.org/livezip/test123/world.png
For each request, I will read the ZIP file and extract the requested content and serve it.
There are lots of constraints that make it so extracting once and serving from disk is less than ideal for my application which is a homework grading and annotation system. Users will be uploading 1000's of limited size ZIPs per day and each ZIP might be read 3 times max and each zip will be discarded in 3-4 days. There is strict authorization on who can see each ZIP file and the contents of that ZIP file. That is a lot of churn in some temporary space where I UNZIP and temporarily hold the files. I would rather not build a bunch of infrastructure to do clean up if I don't have to. And I don't want to fill up my temporary space if possible. And this runs in a multi-server load balanced environment and I don't want to go through the effort of making my temporary space an S3 bucket or networked drive...
So, lets leave the assertion that I should not do this out of the answers. I am looking for a library that helps me do this in PHP or even a practice that could handle this. I have already started to write it - I just wish I could use a library instead.

PHP FPM download speed on a backup system

i'm making a backup system for a company and I need to understand why i can't get better download speeds using PHP.
The files are on the webserver and I need to bring them to the backupserver. The problem is, using WGET to get the files, i can download them to the backupserver at 50mbps(network limit), but using PHP file_put_contents i can only get like 2mbps if only one file, and when i try to download like 50 files at the same time they get 50kbps each...
Since i'm downloading about 50TB of content, and each file is about 800mb-1.2g, this would take months this way.
I'm using NGINX with PHP-FPM and the configs are perfect everywhere. No limits, no timeouts etc
The code i'm using is basically this example but i'm updating the bytes downloaded in mysql.
https://www.php.net/manual/en/function.stream-notification-callback.php
Could this problem be related to file_put_contents performance? Is there a solution to get better download speeds?

Uploading common required file via FTP without affecting website?

Lets say I have a file common.php used by many pages in my website. Now I want to update the file via FTP, so there will be around 1-2 seconds where the file is not available / still partially being uploaded.
During that time, it causes require('common.php') to report error, thus website is not loading properly.
How to solve cases like this?
Thanks!
You can upload the file with a different name and rename it only after the upload completes. That minimizes the downtime.
Some clients support this even automatically. What further minimizes the downtime.
For example, WinSCP SFTP/FTP client supports this. But with SFTP protocol only, if that's an option for you.
In WinSCP preferences, enable Transfer to temporary filename for All files.
WinSCP will then upload all files with a temporary .filepart extension, overwriting the target file only after the upload finishes.
(I'm the author of WinSCP)

FTP Issues Downloading Large Directory

When I use FireFTP (or other FTP clients for that matter) to download large directories the download gets messed up. It seems to work unendingly and show a nearly completed percentage. Then it will change and show a status percentage much farther from completion. So usually what I have to do with large directories is ssh into the host and zip or tar the file and then download the tarred file. Is there a reason and/or solution to this?
Configure the client to use passive mode. Then try again.

Big xml files over FTP

On my FTP server, there is a 4gb XML file and I want it to put data from that file to the database using PHP.
I know how to connect to FTP and basic operations using PHP but my question is there a possibility to do it without having to download the file first?
Unfortunately no, you cannot "stream" file using FTP as you could do say on network drive. It's not possible to open that file without downloading it first locally.
This is given you can only access that file via FTP.
If your FTP server and PHP server are one and the same, you just need to change the path to reference the FTP location rather than whereever you are downloading to.
If they are on the same local network, you may be able to use a network path to reach the file.
Otherwise, you will indeed need to transfer the entire file first by downloading it.

Categories