PHP ftp_fget simultaneous downloading - php

I need to download a bunch of files using FTP.
I am allowed up to 5 connections. I can use FileZilla to download the files pretty quick but I would like this to be done using PHP for various reasons.
Is it possible to simultaneously download files this way instead of going from file to file? Is there a difference, download speed-wise, to create multiple connections? I need them downloaded as quickly as possible.

to download more than one file at a at time
use ftp_nb_fput() and then a loop that uses ftp_nb_continue for each handler, alternating.
You will still be limited to the maximum bandwidth available to you, so simultaneous downloads may not be any faster

i think you can do this with curl! http://php.net/manual/en/book.curl.php

Related

PHP FPM download speed on a backup system

i'm making a backup system for a company and I need to understand why i can't get better download speeds using PHP.
The files are on the webserver and I need to bring them to the backupserver. The problem is, using WGET to get the files, i can download them to the backupserver at 50mbps(network limit), but using PHP file_put_contents i can only get like 2mbps if only one file, and when i try to download like 50 files at the same time they get 50kbps each...
Since i'm downloading about 50TB of content, and each file is about 800mb-1.2g, this would take months this way.
I'm using NGINX with PHP-FPM and the configs are perfect everywhere. No limits, no timeouts etc
The code i'm using is basically this example but i'm updating the bytes downloaded in mysql.
https://www.php.net/manual/en/function.stream-notification-callback.php
Could this problem be related to file_put_contents performance? Is there a solution to get better download speeds?

Trying to decrease file download time in PHP

Currently I have a backend application that downloads one file per request. The file is not downloaded to end user in a web browser, and I'm trying to find ways of improving performance when downloading from an FTP server.
I connect to the FTP server in question using ftp_ssl_connect and when I get the file for download, I do so with ftp_fget with mode FTP_BINARY.
For files of 8 MB or less (they are zip archives, b.t.w) the performance is fine, but I now have files that range from 30 MB to 300 MB. I understand that "it takes time" to obtain files and the larger the file, the more time required... etc.
To increase performance (or maybe I'm trying to reduce strain on system) I'd like to perhaps do FTP download of large files in chunks. So far I haven't seen much in PHP FTP function examples for this. I read about non-blocking FTP file getter functions, but I'm not really trying to solve the problem of "forcing a user to wait for the file to finish" - hence I'm ok with the server doing what it needs to do to get that file.
Should I consider instead using Curl as given in this SO example? Using CURL/PHP to download from FTP in chunks to save memory - Stack Overflow
Alternately can anyone recommend an existing package from https://packagist.org ?
Thanks,
Adam

Serving large file downloads from remote server

We have files that are hosted on RapidShare which we would like to serve through our own website. Basically, when a user requests http://site.com/download.php?file=whatever.txt, the script should stream the file from RapidShare to the user.
The only thing I'm having trouble getting my head around is how to properly stream it. I'd like to use cURL, but I'm not sure if I can read the download from RapidShare in chunks and then echo them to the user. The best way I've thought of so far is to use a combination of fopen, fread, echo'ing the chunk of the file to the user, flushing, and repeating that process until the entire file is transferred.
I'm aware of the PHP readfile() function aswell, but would that be the best option? Bear in mind that these files can be several GB's in size, and although we have servers with 16GB RAM I want to keep the memory usage as low as possible.
Thank you for any advice.
HTTP has a Header called "Range" which basically allows you to fetch any chunk of a file (knowing that you already know the file size), but since PHP isn't multi-threaded aware, I don't see any benefit of using it.
Afaik, if you don't want to consume all your RAM, the only way to go is a two steps way.
First, stream the remote file using fopen()/fread() (or any php functions which allow you to use stream), split the read in small chunks (2048 bits may be enough), write/append the result to a tempfile(), then "echoing" back to your user by reading the temporary file.
That way, even a file 2To would, basically, consumes 2048 bits since only the chunk and the handle of the file is in memory.
You may also write some kind of proxy manager to cache and keep already downloaded files to avoid the remote reading process if a file is heavily downloaded (and keep it locally for a given time).

Upload big files with PHP and FTP

I need to write a script to upload big Files (~2GB+) to a server.
I don't think HTTP is the right way to do this so I want to use (S)FTP.
There are several tutorials about this (using cURL or ftp_connect) and I understand that i have to set several things in the php.ini.
But all this tutorials upload the file to a remote Server, what I want to do is upload it to the Server the script is running on, without having to upload the file to the Server over HTTP first.
Is this possible? If so, how would I do that?
HTTP can be the right way to upload large files. You can use resumable.js or similar library to split the file in "chunks" and then reassemble the file on the server.
If you decided not to go with HTTP and have shell access, I recommend you use rsync (with the --partial flag) which will do the heavy lifting for you.

Upload 1GB files using chunking in PHP

I have a web application that accepts file uploads of up to 4 MB. The server side script is PHP and web server is NGINX. Many users have requested to increase this limit drastically to allow upload of video etc.
However there seems to be no easy solution for this problem with PHP. First, on the client side I am looking for something that would allow me to chunk files during transfer. SWFUpload does not seem to do that. I guess I can stream uploads using Java FX (http://blogs.oracle.com/rakeshmenonp/entry/javafx_upload_file) but I can not find any equivalent of request.getInputStream in PHP.
Increasing browser client_post limits or php.ini upload or max_execution times is not really a solution for really large files (~ 1GB) because maybe the browser will time out and think of all those blobs stored in memory.
Is there any way to solve this problem using PHP on server side? I would appreciate your replies.
plupload is a javascript/php library, and it's quite easy to use and allows chunking.
It uses HTML5 though.
Take a look at tus protocol which is a HTTP based protocol for resumable file uploads so you can carry on where you left off without re-uploading whole data again in case of any interruptions. This protocol has also been adopted by vimeo from May, 2017.
You can find various implementations of the protocol in different languages here. In your case, you can use its javascript client called uppy and use golang or php based server implementation in a server.
"but I can not find any equivalent of request.getInputStream in PHP. "
fopen('php://input'); perhaps?
I have created a JavaFX client to send large files in chunks of max post size (I am using 2 MB) and a PHP receiver script to assemble the chunks into original file. I am releasing the code under apache license here : http://code.google.com/p/gigaupload/
Feel free to use/modify/distribute.
Try using the bigupload script. It is very easy to integrate and can upload up to 2 Gb in chunks. The chunk size is customizable.
How about using a java applet for the uploading and PHP for processing..
You can find an example here for Jupload:
http://sourceforge.net/apps/mediawiki/jupload/index.php?title=PHP_Example
you can use this package
it supports resumable chunk upload.
in the examples/js-examples/resumable-chunk-upload example , you can close and re-open the browser and then resume not completed uploads.
You can definitely write a web app that will accept a block of data (even via a POST) then append that block of data to a file. It seems to me that you need some kind of client side app that will take a file and break it up into chunks, then send it to your web service one chunk at a time. However, it seems a lot easier to create an sftp dir, and let clients just sftp up files using some pre-existing client app.

Categories