i'm making a backup system for a company and I need to understand why i can't get better download speeds using PHP.
The files are on the webserver and I need to bring them to the backupserver. The problem is, using WGET to get the files, i can download them to the backupserver at 50mbps(network limit), but using PHP file_put_contents i can only get like 2mbps if only one file, and when i try to download like 50 files at the same time they get 50kbps each...
Since i'm downloading about 50TB of content, and each file is about 800mb-1.2g, this would take months this way.
I'm using NGINX with PHP-FPM and the configs are perfect everywhere. No limits, no timeouts etc
The code i'm using is basically this example but i'm updating the bytes downloaded in mysql.
https://www.php.net/manual/en/function.stream-notification-callback.php
Could this problem be related to file_put_contents performance? Is there a solution to get better download speeds?
Related
Currently I have a backend application that downloads one file per request. The file is not downloaded to end user in a web browser, and I'm trying to find ways of improving performance when downloading from an FTP server.
I connect to the FTP server in question using ftp_ssl_connect and when I get the file for download, I do so with ftp_fget with mode FTP_BINARY.
For files of 8 MB or less (they are zip archives, b.t.w) the performance is fine, but I now have files that range from 30 MB to 300 MB. I understand that "it takes time" to obtain files and the larger the file, the more time required... etc.
To increase performance (or maybe I'm trying to reduce strain on system) I'd like to perhaps do FTP download of large files in chunks. So far I haven't seen much in PHP FTP function examples for this. I read about non-blocking FTP file getter functions, but I'm not really trying to solve the problem of "forcing a user to wait for the file to finish" - hence I'm ok with the server doing what it needs to do to get that file.
Should I consider instead using Curl as given in this SO example? Using CURL/PHP to download from FTP in chunks to save memory - Stack Overflow
Alternately can anyone recommend an existing package from https://packagist.org ?
Thanks,
Adam
I'm serving downloads on apache using X-SendFile within PHP. It works great, but the problem I'm anticipating is when users download large >18GB files they may run into issues where their PC's power saving options might interrupt the download. Is there any solution to this besides splitting up the file into pieces and making them download it one by one? Is there download manager software out there I can implement on our server?
When I use FireFTP (or other FTP clients for that matter) to download large directories the download gets messed up. It seems to work unendingly and show a nearly completed percentage. Then it will change and show a status percentage much farther from completion. So usually what I have to do with large directories is ssh into the host and zip or tar the file and then download the tarred file. Is there a reason and/or solution to this?
Configure the client to use passive mode. Then try again.
I am just currently wondering how I can backup a folder which contains 8000+ images without the script timing out, the folder in all contains around 1.5gb of data, which we need to backup ourselves every so often.
I have tried the zip functionality provided in PHP, however it simply times out the request due to the huge number of files needed to be backed up, it does however work with smaller amounts of work.
I am trying to run this script through a HTTP REQUEST, would putting it through a Cronjob ignore the timeout?
Does anyone have any recommendations?
I would not use php for that.
If you are on linux I would setup a cron job and to run a program like rsync periodically.
A nice introduction about rsync.
Edit: If you do want / need to go the php way, you can also consider just copying instead of using zip. zip normally doesn't do much with images and if you have a database already, you can check your current directory against the database and just do a differential backup (just copy the new files). That way only your initial backup would take a long time.
You can post the code so we can optimize it, other than that, you should change your php.ini (configuration file) and remove/increase the timeout (the longest time your script can run on your server)
I need to download a bunch of files using FTP.
I am allowed up to 5 connections. I can use FileZilla to download the files pretty quick but I would like this to be done using PHP for various reasons.
Is it possible to simultaneously download files this way instead of going from file to file? Is there a difference, download speed-wise, to create multiple connections? I need them downloaded as quickly as possible.
to download more than one file at a at time
use ftp_nb_fput() and then a loop that uses ftp_nb_continue for each handler, alternating.
You will still be limited to the maximum bandwidth available to you, so simultaneous downloads may not be any faster
i think you can do this with curl! http://php.net/manual/en/book.curl.php