Nextcloud (Apache/PHP-FPM) Slow Upload Speed - php

I'm new to Linux/Apache/PHP installations.
I have a Nextcloud installation. If I upload a large file using the browser the upload speed is about 2 - 3 MB/d (HTTP/2). I have tried using HTTP 1.1 here was the upload speed about 10 MB/s. If I upload the same file using WinSCP the upload speed reaches 50 MB/s.
So there is a huge difference in the upload speed. Any idea how I Improve the upload speed from the browser?
Phpinfo as image: https://drive.google.com/file/d/1njwVwY8x6TxXWp5-9yVRmxio2I766nv4/view?usp=sharing

Nextcloud chunk the file before sending to the server. There are some common issues on it :
if you have an antivirus, it scans all the chunks and not the global file. It's take more time.
if you use object storage, the chunk are saved in it before get it back to reconstruct the real file and then saved it back in the object storage. This take very long time.
And, there is the limit of ajax (to send the chuncks). Web browser can have 6 xhr openened at the same time.
You can try to change the chunck file size to increase it : https://docs.nextcloud.com/server/latest/admin_manual/configuration_files/big_file_upload_configuration.html#configuring-nextcloud
Good luck

Related

Is there a way to monitor progress of uploaded to Azure cloud server via PHP?

I'm working on a project that processes some files on the server at the user's request, then uploads the resulting ZIP file as a blob to Azure for storage. A few cases involve extremely large files which take about an hour to upload. It would be helpful if I could at any random moment run a separate process that queries Azure for the upload progress while the main process on the local server is still preoccupied with uploading the file.
Is there a way to do this in PHP? (If it's of any help, this project is running on Phalcon.)
Is there a way to monitor progress of uploaded to Azure cloud server via PHP?
I don't think it's possible because uploading file is a single task and even though internally the file is split into multiple chunks and these chunks get uploaded, the code actually wait for the entire task to finish.

How do I send a file upload from an HTML form to S3 via PHP without local storage?

I'm trying to convert a website to use S3 storage instead of local (expensive) disk storage. I solved the download problem using a stream wrapper interface on the S3Client. The upload problem is harder.
It seems to me that when I post to a PHP endpoint, the $_FILES object is already populated and copied to /tmp/ before I can even intercept it!
On top of that, the S3Client->upload() expects a file on the disk already!
Seems like a double-whammy against what I'm trying to do, and most advice I've found uses NodeJS or Java streaming so I don't know how to translate.
It would be better if I could intercept the code that populates $_FILES and then send up 5MB chunks from memory with the S3\ObjectUploader, but how do you crack open the PHP multipart handler?
Thoughts?
EDIT: It is a very low quantity of files, 0-20 per day, mostly 1-5MB sometimes hitting 40~70MB. Periodically (once every few weeks) a 1-2GB file will be uploaded. Hence the desire to move off an EC2 instance and into heroku/beanstalk type PaaS where I won't have much /tmp/ space.
It's hard to comment on your specific situation without knowing the performance requirements of the application and the volume of users needed to access it so I'll try to answer assuming a basic web app uploading profile avatars.
There are some good reasons for this, the file is streamed to the disk for multiple purposes one of which is to conserve memory use. If your file is not on the disk than it is in memory(think disk usage is expensive? bump up your memory usage and see how expensive that gets), which is fine for a single user uploading a small file, but not so great for a bunch of users uploading small files or worse: large files. You'll likely see the best performance if you use the defaults on these libraries and let them stream to and from the disk.
But again I don't know your use case and you may actually need to avoid the disk at all costs for some unknown reason.

Trying to decrease file download time in PHP

Currently I have a backend application that downloads one file per request. The file is not downloaded to end user in a web browser, and I'm trying to find ways of improving performance when downloading from an FTP server.
I connect to the FTP server in question using ftp_ssl_connect and when I get the file for download, I do so with ftp_fget with mode FTP_BINARY.
For files of 8 MB or less (they are zip archives, b.t.w) the performance is fine, but I now have files that range from 30 MB to 300 MB. I understand that "it takes time" to obtain files and the larger the file, the more time required... etc.
To increase performance (or maybe I'm trying to reduce strain on system) I'd like to perhaps do FTP download of large files in chunks. So far I haven't seen much in PHP FTP function examples for this. I read about non-blocking FTP file getter functions, but I'm not really trying to solve the problem of "forcing a user to wait for the file to finish" - hence I'm ok with the server doing what it needs to do to get that file.
Should I consider instead using Curl as given in this SO example? Using CURL/PHP to download from FTP in chunks to save memory - Stack Overflow
Alternately can anyone recommend an existing package from https://packagist.org ?
Thanks,
Adam

Usage of bandwidth when uploading from a form

I need to know how much bandwidth is used when uploading a file through a form.
Let me explain a bit more easily. I have a file containing a upload form that is hosted on a web host. When the user uploads a file it is uploaded through this form and to another server through FTP, so basically I'm creating a FTP connection inside the PHP file that is stored on a web host.
How much bandwidth is used if I upload a 100MB file? And is it the receiving server (the server we upload to through FTP in PHP file), is it the web host (where we are hosting the PHP file that opens the FTP connection) or is it both that uses the bandwidth needed to upload a 100MB file?
When you use 100MB of bandwidth (transfer to first server) and another 100MB of bandwidth (transfer to another server), that's 200MB of bandwidth. 100MB download, 100MB upload. Sometimes your provider will bill those separately.
100 + 100 = 200. It really is that simple.
(Note that there is overhead in all cases, but not a ton.)

Amazon Elastic Transcoder Fails With Files Larger than 100mb

I have a social media website, and I think it will attract a lot of visitors. It allows users to upload images and video. For video, it uploads with PHP then converts it to the right format using Elastic Transcoder. It then stores the files into S3 Buckets. When I upload small videos (less than around 80mb) it works fine. But if I upload like a 150 or 300mb file, the job fails. Can anyone tell me why it happens and how to fix this? I have been stuck on this for the past 2 months.
Could be a few issues;
Your PHP script is timing out, do you need to set a higher time out limit for that script?
Max POST size is too small, you may need to edit the PHP config to increase POST OR Memory limits.
Also you should check the PHP and Apache logs to see if anything odd is occurring.

Categories