Upload big files in php - php

I need to upload big files from my website (music albums). The problem is that I can't upgrade my server speed and usually for those big files connection goes down (too much time for response, timeout). I could change php.ini stats but usually music albums need 5-10 min for upload. Which simple solution can you suggest me?

Related

Trying to decrease file download time in PHP

Currently I have a backend application that downloads one file per request. The file is not downloaded to end user in a web browser, and I'm trying to find ways of improving performance when downloading from an FTP server.
I connect to the FTP server in question using ftp_ssl_connect and when I get the file for download, I do so with ftp_fget with mode FTP_BINARY.
For files of 8 MB or less (they are zip archives, b.t.w) the performance is fine, but I now have files that range from 30 MB to 300 MB. I understand that "it takes time" to obtain files and the larger the file, the more time required... etc.
To increase performance (or maybe I'm trying to reduce strain on system) I'd like to perhaps do FTP download of large files in chunks. So far I haven't seen much in PHP FTP function examples for this. I read about non-blocking FTP file getter functions, but I'm not really trying to solve the problem of "forcing a user to wait for the file to finish" - hence I'm ok with the server doing what it needs to do to get that file.
Should I consider instead using Curl as given in this SO example? Using CURL/PHP to download from FTP in chunks to save memory - Stack Overflow
Alternately can anyone recommend an existing package from https://packagist.org ?
Thanks,
Adam

Optimize Wordpress image upload, slow and only 1 at a time on Dedicated

So I have been at this for weeks with different configurations and I usually am able to figure out by searching but I am not as savvy with server configuration, so I need to ask. Hopefully someone can point me in the right direction.
I have a dedicated server with 5 different WordPress installation, EVERYTHING is blazing fast, in terms of site load speed and posting, etc EXCEPT for when I start uploading images using the Wordpress Image upload. ITS SLOW..
When I drag photos over (doesn't matter the size) it takes forever and it seems to be processing one image at a time, unlike when I was on shared hosting, the image upload was a lot faster.. This happens on all my Wordpress installation with different themes and plugins, so I assume this is a server configuration, somewhere.
The process bar would go to 99% for the first image and it sits there for a few minutes, and then goes to the next; also my Admin would stall, so I cannot do anything else on the Admin area until all the images are done uploading. The site doesn't stall and is still functional if I go on a different browser that isn't login as the Admin.
If I go to my process manager, I see that the async-upload.php is running and is only taking up .3% of the CPU and .6% of the memory.
It always finishes but it seems like I am only allowed to have 1 connection or process (sorry, I do not know the correct terms) at a time and then I can request another. Anyone know what server configuration I am missing or have that is causing this? I am on WHM/Cpanel with SSH access, I have tried a few PHP, MySql, and Apache optimizations I found but it hasn't resolved the problem, I am of course doing something wrong with my configuration, anyone can shed some light?
Near the end of its image upload operation, WordPress attempts to resize the incoming images to create thumbnails and medium-sized images. To do this it has to load the images, decompressed, into memory.
That can take a lot of RAM. Try increasing the RAM available to your php instance. Look, in php.ini for a stanza like this
; Maximum amount of memory a script may consume
; http://php.net/memory-limit
memory_limit = 32M
and increase it.
You may also have issues with these settings.
; Maximum allowed size for uploaded files.
; http://php.net/upload-max-filesize
upload_max_filesize = 8M
; Maximum number of files that can be uploaded via a single request
max_file_uploads = 20
Also, take a look in your php_error.log and apache_error.log files and see if you can see any problems. Also, if you're using Google Chrome, open the Javascript console and see if there are any errors showing up.

Amazon Elastic Transcoder Fails With Files Larger than 100mb

I have a social media website, and I think it will attract a lot of visitors. It allows users to upload images and video. For video, it uploads with PHP then converts it to the right format using Elastic Transcoder. It then stores the files into S3 Buckets. When I upload small videos (less than around 80mb) it works fine. But if I upload like a 150 or 300mb file, the job fails. Can anyone tell me why it happens and how to fix this? I have been stuck on this for the past 2 months.
Could be a few issues;
Your PHP script is timing out, do you need to set a higher time out limit for that script?
Max POST size is too small, you may need to edit the PHP config to increase POST OR Memory limits.
Also you should check the PHP and Apache logs to see if anything odd is occurring.

upload large files 100MB+ progress

I am having a lot of trouble handling large file uploading on the client side. I am looking for a way to show the progress of an upload while that person uploads. Then when it is finished, I need to have the file handled by a php script.
I am currently using http://code.google.com/p/swfupload/ SWFUPLOAD, but it is giving me trouble. It works 100% of the time for small files that are 5MB and so on, but for larger files that are over 100MB I am getting weird behavior. For example, when finished, the upload script does not receive some of the posted variables sometimes and so on. It seems to be breaking for reasons I cannot diagnose and I am quite frankly completely sick of it. (PS all my php settings are fine).
I am just looking for a simple solution for upload progress that does not have too many bells and whistles. I just want the ability to upload large files 100MB-500MB and then have the form posted to an upload script without the client side solution hanging or causing problems.
Has anyone worked on a project that required uploading large files and displaying progress? If so, what was your solution?
Did it involve flash?
Does anyone have any recommendations or a reliable solution?
Thanks in advance.
PHP have a restriction for upload files, you can modify this argument in PHP.ini, but if you can't have access to PHP.ini (some webhosting don't give access to PHP.ini) you can try upload file via FTP.
Can try with this (is in spanish) or with another good.

How to upload a huge file (above 50MB) to server

I am a developing a website that involves uploading videos above 50MB.
Which is a better (faster) way of uploading the files to server:
uploading the video files via ftp
or
uploading the files via a form
Thanks
The best way would be with FTP.
FTP is much faster for larger file sizes. File sizes that are below 1MB won't matter as much.
P.S. If you are not the one uploading, then think which is easier for your users. Form is easier but ftp is still faster.
For user experience you should go with the form file upload; The speed of both depend on the internet connection speed and load of the server and client and won't differ that much. It might be a bit much for your webserver if it's handling a lot of users but you can use for example nginx to make that less of a problem.
edit:
here a comparison: http://daniel.haxx.se/docs/ftp-vs-http.html
I use Jupload
It splits the files and uploads them via http. It's also good because you don't need to care about file upload limitations in server config. Speed depends mostly on the client connection info both for HTTP and FTP. Of course there are some differences but not at all so big between them.
Why not offer both? (Seriously - I wrote an app about ten years ago that did this.) Look up "MOVEit DMZ" or research various FTP servers with web portal integration to see how it's being done today.
There's also a third way you should consider and was touched on by the Jupload comment: a local control (Flash, Java, ActiveX, Firefox plug-in, etc. that optimizes the upload experience). If people are uploading multiple large files to your site they may appreciate the speed/reliability boost.

Categories