Comparison in HTML5 huge vs small chunk file size - php

What would be the bottleneck for fast performance file upload aside from internet connection of course.
HTML 5 have file api for slicing file into chunk.
what would make better performance in uploading file from client side to server side e.g:
4GB file slice into
4000 chunks of 1MB file vs
400 chunks of 10 MB file vs
40 chunks of 100MB file vs
4 chunks of 1GB file, if that's even possible together at a time.
I'm not talking about internet connection here. Although that will be the main issue, but I'm sure, there is other thing that will affect the performance.
such as, file size or http issue, or maybe upload it one slice sequentially instead of simultaneously at a time or something else...

Related

How bad are huge php files?

I have a web site without any nessasery functions and a big Php file (over 13000 lines) on another server (it is using by web and mobile app), where all the logic functions was placed separated by classes, but most of the functions are in one class (11000 lines).
Size of the whole file is 1282KB.
How bad is that for the execution time?
On the server the sing in process takes less than a second, but on my much weaker PC (i5 3470 3.2GHz; 8GB RAM) it takes 10-15 seconds independent on size of data stream. An anomaly low speed in my opinion.
Is this file the biggest problem, that slowing processes so hard, or should i search the Problem somewhere else?

Upload load big size file in PHP

I need to upload big size file by web page in PHP, as I surveyed, there are three variables related to upload.
post_max_size=100MB
upload_max_filesize=100MB
memory_limit
If I want to upload file under 100 MB, is that right just set size to 100 MB? and how much should I give to memory_limit for 100 MB file; Is there any other issue may cause the file can't finished upload.
The memory limit will be difficult because it depends on how you wish to process the file. If you're just going to write it to the disk without much processing, 100MB should be fine. If you're going to do any extensive processing on it, you may very well need to increase it higher depending on how your algorithm is handling memory usage.

Concurrent AJAX file upload LIMIT

I'm doing multiple file uploader to upload my files by using XMLHttp 2. When i'm doing this, if i upload the files one by one (by sending a file through an ajax request and waiting it to complete before sending the next and so on), the uploader works well. But when i upload multiple files by using concurrent ajax requests to upload multiple files concurrently, the browser hangs down. Here is the performance comparison,
So is there any maximum limit for UPLOAD FILE SIZE or NUMBER OF AJAX REQUESTS that the browser can handle concurrently?
Note: That red numbers shows the total upload time in that the Firefox(using firebug) consumes to upload all the files. In parallel upload, since all uploads happening concurrently, i took the time consumed by the largest file which ended at last.
There's no theoretical maximum for the number of concurrent uploads (unless the browser builders put one in explicitly).
However, in practice, upload speed performance drops significantly after two or three concurrent uploads due to bandwidth choking, with the exception of very low latency whereby the tcp window limits the maximum speed of a single upload.
I would recommend setting a concurrency limit of 2, especially if you're providing this to external users whose bandwidth may vary. Alternatively, you could do speed benchmarking as well, adapting the concurrency level based on measured upload performance.

Increasing the max concurrent file uploads on a LAMP server

I have an issue where my client wants to be able to upload more than 2 large files at a time, and i was wondering if any of you have a solution to this, i'm pretty sure that it is a setting in either Apache or PHP.
Here is what the scenario, i have an upload form for videos with a max file size of 1024MB (1GB), so when a user uploads 1 file that is less than or equal to 1GB it works fine, then they open a new tab and upload another large file, but when they open up a 3rd tab of that form and try to Upload i Get a IO error from the form, so i assume either PHP or apache is limiting the max number of concurrent uploads to the server.
Edit:
When i use another computer with a totally separate internet connection (separate/different external IP) i can upload there as well, so it seems like a max concurrent upload per client/IP.
I think increasing the memory assigned to a php script can help you, I don't think this is the most appropriate solution but when i was having problem handling lot of big files, I increased the memory for the script. If you are developing this for a very busy site i don't recommend this method but as far as i know try increasing the memory.
ini_set('memory_limit','128M');
for testing if you have -1
instead of 128M the system will give unlimited memory to the script. You can use this to test if the problem is caused due to memory.

PHP File uploading stops

For a client I have build a simple script that uploads multiple files(images), resizes them, stores them on a temporary folder and then later on move them to their destination.
Resizing is done using PHP's GD, as Imagick is not available.
These images are about 2/4 MB a piece and the client uploads about 30 images in one shot.
I used HTML5's multiple="" attribute which all works fine.
In tests all worked fine because I used Windows standard wallpaper images.
I can't find the source of the problem.
When uploading more then 1 image, the script failes debugging tells me it does upload the second image but won't resize.
I checked the memory usage for the images which is aprox 105724352 bytes each.
My PHP ini settings:
max_execution_time = 300
max_input_time = 600
memory_limit = 200M
So you see at the second image the memory reached it limit, making my script stop. Is that correct?
If so, how wise is it to upgrade the memory limit?
Thanks in advance!
EDIT:
It now seems the GD Function imagecreatefromjpeg cant handle files with a resolution bigger then 3500px wide, my files are bigger then 5000px wide.
Does anyone have a work arround for this?
At this point I am wondering if it is wise to have the client on a shared host at all if he needs so much memory for these images.
So you see at the second image the memory reached it limit, making my
script stop. Is that correct?
Check your Apache error logs under (**nix system) /var/log/apache2/error.log to see if it is really the problem.
If so, how wise is it to upgrade the memory limit?
You should not hande multiple image operations in one script. Make ajax queries for each, handle them in seperate instances.

Categories