I'm doing multiple file uploader to upload my files by using XMLHttp 2. When i'm doing this, if i upload the files one by one (by sending a file through an ajax request and waiting it to complete before sending the next and so on), the uploader works well. But when i upload multiple files by using concurrent ajax requests to upload multiple files concurrently, the browser hangs down. Here is the performance comparison,
So is there any maximum limit for UPLOAD FILE SIZE or NUMBER OF AJAX REQUESTS that the browser can handle concurrently?
Note: That red numbers shows the total upload time in that the Firefox(using firebug) consumes to upload all the files. In parallel upload, since all uploads happening concurrently, i took the time consumed by the largest file which ended at last.
There's no theoretical maximum for the number of concurrent uploads (unless the browser builders put one in explicitly).
However, in practice, upload speed performance drops significantly after two or three concurrent uploads due to bandwidth choking, with the exception of very low latency whereby the tcp window limits the maximum speed of a single upload.
I would recommend setting a concurrency limit of 2, especially if you're providing this to external users whose bandwidth may vary. Alternatively, you could do speed benchmarking as well, adapting the concurrency level based on measured upload performance.
Related
I want to upload file to remote server.multiple user upload the files at same time so any limitation of curl connection
there is no built-in limitation on the max amount of curl_easy handles you can have running concurrently, nor any max limit of curl handles in a curl_multi handle, curl could easily upload over 1000 files concurrently without problems. curl will be the last of your problems, most (all? probably) operating systems have a max number of open sockets/file handles per process, for example, on windows server 2008, the limit seems to be somewhere around 16,000 handles per session, and if those limitations doesn't bite you first, you'll probably run out of RAM long before you hit any limits of curl itself.
When cropping and resizing images with PHP functions as imagecreatefromjpeg(), imagecrop(), imagecopyresampled(), and the like, you get notified about high memory usage.
If I implement this crop module into a CMS and say 50 people call the crop function at the same time: will the memory usage be multiplied by 50 and thus overload the server memory resources?
If so what would be a good way to queue requests or prevent the overload?
Although this may not be a direct answer to your question, from my experience it is always better to handle image manipulation on another server as you don't want a single user to kill the entire webserver with a few images.
If you handle your images on a different server and it dies, the worst thing that happens is that the image will return 404, but the webserver itself will keep running just fine.
In any case, I would definitely recommend using library such as ImageMagick. It is easier to use and I believe you can set memory limit which it cannot exceed.
The best approach I found for image manipulation so far is AWS Lambda. You can easily write a serverless nodeJS script which downloads your image, performs any modifications to it and uploads it back to your server. Unless you need to process thousands of images every day, you can run it almost for free and it can handle up to 100 simultaneous invocations (it basically starts up to 100 servers at once) so in theory it can be almost 100x faster than if you have to queue your requests.
I have an issue where my client wants to be able to upload more than 2 large files at a time, and i was wondering if any of you have a solution to this, i'm pretty sure that it is a setting in either Apache or PHP.
Here is what the scenario, i have an upload form for videos with a max file size of 1024MB (1GB), so when a user uploads 1 file that is less than or equal to 1GB it works fine, then they open a new tab and upload another large file, but when they open up a 3rd tab of that form and try to Upload i Get a IO error from the form, so i assume either PHP or apache is limiting the max number of concurrent uploads to the server.
Edit:
When i use another computer with a totally separate internet connection (separate/different external IP) i can upload there as well, so it seems like a max concurrent upload per client/IP.
I think increasing the memory assigned to a php script can help you, I don't think this is the most appropriate solution but when i was having problem handling lot of big files, I increased the memory for the script. If you are developing this for a very busy site i don't recommend this method but as far as i know try increasing the memory.
ini_set('memory_limit','128M');
for testing if you have -1
instead of 128M the system will give unlimited memory to the script. You can use this to test if the problem is caused due to memory.
i would like to allow users to upload photos of any size to the server quickly. after which, my program will resize the original image to thumnail, scaled and probably a max width of 1020px. because of bandwidth issues (im on shared server currently), i would need to find a way to avoid loading for too long or reaches the max upload time limit.
i understand i can do these:
1. extend the max upload time
2. set max file upload size (which im trying not to)
please advise =)
There is no secret. The upload time depends on the users's bandwith. If he has a small bandwith the upload will take time and he maybe can reach the limit of your server.
There is no optimisation for that on your side. Moreover a shared hosting has a lot of bandwith available (several Gb) so it's probably impossible for your user to reach that limit even more in upload
Same thing with the memory limit. If you have a 8mb memory limit , trying to work on a 18MP photo will reach this limit.
Nevertheless you can seperate the two action :
1- Upload the photo
2- Redirect with header() when upload is done
3- Resize image or put it in queue for a later processing
We have a lamp server that is fairly busy, the CPU usage hovers around 90% at peak times. We are having an intermittent problem where file uploads from web forms fail. It only seems to happen with larger files (over a mb) and it seems to affect some users more than others. We've gone through and checked the obvious stuff like PHP ini max upload sizes, execution times, and folder write permissions. Also, the site worked for a year without trouble like this and it suddenly began (we don't think any of our application php would cause this).
We've watched what happens in Charles Proxy and it shows the upload happening (the sent filesize increases regularly) until it just stops sending data. The browser just shows it's spinning progress like it's proceeding but you can wait 20 minutes and nothing will happen or it reports a timeout.
Does anyone know why an upload might fail intermittently? My only guess is that maybe it has to do with server traffic, like apache is closing that connection prematurely.
If the load on the server is high, then your scripts may be timing out while trying to upload the file. Can you give any more specifics on your problem? I think PHP scripts have a 30 second timeout by default, meaning that if the script has not completed i.e. uploading the file within that time frame then the script will timeout and the upload will fail.
It seems like if the site worked for over a year, and now traffic has grown to the point where it is starting to strain the load on the server then it is possible that scripts may be timing out given the increased traffic and load on the server.