I have an issue where my client wants to be able to upload more than 2 large files at a time, and i was wondering if any of you have a solution to this, i'm pretty sure that it is a setting in either Apache or PHP.
Here is what the scenario, i have an upload form for videos with a max file size of 1024MB (1GB), so when a user uploads 1 file that is less than or equal to 1GB it works fine, then they open a new tab and upload another large file, but when they open up a 3rd tab of that form and try to Upload i Get a IO error from the form, so i assume either PHP or apache is limiting the max number of concurrent uploads to the server.
Edit:
When i use another computer with a totally separate internet connection (separate/different external IP) i can upload there as well, so it seems like a max concurrent upload per client/IP.
I think increasing the memory assigned to a php script can help you, I don't think this is the most appropriate solution but when i was having problem handling lot of big files, I increased the memory for the script. If you are developing this for a very busy site i don't recommend this method but as far as i know try increasing the memory.
ini_set('memory_limit','128M');
for testing if you have -1
instead of 128M the system will give unlimited memory to the script. You can use this to test if the problem is caused due to memory.
Related
For a client I have build a simple script that uploads multiple files(images), resizes them, stores them on a temporary folder and then later on move them to their destination.
Resizing is done using PHP's GD, as Imagick is not available.
These images are about 2/4 MB a piece and the client uploads about 30 images in one shot.
I used HTML5's multiple="" attribute which all works fine.
In tests all worked fine because I used Windows standard wallpaper images.
I can't find the source of the problem.
When uploading more then 1 image, the script failes debugging tells me it does upload the second image but won't resize.
I checked the memory usage for the images which is aprox 105724352 bytes each.
My PHP ini settings:
max_execution_time = 300
max_input_time = 600
memory_limit = 200M
So you see at the second image the memory reached it limit, making my script stop. Is that correct?
If so, how wise is it to upgrade the memory limit?
Thanks in advance!
EDIT:
It now seems the GD Function imagecreatefromjpeg cant handle files with a resolution bigger then 3500px wide, my files are bigger then 5000px wide.
Does anyone have a work arround for this?
At this point I am wondering if it is wise to have the client on a shared host at all if he needs so much memory for these images.
So you see at the second image the memory reached it limit, making my
script stop. Is that correct?
Check your Apache error logs under (**nix system) /var/log/apache2/error.log to see if it is really the problem.
If so, how wise is it to upgrade the memory limit?
You should not hande multiple image operations in one script. Make ajax queries for each, handle them in seperate instances.
I am in the early stages of building a PHP application, part of which involves using file_get_contents() to fetch large files from a remote server and transfer them to a user. Lets say, for example, the targeted file that is being fetched is 200 mB.
Will this process time out if downloading to the server takes too long?
If so, is there a way to extend this timeout?
Can this file that is being downloaded also be transferred to a user simultaneously, or does the file have to be saved on the server then manually fetched by the user once download has completed?
I am just trying to make sure that I know my options or limitations are before I do much too more.
Thank you for your time.
Yes, you can use set_time_limit(0) and the max_execution_time directive to cancel the time limit imposed by PHP.
You can open a stream of the file, and transfer it to the user seamlessly.
Read about fopen()
If not a timeout you may well run into memory issues depending on how your PHP is configured. You can adjust a lot of these settings manually through code without much difficulty.
http://php.net/manual/en/function.ini-set.php
ini_set('memory_limit', '256M');
i would like to allow users to upload photos of any size to the server quickly. after which, my program will resize the original image to thumnail, scaled and probably a max width of 1020px. because of bandwidth issues (im on shared server currently), i would need to find a way to avoid loading for too long or reaches the max upload time limit.
i understand i can do these:
1. extend the max upload time
2. set max file upload size (which im trying not to)
please advise =)
There is no secret. The upload time depends on the users's bandwith. If he has a small bandwith the upload will take time and he maybe can reach the limit of your server.
There is no optimisation for that on your side. Moreover a shared hosting has a lot of bandwith available (several Gb) so it's probably impossible for your user to reach that limit even more in upload
Same thing with the memory limit. If you have a 8mb memory limit , trying to work on a 18MP photo will reach this limit.
Nevertheless you can seperate the two action :
1- Upload the photo
2- Redirect with header() when upload is done
3- Resize image or put it in queue for a later processing
I have made a program wherein I am able to upload a file. Everything is working fine.
But, when I tried uploading 11mb file, it seems like it is loading forever or sending the file to server forever..
I have already tried setting the upload_max_filesize to 20M.
Any ideas what could be the cause and how to resolve this?
The why is almost certainly related to your connection speed. Unless you're connecting via a LAN to the machine in question you'll in all probability be connecting via a consumer grade broadband connection. These are pretty much always configured so that download speed is a lot higher than upload speed. As a consequence, that 20 meg file that takes a minute or so to download will take 10 minutes or more to upload over the same connection.
What can you do about this, other than switching to an enterprise-grade broadband connection? Not a great deal, those bits will only transfer so fast over the connection you've got and no faster. What you can do, however, is at least keep the user informed as to how the upload is progressing. PHP from version 5.2 onwards provides hooks that you can use to monitor the progress of a file upload. You can use javascript to monitor these hooks and display a progress bar to the user.
http://www.phpriot.com/articles/php-ajax-file-uploads
How long is "forever"? A typical upload speed on consumer broadband is 256 kilobits per second, at which speed an 11 megabyte file will take over five minutes to upload.
If you are using the Google Chrome web browser, you get an upload progress bar so you can tell if it is working or not.
We have a lamp server that is fairly busy, the CPU usage hovers around 90% at peak times. We are having an intermittent problem where file uploads from web forms fail. It only seems to happen with larger files (over a mb) and it seems to affect some users more than others. We've gone through and checked the obvious stuff like PHP ini max upload sizes, execution times, and folder write permissions. Also, the site worked for a year without trouble like this and it suddenly began (we don't think any of our application php would cause this).
We've watched what happens in Charles Proxy and it shows the upload happening (the sent filesize increases regularly) until it just stops sending data. The browser just shows it's spinning progress like it's proceeding but you can wait 20 minutes and nothing will happen or it reports a timeout.
Does anyone know why an upload might fail intermittently? My only guess is that maybe it has to do with server traffic, like apache is closing that connection prematurely.
If the load on the server is high, then your scripts may be timing out while trying to upload the file. Can you give any more specifics on your problem? I think PHP scripts have a 30 second timeout by default, meaning that if the script has not completed i.e. uploading the file within that time frame then the script will timeout and the upload will fail.
It seems like if the site worked for over a year, and now traffic has grown to the point where it is starting to strain the load on the server then it is possible that scripts may be timing out given the increased traffic and load on the server.