For a client I have build a simple script that uploads multiple files(images), resizes them, stores them on a temporary folder and then later on move them to their destination.
Resizing is done using PHP's GD, as Imagick is not available.
These images are about 2/4 MB a piece and the client uploads about 30 images in one shot.
I used HTML5's multiple="" attribute which all works fine.
In tests all worked fine because I used Windows standard wallpaper images.
I can't find the source of the problem.
When uploading more then 1 image, the script failes debugging tells me it does upload the second image but won't resize.
I checked the memory usage for the images which is aprox 105724352 bytes each.
My PHP ini settings:
max_execution_time = 300
max_input_time = 600
memory_limit = 200M
So you see at the second image the memory reached it limit, making my script stop. Is that correct?
If so, how wise is it to upgrade the memory limit?
Thanks in advance!
EDIT:
It now seems the GD Function imagecreatefromjpeg cant handle files with a resolution bigger then 3500px wide, my files are bigger then 5000px wide.
Does anyone have a work arround for this?
At this point I am wondering if it is wise to have the client on a shared host at all if he needs so much memory for these images.
So you see at the second image the memory reached it limit, making my
script stop. Is that correct?
Check your Apache error logs under (**nix system) /var/log/apache2/error.log to see if it is really the problem.
If so, how wise is it to upgrade the memory limit?
You should not hande multiple image operations in one script. Make ajax queries for each, handle them in seperate instances.
Related
I need to upload big size file by web page in PHP, as I surveyed, there are three variables related to upload.
post_max_size=100MB
upload_max_filesize=100MB
memory_limit
If I want to upload file under 100 MB, is that right just set size to 100 MB? and how much should I give to memory_limit for 100 MB file; Is there any other issue may cause the file can't finished upload.
The memory limit will be difficult because it depends on how you wish to process the file. If you're just going to write it to the disk without much processing, 100MB should be fine. If you're going to do any extensive processing on it, you may very well need to increase it higher depending on how your algorithm is handling memory usage.
I have an issue where my client wants to be able to upload more than 2 large files at a time, and i was wondering if any of you have a solution to this, i'm pretty sure that it is a setting in either Apache or PHP.
Here is what the scenario, i have an upload form for videos with a max file size of 1024MB (1GB), so when a user uploads 1 file that is less than or equal to 1GB it works fine, then they open a new tab and upload another large file, but when they open up a 3rd tab of that form and try to Upload i Get a IO error from the form, so i assume either PHP or apache is limiting the max number of concurrent uploads to the server.
Edit:
When i use another computer with a totally separate internet connection (separate/different external IP) i can upload there as well, so it seems like a max concurrent upload per client/IP.
I think increasing the memory assigned to a php script can help you, I don't think this is the most appropriate solution but when i was having problem handling lot of big files, I increased the memory for the script. If you are developing this for a very busy site i don't recommend this method but as far as i know try increasing the memory.
ini_set('memory_limit','128M');
for testing if you have -1
instead of 128M the system will give unlimited memory to the script. You can use this to test if the problem is caused due to memory.
i would like to allow users to upload photos of any size to the server quickly. after which, my program will resize the original image to thumnail, scaled and probably a max width of 1020px. because of bandwidth issues (im on shared server currently), i would need to find a way to avoid loading for too long or reaches the max upload time limit.
i understand i can do these:
1. extend the max upload time
2. set max file upload size (which im trying not to)
please advise =)
There is no secret. The upload time depends on the users's bandwith. If he has a small bandwith the upload will take time and he maybe can reach the limit of your server.
There is no optimisation for that on your side. Moreover a shared hosting has a lot of bandwith available (several Gb) so it's probably impossible for your user to reach that limit even more in upload
Same thing with the memory limit. If you have a 8mb memory limit , trying to work on a 18MP photo will reach this limit.
Nevertheless you can seperate the two action :
1- Upload the photo
2- Redirect with header() when upload is done
3- Resize image or put it in queue for a later processing
I have a LAMP setup running PHP 5.2.6-1 with the Suhosin Patch (0.9.6.2) and Zend (2.2.0) with APC enabled for use with a file upload script using an ajax cal to get the status and generate a progress bar.
Everything appears to be working, the file uploads perfectly and is displayed correctly on the website or if you download it, but it never gets marked as "complete" by APC, nor does the file size reach the actual size (in the APC call, the uploaded file is just fine).
What could be the reason for APC never seeing the file completely uploaded, and how can I solve this? I'm currently running a rather hack'n'slash way for this, since the file size always reaches at least 90%, I've got my ajax call checking the size, if it's at 90% and stays there for 3 updates, it waits 5 more seconds and then expects it to be completed (not ideal if it's a large file and it really isn't done yet)
Try setting apc.rfc1867_freq=0 this should make APC update the size all the way, whereas before it may have been updating it in 10k increments and stopped near the end.
check the upload_max_filesize. If you are trying to upload a file that is bigger than upload_max_filesize then the you will have this problem. Increase the upload_max_filesize to fix the problem.
I have a site that enables users to upload images which are then re-sized into 4 different sizes.
I'm moving to a new host and I wondered what makes a good spec for handling this task - or should any server spec be able to handle this task. Should I look at more RAM or a better CPU etc...
Images are currently restricted to 2mb but I'd like to increase that.
Is there anything to choose between these (for this task)?
Option 1.
* Processor: Pentium 4 3GHZ Hyperthreaded
* Memory: 2GB DDR SDRAM
* Hd1: 120GB 7200RPM SATA / 8MB Cache
* Hd2: 120GB 7200RPM SATA / 8MB Cache
* OS: Linux - CentOS 5 (+32 Bit)
Option 2.
* Processors: Dual Core Intel Core 2 Duo 2.2GHz
* Memory: 1GB RAM
* Hard Disk: 1x 160GB 7,200rpm
* OS: Linux - CentOS 5.2
edit:
I'm using
http://pear.php.net/package/Image_Transform
with GD2
Volume is very low, but
certain JPG files fail even when
they are < 2mb
Current hosting is a VPS with 768mb dedicated ram (find
out about processor)
You don't say how many you are doing per time period, what you are using (GD? ImageMagick? Something else) or the spec and performance of your current server.
However unless you are doing a lot, both of those servers should be way more than fine.
Definitely stick with a VPS (vs. shared hosting) because working with images in PHP is all about tuning your php.ini file.
There are a ton of reasons why a PHP script would fail to process an upload:
Upload is too big. The upload size is controlled by several directives:
post_max_size, upload_max_filesize, memory_limit. If all of the above directives are not configured properly, the defaults will cap you around 2MB.
Ran out of memory working with the image. The memory_limit directive affects this. Also make sure your code is releasing resources as soon as possible instead of waiting for script termination.
Operations took too long. max_input_time and max_execution_time control how long the script gets to execute (max_input_time controls HTTP I/O, max_execution_time controls actual script execution). Bigger images take longer to process.
Figure out which conditions are failing, and then scale your server up to resolve those conditions. If you are switching hosts based on performance issues, you might want to do this first. You might find that the switch is unneeded.
IF you're just doing development/testing, and maybe just a soft launch - one if fine. If you expect to go live you're going to need to keep tabs on your server load and how many processes you are spawning, as well as how long your actual resize time is for images.
If you expect to handle serious volume in the near future, you'll definitly want the dual core system. Resizing images is very intensive. Further down the road, you may need additional machines just to handle image processing and one to handle the site.