PHP Uploading Multiple files always gives me timeout error - php

Ok, I don't quite understand why, but no matter what I seem to set in PHP's ini_set function, I am unable to upload multiple files or the files that I'm uploading are too big... I don't understand it. Here's what I am setting so far within the script that handles the $_FILES[] array that is being uploaded when posting the form:
// Try and allow for 3 hours, just in case...
#set_time_limit(10800);
// Try and set some ini settings to buy us some time...
#ini_set('memory_limit', '512M');
#ini_set('upload_max_filesize', '512M');
#ini_set('post_max_size', '550M');
#ini_set('session.gc_maxlifetime', '10800'); // Allows for up to 3 hours
#ini_set('max_input_time', '10800'); // Allows for up to 3 hours
#ini_set('max_execution_time', '10800'); // Allows for up to 3 hours
I am using move_uploaded_file to place it into the specified directory, not sure if that matters or not?
Honestly, how can I allow multiple files to be uploaded without a TIMED OUT ERROR?? If I upload 1 file, seems fine. Not sure if it's a quantity problem with multiple files or the combined files are just too big in filesize? Have tried with like 15MB total filesize, and this produces a TIMED OUT ERROR! arggggg!!!!
What must I do to make this work??

If you are using an apache server, the server itselfs has a timelimit too.
Look at the TimeOut directive at apache configuration (by default is 300 seconds)
You can look at the apache error log too
If you upload only 2 files ok 1Kb each, works fine?

For uploading large + multiple files reliably you should use file uploaders
I prefere http://www.uploadify.com/ for this

Related

Website Uploading - Error "NaN" uploaded

I've been trying to setup a file hosting script, I've tried multiple scripts (7), and no matter what script I've used, I seem to always get the same error, and I've tried it on multiple devices, and on different HTTP(s) protocols...
(I've searched for a few days for a way to resolve this, but without finding a working solution...)
Uploading File: (Computer upload)
"Speed: NaN Bps. Remaining: NaN seconds | Progress: NaN% (NaN B / 2.45 KB)"
However, if I use something such as "Remote Upload" everything seems to work completely fine, any advice?
I've changed the php.ini file to max upload file size to 2048MB's (Just to test it)
I uploaded a simple gif, and it didn't seem to want to upload...
(Also the upload script was working completely fine before, then one day it simply stopped working...)
Apache2 Website Conf: here
(Hiding domain using example.net)

Possible to prevent server from timing out when uploading bulk images?

We have a site on Wordpress and use Woocommerce for our commerce site. In short we have a front end form that logged in users can upload multiple images at once. Problem is these images are typically straight off a digital camera and we need to keep them at the highest resolution possible.
When users try uploading even 50+ images it takes FOREVER and sometimes doesn't even complete, it will return a 504 error.
We talked to the host and have done all we can with them to decrease server timeouts and they suggested making edits to the attached script. I also went in to the wp_config file and set the max upload size to something like 256M.
This problem is still happening and I was just wondering if anyone had any recommendations on how to prevent server timeouts or speed up image uploads without totally reworking the code?
The attached code is here: http://pastebin.com/AHTDNaDL
Just to save some time while browsing that file lines 3 -175 handle the product creation for each image uploaded; line 253 - 340 is the upload form and line 447 starts the binding functions.
I have been at this for days and googled everything from plugins to ajax uploaders but still not having much luck, thinking some outside input would help
Edit:
Since it doesn't look like I will be able to configure my server to what I need, is it possible to break the upload/creating of product up into a few different sections? In other words, the user would be able to upload all there images and then in the background I could run my create_var_product function to hopefully prevent timeout issues?
You need to edit your php.ini file. Look at editing at least the following:
max_input_time
upload_max_filesize
Check the documentation:
http://us3.php.net/ini.core
Try adding the following lines to the top of the script:
ini_set('max_execution_time', 10000);
set_time_limit(0);
ini_set('memory_limit', '-1');

big size multi image upload

I built a website in that I upload 10 too big size(10MB) images. When uploading start, it continues to some time then a blank page will come. I tried to change php_values in .htaccess file, because I don't have permission to change the settings in php.ini file (it's shared server). I have some doubts regarding this.
1) what happen if file will going to post request, because I want fastly uploded the files.
2) it takes time when posting the request or uploding the file, I am cropping the images (loop) using php GD functions.
It is because of the limits your web hosting provider set. Which values did you try to change in the .htaccess?
You could try using some flash uploader, it should work despite the limits imposed by the server. A good one is SWFUpload.
That is because of the exection time of a script.You can edit your php.ini file. If that is not permitted you can set the *MAX_EXECUTION_TIME* for a script using your .htaccess file.

PHP Aborting when creating large .zip file

My php script running on CentOS 5.6 and PHP 5.2.12 using ZipArchive() and successfully creates .zip files over 1.6Gb but not for a larger archive of 2GB or larger - PHP aborts with no apparent error. Nothing in the PHP error log or stderr. The script is being executed at the cmd line and not interactively.
The script runs for about 8min and the temp archive grows and while checking the filesize, the last listing showed the tmp file was 2120011776 in size and then the tmp file disappears and the PHP script falls thru the logic and executes the code after the archive create.
For some reason top shows the CPU still at 95% and is creating a new tmp archive file - it does this for say another 5+ min and silently stops and leaves the un-completed tmp archive file. In this test - there was less then 4000 expected files.
The script as noted works just fine creating smaller archive files.
Tested on several different sets of large source data - same result for large files.
This issue sounds similar to this question:
Size limit on PHP's zipArchive class?
I thought maybe the ls -l command was returning a count of 2K blocks and thus 2120011776 would be close to 4GB but that size is in bytes - the size of the xxxx.zip.tmpxx file.
Thanks!
It could be many things. I'm assuming that you have enough free disk space to handle the process. As others have mentioned, there could be some problems fixed either by editing your php.ini file or using the ini_set() function in the code itself.
How much memory does your machine have? If it exhausts your actual memory, then it makes sense that it would abort regularly after a certain size. So, check the free memory usage before the script and monitor it as the script executes.
A third option could be based on the file system itself. I don't have much experience with CentOS, but some file systems do not allow files over 2 gb. Although, from the product page, it seems like most systems on CentOS can handle it.
A fourth option, which seems to be the most promising, appears if you look at the product page linked above, another possible culprit is "Maximum x86 per-process virtual address space," which is approximately 3gb. x86_64 is about 2tb, so check the type of processor.
Again, it seems like the fourth option is the culprit.
Do you have use set_limit variables in php.
You can use the. Htacess or within the PHP script.
Inside the script set_time_limit(0);
Inside the .htaccess php_value memory_limit 214572800;
When your file size is big it will take time to make its archive ZIP, but in PHP (php.ini) maximum execution time, so you must try to increase that value.
there is a setting in php.ini maximum execution time
perhaps this is getting fired !
try to increase the value !
There is also different file size limit for OS, try to check that too !

PHP file uploads being "hijacked" by partial uploads

I have a site that is receiving 30-40k photo uploads a day and I've been seeing an issue pop up with more frequency now. This issue is this:
Our upload script receives (via $_FILES['name']['tmp_name']) a file (photo) that was NOT uploaded by the user & the majority of the time the file received is a "partial" upload.
Of course at first I thought it was my PHP code making a simple mistake and I've spent days looking over it to make sure, but after placing checks in the code I've found that the file received via a HTTP POST upload to PHP is actually the wrong file. So the issue is happening before it reaches my code. The tmp file (phpxxxx) received by the script is sometimes incorrect, as if it was somehow being overwritten by another process and its usually overwritten by a file that was partially uploaded.
Has anyone every seen an issue like this? Any help is greatly appreciated. I'm turning to this as a last resort after days of searching/asking other PHP devs
So to recap:
User uploads a photo
PHP script receives a file that was not uploaded by the user (pre code, via $_FILES in /var/tmp)
Usually the incorrect file received is a partial upload or a broken upload
It seems to happen randomly and not all the time
First off, check you PHP version.
Second, check your file upload limits and POST_MAX_SIZE in php.ini
It might just be that someone tries to upload a file that's too large :-)
Can you try different names for the temp file to avoid its being overwritten? Can you identify the origin of the new, incorrect and incomplete file?
Is this a development environment? Is it possible that more than one user is uploading files at the same time?
Try your program with very small images to check if SchizoDuckie is correct about filesize problems.
Try with different navigators to eliminate the admittedly remote possibility that it is a local problem.
Check permissions on the directory where the temp file is stored.
PHP's built-in file handling does not support partial uploads.
Turn off KeepAlives and/or send a 'Connection: close' header after each upload.
Configure your webserver to send the header 'Allow-Ranges: none'.

Categories