We have a site on Wordpress and use Woocommerce for our commerce site. In short we have a front end form that logged in users can upload multiple images at once. Problem is these images are typically straight off a digital camera and we need to keep them at the highest resolution possible.
When users try uploading even 50+ images it takes FOREVER and sometimes doesn't even complete, it will return a 504 error.
We talked to the host and have done all we can with them to decrease server timeouts and they suggested making edits to the attached script. I also went in to the wp_config file and set the max upload size to something like 256M.
This problem is still happening and I was just wondering if anyone had any recommendations on how to prevent server timeouts or speed up image uploads without totally reworking the code?
The attached code is here: http://pastebin.com/AHTDNaDL
Just to save some time while browsing that file lines 3 -175 handle the product creation for each image uploaded; line 253 - 340 is the upload form and line 447 starts the binding functions.
I have been at this for days and googled everything from plugins to ajax uploaders but still not having much luck, thinking some outside input would help
Edit:
Since it doesn't look like I will be able to configure my server to what I need, is it possible to break the upload/creating of product up into a few different sections? In other words, the user would be able to upload all there images and then in the background I could run my create_var_product function to hopefully prevent timeout issues?
You need to edit your php.ini file. Look at editing at least the following:
max_input_time
upload_max_filesize
Check the documentation:
http://us3.php.net/ini.core
Try adding the following lines to the top of the script:
ini_set('max_execution_time', 10000);
set_time_limit(0);
ini_set('memory_limit', '-1');
Related
This is not a quick failure, I have spent a totally of 5 completely full days trying to figure this out. Initially I was limited by file size and then file type; in which I removed the Wordpress restrictions and am now "capable" of uploading my 177MB .glb file to Wordpress.
However when doing so, I receive the following error:
retriever.glb
Unexpected response from the server. The file may have been uploaded successfully. Check in the Media Library or reload the page.
I was on the phone with GoDaddy Specialists for 2.5 hours yesterday ensuring that this was not a Server issue or restriction on their side.. they confirmed that it was not. We pretty much ended the conversation that it is something I must figure out with me, myself, and I.
I went ahead and uploaded my .glb to the server through panel, everything worked fine. In fact I have a location for it here: https://www.tattiniboots.com/wp-content/uploads/2020/07/retriever.glb
However, this does not make the file discoverable to the 3D viewer plugins I have installed on the site through the media location.
I truly don't know where to go from here
I changed the name of the file to .png and attempted an upload and received the following error:
Post-processing of the image failed likely because the server is busy or does not have enough resources. Uploading a smaller image may help. Suggested maximum size is 2500 pixels.
I just tried to update a normal .mov file that is 150MB and received the following error; really making me think this is something to do with file size:
Unexpected response from the server. The file may have been uploaded successfully. Check in the Media Library or reload the page.
Yes, normal images are uploading just fine (2MB-ish)
I just attempted to deactivate all plugins with consideration that maybe "Smush" or another was imposing issues: I then received the issue that the file type is not supported (even with the allow all file types code in my wp-config)
Is this just the case that glb is not allowed at all?
This must be a server thing. probably a run time error
Whoever supported you just doesn't know it. GoGoDaddy. ;)
Nevertheless... you can Use a plugin that's called
media sync. Check it out and best of luck
I am using the
Storage::
to upload files into laravel. It works, i can successfully upload, store, delete, and edit files on the server. But the issue i am having is the weight of the file. Images upload easily, but if i try to upload a .zip with like 20MB it drops. I can see the numbers in the bottom of the browser going all the way to 23 then back to 0, up to like 10% then this pages comes up:
What is going on? I updated my PHP.INI to allow really large uploads... is Laravel got some restrictions? Does Nginx have its own set of permissions? What am i missing?
Ok, I don't quite understand why, but no matter what I seem to set in PHP's ini_set function, I am unable to upload multiple files or the files that I'm uploading are too big... I don't understand it. Here's what I am setting so far within the script that handles the $_FILES[] array that is being uploaded when posting the form:
// Try and allow for 3 hours, just in case...
#set_time_limit(10800);
// Try and set some ini settings to buy us some time...
#ini_set('memory_limit', '512M');
#ini_set('upload_max_filesize', '512M');
#ini_set('post_max_size', '550M');
#ini_set('session.gc_maxlifetime', '10800'); // Allows for up to 3 hours
#ini_set('max_input_time', '10800'); // Allows for up to 3 hours
#ini_set('max_execution_time', '10800'); // Allows for up to 3 hours
I am using move_uploaded_file to place it into the specified directory, not sure if that matters or not?
Honestly, how can I allow multiple files to be uploaded without a TIMED OUT ERROR?? If I upload 1 file, seems fine. Not sure if it's a quantity problem with multiple files or the combined files are just too big in filesize? Have tried with like 15MB total filesize, and this produces a TIMED OUT ERROR! arggggg!!!!
What must I do to make this work??
If you are using an apache server, the server itselfs has a timelimit too.
Look at the TimeOut directive at apache configuration (by default is 300 seconds)
You can look at the apache error log too
If you upload only 2 files ok 1Kb each, works fine?
For uploading large + multiple files reliably you should use file uploaders
I prefere http://www.uploadify.com/ for this
I built a website in that I upload 10 too big size(10MB) images. When uploading start, it continues to some time then a blank page will come. I tried to change php_values in .htaccess file, because I don't have permission to change the settings in php.ini file (it's shared server). I have some doubts regarding this.
1) what happen if file will going to post request, because I want fastly uploded the files.
2) it takes time when posting the request or uploding the file, I am cropping the images (loop) using php GD functions.
It is because of the limits your web hosting provider set. Which values did you try to change in the .htaccess?
You could try using some flash uploader, it should work despite the limits imposed by the server. A good one is SWFUpload.
That is because of the exection time of a script.You can edit your php.ini file. If that is not permitted you can set the *MAX_EXECUTION_TIME* for a script using your .htaccess file.
I'm with a fairly mediocre low-cost (shared) host at the moment (as it's all I can afford just now) and want to implement a very basic file upload feature on a page. I want to allow files of up to 100MB to be uploaded to the server but my free host limits the PHP_MAX_FILESIZE to 32MB and the POST_FILESIZE to 64MB.
Is there any way of overcoming this without requiring the user to split the larger files into smaller chunks? Either a JavaScript or Flash-based solution which could perhaps somehow route parts of the file through different HTTP requests or some other way to get around the 32MB limit?
Or are there any commands I can attempt to make which might over-ride the host's limits? I've already tried using .htaccess without success. EDIT: also tried ini_set. I'm thinking the only way is some kind of chunking/splitting/multi-streaming or another way to solve the inability to set higher PHP values.
Any suggestions are hugely appreciated. Thanks.
You can use Flash. Start with this: http://soenkerohde.com/2010/01/chunk-file-upload/
OR
use https://github.com/moxiecode/plupload
might also possibly be able to use ini_set('upload_max_filesize','100M');
But I have a sneaking suspicion that your host might not be happy with you trying to circumvent their limit...
if your 'free host' already limits you, there is nothing you can do about it. try reading
http://www.php.net/manual/en/ini.core.php#ini.post-max-size
and
http://www.php.net/manual/en/ini.core.php#ini.upload-max-filesize
here is where you can set it during runtime (ini_set) or not
http://www.php.net/manual/en/configuration.changes.modes.php
i suggest you just do multiple file uploads. 100 mb right? are you planning to host videos and movies? try looking for a better paid host rather than free ones :)
if your host allows java applet then in sourceforge there is already package for it. which allows you to dw file via java applet from users machine to your host via small packages. it works because applet handles the file upload code in users machine and at server side you will receive small chunks which you can bind later and any file size can be uploaded.
i found the link here it is,
http://sourceforge.net/projects/juploader/
Some hosts allow using a local copy of php.ini (mine does, for example), so you could change parameters at will.
Keep in mind that 100MB each file can rapidly bloat your host, expecially if it's not top of the category, so be careful.
Since ini_set doesn't seem to be working, you could try and set it via the .htaccess file. I'm not sure about the exact syntax but it's something involving php_flag.
I wouldn't be surprised if this doesn't work either.
From my experience choosing a good host based on their advertising is impossible. I know of no other way than to simply try out a bunch and hope your run across one that isn't super retentive.
Upload limits are a common problem. If it's common for your customers to upload very large files then perhaps it would be wise to look into some other hosting plan. If it's not very common at all you could just have them send you the file so you can FTP it. Sometimes the best solution is the simplest solution.
You could have your users upload to an outside site. Then give you the URL from that outside site. If you have enough space, you can circumvent how long the outside site keeps the file by downloading it to a directory in your site.
It's certainly not the best option, but your users will have a bigger upload quota and probably faster upload speeds (shared servers and speed mix like oil and water).
Abstract:
1) Read the file before to sending it (catch into onsubmit event); split and send the chunks as textarea fields.
2) In the server side, recover the chunks, and make one single file.
Proposal:
Depending on the environment in which your script runs and where the file resides, your options include the following:
XMLHttpRequest
object (for reading files available via URLs on your website)
FileSystemObject
(if you use Windows Scripting Host or Internet Explorer in a trusted environment) a "helper" Java applet that reads a file or URL for your script.
(Extract from http://www.javascripter.net/faq/reading2.htm)
If ok, remove the input file element of the form.
Then, split the string into many chunks.
mySplitResult = myReadedDocument.split( 1024 * 1024 ); // 1M each
That make an array of pieces of your document:
Add the values into the form (remember set the same name with [] to all the new controls). Assume that form id is 'myForm':
formName = document.getElementById('myForm');
mySplitResult.forEach(function(item) {
ta = document.createElement('textarea');
ta.appendChild(document.createTextNode(item))
ta.setAttribute("name", "chunk[]" );
formName.appendChild(ta);
});
In the server side, you can reconstruct the chunks and save as a file.
<?php
$chunks = $_POST['chunk'];
$fileContent = implode( '', $chunks );
file_put_content( 'dirname/filename.foo', $fileContent );
?>
The weight of success is that you can read the file on the client side.
Good luck!