Request Entity Too Large with small files - php

I know there are many questions about Request Entity Too Large on internet but i could not find right answer for my problem ;)
I`m using HTML file input tag to let users upload their images .
<input type = 'file' class = 'upload-pic' accept='image/*' id ='Fuploader' name = 'pro-pic'><br>
There is nothing wrong with files less than 2 MB which is allowed by my site
But the problem is if some one decide to upload larger file like 5.10 MB , i can handle it by php and warn user that this file is too large
if($_FILES['pro-pic']['size'] > 2000000)
{
die("TOO LARGE");
}
But my problem is by uploading 5.10 MB file , Request entity too large error will be lunched and rest of my php code won`t work
I have checked post_max_size and upload_max_filesize they are both set to 8MB
But i get Error on 5.10MB !
And I need to find way to handle files even larger than 8MB because there is no way to guess what user may try to upload ;) and i don`t want them to get dirty and broken page because of REQUEST ENTITY TOO LARGE ERROR
Is there any way too fully disable this Error Or set upload_max_filesize and post_max_size to infinity ?

You need to set SecRequestBodyAccess Off.
Check the link i have given ..it would help you
https://serverfault.com/questions/402630/http-error-413-request-entity-too-large

Related

Try upload videofile

I use Laravel 5.4 and I try upload video file. Image file upload successfully.
$video = Request::file('video_file')) {
$fullName = 'videos/'.uniqid().time().'.'.$video->getClientOriginalExtension();
Storage::disk()->put($fullName, $video);
But it didn't work. When I try get information about file - size = 0
What I do wrong?
There’s a limit on the amount of data you can send in a POST request. If you exceed that limit, PHP will return zero as the size of the file.
Instead, you’ll need to upload the file in chunks. If you’re using something like Amazon Web Services, they have a JavaScript SDK that will handle multi-part uploads for you: http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#upload-property
Check out http://php.net/manual/en/ini.core.php#ini.post-max-size
PHP has a default max_post_size of 8mb I believe, anything larger than that will not go through. There is also a upload_max_filesize that you may want to check out. I'm not sure if increasing the limit is the correct answer here, as I'm not sure what complications that would bring. When we upload large Files we use blue imp's file uploader: https://github.com/blueimp/jQuery-File-Upload
It's pretty straight forward and automatically chunks the uploads for you.

Delay after uploading large files with Jquery BlueImp uploader

I'm able to successfully upload large files (tested up to 8GB) via the BlueImp Jquery Uploader plugin.
My PHP settings are:
upload_max_filesize 8192M
post_max_size 8192M
max_execution_time 200
max_input_time 200
memory_limit 8192M
The problem I'm having is that when large files (2gb or larger) finish uploading, the progress bar hangs on 100% and takes considerable time to "finish".
I'm not sure what post processing is happening, other than piecing the file chunks together.
I've tried adjusting the chunk size in the UploadHandler.php file, and it seems to improve things slightly when increased (e.g from 10mb to 500mb chunks) but the delay is still there. There doesn't seem to be any improvement when disabling chunked uploads altogether (0), and I'm not sure of the potential ramifications of doing this also.
On a 2gb file, the delay is around 20 seconds, but on a 4gb file, it's around 2 minutes. A 7gb file takes around 3-4 minutes, and sometimes times out. This leaves the user waiting, unaware of what is happening as the progress bar has finished at 100% by this point.
Does anyone have any insight as to what this might be, or how to go about troubleshooting it? I suspected that the file in /tmp might be being copied rather than moved, but there's no sign of this in the php file as far as I can tell.
Boosting the CPU and RAM on my server VM improves things a little, though running the "top" command during this process reveals that CPU and RAM don't appear to be extensively exhausted (0-2%, compared to 90-100% during actual upload).
Many thanks in advance.
I don't know very much about it , i search on google and i found some , i think it might help you.
Normally “onError” event is fired when exception occurs on Server or Client. In the mentioned situation there is no exception thrown until the Server itself does not time out.
One possibility is to monitor the upload status and set JavaScript timeout or counter in uploading event that will cancel the upload after reached time limit.
All upload status and error codes are shown here - http://help.infragistics.com/NetAdvantage/jQuery/2013.1/CLR4.0?page=igUpload_Using_Client_Side_Events.html
You can monitor these with “fileStatus” argument in uploading event.
Link Which will help you
Link1
Link2
Link3
What this might be:
jQuery FileUpload registers the request progress event. This is the event when the browser is sending data to the server, not the confirmation event from the server. There is a delay between the browser sending and the server confirming reception. See Can onprogress functionality be added to jQuery.ajax() by using xhrFields? for the two events. However, I could not resolve this for me and used a work-around, so I cannot fully confirm this assumption.
How to go about troubleshooting it:
Use your Browser's debugging tools and set a breakpoint around here in the code to get an understanding of variables available:
https://github.com/blueimp/jQuery-File-Upload/blob/master/js/jquery.fileupload.js#L372
Once you are sure what you want to see, best is likely to print information with console.log and check your Browser's console output.
Work-arounds:
If you are only interested to see whether all downloads have finished and display to the user that something is still going on, you may want to check out stop / done events, or also fileupload('active'), see here: Finish of Multiple file upload
After much troubleshooting and debugging of this issue, I’ve found what I believe to be the cause, and a better workaround/solution. It’s a bit “hacky” (I’m an amateur dev!), so I’m open to any suggestions to improve this, though it does seem to work for me.
The files uploaded from the client are actually uploaded to this temp location on the server:
/tmp/systemd-private-random-number-httpd.service-random-number/tmp/sess_php-session-ID
Once the uploads have completed on the client side, the UI progress bar reaches 100% and remains so whilst the server is processing the files.
The processing on the server side involves moving (copying, then deleting) the file from the above /tmp location to the relevant blueimp location. (Assuming “user_dirs” is enabled/true in the blueimp options, the default destination is: /var/www/html/server/php/files/php-session-id/ ).
This copying process can take a significant amount of time, particularly on files larger than 2gb or thereabouts. Once the server processing has completed, blueimp triggers the “fileuploaddone” callback, and the UI updates to a completed state.
My aim was to provide some interactive UI feedback at this point (rather than hanging at 90% like the other workaround). My environment is capable of very large uploads (10gb+), so I didn’t find it acceptable not to provide any user feedback for what could be several minutes of file processing (with the user thinking the site has crashed, and closing the browser etc.).
My Workaround:
The first problem I encountered was that there doesn’t appear to be a blueimp callback for the point when the file uploads have completed on the client side, and the server processing begins. So I worked around this by creating a function (in main.js) to display my custom “processing div” after data.loaded matches data.total:
$('#fileupload').bind('fileuploadprogressall', function (e, data) { console.log(data);
if (data.loaded == data.total) {
setTimeout(function(){
$("#processwarn").fadeTo(300, 1);
},3000);
}})
My custom div that sits below the progress bar info is another php page that’s refreshed (with ajax.load) every couple of seconds and calculates the total file size of all the files in the current session upload folder (the directory total size didn’t seem to provide accurate results):
// Calculate size of files being processed
$filePath = "/var/www/html/server/php/files/*php-session-id*/";
$total = 0;
$d = new RecursiveIteratorIterator(
new RecursiveDirectoryIterator($filePath),
RecursiveIteratorIterator::SELF_FIRST
);
foreach($d as $file){
$total += $file->getSize();
}
// Convert to readable format
if ($total >= 1073741824)
{
$total = number_format($total / 1073741824, 2) . ' GB';
}
elseif ($total >= 1048576)
{
$total = number_format($total / 1048576, 2) . ' MB';
}
elseif ($total >= 1024)
{
$total = number_format($total / 1024, 2) . ' KB';
}
elseif ($total > 1)
{
$total = $total . ' bytes';
}
elseif ($total == 1)
{
$total = $total . ' byte';
}
else
{
$total = '0 bytes';
}
// Display spinner gif and size of file currently being processed
echo "<img src=\"img/spinner.gif\" height=\"20px\" width=\"20px\"> Please wait, processing files: $total";
The result looks something like this:
Edit: (after some more work) - With bootstrap progress bar added instead:
Additional point noticed from testing:
With chunking enabled (for example set to 1gb (maxChunkSize: 1000000000) ), the problem almost disappears, as the processing time appears drastically reduced to the user, as the "copying" processing occurs at the point of chunking (every 1gb in this example).
When the final processing then occurs, the server only has to "rechunk"/copy the last remaining 1gb over.
I also experienced quicker overall upload times due to this.
In hindsight, this may well be an easier/more effective solution for many.

TYPO3: Data is not posting on select file larger than 1.5 KB to be uploaded

I need to upload files from the front in my plugin. I got the success with it, but then now I have an issue during uploading larger file than 1.5 KB. Whenever I select file larger than 1.5 KB, i get error:
1298012500: Required argument "newRockupload" is not set for Rock\RockUpload\Controller\RockuploadController->create.
So i have put this code in initializeCreateAction() of controller to debug:
$arguments = $this->request->getArguments();
DebuggerUtility::var_dump($arguments);
exit;
So whenever i select file which has lower size than 1.5 KB, I get posted data successfully in controller:
And whenever i select larger file than 1.5 KB i am getting nothing like this:
I tried and surfed lot. Need Help..
As it mentioned in the comments, this is probably an issue with the URL you are trying to send.
The parameter is not there in the TYPO3, (maybe not even on the server side.) I guess something is wrong with your fluid form.
You should inspect your request.
You can check it in your browser / apache access log / even with a debugger in the TYPO3 code.
Probably you are trying to send the data of the file in the Request Header instead of the Request Body.
You have probably some characters already in your URL so a file > 1.5 kb exceeds the limit of around 2000 characters in the url.
See also here about the limit of the url:
What is the maximum length of a URL in different browsers?

empty $_POST and $_FILE variable when uploading large files

I was trying to upload a file which is 20MB in size. Now default form upload size is 8MB. When I upload such a file i get $_POST and $_FILE variables empty. Now I want to put a check on file size. If I get both these variables empty, how can I put such a check ?? Please give me suggestions
Barring any code errors, its most likely your 20MB exceeds your upload limit.
Change this permanently from your php.ini file.
Use
ini_set("upload_max_filesize", "30M");
to set your max upload size for that session only. And for POST
Use this
ini_set("post_max_size", "30M");
To check the sizes
echo ini_get("post_max_size") . "\n";
echo ini_get("upload_max_filesize");
No idea what you actually want. But you can probe the recieved content size using:
$_SERVER["CONTENT_LENGTH"]
This should tell how big the POST request body would have been. (The number might be higher than the actual received content, in case of an aborted upload.)
Checkout php://input, the allowed 8mb part of it should be there.
For example echo file_get_contents('php://input');
You can dynamically set your max file size for upload.
write down below statement in your upload function where you are trying to upload file.
this will enhance limit up to 50 MB
ini_set("upload_max_filesize", "50M");
If you want to check file variables, you can user alternative HTTP_POST_FILES
$theFileSize = $HTTP_POST_FILES['file']['size'];
Hope this may help you.
Thanks.
Use MAX_FILE_SIZE as a hidden input field, this will stop the user waiting if the file is larger than the limit and won't execute your code so the variables won't be empty...
The MAX_FILE_SIZE hidden field (measured in bytes) must precede the
file input field, and its value is the maximum filesize accepted by
PHP. This form element should always be used as it saves users the
trouble of waiting for a big file being transferred only to find that
it was too large and the transfer failed. Keep in mind: fooling this
setting on the browser side is quite easy, so never rely on files with
a greater size being blocked by this feature. It is merely a
convenience feature for users on the client side of the application.
The PHP settings (on the server side) for maximum-size, however,
cannot be fooled.
http://www.php.net/manual/en/features.file-upload.post-method.php

Zend File upload: File exceeds the defined ini size

Inside my form i define this file upload field:
$this->setEnctype(Zend_Form::ENCTYPE_MULTIPART);
$logo = $this->createElement('file', 'logo');
$logo->setLabel('Group logo')
->setMaxFileSize(5242880) // 5mb
->addValidator('IsImage')
->addValidator('Count', false, 1)
->addValidator('Size', false, 5242880)
->addValidator('Extension', false, array('jpg', 'jpeg', 'png', 'gif'));
However, no matter how small files I upload I get this error: File 'logo' exceeds the defined ini size.
The error message seemed pretty straight forward so I checked the php config (phpinfo() on the same exact page that handles the form)
file_uploads: On
upload_max_filesize: 2000M
memory_limit: 128M
post_max_size: 8M
While those values don't exactly make sense, they absolutely should allow me to upload files up to 8Mb but the upload always failes with the message from above. Even files smaller than 1Kb fail. I also tried removing all setters/validators but it still fails.
While searching for an answer I came across some posts that said that it was ajax' fault but this is a regular form, so now I'm stuck.
Update: I'm terribly sorry to have wasted your time, there was another unclosed form on the page which voided the multipart-declaration. Could have found that out sooner if I had tested with larger files rather than small ones :/
Add enctype="multipart/form-data" in your form. It should solve your problem.
Add
enctype="multipart/form-data"
to your <form> element. Solved my problem.
if you are using script file to render your file , you need to retrieve the enctype info that you specified in form class from your script file. <form enctype="<?php echo $this->element->getAttrib("enctype"); ?>">
Chances are that the php extension fileinfo is not activated.
Please check your php.ini file and increase upload_max_filesize. By default, it is 2M (2 MegaBytes). Also in order to be able to post file with size more than 2M you need to update value of post_max_size
It looks like you're missing the destination:
$logo->setLabel('Group logo')
->setDestination('/var/www/upload')
...
You might want to make sure that the folder is writeable by your web server.
When I commented out the following I got the same error:
->setDestination($this->_config->folder->ugc);
->addValidator(Kvadrat_Form_Element_File::VALIDATE_COUNT, true, 1);
->addValidator(Kvadrat_Form_Element_File::VALIDATE_SIZE, true, 5 * 102400);
(I commented it out as was doing the file uploads separately with FormData)
So I uncommented it and it all worked again.
Your size validator is incorrect. You should use this format:
->addValidator('Size', false, array('max' => '5242880'))
Your validator checks file's size == 5242880, NOT <= 5242880.

Categories