Delay after uploading large files with Jquery BlueImp uploader - php

I'm able to successfully upload large files (tested up to 8GB) via the BlueImp Jquery Uploader plugin.
My PHP settings are:
upload_max_filesize 8192M
post_max_size 8192M
max_execution_time 200
max_input_time 200
memory_limit 8192M
The problem I'm having is that when large files (2gb or larger) finish uploading, the progress bar hangs on 100% and takes considerable time to "finish".
I'm not sure what post processing is happening, other than piecing the file chunks together.
I've tried adjusting the chunk size in the UploadHandler.php file, and it seems to improve things slightly when increased (e.g from 10mb to 500mb chunks) but the delay is still there. There doesn't seem to be any improvement when disabling chunked uploads altogether (0), and I'm not sure of the potential ramifications of doing this also.
On a 2gb file, the delay is around 20 seconds, but on a 4gb file, it's around 2 minutes. A 7gb file takes around 3-4 minutes, and sometimes times out. This leaves the user waiting, unaware of what is happening as the progress bar has finished at 100% by this point.
Does anyone have any insight as to what this might be, or how to go about troubleshooting it? I suspected that the file in /tmp might be being copied rather than moved, but there's no sign of this in the php file as far as I can tell.
Boosting the CPU and RAM on my server VM improves things a little, though running the "top" command during this process reveals that CPU and RAM don't appear to be extensively exhausted (0-2%, compared to 90-100% during actual upload).
Many thanks in advance.

I don't know very much about it , i search on google and i found some , i think it might help you.
Normally “onError” event is fired when exception occurs on Server or Client. In the mentioned situation there is no exception thrown until the Server itself does not time out.
One possibility is to monitor the upload status and set JavaScript timeout or counter in uploading event that will cancel the upload after reached time limit.
All upload status and error codes are shown here - http://help.infragistics.com/NetAdvantage/jQuery/2013.1/CLR4.0?page=igUpload_Using_Client_Side_Events.html
You can monitor these with “fileStatus” argument in uploading event.
Link Which will help you
Link1
Link2
Link3

What this might be:
jQuery FileUpload registers the request progress event. This is the event when the browser is sending data to the server, not the confirmation event from the server. There is a delay between the browser sending and the server confirming reception. See Can onprogress functionality be added to jQuery.ajax() by using xhrFields? for the two events. However, I could not resolve this for me and used a work-around, so I cannot fully confirm this assumption.
How to go about troubleshooting it:
Use your Browser's debugging tools and set a breakpoint around here in the code to get an understanding of variables available:
https://github.com/blueimp/jQuery-File-Upload/blob/master/js/jquery.fileupload.js#L372
Once you are sure what you want to see, best is likely to print information with console.log and check your Browser's console output.
Work-arounds:
If you are only interested to see whether all downloads have finished and display to the user that something is still going on, you may want to check out stop / done events, or also fileupload('active'), see here: Finish of Multiple file upload

After much troubleshooting and debugging of this issue, I’ve found what I believe to be the cause, and a better workaround/solution. It’s a bit “hacky” (I’m an amateur dev!), so I’m open to any suggestions to improve this, though it does seem to work for me.
The files uploaded from the client are actually uploaded to this temp location on the server:
/tmp/systemd-private-random-number-httpd.service-random-number/tmp/sess_php-session-ID
Once the uploads have completed on the client side, the UI progress bar reaches 100% and remains so whilst the server is processing the files.
The processing on the server side involves moving (copying, then deleting) the file from the above /tmp location to the relevant blueimp location. (Assuming “user_dirs” is enabled/true in the blueimp options, the default destination is: /var/www/html/server/php/files/php-session-id/ ).
This copying process can take a significant amount of time, particularly on files larger than 2gb or thereabouts. Once the server processing has completed, blueimp triggers the “fileuploaddone” callback, and the UI updates to a completed state.
My aim was to provide some interactive UI feedback at this point (rather than hanging at 90% like the other workaround). My environment is capable of very large uploads (10gb+), so I didn’t find it acceptable not to provide any user feedback for what could be several minutes of file processing (with the user thinking the site has crashed, and closing the browser etc.).
My Workaround:
The first problem I encountered was that there doesn’t appear to be a blueimp callback for the point when the file uploads have completed on the client side, and the server processing begins. So I worked around this by creating a function (in main.js) to display my custom “processing div” after data.loaded matches data.total:
$('#fileupload').bind('fileuploadprogressall', function (e, data) { console.log(data);
if (data.loaded == data.total) {
setTimeout(function(){
$("#processwarn").fadeTo(300, 1);
},3000);
}})
My custom div that sits below the progress bar info is another php page that’s refreshed (with ajax.load) every couple of seconds and calculates the total file size of all the files in the current session upload folder (the directory total size didn’t seem to provide accurate results):
// Calculate size of files being processed
$filePath = "/var/www/html/server/php/files/*php-session-id*/";
$total = 0;
$d = new RecursiveIteratorIterator(
new RecursiveDirectoryIterator($filePath),
RecursiveIteratorIterator::SELF_FIRST
);
foreach($d as $file){
$total += $file->getSize();
}
// Convert to readable format
if ($total >= 1073741824)
{
$total = number_format($total / 1073741824, 2) . ' GB';
}
elseif ($total >= 1048576)
{
$total = number_format($total / 1048576, 2) . ' MB';
}
elseif ($total >= 1024)
{
$total = number_format($total / 1024, 2) . ' KB';
}
elseif ($total > 1)
{
$total = $total . ' bytes';
}
elseif ($total == 1)
{
$total = $total . ' byte';
}
else
{
$total = '0 bytes';
}
// Display spinner gif and size of file currently being processed
echo "<img src=\"img/spinner.gif\" height=\"20px\" width=\"20px\"> Please wait, processing files: $total";
The result looks something like this:
Edit: (after some more work) - With bootstrap progress bar added instead:
Additional point noticed from testing:
With chunking enabled (for example set to 1gb (maxChunkSize: 1000000000) ), the problem almost disappears, as the processing time appears drastically reduced to the user, as the "copying" processing occurs at the point of chunking (every 1gb in this example).
When the final processing then occurs, the server only has to "rechunk"/copy the last remaining 1gb over.
I also experienced quicker overall upload times due to this.
In hindsight, this may well be an easier/more effective solution for many.

Related

Request Entity Too Large with small files

I know there are many questions about Request Entity Too Large on internet but i could not find right answer for my problem ;)
I`m using HTML file input tag to let users upload their images .
<input type = 'file' class = 'upload-pic' accept='image/*' id ='Fuploader' name = 'pro-pic'><br>
There is nothing wrong with files less than 2 MB which is allowed by my site
But the problem is if some one decide to upload larger file like 5.10 MB , i can handle it by php and warn user that this file is too large
if($_FILES['pro-pic']['size'] > 2000000)
{
die("TOO LARGE");
}
But my problem is by uploading 5.10 MB file , Request entity too large error will be lunched and rest of my php code won`t work
I have checked post_max_size and upload_max_filesize they are both set to 8MB
But i get Error on 5.10MB !
And I need to find way to handle files even larger than 8MB because there is no way to guess what user may try to upload ;) and i don`t want them to get dirty and broken page because of REQUEST ENTITY TOO LARGE ERROR
Is there any way too fully disable this Error Or set upload_max_filesize and post_max_size to infinity ?
You need to set SecRequestBodyAccess Off.
Check the link i have given ..it would help you
https://serverfault.com/questions/402630/http-error-413-request-entity-too-large

Display 30 video frames in a second in php

I've written a C++ program for an embedded system. My program processes images taken by a camera (350 fps or even more ) and apply some machine vision algorithms on them. I wrote the result of my program which is an jpg image in a specific location in my hard-drive (e.g localhost/output/result.jpg)
Now I want to write a PHP code for displaying this image every 30 ms or faster (it depends to the output of my c++ program .... sometimes it is 30 fps and sometimes 10 or 40 but never more than 40 !!!) the video frames are not related to each other so I cannot use a streaming video since these algorithms suppose the frames are sequential ....
It is possible that the PHP code reads a corrupted image (Since, one program wants to write and the other one wants to read)
I thought to use a mutex concept here ... create a flag (text file) which is accessible to both programs whenever my C++ program wants to write into the same location set the flag high (write something in the text file) and when it finished the writing clear the flag so when php code sees that flag waits until the image is written into the hard drive and then displays it ....
I'm familiar to C++ but completely new to php, I'm able to display images in php but I don't know how to use timers in a way that the above problem would be solved.
what should I do ? I put a code which works but I don't want to use it since it makes only this part of the webpage be updated (Because of while(1)) ? is there any alternative solution ? if I'm not able to display more than 20 frames per second, what frame rate is possible in this scripting language ? what factor plays the role in this thing ?
Thanks
My Code:
<?php
while(1)
{
$handle = #fopen("/var/www/CTX_ITX/Flag_File.txt", "r");
if ($handle) {
if(($buffer = fgets($handle, 10)) !== false)
{
if($buffer=="Yes)
{
echo "<img src = "result.jpg" alt='test' \>";
}
}
if (!feof($handle)) {
echo "Error: unexpected fgets() fail\n";
}
fclose($handle);
}
sleep(.01); //10 ms pause before starting again ....
}
?>
PHP is not suitable for your task. It responds to a request. Your JS would have to load the image every 30ms.
I would suggest you take a look at Node.JS and HTML5 WebSockets. This technology will enable you to theoretically send pictures fast enough without reconnects etc. (only if your connection is fast enough to handle the traffic)

gzip json files sent to the browser

I have a ton of data to send to the browser, maybe 100mb or so. I've chunked them up into smaller files so I can simulate streaming. Let's say I have 200 files of 500kb each. I build an array of the 200 files in javascript, and then loop over that and make ajax calls for each. It works fine. Then I wanted to improve it, so I gziped everything on the server and they went down to about 20% of the original chunk size. My ajax calls the following file:
fileserver.php?file=/temp/media_C46_20110719_113332_ori-0.js.gz
In fileserver.php, I have, very simply:
$filepath = isset($_GET['file']) ? $_GET['file'] : '';
if($filepath!=''){
if(substr($filepath,-2,2)=='gz'){
header("Content-Type: text/plain" );
header("Content-Length: " .(string)(filesize($filepath)) );
header("Content-Encoding: gzip");
readfile(filepath);
}
else{
header("Location: ".$filepath );
}
}
Again, this totally works. The problem is that it takes forever! Looking at my network tab in chrome, it's taking 15 seconds or so to get a 100kb chunk. I can download that file directly in less than a second. The php script above should take virtually no time to run. I know the client (browser) needs to spend a bit of time to inflate the content, but it's got to be less than a second. So what's taking 15 seconds! Are there any other tools I can use to check this out?
I know I could set the header variables in apache, but I don't have access to that, and doing it in php is functionally equivalent, right? Are those the correct headers to set?
I just figured out the problem. The filesize() function wasn't getting the correct path, so It was printing as blank. I fixed that to send the correct info and it works much much faster now.

Keep track of download progress in PHP

All,
I have built a form, if the users fills in the form, my code can determine what (large) files need to be downloaded by that user. I display each file with a download button, and I want to show the status of the download of that file next to the button (download in progress, cancelled/aborted or completed). Because the files are large (200 MB+), I use a well-known function to read the file to be downloaded in chuncks.
My idea was to log the reading of each (100 kbyte) chunk in a database, so I can keep track of the progress of the download.
My code is below:
function readfile_chunked($filename, $retbytes = TRUE)
{
// Read the personID from the cookie
$PersonID=$_COOKIE['PERSONID'];
// Setup connection to database for logging download progress per file
include "config.php";
$SqlConnectionInfo= array ( "UID"=>$SqlServerUser,"PWD"=>$SqlServerPass,"Database"=>$SqlServerDatabase);
$SqlConnection= sqlsrv_connect($SqlServer,$SqlConnectionInfo);
ChunckSize=100*1024; // 100 kbyte buffer
$buffer = '';
$cnt =0;
$handle = fopen($filename, 'rb');
if ($handle === false)
{
return false;
}
$BytesProcessed=0;
while (!feof($handle))
{
$buffer = fread($handle, $ChunckSize);
$BytesSent = $BytesSent + $ChunckSize;
// update database
$Query= "UPDATE [SoftwareDownload].[dbo].[DownloadProgress] SET LastTimeStamp = '" . time() . "' WHERE PersonID='$PersonID' AND FileName='$filename'";
$QueryResult = sqlsrv_query($SqlConnection, $Query);
echo $buffer;
ob_flush();
flush();
if ($retbytes)
{
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status)
{
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
The weird thing is that my download runs smoothly at a constant rate, but my database gets updated in spikes: sometimes I see a couple of dozen of database updates (so a couple of dozens of 100 kilobyte blocks) within a second, the next moment I see no database updates (so no 100 k blocks being fed by PHP) for up to a minute (this timeout depends on the bandwidth available for the download). The fact that no progress is written for a minute makes my code believe that the download was aborted.
In the meantime I see the memory usage of IIS increasing, so I get the idea that IIS buffers the chuncks of data that PHP delivers.
My webserver is running Windows 2008 R2 datacenter (64 bits), IIS 7.5 and PHP 5.3.8 in FastCGI mode. Is there anything I can do (in either my code, in my PHP config or in IIS config) to prevent IIS from caching the data that PHP delivers, or is there any way to see in PHP if the data generated was actually delivered to the downloading client?
Thanks in advance!
As no usefull answers came up, I've created a real dodgy workaround by using Apache strictly for the download of the file(s). The whole form, CSS etc is still delivered through IIS, just the downloads links are server by Apache (a reverse proxy server sits in between, so users don't have to fiddle with port numbers, the proxy takes care of this).
Apache doesn't cache PHP output, so the timestamps I get in my database are reliable.
I've asked a similar question once, and got very good results with the answers there.
Basically, you use a APC cache to cache a variable, then retrieve it in a different page with an Ajax call from the client.
Another possibility is to do the same with a database. Write the progress to a database and have the client to test for that (let's say every second) using an Ajax call.

Estimate required memory for libGD operation

Before attempting to resize an image in PHP using libGD, I'd like to check if there's enough memory available to do the operation, because an "out of memory" completely kills the PHP process and can't be catched.
My idea was that I'd need 4 byte of memory for each pixel (RGBA) in the original and in the new image:
// check available memory
if(!is_mem_available(($from_w * $from_h * 4) + ($to_w * $to_h * 4))){
return false;
}
Tests showed that this much more memory than the library really seem to use. Can anyone suggest a better method?
You should check this comment out, and also this one.
I imagine it must be possible to find out GD's peak memory usage by analyzing imagecopyresampled's source code, but this may be hard, require extended profiling, vary from version to version, and be generally unreliable.
Depending on your situation, a different approach comes to mind: When resizing an image, call another PHP script on the same server, but using http:
$file = urlencode("/path/to/file");
$result = file_get_contents("http://example.com/dir/canary.php?file=$file&width=1000&height=2000");
(sanitizing the file parameter, obviously)
If that script fails with an "out of memory" error, you'll know the image is too large.
If it successfully manages to resize the image, it could return the path to a temporary file containing the resize result. Things would go ahead normally from there.

Categories