HTTP PUT Request - Progress Bar Implementation - php

I have discovered that HTTP PUT Request ist the most suitable for very large files upload (1GB or more).
The solution works well and I can upload any file of my choice to the server. However, I have difficulties monitoring upload progress.
I have implemented onprogress callback, but this one gets called only once after the file is uploaded via PUT.
My JavaScript Code:
var req = createRequest();
req.open("PUT", "PHP/upload_file.php?folder=" + aUploadedFile.upload_folder + "&master_folder=" + settings.strServerSideMasterFolder + "&file_name=" + aUploadedFile.file_name);
req.setRequestHeader("Content-type", "text/plain");
req.onload = function (event)
{
console.log("OnLoad Called: " + aUploadedFile.file_name);
}
req.onprogress = function (event)
{
console.log("OnProgress Called: " + aUploadedFile.file_name);
}
req.send(aUploadedFile.file_object);
What are my options when i wish to monitor the upload progress via PUT, please?
Should I establish another JavaScript AJAX call, that will monitor the size of the uploaded file o the server?
Is there any other working solution available?
I use:
HTML5
JavaScript
PHP
Apache
I do not use:
jQuery
Thank you in advance.

Have you tried xhr.upload.onprogress instead of xhr.onprogress?
If that doesn't work too, you could establish another JavaScript AJAX call, like you said. Recently, I've made an upload system that read the file line per line, and it needed to show some extra information about the upload, and not just the percentage, so I did something like this:
The main page makes an AJAX request to a file responsible to process the file
In that file, I had to close the connection, so the AJAX request could complete, but the PHP script would still be running, using this:
ob_start();
$file = tempnam('/tmp', uniqid()); // create a temp file to show status of the action
echo json_encode(array('file' => $file));
header('Content-length: '.ob_get_length());
header('Connection: close');
ob_end_flush();
flush(); // The AJAX request is completed here, but the script is still running...
function writeToFile($handle, $arr) {
ftruncate($handle, 0); // empty file
fwrite($handle, json_encode($arr));
}
$handle = fopen($file, 'w');
while (readLine($uploadedFile)) {
// code to process line
writeToFile($handle, array('progress' => $current / $total, 'action' => 'Reading...'));
}
// insert into database
writeToFile($handle, array('progress' => '100', 'action' => 'Inserting into database...'));
fclose($handle);
At the main page, the AJAX request would return the name of the file that contains the information, so I would create several GET requests to that file, using setInterval method
Note: in my case, I created another PHP file to show the contents of the progress file (with file_get_contents), so I could manually delete that file when the operation completes

Related

How to improve speed of uploading 2000 images at 4MB each using Vue.JS and XHR?

So I am working on an uploader for one of our websites, and one of the things I am trying to achieve is a way of uploading thousands of images via the website, instead of our clients using an FTP/SFTP solution.
One of the issues I am running into is upload speeds, so here is the current user flow:
The client clicks the 'Add Images' button and selects the images they wish to upload.
There is a #change set for the input, which processes the images for Vue, by taking the data from the event.target.files array and adding them into the Vue data so that we can display the content.
There is a tick loop running, that checks the first 10 images and preloads them so the client can see the first 10 image previews, any more would kill the browsers memory. Also, as files get uploaded and removed from the array, this updates the previews for the first 10 images always, so there will always be the first 10 preview images displayed.
Once they are happy with this, our client would then click 'Upload Files' and this would then start the upload, which also is part of the tick loop, and what it does is check if anything is uploading, if not, it will start on the first file in the array.
So now it will set the status of the file as uploading, so it shows on the UI, then it creates a XMLHttpRequest() and set the URL, and create a new FormData object and append the image handle (the File(ID) object) and then any other data that needs to be sent.
I set the request to use POST, and set an onreadystatechange so that I can catch when it finishes, which just basically, sets the file state as uploaded, and then removes it from the array, unless there is an issue, then it moves it to the issues array.
Now I send the request to the server, this will then receive the file in the $_FILES variable, and will resize the image 3 times for various sizes and then save them to the correct place, and then return with a success or failure message.
The main problem stems from the upload, the resize code, is fairly quick, I mean around 200-500ms, so the issue doesn't stem from there, but the actual upload.
Our internet is around 4MB per second, and using FTP it takes around 300-400ms for a 4MB file, but for the browser, it takes about 2.2s so I am not sure why this is.
I understand that of course FTP/SFTP is a direct upload, using chunks (I think), where as we are making many Ajax requests, so there in itself makes sense to why it is slower, but is there no way to make this upload quicker at all?
Another thing to note, is this is running within Joomla also.
I am using the below code (amended for me to post):
// Create new request
var http = new XMLHttpRequest();
// Set URL
var url = 'POST_API_URL';
// Create form data object
var params = new FormData();
params.append('name', this.input.files[i].name);
params.append('size', this.input.files[i].size);
params.append('type', this.input.files[i].type);
// Append file to form data object
params.append('images[]', this.input.files[i].handle,
this.input.files[i].name);
// Open post request
http.open("POST", url);
// On return
http.onreadystatechange = function() {
// Check http codes
if (http.readyState == 4 && http.status == 200) {
// Write data to page
window.vm.$data.app.response = http.responseText;
// Get response array
var response = JSON.parse(http.responseText);
// Check response status
if (response.status) {
console.log('Completed');
} else {
console.log('Failed');
}
}
}
// Send request
http.send(params);
The PHP code to receive the file is here:
// Main upload function
public function save()
{
// Wrap everything in a try; catch
try {
// Import system libraries
jimport('joomla.filesystem.file');
jimport('joomla.filesystem.folder');
// Include php image rezing library
require_once JPATH_ROOT . '/lib/php-image-resize/lib/ImageResize.php';
// Decode request data from client
$request = (object) $_POST;
//Define the different sizes and auction we need
$sizes = array('small' => '200', 'medium' => '320', 'xlarge' => '2000');
// Define auction number
$auction = $request->auction;
// Set path for save
$path = $_FILES['images']['tmp_name'][0];
// Create image object
$image = new \Eventviva\ImageResize($path);
// Loop the sizes so we can generate an image for each size
foreach ($sizes as $key => $size) {
// Resize image
$image->resizeToWidth($size);
// Set folder path
$folder = JPATH_ROOT . '/catalogue_images/' . $auction . '/' . $key . '/';
// Check if folder exists, if not; create it
if (!JFolder::exists($folder)) {
JFolder::create($folder);
}
// Set file path
$filepath = $folder . $request->name;
// Save updated file
$image->save($filepath);
}
// Return to the client
echo json_encode(array('status' => true));
} catch(Exception $e) {
// Return error, with message
echo json_encode(array('status' => false, 'message' => $e->getMessage()));
}
}
I am open to any ideas, on how we can either use chunked upload, or anything else, but do keep in mind that our clients can upload 20 up to 5000 images, and we have some clients that do upload around 4000-5000 quite often. So it needs to be robust enough to support this.
My last test, was that:
Time taken: 51 minutes, and 15 seconds
Files: 1000 images (jpg)
Sizes: 1.5MB and 6.5MB
_Also noticed, that it does get slower as time progresses, maybe an extra 500ms to 1s maximum, additional to the 2.2s upload time.
Thanks in advance.

AJAX: Detect connection closed, PHP continues

Situation: User uploads a number of photos via AJAX, then continues interacting with the website whilst a PHP script continues running in the background and generates a variety of thumbnails based on the uploaded photos.
Site config:
jQuery AJAX (v1.9.1)
PHP 5.4.7, FastCGI mode
IIS 7.5, with gzip
Previous posts that I've referred to and tried to implement (but to no avail):
Closing a connection early with PHP
PHP: continue after output is complete
Send AJAX results but continue processing in PHP
close a connection early
Disable Gzip compression for single php file with IIS
http://www.php.net/manual/en/features.connection-handling.php#71172
I have tried a huge number of script options based on the previous posts, however none seem to tell the AJAX script to let the user continue, whilst the PHP continues to process...
Example PHP code:
<?php
// Save images to db, etc
// Now tell AJAX to let the user continue, before generating thumbnails
if(ini_get('zlib.output_compression')) {
ini_set('zlib.output_compression', 'Off'); // turn IIS gzip for this file
}
ob_end_clean();
header("Connection: close");
header("Content-Encoding: none"); //ensures gzip is not sent through
ob_start();
echo '123'; // 3 digit number will be sent back to AJAX
$size = ob_get_length(); // should mean Content-Length = 3
header("Content-Length: $size");
ob_end_flush();
flush();
ob_end_clean();
// Generate thumbnails, etc
?>
Example jQuery AJAX code:
$.ajax({
type: 'POST',
url: ajax_url,
data: { foo: bar },
beforeSend:function(){
//
},
success:function(data){
alert(data); // Only seems to be firing once the thumbnails have been generated.
}
});
The response headers seem okay...
Question: How do I get AJAX to allow the user to continue once it has received the code from the middle of the PHP script, whilst the PHP script continues to generate the thumbnails?
If you run request, it will always wait until PHP Script finish executing, or there will be a timeout. So you cannot stop AJAX in middle, and keep PHP running. If you want to upload files, and then create thumbnails, but have info that files are uploaded, do it in two steps:
upload files with AJAX -> return success
run another AJAX request on success to get uploaded images (or thumbs in fact).
Thanks to that, thumbs can be also rendered later, when they are first time requested (even without ajax). If you don't want requesting and waiting for thumbs, use cron job on server, which will create thumbs for awaiting images.

HTTP PUT Request: Passing Parameters with File

After numerous various tests with uploading files throught HTTP POST Request, it looks that HTTP PUT Requests are the most suitable for very large files +1GB upload.
The below listed simple code I have tested for HTTP PUT file upload request works well:
JavaScript:
var req = createRequest();
req.open("PUT", "PHP/filePutLoad.php");
req.setRequestHeader("Content-type", "text/plain");
req.onload = function (event)
{
console.log(event.target.responseText);
}
req.send(aUploadedFile.file_object);
PHP:
include 'ChromePhp.php';
require_once 'mysqlConnect.php';
ini_set('max_execution_time', 0);
ChromePHP::log( '$_PUT :' . print_r($_PUT));
/* PUT data comes in on the stdin stream */
$putdata = fopen("php://input", "r");
/* Open a file for writing */
$fp = fopen("myputfile.ext", "w");
/* Read the data 1 KB at a time and write to the file */
while ($data = fread($putdata, 1024))
fwrite($fp, $data);
/* Close the streams */
fclose($fp);
fclose($putdata);
However, I have difficulties delivering arguments and variables with the file being uploaded from JavaScript to PHP. For example, I need to deliver upload target folder, where the new data needs to be stored, ID of the uploader, etc..
Is there a way to combine HTTP PUT Request with HTTP POST to submit arguments?
What are my options if I wish to deliver parameters from JavaScript to PHP along HTTP PUT file upload?
Thank you.
using PUT also, it works when you append the parameters in the query string. I'm also looking for another way for this. Although, this is a workaround I'm using currently
curl -X PUT "http://www.my-service.com/myservice?param1=val1" --data #file.txt

Offer a generated file for download from jQuery post

I've got a large form where the user is allowed to input many different fields, and when they're done I need to send the contents of the form to the server, process it, and then spit out a .txt file containing the results of the processing for them to download. Now, I'm all set except for the download part. Setting the headers on the response to the jQuery .post() doesn't seem to work. Is there any other way than doing some sort of iframe trick to make this work (a la JavaScript/jQuery to download file via POST with JSON data)?
Again, I'm sending data to the server, processing it, and then would like to just echo out the result with headers to prompt a download dialog. I don't want to write the result to disk, offer that for download, and then delete the file from the server.
Don't use AJAX. There is no cross-browser way to force the browser to show a save-as dialog in JavaScript for some arbitrary blob of data received from the server via AJAX. If you want the browser to interpret the results of a HTTP POST request (in this case, offering a download dialog) then don't issue the request via AJAX.
If you need to perform some kind of validation via AJAX, you'll have to do a two step process where your validation occurs via AJAX, and then the download is started by redirecting the browser to the URL where the .txt file can be found.
Found this thread while struggling with similar issue. Here's the workaround I ended up using:
$.post('genFile.php', {data : data}, function(url) {
$("body").append("<iframe src='download.php?url="+url+"' style='display: none;'></iframe>");
});
genFile.php creates the file in staging location using a randomly generated string for filename.
download.php reads the generated file, sets the MIME type and disposition (allowing to prompt using a predefined name instead of the random string in the actual filename), returns the file content and cleans up by deleting the source file.
[edit] might as well share the PHP code...
download.php:
<?php
$fname = "/tmp/".$_GET['url'];
header('Content-Type: text/xml');
header('Content-Disposition: attachment; filename="plan.xml"');
echo file_get_contents($fname);
unlink ($fname);
?>
genFile.php:
<?php
$length = 12;
$chars = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789";
$str = substr( str_shuffle( $chars ), 0, $length ).'.xml';
$fh = fopen(('tmp/'.$str), 'w') or die("can't open file");
fwrite($fh,$_POST["data"]);
fclose($fh);
echo $str;
?>
Rather than using jQuery's .post(), you should just do a normal POST by submitting the form, and have the server respond with appropriate Content-Encoding and MIME-type headers. You can't trigger a download through post() because jQuery encapsulates the returned data.
One thing I see in use rather frequently, though, is this:
$.post('generateFile.php', function(data) {
// generateFile builds data and stores it in a
// temporary location on the server, and returns
// the URL to the requester.
// For example, http://mysite.com/getFile.php?id=12345
// Open a new window to the returned URL which
// should prompt a download, assuming the server
// is sending the correct headers:
window.open(data);
});

PHP echo file contents

I have a pdf file which is located off my webpage's root. I want to serve a file in ../cvs to my users using php.
Here is the code I have sofar:
header('Content-type: application/pdf');
$file = file_get_contents('/home/eamorr/sites/eios.com/www/cvs/'.$cv);
echo $file;
But when I call this php page, nothing gets printed! I'd like to simply serve the pdf file stored whose name is in $cv (e.g. $cv = 'xyz.pdf').
The ajax response to this PHP page returns the text of the pdf (gobbldy-gook!), but I want the file, not the gobbldy-gook!
I hope this makes sense.
Many thanks in advance,
Here's the AJAX I'm using
$('#getCurrentCV').click(function(){
var params={
type: "POST",
url: "./ajax/getCV.php",
data: "",
success: function(msg){
//msg is gobbldy-gook!
},
error: function(){
}
};
var result=$.ajax(params).responseText;
});
I'd like the user to be prompted to download the file.
Don't use XHR (Ajax), just link to a script like the one below. The HTTP headers the script outputs will instruct the browser to download the file, so the user will not navigate away from the current page.
<?php
// "sendfile.php"
//remove after testing - in particular, I'm concerned that our file is too large, and there's a memory_limit error happening that you're not seeing messages about.
error_reporting(E_ALL);
ini_set('display_errors',1);
$file = '/home/eamorr/sites/eios.com/www/cvs/'.$cv;
//check sanity and give meaning error messages
// (also, handle errors more gracefully here, you don't want to emit details about your
// filesystem in production code)
if (! file_exists($file)) die("$file does not exist!");
if (! is_readable($file)) die("$file is unreadable!");
//dump the file
header('Cache-Control: public');
header('Content-Type: application/pdf');
header('Content-Disposition: attachment; filename="some-file.pdf"');
header('Content-Length: '.filesize($file));
readfile($file);
?>
Then, simplify your javascript:
$('#getCurrentCV').click(function(){
document.location.href="sendfile.php";
});
How about using readfile instead? Provided that the file exists, that should work. Make sure your web process has permission to read the directory and the file. There is an example on the readfile page that sets some headers as well.
I'm trying to prompt the user to download the pdf file.
You can't (and don't need to) send a binary download to the user's browser using Ajax. You need to send the user to an actual URL where the PDF is located.
Use #timdev's code, and point the user there using e.g.
location.href = "scriptname.php";
It sounds like you're trying to serve the pdf to user for download via AJAX.
What you want to do is use AJAX to confirm the files exists, and security if any, then simply use js to redirect the browser to that files url, or in this case the url of the php script delivering the pdf. When your browser gets the pdf header it wont try to redirect the page itself but prompt for download, or whatever the users browser settings are.
Something like:
(js)
window.location.href = http://example.com/getApdf.php?which=xyz
(php)
if( !isset( $_GET['which'] ) ) die( 'no file specified' );
if( !file_exists( $_GET['which'] . '.pdf' ) ) die( 'file doesnt exist');
header('Content-type: application/pdf');
readfile( $_GET['which'] . '.pdf' );

Categories