Send raw data to PHP via XMLHttpRequest - php

I am selecting a file and sending it through XMLXttpRequest like this :
var upload_form = $('#upload_form'),
file_input = $('#file_input'),
file_list = $('#file_list'),
submit_btn = $('#submit_btn'),
uploaders = [];
file_input.on('change', onFilesSelected);
upload_form.on('submit', onFormSubmit);
/**
* Loops through the selected files, displays their file name and size
* in the file list, and enables the submit button for uploading.
*/
function onFilesSelected(e) {
var files = e.target.files,
file,
list_item,
uploader;
for (var i = 0; i < files.length; i++) {
file = files[i];
uploader = new ChunkedUploader(file);
uploaders.push(uploader);
list_item = $('<li>' + file.name + '(' + file.size.formatBytes() + ') <button>Pause</button></li>').data('uploader', uploader);
file_list.append(list_item);
}
file_list.show();
submit_btn.attr('disabled', false);
}
so for each file that I add, I create a new ChunkedUploader object, which chunks the file in little 1MB files.
The code for the ChunkedUploader object is as follows:
function ChunkedUploader(file, options) {
if (!this instanceof ChunkedUploader) {
return new ChunkedUploader(file, options);
}
this.file = file;
this.options = $.extend({
url: 'index/upload'
}, options);
this.file_size = this.file.size;
this.chunk_size = (1024 * 100); // 100KB
this.range_start = 0;
this.range_end = this.chunk_size;
if ('mozSlice' in this.file) {
this.slice_method = 'mozSlice';
}
else if ('webkitSlice' in this.file) {
this.slice_method = 'webkitSlice';
}
else {
this.slice_method = 'slice';
}
this.upload_request = new XMLHttpRequest();
this.upload_request.onload = this._onChunkComplete();
}
_upload: function() {
var self = this,
chunk;
// Slight timeout needed here (File read / AJAX readystate conflict?)
setTimeout(function() {
// Prevent range overflow
if (self.range_end > self.file_size) {
self.range_end = self.file_size;
}
chunk = self.file[self.slice_method](self.range_start, self.range_end);
self.upload_request.open('POST', self.options.url, true);
self.upload_request.overrideMimeType('application/octet-stream');
if (self.range_start !== 0) {
self.upload_request.setRequestHeader('Content-Range', 'bytes ' + self.range_start + '-' + self.range_end + '/' + self.file_size);
}
self.upload_request.send(chunk);
}, 200);
},
It all works okay, but on the PHP end I receive nothing through :$_GET,$_POST or $_FILE. I can see in Firebug that raw data is being sent through post, there are some gibberish data being sent, I presume it's the small chunk that I just cropped from the original file. I have looked everywhere and I can't find anything releated to this case.
Can you point out what I am doing wrong, because I have to no clue.

You may want to file_get_contents('php://input') instead: this is the raw request body, whereas $_POST is already a parsed representation.
See http://php.net/manual/en/wrappers.php.php#wrappers.php.input

Related

JSON reponse for Jquery/Ajax in Wordpress fails for bigger size file

I am implementing own custom function for historical extract in CSV format from MySQL database using jQuery-Ajax in WordPress environment. I have an HTML where user selects the start date and end date and clicks on a button and then the process works.
When the JSON response is in the range of 900kb to 1 MB, then extraction works. But when the response size increases beyond this then AJAX callback goes in error and returns nothing.
Below is the JavaScript file:
jQuery(document).ready(function(jQuery) {
jQuery('#extract_btn').click(function(){
var startdate = jQuery( '#from-date' ).val();
var enddate = jQuery( '#to-date' ).val();
var data1 = {
action: 'hist_extract',
fromdate: startdate,
todate: enddate
};
// since 2.8 ajaxurl is always defined in the admin header and points to admin-ajax.php
jQuery.ajax({
type:"post",
url:MyAjax1.ajaxurl,
data:data1,
success:function(response) {
if (response == '')
return;
alert('Got this from the server: ' + JSON.parse(response));
JSONToCSVConvertor(response, "Historic Price", true);
},
error:function(xhr, status, error){
alert('Error in response');
var err = JSON.parse(xhr.responseText);
alert(err.Message);
}
});
});
});
function JSONToCSVConvertor(JSONData, ReportTitle, ShowLabel) {
alert('Start of Json Convertor')
//If JSONData is not an object then JSON.parse will parse the JSON string in an Object
var arrData = typeof JSONData != 'object' ? JSON.parse(JSONData) : JSONData;
var CSV = '';
//Set Report title in first row or line
//CSV += ReportTitle + '\r\n\n';
//This condition will generate the Label/Header
if (ShowLabel) {
var row = "";
//This loop will extract the label from 1st index of on array
for (var index in arrData[0]) {
//Now convert each value to string and comma-seprated
row += index + ',';
}
row = row.slice(0, -1);
//append Label row with line break
CSV += row + '\r\n';
}
//1st loop is to extract each row
for (var i = 0; i < arrData.length; i++) {
var row = "";
//2nd loop will extract each column and convert it in string comma-seprated
for (var index in arrData[i]) {
row += '"' + arrData[i][index] + '",';
}
row.slice(0, row.length - 1);
//add a line break after each row
CSV += row + '\r\n';
}
if (CSV == '') {
alert("Invalid data");
return;
}
alert(CSV);
//Generate a file name
var fileName = "Edding_";
//this will remove the blank-spaces from the title and replace it with an underscore
fileName += ReportTitle.replace(/ /g, "_");
//this trick will generate a temp "a" tag
var link = document.createElement("a");
link.id="lnkDwnldLnk";
//this part will append the anchor tag and remove it after automatic click
document.body.appendChild(link);
var blob = new Blob([CSV]);
if (window.navigator.msSaveOrOpenBlob) // IE hack;
window.navigator.msSaveBlob(blob, fileName+".csv");
else
{
var a = window.document.createElement("a");
a.href = window.URL.createObjectURL(blob, {type: "text/plain"});
a.download = fileName+".csv";
document.body.appendChild(a);
a.click(); // IE: "Access is denied"
document.body.removeChild(a);
}
}
Below is the functions.php having custom hook:
//----------------------------------------------------------------------------------
//Below is the custom Javascript hook for Historic Extract
//----------------------------------------------------------------------------------
function price_history() {
$handle = 'hist_extract';
$list = 'enqueued';
if (wp_script_is( $handle, $list )) {
return;
}
else
{
// registering and enqueueing the Javascript/Jquery
wp_enqueue_script('jquery');
wp_register_script('hist_extract', get_template_directory_uri() . '/js/Historic_Price.js', array( 'jquery' ), NULL, false );
wp_enqueue_script('hist_extract');
wp_localize_script('hist_extract', 'MyAjax1', array(
// URL to wp-admin/admin-ajax.php to process the request
'ajaxurl' => admin_url('admin-ajax.php'),
// generate a nonce with a unique ID "myajax-post-comment-nonce"
// so that you can check it later when an AJAX request is sent
'security' => wp_create_nonce('my-special-string')
));
error_log('Js for Historic Price loaded successfully');
error_log(admin_url('admin-ajax.php'));
}
}
add_action('wp_enqueue_scripts', 'price_history');
//----------------------------------------------------------------------------------
// Custom function that handles the AJAX hook for Historic Extract
//----------------------------------------------------------------------------------
function historic_data_extract() {
error_log('Start of report data function on ajax callback');
// check_ajax_referer( 'my-special-string', 'security' );
$from_date = $_POST['fromdate'];
$to_date = $_POST['todate'];
$convert_from_date= date("Y-m-d", strtotime($from_date));
$convert_to_date = date("Y-m-d", strtotime($to_date));
error_log($from_date );
error_log($to_date);
error_log($convert_from_date);
error_log($convert_to_date);
//******************************************
//Custom Code for fetching data from server database
//********************************************
//header("Content-Type: application/json; charset=UTF-8");
define("dbhost", "localhost");
define("dbuser", "xxxxxxxxx");
define("dbpass", "xxxxxxxxx");
define("db", "xxxxxxxx");
$emparray = array();
$conn = mysqli_connect(dbhost, dbuser, dbpass, db);
// Change character set to utf8
mysqli_set_charset($conn,"utf8");
if ($conn )
{
$query = "SELECT PR_PRICE_HIST_TBL.PR_PRODUCT_ID,PR_PRICE_HIST_TBL.PR_URL_ID,PR_PRICE_HIST_TBL.PR_SHOP_NAME,PR_PRICE_HIST_TBL.PR_LAST_CHECKED,PR_PRICE_HIST_TBL.PR_CUST_PROD_CODE,PR_PRICE_HIST_TBL.PR_PRODUCT_NAME,PR_PRICE_HIST_TBL.PR_LAST_PRICE,PR_PRICE_HIST_TBL.PR_CONV_PRICE,PR_PRICE_HIST_TBL.PR_DOMAIN,PR_PRICE_HIST_TBL.PR_COUNTRY_CODE,PR_PRICE_HIST_TBL.PR_AVAILABLE,PR_PRICE_HIST_TBL.PR_AVAIL_DESCR,PR_PRICE_HIST_TBL.PR_PRICE_TIME,PR_PRICE_HIST_TBL.PR_FAULT_FLAG,PR_PRICE_HIST_TBL.PR_FAULT_TIME,PR_PRICE_HIST_TBL.PR_FAULT_MSG,TABLE_72.MIN_PRICE,TABLE_72.MAX_PRICE,TABLE_72.AVG_PRICE,TABLE_72.DEV_PRICE
FROM PR_PRICE_HIST_TBL
INNER JOIN TABLE_72 ON PR_PRICE_HIST_TBL.PR_URL_ID=TABLE_72.PR_URL_ID AND
PR_PRICE_HIST_TBL.PR_SHOP_NAME=TABLE_72.PR_SHOP_NAME AND
PR_PRICE_HIST_TBL.PR_PRODUCT_NAME=TABLE_72.PR_PRODUCT_NAME
AND PR_PRICE_HIST_TBL.PR_LAST_CHECKED BETWEEN '$convert_from_date' AND '$convert_to_date';";
error_log($query);
$result_select= mysqli_query($conn,$query);
error_log(mysqli_num_rows($result_select));
error_log(mysqli_error($conn));
while($row = mysqli_fetch_assoc($result_select))
{
error_log(json_encode($row));
$emparray[] = $row;
}
//error_log(json_encode($emparray));
echo json_encode($emparray);
die();
}
}
add_action('wp_ajax_hist_extract', 'historic_data_extract');
add_action('wp_ajax_nopriv_hist_extract', 'historic_data_extract');
From the code above, you can see, I have tried to implement many things by going over different forums. But I am stuck here. I am not able to understand where could be the potential problem. FYI..I am hosting this on GoDaddy Server. I tried below things:
Tried to make query execution faster by removing views from join. It seems, query is fetching results in around 15 seconds
Format of the data in JSON and tried async: false, but not working
Tried to modify values in init.php. But of no use.
pload_max_filesize = 64M
post_max_size = 64M
memory_limit = 400M
file_uploads = On
max_execution_time = 300
Tried to implement (error:function) for AJAX response. Where only the first alert('Error in response'); is throwing. But can not see the XHR response text.
Any help is appreciated. Please let me know if I miss something or want more information.
The best way to solve the issue is to have a good night's sleep.
Thanks for your clues.
Issue was: In xhr my ajax request was going in cancelled status.
Solution: I was missing preventdefault() in my function.
Now I can see MBs of JSON response. Thanks again for provided clues.
Currently preventdefault() has solved my issue. If you feel, anything else also needs to be taken care as a best practice in my code. Please do not hesitate to comment.
Thanks.

PHP/Curl progress while downloading

I'm using a system curl command via php to download a file. That is all working fine, but I'm now trying to show some progress as it's downloaded.
The curl command and php called is:
$a = popen("cd user; curl -O -# remote.server.com/filename.tar.gz 2>&1", 'r');
ob_flush();flush();
while($b = fgets($a, 64)) {
ob_flush();flush();
if (preg_match('~\b(error|fail|unknown)\b~i',$b)) {
echo "error^$b";
exit;
}
echo str_replace('#','',$b);
ob_flush();flush();
}
pclose($a);
This is called using ajax and the output is displayed in a div:
var last_response_len = false;
$.ajax(url, {
xhrFields: {
onprogress: function(e)
{
var this_response, response = e.currentTarget.response;
if(last_response_len === false)
{
this_response = response;
last_response_len = response.length;
}
else
{
this_response = response.substring(last_response_len);
last_response_len = response.length;
}
$(upg).show();
console.log(this_response)
var count = this_response.match(/%/g);
if (count !== null && count.length != 2) $(msg).html(this_response);
}
}
})
This works but the results showin in the msg div are not consistent.
I may get :
1%
5%
12.1%
12.5% 13.2%
14.2%
5.3%
16.7%
I get partial results ie: 5.3% instead of 15.3%, I get multiple results within the same output ie: 12.5% & 13.2%
is there anyway to standardise this so I only get 0% through to 100% ?
Thanks
change the php str_replace to
str_replace(['#', ' '], '', $b);
Then you get only the percentage, without the preceding blanks, which you can just insert to the container without editing.
Example:
https://3v4l.org/OGWqU

force-download xlsx from ajax response not working

I have this little problem with downloading my xlsx-file.
I am sending my request for the file over jquery Ajax and on the backend the data is correctly collected and assembled to a xlsx-file. So now on its way back to the frontend i am setting all the headers in preparation to force download the file, but the download never starts.
These are the response headers of my request:
Connection Keep-Alive
Content-Disposition attachment; filename="export.xlsx"
Content-Length 346420
Content-Type application/vnd.openxmlformats-officedocument.spreadsheetml.sheet
Date Mon, 23 Nov 2015 13:23:30 GMT
Keep-Alive timeout=5, max=91
Server Apache/2.4.16 (Win32) OpenSSL/1.0.1p PHP/5.6.12
Set-Cookie <cookiesettings>
content-transfer-encoding binary
x-powered-by PHP/5.6.12
imho the download should start immediately, but nothing happens.
EDIT:
Until now I used a form submit, but the data amount is really big so the time which is needed to assemble the file is also really long and sometimes a couple of minutes or even an hour, so this was no longer possible.
So I built a java-job to build the file and startet an ajax snippet which asks for completion every second or so.
So here is my Code.
Frontend:
This is called on button-click
download: function (type, maximum) {
var
self = this,
oParams = this.oTable.oApi._fnAjaxParameters(this.oTable.fnSettings()),
aoPost = [
{ 'name': 'exportType', 'value': type },
{ 'name': 'exportMax', 'value': maximum },
{ 'name': 'handleId', 'value': self.options.handleId }
],
nIFrame, nContentWindow, nForm, nInput, i
;
// Call a self made function to get extra search parameters
// without call an data update AJAX call.
self.oTable.fnSettings().addAdditionalSearchData(oParams);
// Create an IFrame to do the request
nIFrame = document.createElement('iframe');
nIFrame.setAttribute('id', 'RemotingIFrame');
nIFrame.style.border = '0px';
nIFrame.style.width = '0px';
nIFrame.style.height = '0px';
document.body.appendChild(nIFrame);
nContentWindow = nIFrame.contentWindow;
nContentWindow.document.open();
nContentWindow.document.close();
nForm = nContentWindow.document.createElement('form');
nForm.className = 'export-table';
nForm.setAttribute('method', 'post');
// Add POST data.
var formData = {};
for (i = 0; i < aoPost.length; i++) {
nInput = nContentWindow.document.createElement('input');
nInput.setAttribute('name', aoPost[ i ].name);
nInput.setAttribute('type', 'text');
nInput.value = aoPost[ i ].value;
nForm.appendChild(nInput);
formData[aoPost[ i ].name] = aoPost[ i ].value;
}
// Add dataTables POST.
for (i = 0; i < oParams.length; i++) {
nInput = nContentWindow.document.createElement('input');
nInput.setAttribute('name', oParams[ i ].name);
nInput.setAttribute('type', 'text');
nInput.value = oParams[ i ].value;
nForm.appendChild(nInput);
formData[oParams[ i ].name] = oParams[ i ].value;
}
nForm.setAttribute('action', '/service/exportTableData');
// Add the form and the iFrame.
nContentWindow.document.body.appendChild(nForm);
// Send the request.
//nForm.submit();
// Send the request.
var form = $(nContentWindow.document.body).find('form.export-table');
var jobId = 0;
form.ajaxForm(
{
'showMessagesOnSuccess': false
},
{
'getData': function () {
return formData;
}
}
).data('ajaxForm').submit();
}
The Ajax request on submit:
$.ajax({
type: 'POST',
url: self.handler.getServiceUrl(),
timeout: GLOBALS.AJAX_REQUEST_TIMEOUT,
cache: false,
data: (<get the Data>)
,
success: function (response) {
if (response.success === true) {
// Check if we have to wait for a result.
if (response.jobId !== undefined && response.jobId !== 0) {
self.checkJobStatus(response.jobId);
} else {
<success - show some messages>
}
} else {
self.handler.error(response);
}
},
error: function () {
<Show error Message>
}
});
The CheckJobStatus:
checkJobStatus: function (jobId) {
var self = this;
$.ajax({
type: 'POST',
timeout: GLOBALS.AJAX_REQUEST_TIMEOUT,
cache: false,
data: { 'jobId': jobId },
url: self.handler.getServiceUrl(),
success: function (response) {
if(response !== null && response.data !== undefined) {
if (response.data.isFinished === true) {
if (response.success === true) {
// Check if we have to wait for a result.
self.handler.success(response);
} else {
self.handler.error(response);
}
} else if (response.success === true && response.data !== null) {
setTimeout(
function () {
self.checkJobStatus(jobId);
},
500
);
} else {
Helper.logFrontendError();
}
} else if (response !== null && response.success === true) {
setTimeout(
function () {
self.checkJobStatus(jobId);
},
1000
);
} else {
Helper.logFrontendError();
}
},
error: function (response) {
Helper.logFrontendError();
}
});
}
Backend - php:
(...)
if ($action == 'exportTableData' || $action == 'exportChartData') {
$responseData = $service->execute();
if(isset($responseData->data['contentType']) && $responseData->data['contentType'] != null && isset($responseData->data['data'])) {
$this->sendTextData($responseData->data['contentType'], $responseData->data['data']);
} else {
$this->sendJsonData($responseData);
}
} else {
$this->sendJsonData($service->execute());
}
(...)
private function sendTextData($contentType, $data) {
$this->set('filename', 'export.xlsx');
$this->set('data', $data);
$this->response->type($contentType);
$this->render('/Layouts/excel', 'excel');
}
(...)
$handlerResult = new HandlerResult();
if($dataServiceResult == null) {
$service = new DataService();
$dataServiceResult = $service->exportTableData(
$controller->Auth->User('id'),
json_encode($request->data),
null
);
} else {
if ($dataServiceResult->header->resultKey == 0) {
$handlerResult->wsData['data'] = $dataServiceResult->data;
$handlerResult->wsData['contentType'] = $dataServiceResult->contentType;
}
}
$handlerResult->wsResultHeader = $dataServiceResult->header;
return $handlerResult; // ++++ this result returns to the first codeblock in this section ++++
Backend - java - This is where the File is assembled:
(...)
if (jobId > 0) {
FrontendJobStatus status = FrontendJobQueue.getJobStatus(context.userId, jobId);
this.result = (WSExportTableDataResult) status.getResult();
logger.info((this.result.data == null) ? "ByteArray is EMPTY" : "ByteArray is NOT EMPTY");
} else {
this.jobId = FrontendJobQueue.addJob(this.context.userId, new ExportTableDataJob(this.context, this.postData));
this.result.header.jobId = this.jobId;
}
(...)
The Jop:
<Workbook assembly>
ByteArrayOutputStream out = new ByteArrayOutputStream();
wb.write(out);
this.result.data = out.toByteArray();
this.result.contentType = "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet";
// this.result.contentType = "application/vnd.ms-excel";
this.result.setResultHeader(APIConstants.RESULT_SUCCESS);
Layout/excel:
<?php
header('Content-Disposition: attachment; filename="'.$filename.'"');
header('Content-Transfer-Encoding: binary');
ob_clean();
echo $data;
EDIT 2:
So I tried to open a new window on success with the Data, and i could start the download, but the file ist no valid xlsx File anymore.
var reader = new FileReader();
var blob = new Blob([response.responseText], { type: "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet" });
reader.readAsDataURL(blob);
reader.onloadend = function (e) {
window.open(reader.result, 'Excel', 'width=20,height=10,toolbar=0,menubar=0,scrollbars=no', '_blank');
}
Any Ideas?
After a lot of research i found this site and the essence of its statment is that jquery ajax does not support receiving binary data, but provides a solution for implementing plain xhr request which support blob transfer.
The Site:
http://www.henryalgus.com/reading-binary-files-using-jquery-ajax/
To expand on my comment, instead of trying to send back binary data via ajax, simply save to a temp file , send the file reference back to js. On receiving the file reference, simply set window.location.href to point to a filereading endpoint, passing the file reference. I have done this a few times and it works fine even on ancient browsers:
$('#start').click(function(){
$.post('/createfile.php', {some:data}, function(response){
if(response.started){
pollFile(response.fileId);
}
});
);
function pollFile(fileId){
$.get('/filestatus.php?fileid=' + fileId, function(response){
if(response.fileCompleted){
window.location.href = '/downloadfile.php?fileid=' + fileId;
}else{
setTimeout('pollFile', 5000, fileId);
}
});
}
//createfile.php
$fileId = uniqid();
SomePersistentStorage::addJob($fileID);
//start file job here, code should run in a seperate process/thread, so
//either use a job queue system, use shell_exec or make an http request,
//then once job is queued/started:
header('Content-Type: application/json');
echo json_encode(['started'=>true, 'fileId'=>$fileId]);
//processjob.php - the file that does the work, could be your java
//program for example, just using php here for consistency
//after file is done
file_put_contents($filepath, $data);
SomePersistentStorage::updateJob($fileID, true);
//filestatus.php
$fileId = $_GET['fileid'];
header('Content-Type: application/json');
echo json_encode(['fileCompleted'=>SomePersistentStorage::isJobCompleted($fileID)]);
//downloadfile.php
$fileId = $_GET['fileid'];
$filepath = 'tmp/' . $fileId . '.tmp';
//correct headers here, then
readfile($filepath);
unlink($filepath);
If you dont want to imediatly delete the file, then you could just run a cron to delete files in the specific folder, that are older than x.

drag-and-drop image upload not working on server

I am trying to implement a drag-and-drop image upload.
I found a rather simple script online and adapted to my use.
On my local installation the file uploads perfectly fine, but not on the server.
From my debugging attempts the $_SERVER['HTTP_X_FILENAME'] does not even get set by php.
I tried the following:
- Making sure that the upload folder is set to 755
- Changing the php temporary upload path and increasing the maximum allowed file size
No php or js errors of any kind occur.
Since I have the die(print_r($_SERVER)); in the php, I get the $_SERVER dump using the chrome inspector, it does not contain HTTP_X_FILENAME index.
My php is:
<?php
$fn = (isset($_SERVER['HTTP_X_FILENAME']) ? $_SERVER['HTTP_X_FILENAME'] : false);
if ($fn) {
// AJAX call
file_put_contents(
'../usr/photos/' . $fn,
file_get_contents('php://input')
);
echo "$fn uploaded";
exit();
}
else {
// form submit
if(!$_FILES['fileselect']) die(print_r($_SERVER));
else $files = $_FILES['fileselect'];
foreach ($files['error'] as $id => $err) {
if ($err == UPLOAD_ERR_OK) {
$fn = $files['name'][$id];
move_uploaded_file(
$files['tmp_name'][$id],
'../usr/photos/' . $fn
);
echo "<p>File $fn uploaded.</p>";
}
}
}
The js is as follows:
//Drag and drop photo upload
(function() {
// getElementById
function $id(id) {
return document.getElementById(id);
}
// output information
function Output(msg) {
var m = $id("messages");
m.innerHTML = msg + m.innerHTML;
}
// file drag hover
function FileDragHover(e) {
e.stopPropagation();
e.preventDefault();
e.target.className = (e.type == "dragover" ? "hover" : "");
}
// file selection
function FileSelectHandler(e) {
// cancel event and hover styling
FileDragHover(e);
// fetch FileList object
var files = e.target.files || e.dataTransfer.files;
// process all File objects
for (var i = 0, f; f = files[i]; i++) {
ParseFile(f);
UploadFile(f);
}
}
// output file information
function ParseFile(file) {
/*Debug*/
Output(
"<p>File information: <strong>" + file.name +
"</strong> type: <strong>" + file.type +
"</strong> size: <strong>" + file.size +
"</strong> bytes</p>"
);
// display an image
if (file.type.indexOf("image") == 0) {
var reader = new FileReader();
reader.onload = function(e) {
Output(
"<p>" +
//"<strong>" + file.name + ":</strong><br />" +
'<img width="130" height="100" src="' + e.target.result + '" />' +
'<br />' +
'<input type="text" name="photo_name" value="'+ file.name +'" />' +
'<br />' +
'<input type="text" name="photo_caption" value="Caption" /></p>'
);
}
reader.readAsDataURL(file);
}
// display text
if (file.type.indexOf("text") == 0) {
var reader = new FileReader();
reader.onload = function(e) {
Output(
"<p><strong>" + file.name + ":</strong></p><pre>" +
e.target.result.replace(/</g, "<").replace(/>/g, ">") +
"</pre>"
);
}
reader.readAsText(file);
}
}
// upload JPEG files
function UploadFile(file) {
// following line is not necessary: prevents running on SitePoint servers
if (location.host.indexOf("sitepointstatic") >= 0) return
var xhr = new XMLHttpRequest();
if (xhr.upload && (file.type == "image/jpeg" || file.type == "image/png") && file.size <= $id("MAX_FILE_SIZE").value) {
// create progress bar
var o = $id("progress");
var progress = o.appendChild(document.createElement("p"));
progress.appendChild(document.createTextNode("upload " + file.name));
// progress bar
xhr.upload.addEventListener("progress", function(e) {
var pc = parseInt(100 - (e.loaded / e.total * 100));
progress.style.backgroundPosition = pc + "% 0";
}, false);
// file received/failed
xhr.onreadystatechange = function(e) {
if (xhr.readyState == 4) {
progress.className = (xhr.status == 200 ? "success" : "failure");
}
};
// start upload
xhr.open("POST", $id("upload").action, true);
xhr.setRequestHeader("X_FILENAME", file.name);
xhr.send(file);
}
}
// initialize
function Init() {
var fileselect = $id("fileselect"),
filedrag = $id("filedrag"),
submitbutton = $id("submitbutton");
// file select
fileselect.addEventListener("change", FileSelectHandler, false);
// is XHR2 available?
var xhr = new XMLHttpRequest();
if (xhr.upload) {
// file drop
filedrag.addEventListener("dragover", FileDragHover, false);
filedrag.addEventListener("dragleave", FileDragHover, false);
filedrag.addEventListener("drop", FileSelectHandler, false);
filedrag.style.display = "block";
// remove submit button
submitbutton.style.display = "none";
}
}
// call initialization file
if (window.File && window.FileList && window.FileReader) {
Init();
}
})();
Thank you in advance.
You probably will have solved your problem now, but I'm posting this solution here to help others who come here with the same problem. In your js, there is a line that reads
xhr.setRequestHeader("X_FILENAME", file.name);
but should read
xhr.setRequestHeader("X-FILENAME", file.name);
since underscores are deprecated in later Apache releases (see also
Header names with underscores ignored in php 5.5.1 / apache 2.4.6)
I had this problem on one of my Ubuntu WAMP installations. Your upload URL (the POST URL specified on the Javascript side of things) needs to be a fully qualified path not a relative path. I can't see the value however but seems to be the value of whatever $id("upload").action is in your code. You can confirm this is the cause by looking at the apache logs if you have access to them. If you see 404 errors when trying to send a file then this is your problem. Thats assuming the request even hits your server at all.

jQuery post size limitation using canvas.todataurl()

I'm trying to upload a canvas.todataurl() image (getUserMedia) to server using jQuery post and php to handle the data, but I'm having some problems. All the images I'm uploading end up being corrupted, half of the image is missing. I also have a MySQL database where I'm storing data related to the image (title, text, date and the like). It seems that the more I have the related data the more the image get corrupted.
Therefore, I'm wondering is this a browser limitation or does this have something to do with jQuery post. I've also checked the PHP max_post_size and it's 16mb, so that shouldn't be a problem. I don't have access to the server settings. I'm quite puzzled with this, what can I do? Is it possible to divide the canvas.todataurl() to multiple parts and then post?
JavaScript
window.addEventListener('DOMContentLoaded', function() {
var video = document.getElementById('videoStream');
var canvas = document.getElementById('canvasImage');
var status = document.getElementById('status');
var button = document.getElementById('button');
//var others = document.getElementById('others');
var imageHolder;
document.getElementById('form').style.display = 'none';
var image = null; // kuvan datauri joka lähtee php:lle
window.URL || (window.URL = window.webkitURL || window.mozURL || window.msURL);
navigator.getUserMedia || (navigator.getUserMedia = navigator.mozGetUserMedia || navigator.webkitGetUserMedia || navigator.msGetUserMedia);
// toString : function() {return "video,audio";} canarya varten
if (navigator.getUserMedia) {
navigator.getUserMedia({video: true, audio: false, toString : function() {return "video,audio";}}, onSuccess, onError);
} else {
status.innerText = "getUserMedia is not supported in your browser, sorry :(";
}
function onSuccess(stream) {
var source;
if (window.webkitURL) {
source = window.webkitURL.createObjectURL(stream);
} else {
source = stream; // Opera ja Firefox
}
video.width = 500;
video.height = 375;
video.autoplay = true;
video.src = source;
}
function onError() {
status.innerText = "Please allow access to your webcam.";
}
button.addEventListener('mousedown', function() {
// Poistetaan aikaisempi kuva jos sellaista on
//document.body.removeChild(imageHolder);
// luodaan kuva uudestaan
imageHolder = document.createElement('figure');
imageHolder.id = 'imageHolder';
document.body.appendChild(imageHolder);
img = document.createElement('img');
imageHolder.appendChild(img);
// kuva on yhtäsuuri kuin video
canvas.width = video.width;
canvas.height = video.height;
img.width = 350;
img.height = 225;
// piirretään canvasille kuva videosta
var context = canvas.getContext('2d');
context.drawImage(video, 0, 0, canvas.width, canvas.height);
}, false);
button.addEventListener('mouseup', function() {
// Canvasilta kuvaksi levylle tallentamista varten
canvas.style.display = 'none';
video.style.display = 'none';
button.style.display = 'none';
others.style.display = 'none';
document.getElementById('form').style.display = 'block';
image = canvas.toDataURL('image/png');
img.src = image;
}, false);
// jquery post
$('#send').click(function(){
var image2 = image.replace('data:image/png;base64,', '');
$.post('upload.php',
{
title: $('#title').val(),
blog: $('#blog').val(),
category: $('#category').val(),
author: $('#author').val(),
imagename: image2
});
});
}, false);
PHP upload.php
define('UPLOAD_DIR', 'images/');
$img = $_POST['imagename'];
$img = str_replace(' ','+', $img);
$data = base64_decode($img);
$file = UPLOAD_DIR . uniqid() . '.png';
$success = file_put_contents($file, $data);
print $success ? $file : 'Tiedoston tallennus ei sitten onnistu millään...';
$imagename = $file; // this is the file name for the MySQL database
My problem is (I think) image = canvas.toDataURL('image/png'); and the jQuery post.
The canvas.toDataUrl() string is about 700 000 letters long.
You might wanna try this:
<?php
$decoded = "";
for ($i=0; $i < ceil(strlen($encoded)/256); $i++)
$decoded = $decoded . base64_decode(substr($encoded,$i*256,256));
?>
I got it from here: http://www.php.net/manual/en/function.base64-decode.php#92980
The code basically tries to partially decodes the base64 string. I haven't tested this though. I've never dealt with base64 images as big as what you're working with.
Split it, use two, variables and merge in php, works fine. ;-)
var resourcelength_all = resource.length;
var resourcelength_split = resourcelength_all / 2;
var resource_part1 = resource.substr(0, resourcelength_split);
var resource_part2 = resource.substr(resourcelength_split, resourcelength_all);

Categories