i'm trying to read the filesize of a file while uploading it this way
1 . start an $.ajax(); request to start a server to server downloading of file (i'm using php fopen + stream_copy_to_stream );
2 . start a second $a.ajax(); request each tot time and try to read the current filesize till the end of download process;
unfortunely this don't work as expected.
both the requests are made correctly but the filesize is read only after the file is fully transfered. so i get all the alert message at the end of the process instead of meanwhile
i guess i'm doing something that can't be done? or i'm just missing something?
pseudo code:
var uploadStart = 0;
function startUploadProgressBar(file) {
var file = file;
uploadStart = setInterval(function() {
$.ajax({
data: 'uploadProgress=1&file=' + file,
success: function(json) {
alert(json.filesize)
}
});
},
1000);
}
$('#submit').click(function() {
var file = somevar;
startUploadProgressBar(file);
$.ajax({
success: function(json) {
clearInterval(uploadStart);
}
});
return false;
});
Just for uploading files via ajax, you can consider using the jQuery forms plugin. It supports progress bars, as far as I know.
If you want to stick to your method, read through the comments of the function's manual page in the PHP manual. There are alternatives that can be modified outputting states of the transfer.
Why not just have the PHP page return the size of the file itself using filesize()? That way you don't need to poll repeatedly.
Related
I think I have this narrowed down to being a problem with trying to download a file through the AJAX request. I need to push the array to the php script, but the zip download that the script pushes back has to be redirected to a new window. Not sure how to fit that into my jquery.
I'm not sure if it counts as a duplicate to other questions involving downloading from php, because the ones I looked at didn't pass data through the ajax call.
I am passing an array from jquery to php through AJAX and getting a streaming zip download of files from my database. The script works when I navigate to it directly and downloads the zip file of files as expected. It does not however work when I try to trigger it through my jquery. I believe this has to do with the download needing to be redirected to a new window?
Jquery:
if (confirm("Download data for the following samples:" + download_list.toString())) { alert("Download Beginning - Will take several minutes per sample selected - DON NOT NAVIGATE AWAY FROM PAGE.");
$.ajax({
type: "POST",
url: myURL + "zip_download.php",
data: {download_listArray:download_list},
success: function(){
alert("OK");
}
});
PHP:
<?php
$sample_name_list = $_POST['download_listArray'];
foreach ($sample_name_list as $i => $sample_name){
//do stuff
}
//stream zip
?>
I have a large CSV file which I am uploading to the WordPress dashboard for importing taxonomy terms. I wrote a small plugin which uses wp_insert_term() function to insert each term, however, the function caches a lot of its data in order to check the slugs uniqueness and parent term dependencies, the process runs out of memory around 1000 terms, despite increasing memory allocation to 0.5 Gb.
I have been wanting to split the file into manageable chunks so as to batch process the data and run sessions limited to a 1000 or lines of data, this way each process would terminate cleanly.
I have been looking around for such a solution and found this interesting article about a similar issue faced by bulk image imports, and it outlines how the developers used javascript to control the batch process by sending ajax requests to the server and manageable chunks.
It gave me the idea of reading the CSV file on upload, reading it line by line and sending an ajax request to the server to process a manageable number of lines.
Is ther a better way to achieve this?
I developed the following solution based on the links in the question and some additional tinkering.
On the WordPress server-side, when loading the javascript file, I determine the number of lines the server can handle based on the memory allocation using,
$limit = ini_get('memory_limit');
$limit = wp_convert_hr_to_bytes($limit) / MB_IN_BYTES; //in MBs.
switch(true){
case $limit >= 512:
$limit = 1000;
break;
default:
$limit = 500;
break;
}
wp_enqueue_script( 'my-javascript-file');
wp_localize_script( 'my-javascript-file', 'cirData', array(
'limit'=>$limit
));
you should determine and set your own limit as per your process.
In the javascript file, using jQuery,
var reader,formData, lineMarker=0, csvLines, isEOF=false, $file, $form ;
$(document).ready(function(){
$file = $(':file'); //file input field
$form = $('form'); //form
//when the file field changes....
$file.on('change', function(){
//check if the file field has a value.
if($file.val()){
//setup file reader.
reader = new FileReader();
//now listen for when the file is ready to be read.
reader.addEventListener('load', function (e) {
csvLines = e.target.result.split("\n");
batchProcess(); //launch process.
});
//when the form is being submitted, start reading the file.
$(document).on('click', ':submit', function(e){
e.preventDefault(); //disable normal submit.
//setup data for the ajax.
formData = new FormData($form.get(0));
//read the file and batch request to server.
reader.readAsBinaryString($file.get(0).files[0]);
})
}
})
});
// Methods
//posting
function postCSVdata(csvdata){
formData.set('csvlines', csvdata); //set the current datat to send.
$.ajax({
type: 'POST',
url: $form.attr('action'),
data: formData,
contentType: false,
processData: false,
cache: false,
success: function(data){
var msg ="";
if(isEOF){ //is this end of the file?
console.log("success!");
}else{ //continue reading file.
console.log("uploaded:"+ Math.round(lineMarker/csvLines.length*100)+"%");
batchProcess(); //process the next part of the file.
}
}
})
}
//batch process.
function batchProcess(){
//csvlines is the array containing all the lines read from the file.
//lineMarker is the index of the last line read.
var parsedata='', stop = csvLines.length - lineMarker, line='';
for(var i = 0; i < stop; i++) {
line = csvLines[i+lineMarker];
parsedata +=line+"\n"; //add a new line char for server to process.
//check if max limit of lines server can process is reached.
if(i>(cirData.limit-2)) break; //batch limit.
}
lineMarker += i;
if(i==stop) isEOF = true;
postCSVdata(parsedata); //send to server.
}
this sends multiple AJAX request in a sequential manner in chunks of lines that the server is able to handle without having a fatal memory error.
I have a php script which processes XML file uploaded by user . When file is big (like 50MB) it can take several minutes. On small files it works how it's expected. But I faced with one problem with large ones.
So, my file looks like:
if(file_exist($filename)) {
return array(false, "File with this name already exist");
} else {
/*
Processing of file
*/
return array(true, "")
}
When user upload file with javascript uploader, I make ajax request to this script. And the thing is that for large file it execute twice. For the first time it go to the "processing" section, at the second time it returns error that file with this name already exist (which is true, actually).
I add logging function in the beginning of the script
$logger = Zend_Registry::get('logger');
$logger->log('assign file function called', 7);
and I see that script has been called twice.
But in Firebug I see only 1 ajax request to the script. In Apache access.log I see only one request. Apache error.log is empty.
Any idea what it can be? Probably any configuration for long-time-executed scripts?
UPD Javascript for calling script
$("#save_file_btn").click(function(){
var filename = $("input#selected_file").val();
if(!filename.length) {
customAlert("Choose file at first");
return false;
}
$.ajax({
url: '/otms/publisher/assign-file-to-publisher',
beforeSend: function(xhr){
$("#add_form").html('<div class="loader"></div>');
},
dataType: 'json',
data: {filename: filename},
success: function(data) {
if(data.success) {
//Process added ONIX file
processOnixFile(data.fileId, function(){
getFileList();
$("#add_form").html('<div class="text-center">File has been successfully uploaded</div>');
setTimeout('$("#add_file_modal").modal("hide")', 2500);
});
} else {
var output = '';
$.each(data.error, function(index,value){
output += '<div class="bold">'+value.title+'</div>';
output += '<ul>';
$.each(value.errorList, function(i, errorMsg){
output += '<li>'+errorMsg+'</li>';
});
output += '</ul>';
});
$("#add_form").html('<div class="red">'+output+'</div>');
}
}
});
});
I checked and this is the only place were any request is made to script 'assign-file-to-publisher' (where I have problem).
UPD 2.
What I found out is that script called twice only in Firefox browser. In Safari request doesn't return anything. In Chrome it returns the following status:
(failed) net::ERR_EMPTY_RESPONSE
I still do not see any error messages in my application logs or apache logs.
As I mentioned script works with big file and it can take up to several minutes to process it.
In my configuration:
max_execution_time = 300
while Chrome return error on 1:40 time.
I have a PHP function
function ExportExcel()
{
// code
}
and a link on the page Download in Excel
<a>Download in Excel</a>
So what I want is when users clicks on that link, PHP function would be called and data will be downloaded in excel.
I may need to Ajax for that. How do I go about doing that ?
You could possibly just use a GET statement, so it would look something like this...
HTML
Download in Excel
PHP
function ExportExcel()
{
// code
}
if($_GET['init'])
{
ExportExcel();
}
here is the function i implemeted recently:
$('#toexcel').live("click",function() {
$.ajax({
url: "toExcel.php",
data: "sql="+encodeURIComponent(sql),
beforeSend: function(){
$("#wait").show();
},
complete: function(){
$("#wait").hide();
},
success: function(response){
window.location.href = response.url;
}
});
});
where sql variable actually stores sql query to the server,
and then toExcel.php if getting passed sql, submitting it to the server and outputs the result using PHPExcel() object.
EDIT
i think i understood what you trying to achieve. your ExporExcel() function already outputs the results you need, right? is so, then you can do it as follow:
$('#toexcel').click(function() {
$.ajax({
url: "toExcel.php", // should contain and _call_ you ExportExcel() function
beforeSend: function(){
$("#wait").show(); // this is loading img to show
},
complete: function(){
$("#wait").hide(); ;// this is loading img to hide once complete
},
success: function(response){
window.location.href = response.url;
}
});
});
first let me make sure you know php is only parsed when the page is first being distributed. If you click a link on the page, it has no idea the php function on the same page exists because the function only existed server-side while the code was being parsed. That being said, you can easily make a separate page called download.php and call your function on that page. Then your link can just link to that page.
If you want your custom download page to return to the user as an excel file, you can use custom php headers to convince the browser that it is downloading an excel file. (you'd have to specify the MIME type for excel files)
edit:
this would cause a download to start of an excel file created by your function call and activated by your link click. You don't need any JS or JQuery for this.
edit2:
here's example code for the download file to get you started
<?php
header("Content-type: application/excel");
print($data); /* print out the contents of the excel file here */
exit();
?>
If you do it like this, your php page will not redirect from your original page, but will bring up a download box from the browser instead. If your using csv files instead of xls files, you'll need to change the mime type.
you can handle the request in your js scrpit file
$("a").click(function(){
jQuery.ajax({
url: "path/to/controller",
type: "POST",
dataType: 'json',
data: {'mentod':'ExportExcel'},
success: successCallback,
error:failureCallback
});
});
Just provide link of that excel file in href of anchor , browser will download automatically
If your file form DB then providelink of excel.php , and in excel.php do processing of getting excel file and creation of it .
read this artical..do like that
I am trying get use of Ajax file uploader
http://valums.com/ajax-upload/
The doc says:
var uploader = new qq.FileUploader({
// pass the dom node (ex. $(selector)[0] for jQuery users)
element: document.getElementById('file-uploader'),
// path to server-side upload script
action: '/server/upload'
// WHAT IS action:?
});
The element property means what element ID is used as Upload button.
What is action? It must be some sort of handler for uploaded files?
How I can handle uploaded files and where are located?
The doc says
// events
// you can return false to abort submit
onSubmit: function(id, fileName){},
onProgress: function(id, fileName, loaded, total){},
onComplete: function(id, fileName, responseJSON){},
onCancel: function(id, fileName){},
I want when upload complete display a list of files somewhere, say in div with ID=list
The short snippet will be highly appreciated and awarded.
I've used File Uploader quite a lot and I think it is the best file uploader out there.
The action is the method (URL) which receives the call from your Ajax client script.
You have to define a DIV in your HTML:
<div id="uploaderFile"></div>
I've used a javascript function to build my uploader around the DIV:
function CreateImageUploader() {
var uploader = new qq.FileUploader({
element: $('#uploaderFile')[0],
template: '<div class="qq-uploader">' +
'<div class="qq-upload-drop-area"><span>Drop files here to upload</span></div>' +
'<div class="qq-upload-button ui-button ui-widget ui-corner-all ui-button-text-only ui-state-default">Seleziona il Listino Excel</div>' +
'<ul class="qq-upload-list"></ul>' +
'</div>',
hoverClass: 'ui-state-hover',
focusClass: 'ui-state-focus',
action: 'Home/UploadImage',
allowedExtensions: ['jpg', 'gif'],
params: { },
onSubmit: function(file, ext) {
},
onComplete: function(id, fileName, responseJSON) {
$("#PopupImageUploader").dialog('close');
}
}
});
}
What happens here is you're creating an uploader around this element element: $('#uploaderFile')[0]. I've used the standard template but you can change the appearance.
When you've done that everything is pretty much setup on the client-side.
On the server side (it depends what you're using) you have to intercept and read the file and persist it.
I use ASP.NET MVC. You can find my action here and here
Your server-side code will manage to persist the file where you want and will return infos to the client script.
Personally I return json data which I manage with the event onComplete to: close a dialog (like in the example); show a message etc etc etc.
If you want to pass parameters back to the client on the server side you can return a JSON object. I would do something like this:
return ("{success:true, newfilename: '" + newFileName + "'}");
I reckon that in PHP could be something like this:
echo {success:true, newfilename: '" + newFileName + "'}";
Forgive me if there are mistakes in that but I've never written a single PHP line in my whole life ;-)
the client side now can check the JSON object like this:
onComplete: function(id, fileName, responseJSON) {
alert(responseJSON.newfilename);
}
As you can see I pass back the result of the action: success:true or success:false so I can show a warning to the user:
onComplete: function(id, fileName, responseJSON) {
if (responseJSON.success)
{
alert(responseJSON.newfilename);
}
else {
alert("something wrong happened");
}
}
How I can handle uploaded files and where are located?
This depends on your webserver and backend language. In PHP have a look at the $_FILES array.
What is action? It must be some sort of handler for uploaded files?
The URL the form used to upload the file is submitted to.
What is action? It must be some sort of handler for uploaded files?
Yes, same as for the HTML <form> element's attribute.
How I can handle uploaded files
With a server side script written in your server side language of preference.
and where are located?
Probably as a stream to STDIN. The forms library you use with the aforementioned server side language have methods to extract it automatically.