AJAX upload file with progres + FFMPEG progress - php

The task is to implement the following:
1. On the page there is a form for downloading a file, the file should be loaded asynchronously, without rebooting with the help of AJAX and displaying the upload progress
2. If the file is successfully downloaded, send a command to the server, which should check the queue, and if the queue is empty, inform the client that the processing of the file has begun, then start the processing itself
3. If processing has begun, then show the progress of this processing
What did you manage to implement
File upload was successfully implemented, after a successful upload, the server issues a line with the status successfully, and the full path of the downloaded file
Upload JS snippet
$.ajax({
url: 'api/upload/load',
type: 'post',
data: fd,
contentType: false,
processData: false,
xhr: function() {
var xhr = new window.XMLHttpRequest();
xhr.upload.addEventListener("progress", function(evt) {
if (evt.lengthComputable) {
var percent = Math.ceil((evt.loaded / evt.total) * 100);
console.log(percent) // Upload progress
}
}, false);
return xhr;
},
success: function(data){
var response = JSON.parse(data);
if(response.status == 'success'){
//if upload success, run command
startConversion(response.path)
}
}
});
Reponse from api/upload/load
{"status":"success","path":"<path>"}
In the AJAX script, I do a check: if the response.status == 'success', then we call the startConversion function, which takes the full path of the newly loaded file as an argument. This function sends another AJAX request to the server, where the queue check is in progress.
startConversion JS Snippet
function startConversion(path){
$.ajax({
url: '/api/upload/conversion',
type: 'POST',
data: { path: path },
success: function(data){
var response = JSON.parse(data);
// if response.status == 'start', run checkConversion function
}
})
Code /api/upload/conversion
$queue = $this->model->getQueue();
if($queue < 5){
// If queue is empty, notify browser
$output = array('progress' => 'start');
echo json_encode($output);
// Then start shell_ecxec
$cmd = 'some_command';
shell_exec('ffmpeg.exe -i ' . $path . ' ' . $cmd . ' 1>log.txt 2>&1 ');
} else {
$output = array('progress' => 'waiting');
echo json_encode($output);
}
Here, for some reason, a message is already not being displayed stating that processing has begun (echo before $cmd)
Full link
And here the most interesting begins. It would seem that it would be possible in the method to success call the check on the status of the processing (start, queue or an error) and if the processing began to call the third function, which, again AJAX, polls the server returns the progress of the processing
But in reality, the following happens: as soon as the second function sends the request, nothing is returned in response, but processing is in progress, and in the console it shows the status of the request - PENDING. As soon as the server completes processing, a response is issued from the server that the queue is empty and you can start processing the progress, although the processing has already been completed
There are suggestions that further execution of the script is blocked by the command shell_exec() until the end of her work
Also, it is not clear to me why the response from the server is not issued when the echo is clearly registered in it, and why the browser is waiting for the complete completion of the work shel_exec() because after it, the code does not have one. What should be done, that is, as if according to the logic of things and by the code, the server should give an answer in the form of a JSON string, the browser should show that the request has completed with the status of 200, and the server, in the meantime, should start the conversion, but for some reason this does not happen...

Related

ajax/php on click submit form redirect but execute send.php

(sorry for my english its not my native language)
simple question,
can i execute php file to deal with form data (including files) but redirect user without waiting for response from my send.php file?
there are a lot of file uploading in my php process which means a lot of waiting time to complete. i dont want the user to wait that long...
at the moment im using ajax to prevent the user from redirecting to send.php which make php execute but the user is still waiting until the "success" function invoked after the file uploading and php is finished, which again means user wait forever ...redirecting before success function prevent the php from finishing.
what i want is that the second the user click submit, launch send.php in the background and redirect the user to another page, without even waiting for response from the send.php file.
in my php i check for errors and deal with that so the user dont even need to know of any errors.
(btw everything is working just not happy with the procces)
my ajax:
$(document).on('submit', '#myForm', function(e) {
e.preventDefault();
$.ajax({
url: $('#myForm').prop('action'),
type: $(this).attr('method'),
data: new FormData($('#myForm')[0]),
contentType: false,
processData: false,
success: function(data) {
alert('Form submitted');
window.location = "http://www.somesite.com/";
},
error: function() {
// The request failed - So something here
alert('Form NOT submitted'); //display an alert whether the form is
console.log('form not submited ERROR');
}
});
});
You must wait until the file is uploaded.
After the upload is finished you could start a background proccess (thread), for example if you have to manipulate the upload.
But thread handling is not possible with PHP.

PHP: script executing twice

I have a php script which processes XML file uploaded by user . When file is big (like 50MB) it can take several minutes. On small files it works how it's expected. But I faced with one problem with large ones.
So, my file looks like:
if(file_exist($filename)) {
return array(false, "File with this name already exist");
} else {
/*
Processing of file
*/
return array(true, "")
}
When user upload file with javascript uploader, I make ajax request to this script. And the thing is that for large file it execute twice. For the first time it go to the "processing" section, at the second time it returns error that file with this name already exist (which is true, actually).
I add logging function in the beginning of the script
$logger = Zend_Registry::get('logger');
$logger->log('assign file function called', 7);
and I see that script has been called twice.
But in Firebug I see only 1 ajax request to the script. In Apache access.log I see only one request. Apache error.log is empty.
Any idea what it can be? Probably any configuration for long-time-executed scripts?
UPD Javascript for calling script
$("#save_file_btn").click(function(){
var filename = $("input#selected_file").val();
if(!filename.length) {
customAlert("Choose file at first");
return false;
}
$.ajax({
url: '/otms/publisher/assign-file-to-publisher',
beforeSend: function(xhr){
$("#add_form").html('<div class="loader"></div>');
},
dataType: 'json',
data: {filename: filename},
success: function(data) {
if(data.success) {
//Process added ONIX file
processOnixFile(data.fileId, function(){
getFileList();
$("#add_form").html('<div class="text-center">File has been successfully uploaded</div>');
setTimeout('$("#add_file_modal").modal("hide")', 2500);
});
} else {
var output = '';
$.each(data.error, function(index,value){
output += '<div class="bold">'+value.title+'</div>';
output += '<ul>';
$.each(value.errorList, function(i, errorMsg){
output += '<li>'+errorMsg+'</li>';
});
output += '</ul>';
});
$("#add_form").html('<div class="red">'+output+'</div>');
}
}
});
});
I checked and this is the only place were any request is made to script 'assign-file-to-publisher' (where I have problem).
UPD 2.
What I found out is that script called twice only in Firefox browser. In Safari request doesn't return anything. In Chrome it returns the following status:
(failed) net::ERR_EMPTY_RESPONSE
I still do not see any error messages in my application logs or apache logs.
As I mentioned script works with big file and it can take up to several minutes to process it.
In my configuration:
max_execution_time = 300
while Chrome return error on 1:40 time.

Ajax Call Hangs Browser - I don't care about the response

How do I abort an Ajax call if I don't really care for the response as I don't want to hang the browser?
The situation is I have an ajax call that could trigger the server sending over a 1000 emails in some cases. 99% of the time it is only a few or tens of emails.
So with the 1000 email ajax call, the browser waits sometimes 5 minutes before it gets the success message, thus the user has to wait.
I have tried setting a timeout, but this still hangs. I'd like to wait about 20 seconds and then abort waiting for the response.
var request = jQuery.ajax({
type: "post",url: "admin-ajax.php",
data: {
action: 'send_email',
emailHTMLMessage: tinyMCE.activeEditor.getContent(),
_ajax_nonce: '<?php echo $nonce; ?>'
},
timeout: 20000, //Set your timeout value
success: function(html){ //so, if data is retrieved, store it in html
window.scrollTo(0,0);
sendResults(html);
},
error: function(jqXHR, textStatus, errorThrown) {
if(textStatus==="timeout") {
<... I'd redirect to another html page here....>
} else {
alert("Another error was returned"); //Handle other error type
}
}
}); //close jQuery.ajax
I have tried request.abort() but this kills it immediately and the server never gets the send_email message.
How can I quietly ignore the response after 20secs while the server carries on doing it's thing?
In this post there are afew ways to keet the script running after the http request ends:
Can a PHP script start another PHP script and exit?
So you can leave your email script running on the server.
If you want to know the status you could make your email script update to a mysql table how many emails are sent and check with an ajax request the count from the mysql table.
If you're sending 1000 emails, that call is going to contain some overhead no matter what you do. You are also going to have to wait until all the information is sent to the server before you want to allow the user to leave the page as well.
My suggestion would be to change the server code to respond as soon as the server gets their request. That way the client isn't waiting on the server to finish sending the entire batch, rather just waiting for their request to be received by the server.
Just post without using the success, timeout or error callbacks.
You might want to try creating a new get request for each email:
for(i = 0; i<1000; i++){
var request = $.ajax({
type: "post",
url: "admin-ajax.php",
data: {'i':'nada'}
});
};
In this case, I used a for loop, but you could also use an array of emails:
var emails = new Array(1000);
$.each(emails, function(){
var request = $.ajax({
type: "post",
url : "admin-ajax.php",
data: {'i':'nada'}
});
});

JQuery Ajax using post - is it blocking?

Is there a way to make jQuery's post method wait for server side code to complete?
In my example below, post is not waiting for my php script to finish. Though php is calling sleep(10), post returns right away, resulting in javascript clearing out the value in #temsg and changing the text of $sendmsg too early.
$('#sendmsg').click(function() {
$("#sendmsg").html("sending....");
var msg = $("#temsg").val();
var to_id = 123;
$.post("http://localhost:8888/ci/index.php/members/addMessage",
{message: msg, tomember: to_id},
function(data){
$("#temsg").val('');
$("#sendmsg").html("Leave Message");
},'json');
$("#infomsg").show();
$("#infomsg").html("Message Sent!");
setTimeout(function() { $("#infomsg").hide('slow'); }, 3000);
});
Ajax is (supposed to be) asynchronous - that means that the $.post() method is non-blocking and returns immediately and the rest of your function continues executing, and then eventually when a response comes back the success handler is called.
You can make the JS code pause until the Ajax request returns by doing a synchronous (blocking) request, but given that (most) browsers run JavaScript on the same thread as the UI that means the browser will not respond to anything until the response comes back which is horrible for the user - essentially the browser would be locked up for the ten seconds that your server-side code is sleeping.
The solution is to stick with the default asynchronous request but move the code from after your $.post() call into the success handler:
$('#sendmsg').click(function() {
$("#sendmsg").html("sending....");
var msg = $("#temsg").val();
var to_id = 123;
$.post("http://localhost:8888/ci/index.php/members/addMessage",
{message: msg, tomember: to_id},
function(data){
$("#temsg").val('');
$("#sendmsg").html("Leave Message");
$("#infomsg").show();
$("#infomsg").html("Message Sent!");
setTimeout(function() { $("#infomsg").hide('slow'); }, 3000);
},'json');
});
Ajax is asynchronous. The fact the code keeps running doesn't mean the sleep didn't occur.
That thread on the server "sleeps" , while javascript continue executing the next lines.
Example how to use async:false:
$.ajax({
url: "http://localhost:8888/ci/index.php/members/addMessage",
async: false,
data: {message: msg, tomember: to_id},
dataType: 'json',
success: function(data){
$("#temsg").val('');
$("#sendmsg").html("Leave Message");
}
});
ajax docs

inform user during php calculation w/jquery

I'm writing code in PHP that analyzes user input.
I'm hoping to analyze it through a AJAX request using jquery.
I'd like to provide real-time feedback to the user while I'm preforming the calculations.
For example:
"Uploading your input", "Analyzing", "Preparing final result" and so forth.
How can I go abut doing this?
You will have to have a different back-end script do the processing than the one you are sending your request to. Your original ajax request can store the user input to be analyzed, and another process check for new data to work on regularly and start working when it finds some. That background process can then record its progress, e.g. using a file or a database.
The subsequent ajax requests will check that progress file or database entry and display the progress to the user.
Another (more complicated) solution would be to use Comet to push information on the status from the server to the browser. There is a comet plugin for JQuery in the works, as described in StackOverflow question 136012.
Assuming you have a service located at /service-status.php that checked the status of the job and returned a string you could do something like this in a interval.
var intervalId;
intervalId = setInterval( function() {
$.ajax({
type: "POST",
url: "/service-status.php",
data: "jobid=" + id,
success: function(msg){
if (msg === 'Finished') {
clearInterval( intervalId );
}
alert( "Status: " + msg );
},
error: function(XMLHttpRequest, textStatus, errorThrown) {
alert("He's dead Jim"):
clearInterval( intervalId );
}
})
}, 500);
This would poll your service every 500ms. It also assumes that you return 'Finished' when done. Adjust accordingly. I might put a counter in there too to clear the interval just in case so you don't DDOS your own server.

Categories