AJAX: Detect connection closed, PHP continues - php

Situation: User uploads a number of photos via AJAX, then continues interacting with the website whilst a PHP script continues running in the background and generates a variety of thumbnails based on the uploaded photos.
Site config:
jQuery AJAX (v1.9.1)
PHP 5.4.7, FastCGI mode
IIS 7.5, with gzip
Previous posts that I've referred to and tried to implement (but to no avail):
Closing a connection early with PHP
PHP: continue after output is complete
Send AJAX results but continue processing in PHP
close a connection early
Disable Gzip compression for single php file with IIS
http://www.php.net/manual/en/features.connection-handling.php#71172
I have tried a huge number of script options based on the previous posts, however none seem to tell the AJAX script to let the user continue, whilst the PHP continues to process...
Example PHP code:
<?php
// Save images to db, etc
// Now tell AJAX to let the user continue, before generating thumbnails
if(ini_get('zlib.output_compression')) {
ini_set('zlib.output_compression', 'Off'); // turn IIS gzip for this file
}
ob_end_clean();
header("Connection: close");
header("Content-Encoding: none"); //ensures gzip is not sent through
ob_start();
echo '123'; // 3 digit number will be sent back to AJAX
$size = ob_get_length(); // should mean Content-Length = 3
header("Content-Length: $size");
ob_end_flush();
flush();
ob_end_clean();
// Generate thumbnails, etc
?>
Example jQuery AJAX code:
$.ajax({
type: 'POST',
url: ajax_url,
data: { foo: bar },
beforeSend:function(){
//
},
success:function(data){
alert(data); // Only seems to be firing once the thumbnails have been generated.
}
});
The response headers seem okay...
Question: How do I get AJAX to allow the user to continue once it has received the code from the middle of the PHP script, whilst the PHP script continues to generate the thumbnails?

If you run request, it will always wait until PHP Script finish executing, or there will be a timeout. So you cannot stop AJAX in middle, and keep PHP running. If you want to upload files, and then create thumbnails, but have info that files are uploaded, do it in two steps:
upload files with AJAX -> return success
run another AJAX request on success to get uploaded images (or thumbs in fact).
Thanks to that, thumbs can be also rendered later, when they are first time requested (even without ajax). If you don't want requesting and waiting for thumbs, use cron job on server, which will create thumbs for awaiting images.

Related

Would using header("Location: target"); avoid the execution of the rest of the code in the document

By using this:
$size = ob_get_length();
header("Content-Length: $size");
header('Connection: close');
ob_end_flush();
ob_flush();
flush();
Along ignore_user_abort(true); I get to receive with an ajax call a complete status and let the file handle server side without the user having to wait for a response for the contents of the file to be parsed.
I'd like to achieve the same but with a header("Location: target"); instead of header('Connection: close'); - so it looks as everything is finished but the file continues to parse after triggering header("Location: target");
What would be the right approach into letting the file to continue working but redirect the user with PHP.
By the way before five down votes in a row, the question is not duplicate of PHP Redirect User and Continue Process Script although the title seems to resemble this question.
what would be the right approach into letting the file to continue
working but redirect the user with PHP?
You're describing parallel processing with an immediate exit, which you'll have to fudge in PHP:
open the parent PHP page
spawn a new thread to another php file's functionality (in PHP, threading doesn't exist, so you'd have to use curl, or some other means of executing it)
redirect
exit()
think of step two in the same way as you would if you were firing off an ajax request about which you didn't care about the response. it would be done in parallel to your parent page.
no, it does not stop the execution but you can manually stop the execution with exit call

Background php file processing without AJAX

Working with files compiler that puts a bit of a strain on FIRST homepage hit until the css files are compiled thus fooling the visitor that site is slow,
I have successfully routed the function to ajax witch works very well but I cant let it go since it is bound to js and would love to do this with php
the ajax mootools
new Request({
url: linktofile... and basic responses
does nothing else but is accessing php file which says
// sitename.com/?view=custom
file_get_contents('OF_THE_website_but_different_view');
so how would I do this from php was trying exec() , cant get to work (WAMP local)
CURL , would be same as if I would do file_get_contents() within homepage and slows everything down again was trying to see how proc_open() works but I cant figure it out
so if anyone could rout me to use any php function as a fallback for ajax and call my file_get_contents() file in backend without slowing the homepage down.
please do not suggest SHELL , CRON , or any other lang but PHP
Thank you!
Fund awesome reference
PHP Background Process Still Affecting Page Load
http://php.net/features.connection-handling#71172
header("Connection: close");
ob_start();
phpinfo();// or my page buffer previously saved in variable
$size=ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
file_get_contents('OF_THE_website_but_different_view');
works like charm

Offer a generated file for download from jQuery post

I've got a large form where the user is allowed to input many different fields, and when they're done I need to send the contents of the form to the server, process it, and then spit out a .txt file containing the results of the processing for them to download. Now, I'm all set except for the download part. Setting the headers on the response to the jQuery .post() doesn't seem to work. Is there any other way than doing some sort of iframe trick to make this work (a la JavaScript/jQuery to download file via POST with JSON data)?
Again, I'm sending data to the server, processing it, and then would like to just echo out the result with headers to prompt a download dialog. I don't want to write the result to disk, offer that for download, and then delete the file from the server.
Don't use AJAX. There is no cross-browser way to force the browser to show a save-as dialog in JavaScript for some arbitrary blob of data received from the server via AJAX. If you want the browser to interpret the results of a HTTP POST request (in this case, offering a download dialog) then don't issue the request via AJAX.
If you need to perform some kind of validation via AJAX, you'll have to do a two step process where your validation occurs via AJAX, and then the download is started by redirecting the browser to the URL where the .txt file can be found.
Found this thread while struggling with similar issue. Here's the workaround I ended up using:
$.post('genFile.php', {data : data}, function(url) {
$("body").append("<iframe src='download.php?url="+url+"' style='display: none;'></iframe>");
});
genFile.php creates the file in staging location using a randomly generated string for filename.
download.php reads the generated file, sets the MIME type and disposition (allowing to prompt using a predefined name instead of the random string in the actual filename), returns the file content and cleans up by deleting the source file.
[edit] might as well share the PHP code...
download.php:
<?php
$fname = "/tmp/".$_GET['url'];
header('Content-Type: text/xml');
header('Content-Disposition: attachment; filename="plan.xml"');
echo file_get_contents($fname);
unlink ($fname);
?>
genFile.php:
<?php
$length = 12;
$chars = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789";
$str = substr( str_shuffle( $chars ), 0, $length ).'.xml';
$fh = fopen(('tmp/'.$str), 'w') or die("can't open file");
fwrite($fh,$_POST["data"]);
fclose($fh);
echo $str;
?>
Rather than using jQuery's .post(), you should just do a normal POST by submitting the form, and have the server respond with appropriate Content-Encoding and MIME-type headers. You can't trigger a download through post() because jQuery encapsulates the returned data.
One thing I see in use rather frequently, though, is this:
$.post('generateFile.php', function(data) {
// generateFile builds data and stores it in a
// temporary location on the server, and returns
// the URL to the requester.
// For example, http://mysite.com/getFile.php?id=12345
// Open a new window to the returned URL which
// should prompt a download, assuming the server
// is sending the correct headers:
window.open(data);
});

file upload and move to another server

I have an application, which has one input=file
Now I need to upload to my server, and then move file to some other server. How can I avoid time out?
Also, any good suggestion for ajax uploader. Thanks.
Flash Uploader: Undoubtedly, SWFUpload or Uploadify (based on the latter).
File Transfer: Use PHP CURL to do an HTTP POST form transfer (http://www.php.net/manual/en/function.curl-setopt.php see the 2nd example).
Before doing the transfer do the following:
set_time_limit(-1); // PHP won't timeout
ignore_user_abort(true); // PHP won't quit if the user aborts
Edit: I don't see a valid reason why you would need a CRON job unless the file in question changes at some time (which is the real definition of sync-ing). On the other hand, if what you want is to just copy the file to a remote server, there's no reason you can't do it with plain PHP.
Also, one thing you should be aware of is file sizes. If the file size in anything less than 20mb, you're safe.
Edit 2: By the way, with the right conditions, (output buffering off, and implicit output on), you can show the user the current remote transfer progress. I've done, it ain't hard really. You just need a hidden iframe which sends progress requests to update the parent window.
It works kind of like AJAX, but using an iframe in place of XHR (since XHR returns as a bulk, not in blocks, unlike an iframe).
If interested, I can help you out with this, just ask.
Edit3: Dynamic remote upload example/explanation:
To make things short, I'll assume that your file has already been uploaded to the server by the user, but not the target remote server. I'll also assume the user landed on handle.php after uploading the file.
handle.php would look like:
// This current script is only cosmetic - though you might want to
// handle the user upload here (as I did)
$name = 'userfile'; // name of uploaded file (input box) YOU MUST CHANGE THIS
$new_name = time().'.'.pathinfo($_FILES[$name]['name'],PATHINFO_EXTESION); // the (temporary) filename
move_uploaded_file($_FILES[$name]['tmp_name'],'uploads/'.$new_name);
$url = 'remote.php?file='.$new_name; ?>
<iframe src="<?php echo $url; ?>" width="1" height="1" frameborder="0" scrolling="no"></iframe>
<div id="progress">0%</div>
<script type="text/javascript">
function progress(percent){
document.getElementById('progress').innerHTML='%'+percent;
}
</script>
Doesn't look difficult so far, no?
The next part is a little more complex. The file remote.php would look like:
set_time_limit(0); // PHP won't timeout
// if you want the user to be able to cancel the upload, simply comment out the following line
ignore_user_abort(true); // PHP won't quit if the user aborts
// to make this system work, we need to tweak output buffering
while(ob_get_level())ob_end_clean(); // remove all buffers
ob_implicit_flush(true); // ensures everything we output is sent to browser directly
function progress($percent){
// since we're in an iframe, we need "parent" to be able to call the js
// function "progress" which we defined in the other file.
echo '<script type="text/javascript">parent.progress('.$percent.');</script>';
}
function curlPostFile($url,$file=null,$onprogress=null){
curl_setopt($ch,CURLOPT_URL,$url);
if(substr($url,0,8)=='https://'){
curl_setopt($ch,CURLOPT_HTTPAUTH,CURLAUTH_ANY);
curl_setopt($ch,CURLOPT_SSL_VERIFYPEER,false);
}
if($onprogress){
curl_setopt($ch,CURLOPT_NOPROGRESS,false);
curl_setopt($ch,CURLOPT_PROGRESSFUNCTION,$onprogress);
}
curl_setopt($ch,CURLOPT_HEADER,false);
curl_setopt($ch,CURLOPT_USERAGENT,K2FUAGENT);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,true);
curl_setopt($ch,CURLOPT_FOLLOWLOCATION,true);
curl_setopt($ch,CURLOPT_MAXREDIRS,50);
if($file){
$fh=fopen($file);
curl_setopt($ch,CURLOPT_INFILE,$fh);
curl_setopt($ch,CURLOPT_INFILESIZE,filesize($file));
}
$data=curl_exec($ch);
curl_close($ch);
fclose($fh);
return $data;
}
$file = 'uploads/'.basename($_REQUEST['file']);
function onprogress($download_size,$downloaded,$upload_size,$uploaded){
progress($uploaded/$upload_size*100); // call our progress function
}
curlPostFile('http://someremoteserver.com/handle-uplaods.php',$file,'onprogress');
progress(100); // finished!
Use i.e. scp or rsync to transfer the file to another server. Do that with a cron job every couple of minutes, not from your php script - that will prevent any timeouts occuring if the server-to-server transfer takes too long.

PHP Output complete notification

I'm trying find a way to have PHP to indicate to the browser that all page output is complete. After the page is done we're running some statistics code that usually doesn't take to long but in case it does I don't want to have the users browser waiting for more data. This can't be done via JavaScript because it needs to work with mobile phones.
I'm already starting output buffering using
mb_http_output("UTF8");
ob_start("mb_output_handler");
to insure I don't have issues with my sites MB text (Japanese). I was hoping that ob_end_flush() would do the trick but if I place sleep(10); after the ob_end_flush() the browser waits an additional 10 seconds. Does anyone have any ideas about this?
UPDATE:
Using jitters approach below I "ob_gzhandler" to get it working with gzip any one see any possible issues here?
//may be also add headers for cache-control and expires (IE)
header("Connection: close"); //tells browser that connection will be closed
ob_start();
ob_start("ob_gzhandler");
//page content
ob_end_flush();
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
UPDATE AGAIN:
Please take another look at the code above. You need to do an ob_start(); before the ob_start("ob_gzhandler"); and then call ob_end_flush(); prior to calling ob_get_length() so that you get the correct gzip compressed size.
Use something along these lines
//may be also add headers for cache-control and expires (IE)
header("Connection: close"); //tells browser that connection will be closed
ob_start();
//generate your output
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
//continue statistic processing
I don't think there is a way to notify the browser that the output is complete, at least from the script that sends the output. If you use some other script that will monitor the output of your first script and use an iframe maybe then you might be able to do it.
The browser knows when the output is complete when the page is considered loaded. That is what the browser knows.
You could fork a new php process in the background and let that take care of the stats. Something like:
shell_exec('php stats.php &');
The & at the end makes sure that it's run in the background, so even if the stats.php takes 20 seconds, the visitor won't notice it.
You would probably need to pass data to the stats script, which you can do by passing in parameters, like this:
shell_exec('php stats.php -b '. escapeshellarg($_SERVER['HTTP_USER_AGENT']) .' &');
In stats.php, you'd use the $argv variable to get that data.
But I wouldn't do this if the statistics code doesn't take that long to run, since forking a new process for every page load like this has some overhead. I don't know what it is that makes the stats code take a long time to process, but another solution might be to insert the raw data into a database, and let a background job work on that data to create usable statistics. That could be done either by a cron job, or having a screen run in an infinte loop that processes the queue.
Try to move your statistics code to a seperate function and call this function with an ajax call in the dom.ready or onload event in javascript code on your rendered page like in this meta code:
<html>
<script type="text/javascript">
dom.onready = Ajax.call(location.href + '?do_stats');
</script>
<body>...
</html>
The dom.ready event can be provided by jQuery or Prototype libraries. Downside is it will only work with js enabled.
Alternatively you could just record all needed information for the stats to a database and dispatch a script collecting the queued data from there and working on it in the background - eg by using cron.

Categories