How to run a php file in background [duplicate] - php

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Best way to manage long-running php script?
I have to built a big email list.Everything works perfectly,but when i submit the form page is loading untill every email is send.So i want this email sending script run in background.and notice the user that script is runnign in background.
I cant use Ajax.
i want something like.. proc_open,exec,shell_exec..

You can have cron job which would run php script which will get queue from db and send email
On main script you just need to add emails to queue
I would use ajax only if you need progress bar. With ajax solution you would need to keep window open until it's ended.

You could build an AJAX call that calls the php script. This way, your site will still be operational while the request is fulfilled. And when it's finished, your AJAX will return and you can show a messagebox to the user.
For more information, check at least this and if you understand what AJAX is and what it does, use it with this

Ajax request would be the best choice for this. You can send a request using javascript and even report progress to user (which might require some additional work)
If you find ajax too difficult - run script in an iframe. This is not the most elegant, but the most simple method.

Submit the form with AJAX and update the progress in a Div

For example - write to some place "A"(db or file) current state of your script runtime: "complete"/"incomplete". After start script in background send to your user waiting page which using AJAX handling changes at "A".

This Ajax script will execute a PHP file on the background. It could also send the response to a HTML element if you want.
<script type="text/javascript" language="javascript">
function execute(filename,var1,var2,var3)
{
var xmlhttp;
if(window.XMLHttpRequest)
{
//Code for IE7+, Firefox, Chrome, Opera, Safari
xmlhttp = new XMLHttpRequest();
}
else if(window.ActiveXObject)
{
//Code for IE6, IE5
xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");
}
else
{
alert("Your browser does not support AJAX!");
}
var url = filename+"?";
var params = "var1="+var1+"&var2="+var2+"&var3="+var3;
xmlhttp.open("POST", url, true);
xmlhttp.onreadystatechange=function()
{
if(xmlhttp.readyState==4)
{
//Below line will fill a DIV with ID 'response'
//with the reply from the server. You can use this to troubleshoot
//document.getElementById('response').innerHTML=xmlhttp.responseText;
xmlhttp.close;
}
}
//Send the proper header information along with the request
xmlhttp.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
xmlhttp.setRequestHeader("Content-length", params.length);
xmlhttp.setRequestHeader("Connection", "close");
xmlhttp.send(params);
}
</script>

You can try to run the script through an ajax function as well if you don't want to set cron script.

PHP has an function that can keep an process running even if the user that requested the page leaves the page : ignore_user_abort if you check the comments there you can see this example :
<?php
ignore_user_abort(1); // run script in background
set_time_limit(0); // run script forever
$interval=60*15; // do every 15 minutes...
do{
// add the script that has to be ran every 15 minutes here
// ...
sleep($interval); // wait 15 minutes
}while(true);
?>
It IS an pure php cron job BUT, the risk with this script is that it continues indefinitely or atleast untill you reset/kill php.
Setting the set_time_limit(0); to set_time_limit(86400); would kill the script after an day.
This should point you in the right direction/.
IMPORTANT
After the problem by the OP, it is advisable to only run this script if you have SSH access to the server so you can KILL/RESTART php apache in case the server keeps hanging.
Also do not run the script on a LIVE server.

Related

how can i execute php instructions when closing the browser?

I want to execute instructions in the momant of closing the browser, for example the instruction is a requeste sended to update the database?
PHP has nothing to do with the browser. PHP can't interact with it in any way directly, but you can use JavaScript to communicate with your PHP. Doing something when the browser is closed is not always reliable because the browser could close unexpectedly, but you can attempt to gather the info using an onunload event listener to send an HTTP request to run your PHP script.
window.onunload = function(){
var xmlhttp = (window.XMLHttpRequest)? new XMLHttpRequest() : new ActiveXObject("Microsoft.XMLHTTP");
var url = "MyPHPScript.php";
xmlhttp.open("GET",url,false);
xmlhttp.send();
}

How to detect php server timeout on ajax request?

I want to know that suppose you did an ajax request to a page that runs PHP code. The page outputs some data (using flush() method or otherwise) but because of a 30second timeout or some other error, the php request ends.
Now when the request ends, I want to find out about it on client side and restart the request.
i.e suppose I have something like this
xmlhttp.onreadystatechange=function(){
if(xmlhttp.readyState==3 && xmlhttp.status==200){
document.getElementById("A").innerHTML=xmlhttp.responseText;
}
else if(xmlhttp.readyState==4 && xmlhttp.status==200){
document.getElementById("A").innerHTML=xmlhttp.responseText;
//plus some additional code to end the process
}
else if(xmlhttp.status== SOMETHING /*WHAT DO I ADD HERE TO KNOW THAT THE SERVER HAS TIMED OUT OR SOMETHING SO I CAN RESTART THE XMLHTTP CYCLE */){
//code to resend xmlhttp request.
}
}
One strategy I used was to set a timer in JS, and then clear it if the call was successful.
var myTimer = setTimeout(stuffToDoOnFailure, 6000); // 6 secs
ajaxCall(function callBack() {
clearTimeout(myTimer);
});
EDIT:
Of course, if the ajax call succeeds after 6 secs, you might end up with both stuffToDoOnFailure and callBack being executed, so you want to handle that case somehow. That depends on your app, though.

php monitor process

I'm sure similar questions have bee answered over and over again. If yes then I googled in the wrong direction and apologize for that.
My problem:
I'm writing a web page with a process running in the background. The process I'm talking about is a R script which runs quite long maybe several days. When the progress is started, its started like that.
exec(sprintf("%s > %s 2>&1 & echo $! >> %s", $cmd, $outputfile, $pidfile));
Id' like to track whether the process is still running. This way, when the user checks he gets either the message that it is finished or not. The tracking starts and stops when the user chooses the input file he uploaded on to the server after for example he logs in again or did something else on the page. I also want it to update the page when the process finishes so the page changes in case he is just looking at it.
I have two possibilities. I can either check it via the process id or whether an output file is generated or not.
I tried
while(is_process_running($ps)){
ob_flush();
flush();
sleep(1);
}
this works kind of, except that all other functionality on the page freezes.
That's my is_process_running($ps) function.
function is_process_running($PID){
exec("ps $PID", $ProcessState);
return(count($ProcessState) >= 2);
}
What I really need is another process running in the background checking whether the first process is still running and if not, refreshes the page when the first process finishes.
How would you do that? Please let me know if you need additional information. All help is much appreciated.
with symcbean's answer I was able to solve it. My javascript code is below maybe its of use to someone who faces the same problem. I'm also open for improvement. I still consider myself a beginner.
function loadXMLDoc(){
var xmlhttp;
if (window.XMLHttpRequest){// code for IE7+, Firefox, Chrome, Opera, Safari
xmlhttp=new XMLHttpRequest();
}else{// code for IE6, IE5
xmlhttp=new ActiveXObject("Microsoft.XMLHTTP");
}
return xmlhttp;
}
function start(file,time){
var xmlhttp = loadXMLDoc();
if(lines == 0){
timer = setInterval(function(){sendRequest(xmlhttp,file)},time);
}
}
function sendRequest(xmlhttp,file){
xmlhttp.open("POST",file,true);
xmlhttp.send()
xmlhttp.onreadystatechange=function() {
if (xmlhttp.readyState==4 && xmlhttp.status==200 && xmlhttp.responseText != ""){
var text = xmlhttp.responseText;
var splitted = text.split("\n");
var length = splitted.length
if(splitted[length-2]=="finished"){
refresh_page();
clearInterval(timer);
lines = 0;
}
}
}
}
The start method is called with file path and the time interval its supposed to check for changes. Note the refresh page method I did not post but that just whatever you want to refresh on your page. The page is refreshed when the last line in the file says finished.
I included the line variable to check whether the process is already started or not. I have not fully tested the code but so far its doing what I want it to do.
First off, there are a number of issues with the way you are starting the process - it really needs to be a in a separate process group from the PHP which launches it.
As to checking its status, again, you don't want PHP stuff hanging around for a long time at the end of an http connection. Also getting the webserver to flush content to the browser on demand AND getting the browser to render it progressively is very difficult - and fragile. Even if you get it working on one browser/webserver combination it's unlikely to work in another.
Use Ajax to poll a simple script which reports back on the process status / progress.

How to upload a file and display its information

I want to build a web service that will process some files.
Here is what I want to do:
User uploads a file to the server using "upload form", the file is saved as a temporary file on the server-side
Server-side python script processes the temporary file and produces some statistics (for example, number of lines and words in the file)
The statistics are displayed near the "upload form"
The question here is: I would like the file to be processed in the background just after it is uploaded, and after it is done, .append() the produced results to the current view. I do not want to assign a script to <form action="processing_script.php">... because the user will be redirected to the processing_script.php after clicking the Upload button.
Any clues? Maybe some neat ajax call?
function ajaxRequest(){
var activexmodes=["Msxml2.XMLHTTP", "Microsoft.XMLHTTP"] //activeX versions to check for in IE
if (window.ActiveXObject){ //Test for support for ActiveXObject in IE first (as XMLHttpRequest in IE7 is broken)
for (var i=0; i<activexmodes.length; i++){
try{
return new ActiveXObject(activexmodes[i])
}
catch(e){
//suppress error
}
}
}
else if (window.XMLHttpRequest) // if Mozilla, Safari etc
return new XMLHttpRequest()
else
return false
}
function postFile(){
var mypostrequest=new ajaxRequest()
mypostrequest.onreadystatechange=function(){
if (mypostrequest.readyState==4){
if (mypostrequest.status==200 || window.location.href.indexOf("http")==-1){
document.getElementById("my_Result_tag").innerHTML=mypostrequest.responseText //this is where the results will be put!
}
else{
alert("An error has occured making the request")
}
}
}
var file = document.getElementById("my_file");
var parameters="file="+file //i am not sure of this peice though
mypostrequest.open("POST", "basicform.php", true)
mypostrequest.setRequestHeader("Content-type", "application/x-www-form-urlencoded")
mypostrequest.send(parameters)
}
Yeah, you'll need ajax for that. Create the form as usual, then submit it using Ajax. Form handling can be done as usual.
If you google 'file upload Ajax' I'm sure you can find everything you need :)
yeah, i'd made second ajax request and run it with schedule (e.g. every 10 seconds). it will query the server if uploaded file is processed. the server may even do the file processing in external program. the php-script that accepts second ajax request checks some READY status and give client the answer YES/NO/FAILED. when client accepts YES answer it refirects user to the RESULTS PAGE. if it accepts NO, it alerts user the problem.

Causing two things to load in parallel?

I'm writing some PHP that does a fair amount of processing and then generates reports of the results. Previously it would do a periodic flush() but we're moving to Zend Framework and can't do that anymore. Instead, I would like to have some kind of status display that updates while the report is generated. So I made a progress bar that loads in an iframe, added shared memory to the progress bar update action and the report generation action, and caused the output to load via xmlhttprequest. This all works fine. My issue is that the browser wants to do the two requests serially instead of in parallel, so it will request the progress bar and then BLOCK until the progress bar completes BEFORE it requests the actual output. This means that the process will never end since the real work never starts.
I've searched all morning for some way around this and came up empty-handed.
Is there some way to cause two connections, or am I just screwed?
My next action will be to break the processing apart some more and make the status updating action do the actual work, save the result, and then use the other action to dump it. This will be really painful and I'd like to avoid it.
Edit: Here is the javascript, as requested:
function startProgress()
{
var iFrame = document.createElement('iframe');
document.getElementsByTagName('body')[0].appendChild(iFrame);
iFrame.id = 'progressframe';
iFrame.src = '/report/progress';
}
function Zend_ProgressBar_Update(data)
{
document.getElementById('pg-percent').style.width = data.percent + '%';
document.getElementById('pg-text-1').innerHTML = data.text;
document.getElementById('pg-text-2').innerHTML = data.text;
}
function Zend_ProgressBar_Finish()
{
document.getElementById('pg-percent').style.width = '100%';
document.getElementById('pg-text-1').innerHTML = 'Report Completed';
document.getElementById('pg-text-2').innerHTML = 'Report Completed';
document.getElementById('progressbar').style.display = 'none'; // Hide it
}
function ajaxTimeout(){
xmlhttp.abort();
alert('Request timed out');
}
var xmlhttp;
var xmlhttpTimeout;
function loadResults(){
if (window.XMLHttpRequest){
// code for IE7+, Firefox, Chrome, Opera, Safari
xmlhttp=new XMLHttpRequest();
}else{
// code for IE6, IE5
xmlhttp=new ActiveXObject(\"Microsoft.XMLHTTP\");
}
xmlhttp.open(\"POST\",\"/report/output\",true);
xmlhttp.onreadystatechange=function(){
if (xmlhttp.readyState == 4 && xmlhttp.status == 200) {
clearTimeout(xmlhttpTimeout);
document.getElementById('report-output').innerHTML=xmlhttp.responseText;
}
}
var xmlhttpTimeout=setTimeout(ajaxTimeout,600000); // Ten minutes
xmlhttp.setRequestHeader('Content-Type','application/x-www-form-urlencoded');
xmlhttp.send('".file_get_contents("php://input")."');
}
This gets called from the following onload script:
onload="startProgress(); setTimeout(loadResults,1000);"
The issue is not in Javascript. If you put an alert() in there, the alert will be triggered at the right time, but the browser is delaying the second http transaction until the first completes.
Thank you everyone for your input.
I didn't come up with a satisfactory answer for this within the timeframe permitted by our development schedule. It appears that every common browser wants to re-use an existing connection to a site when doing multiple transactions with that site. Nothing I could come up with would cause the browser to initiate a parallel connection on demand. Any time there are two requests from the same server the client wants to do them in a serial fashion.
I ended up breaking the processing into parts and moving it into the status bar update action, saving the report output into a temporary file on the server, then causing the status bar finish function to initiate the xmlhttprequest to load the results. The output action simply spits out the contents of the temporary file and then deletes it.
Using two async ajaxes could do the trick. With the first ajax request you should start the process by calling the php-cli to do the actual work deep in the background (so it doesn't expire or cancel) and return the id of the process (task). Now when you have the process id, you can start the periodical ajax to display the process made.
Making a db table containing process_id, state, user would not be a bad thing. In this case even if the user would close the browser while the process is running, the process would continue until done. The user could revisit the page and see the percentage done, because the process running in cli would save the progress into the db table.
Make a system call to the php file and detach it?
ex:
exec('nohup php test.php > test.out 2> test.err < /dev/null &');
echo 'I am totally printing here';
test.php contains a sleep for 2 seconds and prints, but echo returns immediately.
Have it store the results in a file/database/whatever. It will act like a very dirty fork.
You could also do something similar with a CURL call I bet if you have issues executing.
Credit here for the code example from bmellink (mine was way worse than his).
If you are able to load the report in the iFrame, you can kind of reverse your logic (I have done this to track file uploads to PHP).
Load Report in iFrame (can be hidden or whatever you like).
Make ajax call to get progress (step 1 will have to log progress as others have mentioned).
When the progress reports loading complete, you may show the iframe or whatever is needed to complete.
Hope that helps. Just did a whole lot with iFrames, CORS, and Ajax calls to API's.

Categories