AJAX Script doesn't respond until background program ran with exec() ends - php

I run a background PHP program with exec() like this :
exec('/usr/bin/php bgScript.php "arg1" "arg2" > /dev/null 2>&1 &');
It works and the program does run in background.
Problem
I have Output Buffering Enabled and would like to keep it that way.
My whole script is this :
exec('/usr/bin/php bgScript.php "arg1" "arg2" > /dev/null 2>&1 &');
echo json_encode(array(
"status" => "started"
));
When an AJAX request is made to the file above, the process is started and is in background. I assume this because, further requests to the server returns data and doesn't wait for the previous AJAX script to finish.
But, the problem is that the JSON data is not outputted until the background process is completed.
Since the program is made to run in the background, shouldn't the JSON Data be outputted without waiting for the exec() to end ? I don't know how to say this techinically (Forgive me) : Why does the Output Buffer continue until exec() ends ?
How can I make the script output the JSON Data right after the program is started in the background and close the connection between the AJAX script and browser ?

The command does not run in the background if it's stdio is not redirected. From the official documentation
Note:
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
There are several methods to alter this behavior in Unix like systems you can run the command with command >/dev/null & on windows you can use start command.

Related

php - exec() run in background [duplicate]

I have a process intensive task that I would like to run in the background.
The user clicks on a page, the PHP script runs, and finally, based on some conditions, if required, then it has to run a shell script, E.G.:
shell_exec('php measurePerformance.php 47 844 email#yahoo.com');
Currently I use shell_exec, but this requires the script to wait for an output. Is there any way to execute the command I want without waiting for it to complete?
How about adding.
"> /dev/null 2>/dev/null &"
shell_exec('php measurePerformance.php 47 844 email#yahoo.com > /dev/null 2>/dev/null &');
Note this also gets rid of the stdio and stderr.
This will execute a command and disconnect from the running process. Of course, it can be any command you want. But for a test, you can create a php file with a sleep(20) command it.
exec("nohup /usr/bin/php -f sleep.php > /dev/null 2>&1 &");
You can also give your output back to the client instantly and continue processing your PHP code afterwards.
This is the method I am using for long-waiting Ajax calls which would not have any effect on client side:
ob_end_clean();
ignore_user_abort();
ob_start();
header("Connection: close");
echo json_encode($out);
header("Content-Length: " . ob_get_length());
ob_end_flush();
flush();
// execute your command here. client will not wait for response, it already has one above.
You can find the detailed explanation here: http://oytun.co/response-now-process-later
On Windows 2003, to call another script without waiting, I used this:
$commandString = "start /b c:\\php\\php.EXE C:\\Inetpub\\wwwroot\\mysite.com\\phpforktest.php --passmsg=$testmsg";
pclose(popen($commandString, 'r'));
This only works AFTER giving changing permissions on cmd.exe - add Read and Execute for IUSR_YOURMACHINE (I also set write to Deny).
Use PHP's popen command, e.g.:
pclose(popen("start c:\wamp\bin\php.exe c:\wamp\www\script.php","r"));
This will create a child process and the script will excute in the background without waiting for output.
Sure, for windows you can use:
$WshShell = new COM("WScript.Shell");
$oExec = $WshShell->Run("C:/path/to/php-win.exe -f C:/path/to/script.php", 0, false);
Note:
If you get a COM error, add the extension to your php.ini and restart apache:
[COM_DOT_NET]
extension=php_com_dotnet.dll
If it's off of a web page, I recommend generating a signal of some kind (dropping a file in a directory, perhaps) and having a cron job pick up the work that needs to be done. Otherwise, we're likely to get into the territory of using pcntl_fork() and exec() from inside an Apache process, and that's just bad mojo.
That will work but you will have to be careful not to overload your server because it will create a new process every time you call this function which will run in background. If only one concurrent call at the same time then this workaround will do the job.
If not then I would advice to run a message queue like for instance beanstalkd/gearman/amazon sqs.

Execute PHP and let Apache do the rest in the background

This might be a very stupid question so please bear with me.
I have a a php script that makes API calls to Shopify.
The entire point of this php script is to print out statements for each customer.
Now it has to run through about 200 customers.
This entire process takes about 15 minutes.
Ordinarily this runs on a monthly basis with a cron job.
But I need to be able to run it manually as well. I just want the page to execute and do everything in the background with my browser or internet connection playing NO role as to whether the complete execution completes.
The cron job runs header_php.php?run=monthly
Is there anyway I can run it manually, make sure it gets a 200 response from the page, and then close my browser tab and ensure that apache does the rest?
I would be executing it via an AJAX call as well.
Another thing, once each statement is done being processed, the script outputs it to pdf and emails it to the customer. So there's no feedback required from the page when it runs.
Easily doable with simple HTTP headers.
Start output buffering
Output the response, if any
Send Content-length and Connection: close headers
Flush and end output buffers
The browser receives HTTP response
Continue time consuming processing
This SO anwer nails it (the comments are helpful as well).
You can call your script with other script that runs in background the job.
shell_exec('nohup /usr/bin/php /dir/to/your/script.php > /dev/null 2>/dev/null &');
Then you don't need to wait to finish the job
on linux
<?php
exec(''.$command.' > /dev/null 2>&1 &');
?>
on windows
<?php
$shell = new COM("WScript.Shell");
$shell->run($command, 0, false);
?>
where command would be something like
php -f $path/to/filename
if you put that in a page, you can then call it whenever you want and it will spawn a thread that will call apache, but not require the browser to wait for any response.

PHP - Parallel processing

I'm working on a web application that needs to send a lot of HTTP requests and update the table, this will block the PHP from executing. So I though I might have to write a separate PHP script and run it via my main application. I tried Exec but still the program waits until the script is executed.
exec('php do_job.php');
I even tried redirecting the output to a file as PHP.Net suggests:
Note: If a program is started with this function, in order for it to
continue running in the background, the output of the program must be
redirected to a file or another output stream. Failing to do so will
cause PHP to hang until the execution of the program ends.
$result = exec('php do_job.php > output.txt &',$output);
But still no success ... Further down the same page I came accross this:
$command = 'php do_job.php';
$shell = new COM("WScript.Shell");
$shell->run($command, 0, false);
Still no sucess ... Lastly I tried:
pclose(popen("start /B ". $command, "r"));
What am I doing wrong here?
I'm developing my app on localhost (XAMPP - Windows), later I'll be releasing it on a Linux host. My last resort would be to run the script via CRON jobs. Is this the only way?
By just adding & at the end of command you can run any command in background. Here I'm redirecting o/p to /dev/null to avoid hang(For Linux).
exec('php do_job.php > /dev/null &');
If you want see the o/p of the command you can redirect it to a file.
exec('php do_job.php > full_path_to_file &');

long running php script hangs

I have this set in my php script to make it supposedly run as long as it needs to to parse and do mysql queries and fetch images for over 100,000 rows.
ignore_user_abort(true);
set_time_limit(0);
#begin logging output
error_reporting(E_ALL);
ini_set('memory_limit', '512M');
I run the command like this in shell:
nohup php myscript.php > output.txt
after running about 8 to 10 hours this script will still be running but execution just stops... no more output.. it's not a zombie process I checked top. It hasn't met the memory limit either and if it did wouldn't it exit?
What is going on? It's a real pain to babysit this script and write custom code to nudge it along. What is going on? I read up on unix maybe cleaning up zombies but it's not a zombie. I know it's not php settings.. and it's not running through a webserver it's from command line only so what gives.
It looks like you haven't detached your process correctly. Currently, if your process's parent die, your process will die too. If you place your process in background (create a real daemon), you'll not meet scuh trouble.
You can execute your PHP this way to really detach it :
php myscript.php > output.txt 2>&1 &
For your information :
> output.txt
will redirect standard output (ie. your echo, print etc) to output.txt file
2>&1
will redirect error output to standard output, writting it in the same output.txt file
&
is the most important thing in your case : it will detach your process to create a real daemon.
Edit : if you're having troubles while disconecting your shell, the most simple is to put your script on a bash script, for example run.sh :
#!/bin/bash
php myscript.php > output.txt 2>&1 &
And you'll run your script this way :
bash run.sh &
In such case, your shell will "think" your program has ended at the end of the shell script, not at the end of the php daemon.
Long-running PHP scripts shouldn't die or hang without reason. I've had scripts that run continuously for 6 months +. There must be something else going on inside of your script body.
I know I should use comment to answer this, but I have not enough reputation to do it...
Maybe your process is consuming 100% of CPU, I had an issue with a while loop without calling a sleep() or usleep() at the end of the loop.

Prevent waiting for command line output

I thought that this would run without waiting for an output:
php /scripts/htdocs/summaries.live/app/scripts/generate-pdfs.php live 1 > /dev/null 2>&1
But it's not happening. PHP's exec() function is waiting for an output. How can I work around this to prevent this from happening?
you're missing & on the end of command
php /scripts/htdocs/summaries.live/app/scripts/generate-pdfs.php live 1 > /dev/null 2>&1 &
If running something with exec, the documentation states
Note:
If a program is started with this function, in order for it to
continue running in the background, the output of the program must be
redirected to a file or another output stream. Failing to do so will
cause PHP to hang until the execution of the program ends.

Categories