So I am trying to execute some script from my php code. That lives in page blah.php
<?php
// ....
// just basic web site that allows upload of file...
?>
Inside I use system call
if (system("blah.pl arg1") != 0)
{
print "error\n";
}
else
{
print "working on it..you will receive e-mail at completion\n";
}
Works great but it waits until it completes until it prints working on it.
I am aware that I am calling perl script from php not a typo.
How can I just start program execution and let it complete in background.
The blah.pl script handles e-mail notification.
Anyone?
Thanks I appreciate it
From system function documentation:
Note: If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
Simply redirect the output of your Perl script and PHP should be able to run without having to wait for the script to finish.
system("blah.pl arg1 > /dev/null 2>&1 &");
Can you add an ampersand to spawn the process in the background?
if (system("blah.pl arg1 &") != 0)
You could probably use the popen function
http://www.php.net/manual/en/function.popen.php
Since all the PHP command line functions wait for a result - I recommend that you have a separate page to call the emailer script.
In other words, send a request to the email.php page (or whatever) using cURL or the file wrappers which allows you to close the connection before receiving a result. So the process will look something like this.
page_running.php
|
cURL call to (email.php?arg1=0)
| |
final output email.php
|
calls system("blah.pl arg1")
Related
Here I am writing about my flow in detail.
I am doing an Ajax request on same host.
AJAX call to saveUser.php
in saveUser.php, I have included Common.php.
Common.php have createFile1() and createFile2() function.
createFile1() function just creating a sample1.php for another purpose.
in createFile2(), I am creating a file sample2.php and executing a exec('php /home/public/sample2.php > /dev/null &'); command to execute that file.
I am calling this createFile1() and createFile2() function respectively from saveUser.php which is being called by an AJAX request.
As I have set the exec command to run in background by '&' at end of command, it is returning without waiting for the response and all goes smoothly in front.
But when I am checking on my server, these files sample1.php and sample2.php are getting created again and again. It seems all this saveUser.php action are getting executed again and again until I stops the sample2.php process from SSH.
I checked the processes list, each time 'php /home/public/sample2.php' is having new process_id, so it confirms that it is getting executed again and again. If I remove the exec() code and execute this sample2.php from SSH, it works as expected, there is not such problem.
Please suggest me whats going on wrong? Is there any problem with server configuration, I am using hostgator shared account.
Also I am including same Common.php file in sample2.php also, informing in case it can help it.
Thanks for your help in advance.
saveUser.php code
include_once dirname(__FILE__).'/Common.php';
createFile1();
createFile2();
echo 'saved successfully!';
Common.php code
function createFile1()
{
$template = file_get_contents('sendDmTemplate.php');
$serviceFp = fopen('sendDmFiles/sample1.php',"w");
fwrite($serviceFp , $fileContent);
fclose($serviceFp);
}
function createFile2()
{
$fileContent = file_get_contents(dirname(__FILE__).'/sampleFileTemplate.php');
$serviceFp = fopen('sample2.php',"w");
fwrite($serviceFp , $fileContent);
fclose($serviceFp);
exec('php '.dirname(__FILE__).'/sample2.php > /dev/null &');
}
sample2.php code
include_once dirname(__FILE__).'/Common.php';
echo 'hi';
This exact same thing happened to me. I was trying to have one php script call exec() on another php script but for some strange reason it would just exec() itself again creating an infinite loop. The currently accepted answer didn't get me anywhere, but I was able to get it to work by specifying an absolute path to php.
So, instead of
exec('php /home/public/sample2.php > /dev/null &');
it would be
exec('/usr/local/bin/php /home/public/sample2.php > /dev/null &');
Check the following:
1) Are you sure you are NOT calling your script saveUser.php multiple times? Mayby a codingerror somewhere in the javascript XHR? Check this by looking in the (apache?) log.
2) Are you sure your php executes alright without the -q? I use php -q pathtoscript.php
3) If not 1 or 2: Post the code in here (or somewhere) of saveUser.php
EDIT: I see your problem. The file you create includes common.php again, and executes that again. Etc. <-- wrong Oops. I wrote that too early. Looking into it again now.
4) Is it possible you use some errorhandling that redirects to your saveUser.php?
Edit:
5) There might arise a problem from the fact that you are including the file that is executing the command itself in combination with include_once, but I am not sure. You could simply test this by changing your content of sample2.php content by adjusting sampleFileTemplate.php. Create a common2.php (with identical content as common.php), and use that one. COuld you testdrive that setup and report back?
I'm have a long-running perl script that outputs the percentage complete. How do I show the completion status real-time on a php page?
Example:
perl:
my $i = 0;
for $i (1 .. 6) {
print "$i\n";
sleep 1;
}
print "script end\n";
exit;
php:
echo passthru('perl testprint.pl');
This works to display the output, but not real-time.
Modify your perl script to pipe the output to a flat file. Use a php script to parse this file and output in whatever format you want.
You could call the php script from your html page using ajax, or if you save the perl script in the server's CGI directory you can call it directly without the need for the extra php file.
Lastly, create a nifty progress bar and profit.
For that you need to utilize AJAX.
I would have a PHP script that looks at the status, and then have AJAX update the status for the user to see.
I need to build a system that a user will send file to the server
then php will run a command-line tool using system() ( example tool.exe userfile )
i need a way to see the pid of the process to know the user that have start the tool
and a way to know when the tool have stop .
Is this possible on a Windows vista Machine , I can't move to a Linux Server .
besides that the code must continue run when the user close the browser windows
Rather than trying to obtain the ID of a process and monitor how long it runs, I think that what you want to do is have a "wrapper" process that handles pre/post-processing, such as logging or database manipulation.
The first step to the is to create an asynchronous process, that will run independently of the parent and allow it to be started by a call to a web page.
To do this on Windows, we use WshShell:
$cmdToExecute = "tool.exe \"$userfile\"";
$WshShell = new COM("WScript.Shell");
$result = $WshShell->Run($cmdToExecute, 0, FALSE);
...and (for completeness) if we want to do it on *nix, we append > /dev/null 2>&1 & to the command:
$cmdToExecute = "/usr/bin/tool \"$userfile\"";
exec("$cmdToExecute > /dev/null 2>&1 &");
So, now you know how to start an external process that will not block your script, and will continue execution after your script has finished. But this doesn't complete the picture - because you want to track the start and end times of the external process. This is quite simple - we just wrap it in a little PHP script, which we shall call...
wrapper.php
<?php
// Fetch the arguments we need to pass on to the external tool
$userfile = $argv[1];
// Do any necessary pre-processing of the file here
$startTime = microtime(TRUE);
// Execute the external program
exec("C:/path/to/tool.exe \"$userfile\"");
// By the time we get here, the external tool has finished - because
// we know that a standard call to exec() will block until the called
// process finishes
$endTime = microtime(TRUE);
// Log the times etc and do any post processing here
So instead of executing the tool directly, we make our command in the main script:
$cmdToExecute = "php wrapper.php \"$userfile\"";
...and we should have a finely controllable solution for what you want to do.
N.B. Don't forget to escapeshellarg() where necessary!
Hi Please help me in executing more than one method at a time in PHP.
Below is example:
<?php
function writeName()
{
sleep(3);
echo "Kai Jim Refsnes";
}
function b(){
sleep(3);
echo"b";
}
b();
writeName());
?>
Here above program take 6 sec to execute.But I want to run my both method simultaneously so that program should execute with in 3 sec(Multi threading).
With common PHP its not possible, because PHP is executed sequential. You may have a look at a job-server like gearman, or you may try to use forks (pcntl_fork()). It's not multi-threading, because there is no shared memory.
Sorry, but multithreading is not supported in PHP.
But you could start a PHP script which can run in the background using exec(). Just make sure you redirect it's output elsewhere.
That should be the closest you can get to "multithreading" without additional tools. Here's what the manual says:
Note: If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
I'll show some code, first:
echo exec("compile\\save.exe Untitled.c tmpUntitled.c");
I have a program, named save.exe and i want to know if it already stopped?
If stopped, ok... Do something...
If not, may be an error, or a loop...
Now: I want to build same way to control the time that program use, and put a limit (time limit exceed, something like this...)
Any one has a sugestion ?
Edit[1]:save.exe is a program wrote on C language, and use two parameters: source and destiny.
popen just don't execute save.exe, i say this because it don't gerenate anymore the destiny (with execit happens);
exec() will suspend your script while the exec'd program is running. Control will be returned only when the external program terminates. If you want to see if the program's hung up or waiting for input or somesuch, you use popen(), which returns a filehandle from which you can read the program's output. At that point you can do a polling loop to see if there's been any output:
$app = popen("your shell command here");
while($output = fgets($app)) {
// handle output
sleep(5); // sleep 5 seconds
}
If you want to send something to the app as input, then you have to use proc_open(), which allows bi-directional communication between your script and the external program.