I have a multiple php scripts that are working currently right now. I created a cron job to execute all the scripts at a given time. But now the client want a trigger/event type so he can execute those scripts. So I thought of using the exec function.
So here is the problem, those script has to be executed in order. E.g: I have 2 scripts namely step1.php & step2.php. How do I run the 2 php script in order and in the backgroud process.
I read that using the 3rd parameter in exec function can return a result but it only always gave me a result: string(0) ""
This is what I want to achieve:
$step1 = exec("php step1.php > /dev/null &", $output, $returnVal);
if($step1 === TRUE) exec("php step2.php > /dev/null &", $output, $returnVal);
Or maybe their another php function that are more suitable than using exec??? Really don't know. Please help
Thanks a many guys
You don't have to run exec() from a php script to handle code in other php scripts. There are probably a dozen ways to do what you want, but one I would probably use:
Functionalize the code in step1.php and step2.php:
Old (Pseudo):
<?php
$var = true;
return $var;
?>
New:
<?php
function foo() {
$var = true;
return $var;
}
?>
Include those scripts (because the code is now functionalized, it doesn't get executed until you call the functions). So, in your script that calls the step scripts:
<?php
include('step1.php');
include('step2.php');
?>
Now call the functions you need with whatever logic you require:
<?php
include('step1.php');
include('step2.php');
if(foo() == true) {
bar(); //bar() is found in step2.php
}
?>
Again, several ways to accomplish this, much depending on your requirements and what the code in the step php scrips are doing. I'm making assumptions about that given the lack of details about what step1 and step2 are trying to execute.
Call the second script on the end of the first one.
if (... == TRUE) {
include('second_script.php');
}
And then you only need to run the first script on cron.
Assuming that you have step1.php outputting true on success
$step1 = exec("php step1.php > /dev/null &", $output, $returnVal);
$step1 is now the last line from the result of the command.
[EDIT: Misread the manual. Depending on what you want, you probably do need to check $step1 for the outputted result, and you are getting an empty string because you're not outputting anything in the step1.php script? Everything else appears to be correct however.]
What you should be checking is $returnVal for return status. This variable is passed by reference (according to the manual), so your code should be:
exec("php step1.php > /dev/null &", $output, $returnVal);
if($returnVal === TRUE) exec("php step2.php > /dev/null &", $output, $returnVal);
if($returnVal === TRUE) exec("php step3.php > /dev/null &", $output, $returnVal);
You could even use a while loop:
$num_steps = 4; //Arbitrary number for example
$step = 1;
while($returnVal === TRUE && $i <= $num_steps) {
exec('php step'.$step.'.php > /dev/null &', $output, $returnVal);
$i++;
}
[Careful, I haven't tested the above and it may not be suitable for what you want to do]
EDIT: islanddave's answer below is 'better'. I assumed that your current process was set, and you cannot change it (e.g. amount of time you have, can't refactor existing code for legacy reasons etc.).
Related
I saw a couple of other question on the issue but not a clear answer.
I've a PHP file (must be PHP, cannot cron or other stuff) running from CLI where I must call the same function multiple time with different arguments:
doWork($param1);
doWork($param2);
doWork($param2);
function doWork($data)
{
//do stuff, write result to db
}
Each call makes HTTPs requests and parses the response. The operation can require up to a minute to complete. I must prevent the "convoy effect": each call must be executed without waiting for the previous one to complete.
PECL pthread is not an option due to server constraints.
Any ideas?
As far as I know you cannot do what you are looking for.
Instead of calling a function with its parameters, you have to call another cli php script in a nonblocking manner and put your function in that script.
This is your main script:
callDoWork($param1);
callDoWork($param2);
callDoWork($param3);
function callDoWork($param){
$cmd = 'start "" /b php doWork.php '.$param;
//if $param contains spaces or other special caracters for the command line,
// you have to escape them.
pclose(popen($cmd);
}
doWork.php would look like :
if(is_array($_SERVER['argv'])) $param = $_SERVER['argv'][1];
doWork($param);
function doWork($data)
{
//do stuff, write result to db
}
More information about argv.
How about adding "> /dev/null 2>/dev/null &"
exec('php myFile.php > /dev/null 2>/dev/null &');
You can check the documentation for more
I have the below code to loop through and run an external file. I can't find much info on how to do this properly and its not working.
while($row = mssql_fetch_array($run)){
$arg = escapeshellarg($row['shopkeeper']);
exec("php /var/www/clients/client1/web/products/shopmania/index_test.php \"$arg\"> /dev/null 2>&1 &");
}
How can I debug this? Perhaps its finding the file but the $arg isn't being passed properly, but I have no idea how to test and find out.
THanks in advance.
You just shoud do it like that and have a look a the output.
exec("php /var/www/clients/client1/web/products/shopmania/index_test.php \"$arg\"", $output);
print_r($output);
I am running the following code. What it does is take a text file, splits it into parts that end with '_part' ending and than calls the same script with a flag to process the files - uploading the content to a Drupal system.
What happens is that the script runs and finishes the work, all invoked scripts finish too and I can see the results. but each time after I run it the web server stops responding. Is there anything basic that I am missing or doing wrong?
if(isset($argv[3])){
$isSplit = $argv[3] == 'true' ? true : false;
}
if($isSplit){
$fileSplitter = new CSVFileParts($fileName);
$parts = $fileSplitter->split_file();
echo 'Splited file to '.$parts.' parts'.PHP_EOL;
for($part =0; $part < $parts; $part++){
echo shell_exec('php Service.php u ./partial_files/'.basename($fileName).'.part_'.$part.' false > /dev/null 2>/dev/null &');
}
}else{
$log->lwrite('uploading '.$argv[2]);
$drupalUploader = new DrupalUploader($fileName, $log);
$drupalUploader->upload();
}
shell_exec — Execute command via shell and return the complete output as a string
shell_exec expects the file handle to be open, but you redirect everything to /dev/null and detach it.
As you plan to detach the process and remove all the output, you should use exec() and escapeshellcmd()
see: http://www.php.net/manual/en/function.exec.php
I have a long running PHP script that has a memory leak which causes it to fail part way through. The script uses a 3rd party library and I haven't been able to find the source of the leak.
What I would like to do is create a bash script that continually runs the PHP script, processing 1000 records at a time, until the script returns an exit code saying it has finished processing all records. I figure this should help me get around the memory leak because the script would run for 1000 records, exit, and then a new process would be started for another 1000 records.
I'm not that familiar with Bash. Is this possible to do? How do I get the output from the PHP script?
In pseudocode, I was thinking of something along the lines of:
do:
code = exec('.../script.php')
# PHP script would print 0 if all records are processed or 1 if there is more to do
while (code != 0)
$? gives you the exit code of a program in bash
You can do something ilke
while /bin/true; do
php script.php
if [ $? != 0 ]; then
echo "Error!";
exit 1;
fi
done
You can probably even do:
while php script.php; do
echo "script returned success"
done
Do you have to use bash? You could probably do this with PHP:
while (true) {
$output = exec('php otherscript.php', $out, $ret);
}
The $ret variable will contain the exit code of the script.
Use a simple until loop to automatically test the exit status of the PHP script.
#!/bin/sh
until script.php
do
:
done
The colon is merely a null operator, since you don't actually want to do anything else in the loop.
until while execute the command script.php until it returns zero (aka true). If the script returned 0 to indicate not done instead of 1, you could use while instead of until.
The output of the PHP script would go to standard out and standard error, so you could wrap the invocation of the shell script with some I/O re-direction to stash the output in a file. For example, if the script is called loop.sh, you'd just run:
./loop.sh > output.txt
though of course you can control the output file directly in the PHP script; you just have to remember to open the file for append.
You might want to ask a separate question about how to debug the PHP memory leak though :-)
You can write:
#!/bin/bash
/usr/bin/php prg.php # run the script.
while [ $? != 0 ]; do # if ret val is non-zero => err occurred. So rerun.
/usr/bin/php prg.php
done
Implemented solution in PHP instead as:
do {
$code = 1;
$output = array();
$file = realpath(dirname(__FILE__)) . "/script.php";
exec("/usr/bin/php {$file}", $output, $code);
$error = false;
foreach ($output as $line) {
if (stripos($line, 'error') !== false) {
$error = true;
}
echo $line . "\n";
}
} while ($code != 0 && !$error);
Using PHP on Linux, I'd like to determine whether a shell command run using exec() was successfully executed. I'm using the return_var parameter to check for a successful return value of 0. This works fine until I need to do the same thing for a process that has to run in the background. For example, in the following command $result returns 0:
exec('badcommand > /dev/null 2>&1 &', $output, $result);
I have put the redirect in there on purpose, I do not want to capture any output. I just want to know that the command was executed successfully. Is that possible to do?
Thanks, Brian
My guess is that what you are trying to do is not directly possible. By backgrounding the process, you are letting your PHP script continue (and potentially exit) before a result exists.
A work around is to have a second PHP (or Bash/etc) script that just does the command execution and writes the result to a temp file.
The main script would be something like:
$resultFile = '/tmp/result001';
touch($resultFile);
exec('php command_runner.php '.escapeshellarg($resultFile).' > /dev/null 2>&1 &');
// do other stuff...
// Sometime later when you want to check the result...
while (!strlen(file_get_contents($resultFile))) {
sleep(5);
}
$result = intval(file_get_contents($resultFile));
unlink($resultFile);
And the command_runner.php would look like:
$outputFile = $argv[0];
exec('badcommand > /dev/null 2>&1', $output, $result);
file_put_contents($outputFile, $result);
Its not pretty, and there is certainly room for adding robustness and handling concurrent executions, but the general idea should work.
Not using the exec() method. When you send a process to the background, it will return 0 to the exec call and php will continue execution, there's no way to retrieve the final result.
pcntl_fork() however will fork your application, so you can run exec() in the child process and leave it waiting until it finishes. Then exit() with the status the exec call returned.
In the parent process you can access that return code with pcntl_waitpid()
Just my 2 cents, how about using the || or && bash operator?
exec('ls && touch /tmp/res_ok || touch /tmp/res_bad');
And then check for file existence.