#!/bin/bash
cd /maintenance;
for (( i=1;i<1000;i++)); do
php -q dostuff.php $i
done
I use this shell script to call the dostuff.php script and pass the $i as an agrv to the script. The script connects to a webservice that returns results 50 items at a time. The $i value is the page number... I have no way to know how many times it needs to be called (how many pages) until I get a response code back from CURL inside that script that I test for. I need to pass my own response code back to the shell script to have it stop looping... it will never get to 1000 iterations... it was just a quick loop I made.
If I use exec("php -q dostuff.php $i", $output, $return_var) how do I tell the script to keep executing and passing the incremented $i value until my php script exits with a response code of 0?
There has got to be a better way. Maybe a while? Just not that good with this syntax.
I have to start at page 1 and repeat until page XXX incrementing by 1 each iteration. When there are no more results I can test for this in the dostuff.php and exit(0). What is the best way to implement this in the shell script?
Thanks!
You can check for the return value of the script, and break the loop if it isn't what is expected.
Usually a script returns 0 when it ran successfully, and something else otherwise, so if I assume your script respect this condition you could do:
#!/bin/bash
cd /maintenance;
for (( i=1;i<1000;i++)); do
php -q dostuff.php $i
if [ $? -ne 0 ]; then break; fi
done
On the other hand, if you want your script to return 0 if the loop shouldn't continue then you should do:
if [ $? -eq 0 ]; then break; fi
Edit: to take the comment into account to simplify the script:
If your script returns 0 when it shouldn't be called again, you instead do:
#!/bin/bash
cd /maintenance;
for (( i=1;i<1000;i++)); do
if php -q dostuff.php $i; then break; fi
done
As already suggested in the comments, you might get way better control if you dont wrap the php script inside a bash script but instead use php-cli as the shell script (PHP is kinda shell):
#!/usr/bin/php
<?php
for ($i = 0; $i < 1000; $i++) {
// contents of dostuff.php integrated
}
You might also be interested in using STDOUT, STDIN and STDERR:
http://php.net/manual/en/features.commandline.io-streams.php
Related
$command1 = "interfacename -S ipaddress -N nms -P company ";
$command2 = "list search clientclass hardwareaddress Mac address ";
if ( exec( $command1 . "&&" . $command2 ) ) {
echo "successfuly executed";
} else {
echo "Not successfuly executed";
}
If command 1 (cmd query) successfully executed, I want command 2 (which also contains some cmd queries) to be executed next. In the above script, only command 1 is executed. It doesn’t show any result for command 2.
I have wasted two days on this without finding any solution.
You can use either a ; or a && to separate the comands. The ; runs both commands unconditionally. If the first one fails, the second one still runs. Using && makes the second command depend on the first. If the first command fails, the second will NOT run. Reference
You can use shell_exec() PHP function to run Shell Command directly in your script.
Syntax : string shell_exec (string $cmd)
Example :
$output = shell_exec('ls -lart');
var_dump($output); #Showing the outputs
You can use multiple conditions in a single command line.
Example :
$data = "rm a.txt && echo \"Deleted\"";
$output = exec($data);
var_dump($output);
if($output=="Deleted"){
#Successful
}
In above example "Deleted" string will assign to $output when the file deleted successfully. Otherwise the error/warning/empty string will assign to $output variable. You should make condition with $output string.
Here is the documentation of shell_exec()
Note : There will be a new line character of the function shell_exec() output.
If I understand your question correctly, you want to execute $command1 and then execute $command2 only if $command1 succeeds.
The way you tried, by joining the commands with && is the correct way in a shell script (and it works even with the PHP function exec()). But, because your script is written in PHP, let's do it in the PHP way (in fact, it's the same way but we let PHP do the logical AND operation).
Use the PHP function exec() to run each command and pass three arguments to it. The second argument ($output, passed by reference) is an array variable. exec() appends to it the output of the command. The third argument ($return_var, also passed by reference) is a variable that is set by exec() with the exit code of the executed command.
The convention on Linux/Unix programs is to return 0 exit code for success and a (one byte) positive value (1..255) for errors. Also, the && operator on the Linux shell knows that 0 is success and a non-zero value is an error.
Now, the PHP code:
$command1 = "ipcli -S 192.168.4.2 -N nms -P nmsworldcall ";
$command2 = "list search clientclassentry hardwareaddress 00:0E:09:00:00:01";
// Run the first command
$out1 = array();
$code1 = 0;
exec($command1, $out1, $code1);
// Run the second command only if the first command succeeded
$out2 = array();
$code2 = 0;
if ($code1 == 0) {
exec($command2, $out2, $code2);
}
// Output the outcome
if ($code1 == 0) {
if ($code2 == 0) {
echo("Both commands succeeded.\n");
} else {
echo("The first command succeeded, the second command failed.\n");
}
} else {
echo("The first command failed, the second command was skipped.\n");
}
After the code ends, $code1 and $code2 contain the exit codes of the two commands; if $code1 is not zero then the first command failed and $code2 is zero but the second command was not executed.
$out1 and $out2 are arrays that contain the output of the two commands, split on lines.
I'm not sure to know about simultaneous execution but I'm sure about one cmd dependent on another cmd execution action. Here I'm running single execution cmd first to clear all set path, second I've declared my file path, third I install angular cmd npm install.
$path = "D:/xampp/htdocs/tests/omni-files-upload/aa-test/src";
$command_one = "cd /";
$command_two = "cd ".$path;
$command_three = "npm install";
#exec($command_one."&& ".$command_two."&& ".$command_three);
I have this line of code to execute a background task (convert several pngs to jpg)
exec("nohup path/to/php path/to/convertToJpg.php >> path/to/convert_to_jpg.log > /dev/null &");
Now I am writing the convertToJpg.php script and I can't figure out how to output information from there so that it will be logged into the convert_to_jpg.log.
When I try to google it all I come up with is how to call a php script from exec or shell_exec, since the words used to describe the two situations is almost the same.
For Clarification
An extra quote got into my code when copying it over to SO. I have fixed it. In my original code the convertToJpg.php is being called as expected, confirmed by error_logs placed in it to check.
A couple answers have pointed to the $output argument in exec(). As I understand it, that totally defeats the purpose of shell redirection using the >> path/to/convert_to_jpg.log.
My question is not how to get the output from the exec() command, but what code do I use to actually output(verb) from the convert_to_jpg.log
More Clarification
If I call
exec("nohup path/to/php path/to/convertToJpg.php >> path/to/convert_to_jpg.log > /dev/null &");
or
$results = shell_exec("path/to/php path/to/convertToJpg.php > /dev/null");
echo $results;
or
$results = "";
exec("path/to/php path/to/convertToJpg.php > /dev/null", $results);
print_r($results);
It doesn't matter which one.
Here is convertToJpg.php
<?php
echo "Will this be in $results?"; // No, this did not end up in results.
error_log("error logs are being logged, so I know that this php file is being called");
//I have tested echo, but that does not work.
//What php function do I use so that a string will be captured in the $output of an exec() call?
?>
$output = "";
$return_var = "";
exec("nohup path/to/php path/to/convertToJpg.php >> path/to/convert_to_jpg.log > /dev/null &", $output, $return_var);
$output
If the output argument is present, then the specified array will be filled with every line of output from the command. Trailing whitespace, such as \n, is not included in this array. Note that if the array already contains some elements, exec() will append to the end of the array. If you do not want the function to append elements, call unset() on the array before passing it to exec().
$return_var
If the return_var argument is present along with the output argument, then the return status of the executed command will be written to this variable.
http://php.net/manual/en/function.exec.php
Ok, I figured it out!
To answer the question plain and simply, use either echo or print.
This was the first thing I tried, but it didn't work, which made me think it was some other function that I was supposed to call. (I obviously haven't worked with this much)
The real problem I was having was the > /dev/null, which was discarding all output. Once I deleted that, it fine. Explained here: What is /dev/null 2>&1?
I had at some point done a copy/paste without really understanding what that did...
And because I was blaming the wrong thing I ended up with this off base question.
I'm calling a php script using exec and I'm trying to make a simple log.
Currently I have this :
exec("php script.php $options > temp/log.txt");
If I execute once the result is wrote, but if I execute this multiple times it's always replaced by the last call.
Is there a way to just add the output at the end of the .txt, without replacing all the file ?
Thanks
This has nothing to do with php, you are looking for a shell feature:
exec("php script.php $options >> temp/log.txt");
Note the double >> in there. It appends the redirection instead of overwriting the target.
I want to process my access log in php - checking some IPs, which are leeching content and so on, everything in PHP, running as CLI. I tried to make following, but it never pass exec tail -f, so, actually I can not process the data. Any help appreciated.
$log = '/var/log/lighttpd/web.org-access.log';
$pipefile = '/www/web.org/tmp/fifo.tmp';
if(file_exists($pipefile))
if(!unlink($pipefile))
die('unable to remove stale file');
umask(0);
if(!posix_mkfifo($pipefile,0777))
die('unable to create named pipe');
exec("tail -f $log > $pipefile 2>&1 &"); //I tried nohup and so on...
//exec("varnishncsa $log > $pipefile 2>&1 &"); //will be here instead tail -f
echo "never comes here"; //never shows up
If possible, I want to do it just in PHP, no bash/tcsh scripting (I know how to do it using those).
Thanks.
If you want exec to start a background process, you will have to redirect its output.
Quote from the manual:
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
Notice the full syntax description:
string exec ( string $command [, array &$output [, int &$return_var ]] )
Source: http://www.php.net/manual/en/function.exec.php
Hi I have a shell script which should run based on the return of php code:
x=1
while [[ "$x" != 5 ]]
do
echo "Welcome $x"
php test.php
x=$?
done
And the php code
echo "Testdfdf test".PHP_EOL;
exit(4);
So I want whenever I get 5 from php to quit the loop.
But I get sometimes:
./myshell: line 7: 20529 Segmentation fault php test.php
Should it loop without problem?
Probably because of this error which affects both Ubuntu and Debian... https://bugs.launchpad.net/ubuntu/+source/php5/+bug/343870
It should and it does, but no clue about why php is ending with a segfault.
your shell while loop will loop forever, as your php script returns 4 to shell, and your while loop checks for !=5. which means the condition is not going to be met. what actually is it you are wanting to do? unless necessary, i would advise to do everything with php (or shell) , but try not to intermingle both.