I have the following code in place. It provides the information needed, however I would like to assign the output to variables.
$cmd = "ssh machine 'cat /usr/local/reports/file.dat | awk -F'[[:space:]][[:space:]][[:space:]]*' '{print \"<tr><td>\"$2\"</td><td>\"$3\"</td></tr>\"}'";
system($cmd);
This correctly runs and produces a table with the 2nd and 3rd columns from the file. However, I would now like to assign the columns to variables for each line read in the file.
Any ideas?
system always outputs the command output directly. You could use output buffering to capture it, but you should use shell_exec instead:
$result = shell_exec( $cmd );
Few suggestions:
Use heredoc to make reader friendly
avoid cat /usr/local/reports/file, awk can read file directly, there is no need of using cat command
if you want take care of return status use exec() function.
shell_exec() returns all of the output stream as a string. exec returns the last line of the output by default, but can provide all output as an array specifed as the second parameter.
Here is code snippet
<?php
$cmd =<<<EOF
ssh user#host "awk -F'[[:space:]][[:space:]][[:space:]]*' '{
print \"<tr><td>\" $2 \"</td><td>\" $3 \"</td></tr>\"
}
' /usr/local/reports/file.dat 2>&1"
EOF;
/*
execute command in 1st argument,
save output in array in 2nd argument
store status in 3rd argument
*/
exec($cmd, $out, $return);
if($return==0)
{
print_r($out);
/* your case you can just
echo implode(PHP_EOL, $out);
*/
}else{
/* Failed to execute command
do some error handling */
die( 'Failed to execute command : '. $cmd );
}
?>
I'm trying to compile a Sass (.scss) file's contents using shell_exec or exec, not proc_open. Don't ask me why I don't just pass the file itself or use proc_open, the goal here is to pass the contents via stdin, piped with echo.
I think there are some characters in the string that break the command but I can't figure out which. I'm on Ubuntu 14.04, running PHP 5.6 and in this case CLI.
You can run this to see for yourself (will need Ruby and Sass installed):
sudo apt-get install Ruby && sudo gem install sass
<?php
/** Should work with '$large = true' and '$download = false' **/
// to prove that a small file DOES compile via stdin
$large = true;
// to prove that it's compilable as a file, rather than stdin
$download = false;
$domain = "http://test.fhmp.net";
$file = $large ? 'large' : 'small';
// grab a valid .scss file
$input = file_get_contents("$domain/$file.scss");
if($download){
// create temp file
$temp = tempnam(sys_get_temp_dir(), '');
file_put_contents($temp, $input);
// compile temp file
var_dump(shell_exec("sass --scss '$temp' 2>&1"));
// delete temp file
#unlink($temp);
} else {
// attempt to escape it
$esc = escapeshellarg($input);
// dump the results of the shell call
var_dump(shell_exec("echo $esc | sass --scss --stdin 2>&1"));
}
You should try :
$esc = '$'.escapeshellarg($input);
the escapeshellarg() function escapes single quotes with a backslash which means that:
escapeshellarg("//where's the quote") become 'where\'s the quote'. You can not echo single quotes inside single quotes in the (bash) shell, see: How to escape a single quote in single quote string in Bash?
I think the problem here was that the command was too long when I'm interpolating the entire input string. The solution is to use proc_open or popen to write to standard input rather than pipe using |.
I have a PHP script that retrieves 200 lines from a file by executing a command in Bash using backtick operators. Here's what the code looks like:
$endline = `(shell execution that returns a number here)`;
$line = $endline - "200";
$lines = "sed -n '".$line.", ".$endline." p' log.txt";
echo $lines;
$file = `$lines`;
echo $file;
This code returns $lines as sed -n '1800, 2000 p' log.txt, but $file doesn't return any results. When directly using sed -n '1800, 2000 p' log.txt in a Bash terminal, I get the expected results.
What is done incorrectly here? Do the ' characters have to be escaped?
Edit: The shell script added a space after the number, therefore misreading it.
My guess is that it's $eof or that your path (log.txt) is not appropriate.
I copied and pasted your code, and it works with the following tweaks:
syntax error fixed (add ; to echo $lines)
change $eof to $endline (though you may not need to if $eof is valid
ensure that log.txt was a valid path (this is most likely your error)
otherwise, it ran as expected.
The reason it would work in Bash but not in PHP is that their "working directory" is not necessarily the same.
I have many files containing php serialized data in which I have to replace some strings by another one. The linux host doesn't have any php installed. The problem is to adjust the modified string to correct size.
I tried something like to replace /share path to /opt:
sed -re 's~s:([0-9]+):"/share([^"]*)~s:int(\1-2):/opt\2~g' file
but the result file is bad: lengths are litteral expression int(size - 2)
Any idea ?
This solution isn't ideal, but you could use perl:
my $line;
while ($line = <STDIN>) {
$line =~ s~s:([0-9]+):"/share([^"]*)~"s:".($1-2).":\"/opt$2"~ge;
print $line;
}
Hopefully I've understood your requirements correctly. Here's an example:
php -r 'echo serialize(array("/share/foo")) . "\n";'
a:1:{i:0;s:10:"/share/foo";}
php -r 'echo serialize(array("/share/foo")) . "\n";' | perl replace.pl
a:1:{i:0;s:8:"/opt/foo";}
EDIT: Here's a modified script to edit the file in-place with variable search and replace strings.
It is possible to pipe data using unix pipes into a command-line php script? I've tried
$> data | php script.php
But the expected data did not show up in $argv. Is there a way to do this?
PHP can read from standard input, and also provides a nice shortcut for it: STDIN.
With it, you can use things like stream_get_contents and others to do things like:
$data = stream_get_contents(STDIN);
This will just dump all the piped data into $data.
If you want to start processing before all data is read, or the input size is too big to fit into a variable, you can use:
while(!feof(STDIN)){
$line = fgets(STDIN);
}
STDIN is just a shortcut of $fh = fopen("php://stdin", "r");.
The same methods can be applied to reading and writing files, and tcp streams.
As I understand it, $argv will show the arguments of the program, in other words:
php script.php arg1 arg2 arg3
But if you pipe data into PHP, you will have to read it from standard input. I've never tried this, but I think it's something like this:
$fp = readfile("php://stdin");
// read $fp as if it were a file
If your data is on one like, you can also use either the -F or -R flag (-F reads & executes the file following it, -R executes it literally) If you use these flags the string that has been piped in will appear in the (regular) global variable $argn
Simple example:
echo "hello world" | php -R 'echo str_replace("world","stackoverflow", $argn);'
You can pipe data in, yes. But it won't appear in $argv. It'll go to stdin. You can read this several ways, including fopen('php://stdin','r')
There are good examples in the manual
This worked for me:
stream_get_contents(fopen("php://stdin", "r"));
Came upon this post looking to make a script that behaves like a shell script, executing another command for each line of the input... ex:
ls -ln | awk '{print $9}'
If you're looking to make a php script that behaves in a similar way, this worked for me:
#!/usr/bin/php
<?php
$input = stream_get_contents(fopen("php://stdin", "r"));
$lines = explode("\n", $input);
foreach($lines as $line) {
$command = "php next_script.php '" . $line . "'";
$output = shell_exec($command);
echo $output;
}
If you want it to show up in $argv, try this:
echo "Whatever you want" | xargs php script.php
That would covert whatever goes into standard input into command line arguments.
Best option is to use -r option and take the data from the stdin. Ie I use it to easily decode JSON using PHP.
This way you don't have to create physical script file.
It goes like this:
docker inspect $1|php -r '$a=json_decode(stream_get_contents(STDIN),true);echo str_replace(["Array",":"],["Shares"," --> "],print_r($a[0]["HostConfig"]["Binds"],true));'
This piece of code will display shared folders between host & a container.
Please replace $1 by the container name or put it in a bash alias like ie displayshares() { ... }
I needed to take a CSV file and convert it to a TSV file. Sure, I could import the file into Excel and then re-export it, but where's the fun in that when piping the data through a converter means I can stay in the commandline and get the job done easily!
So, my script (called csv2tsv) is
#!/usr/bin/php
<?php
while(!feof(STDIN)){
echo implode("\t", str_getcsv(fgets(STDIN))), PHP_EOL;
}
I chmod +x csv2tsv.
I can then run it cat data.csv | csv2tsv > data.tsv and I now have my data as a TSV!
OK. No error checking (is the data an actual CSV file?), etc. but the principle works well.
And of course, you can chain as many commands as you need.
If you are wanting more to expand on this idea, then how about the ability to include additional options to your command?
Simple!
#!/usr/bin/php
<?php
$separator = $argv[1] ?? "\t";
while(!feof(STDIN)){
echo implode($separator, str_getcsv(fgets(STDIN))), PHP_EOL;
}
Now I can overwrite the default separator from being a tab to something else. A | maybe!
cat data.csv | csv2tsv '|' > data.psv
Hope this helps and allows you to see how much more you can do!