I know this is simple but I just cant figure it out.
I have a bunch of files output by "svn st" that I want php to do a syntax check on the command line.
This outputs the list of files: svn st | awk '{print $2}'
And this checks a php script: php -l somefile.php
But this, or variants of, doesn't work: svn st | php -l '{print $2}'
Any ideas? Thanks!
Use xargs:
svn st | awk '{print $2}' | xargs -L 1 php -l
The xargs -L 1 command reads items from standard input, one per line, and runs the given command for each item separately. See the xargs(1) man page for more info.
Related
I'm trying to wipe sensitive history commands daily off my server via Laravel scheduler
When I run this code:
$schedule->exec(`for h in $(history | grep mysql | awk '{print $1-N}' | tac) ; do history -d $h ; done; history -d $(history | tail -1 | awk '{print $1}') ; history`)->{$interval}()->emailOutputTo($email)->environments('prod');
I get this error:
⚡️ app2020 php artisan schedule:run
ErrorException
Undefined variable: h
at app/Console/Kernel.php:44
Noted that this is a perfect working command
for h in $(history | grep mysql | awk '{print $1-N}' | tac) ; do history -d $h ; done; history -d $(history | tail -1 | awk '{print $1}') ; history
Note:
I know I can achieve it by adding the command above to the crontab, but the goal is to keep all cron activities organize in the Laravel project
Since bash and PHP use $ for their variable, how would one go above and make similar bash command works ?
Any hints, I will take.
You're wrapping your command in backticks, which will both interpolate variables and execute the value as a command. You want neither of these things.
Use single quotes to build the command, and then pass it to exec().
$cmd = 'for h in $(history | grep mysql | awk \'{print $1-N}\' | tac) ;';
$cmd .= ' do history -d $h ;';
$cmd .= 'done;';
$cmd .= 'history -d $(history | tail -1 | awk \'{print $1}\') ;';
$cmd .= 'history';
$schedule
->exec($cmd)
->{$interval}()
->emailOutputTo($email)
->environments('prod');
For a somewhat more efficient approach, you might try just editing the history file directly.
$cmd = 'sed -i /mysql/d "$HOME/.bash_history"';
There's no need to erase the last entry in the file, as it's a history of interactive shell commands.
Obviously the best thing to do would be not putting "sensitive" things in the command line. For MySQL, this can be done by adding credentials to the .my.cnf file.
I wrote a very convoluted, very hackjob PHP-cli script that receives and parses JSON of changeable structure, depth and content. For that reason at that time I found it easiest to do the parsing using PHP's shell_exec() and cat | jq | grep.
Sometimes, rarely, on certain input it gives me the message Error: writing output failed: Broken pipe, which is the last message I see in cli output before the script dies. However, even when it does do that, the data is still parsed out correctly, for all the little good it does me.
I isolated the problematic piece of code to:
$jq1='cat '.$randfile.' | jq \'.\' | grep "\[" -m 1 | grep -Po --color=none "\w{3,15}"';
$jq1=trim(shell_exec($jq1));
And tried to debug it by seeing what it executes. The first line is the shell_exec argument, echoed before execution, the second line is the result of shell_exec.
Command: cat 5ca15f21.json | jq '.' | grep "\[" -m 1 | grep -Po --color=none "\w{3,15}";
Result: standalone
Command: cat 5ca59379.json | jq '.' | grep "\[" -m 1 | grep -Po --color=none "\w{3,15}";
Result: season
Error: writing output failed: Broken pipe
Command: cat 5ca7d271.json | jq '.' | grep "\[" -m 1 | grep -Po --color=none "\w{3,15}";
Result: extended
Command: cat 5ca7d7a8.json | jq '.' | grep "\[" -m 1 | grep -Po --color=none "\w{3,15}";
Result: season
(I have seen the error of my lazy ways and will be rewriting that whole section, but back then I was young and inexperienced and impatient. I'd still like to understand what's going wrong where why.)
What would make it do that sometimes? The input is always jq's pretty-printed JSON, of varying structure.
Even if it does get the broken pipe message, the necessary value is still parsed out and stored in the variable. What causes it to die then? I would like to know for the future if there's a way to make PHP disregard the [non-fatal] error and go on executing.
Why does the shell command that produces the broken pipe error message in shell_exec behave no differently when invoked manually in bash? Where is the broken pipe and what makes it so broken?
Edit:
The cat command could be eliminated if this is a bash command to be executed. I would write it like this.
$jq1=grep -Po --color=none "\w{3,15}"|"\[" -m 1 file.txt | jq \'.\';
$jq1=trim(shell_exec($jq1));
I combined your redundant grep commands and added file to the grep command and only piped the jq command.
i'm trying to get the value of my vg space disk.
When i run the command in my shell i don't have any problem that returns the correct value, but when i include it in a PHP file i have a return of a NULL value.
I have see a lot of threads discuting about it but none of the answers solved my problem :(
What commands works ? This one :
$variable = shell_exec("df -h | grep "username" | awk '{print $3}'");
I don't think it's a permission problem cause this commands works..
For the test script has 777 and root permission.
The code who has the problem :
$test = "VG Size";
$tailletotaleduserveur = shell_exec("vgdisplay vghome | grep ".$test." | awk '{print $3}'");
echo json_encode($tailletotaleduserveur);
The actual result in network tab : null -> http://prntscr.com/nehsmb
It should return "5.85"
Thanks for help :)
Your command fails in a shell as well:
bash$ vgdisplay vghome | grep VG Size | awk '{print $3}'
grep: Size: No such file or directory
You have to quote 'VG Size':
$tailletotaleduserveur = shell_exec("vgdisplay vghome | grep '".$test."' | awk '{print $3}'");
Final answer for my problem :) :
$test = 'VG Size';
$tailletotaleduserveur = shell_exec("/usr/bin/sudo vgdisplay vghome | grep '".$test."' | awk '{print $3}'");
So that other guy ;) were in part right ;) i will confirm your answer, thanks
I have just built 3 different versions of PHP from source on an Ubuntu server (alongside NGINX and MySQL 5.7). I am looking for a way to run php --ini for the currently running version. I know I have to add the location to the file PATH in .bashrc so I don't have to add the full path.
I have added this to my .bashrc which allows me to get the currently running PHP version, which then allows me to run the command:
# parallels#ubuntu:~$ ps aux | grep php
# root 6948 0.0 0.2 153724 4620 ? Ss 16:48 0:00 php-fpm: master process (/opt/php-7.0.0/etc/php-fpm.conf)
PHP_VERSION=$(ps aux | grep -o php-[[:digit:]].[[:digit:]].[[:digit:]])
export PATH="/bin:/usr/bin:/opt/$PHP_VERSION/bin:/sbin"
It works, but I am a bash novice and I'm thinking their might be a different way to do it. Would I be correct?
PHP_VERSION=$(php -v | tail -r | tail -n 1 | cut -d " " -f 2 | cut -c 1-3)
cd /usr/local/etc/php/$PHP_VERSION/
# cd /usr/local/etc/php/7.1/
This command works while running in PHP
<?php
echo PHP_VERSION;
You can get it in bash, like
PHP_VERSION=$(php -r "echo PHP_VERSION;")
Here is all of PHP Predefined Constants
I got it to work with the following commands:
# Full version
php -v | head -n 1 | cut -d " " -f 2
# Major.Minor version
php -v | head -n 1 | cut -d " " -f 2 | cut -f1-2 -d"."
should be able to get it done with awk.
php -v | awk 'NR<=1{ print $2 }'
print the second column from the first row of input.
I have a php script that runs via a cron job.
I have an exec command in the script like so:
exec("ps -u bob -o user:20,%cpu,cmd | awk 'NR>1' | grep vlc | tr -s ' ' | cut -d ' ' -f 2",$cpu,$return)
This gets me the cpu form a process run by a specific user, if the process exists. When run via the command line I get say 21 or nothing at all depending on if the process is running or not. However, when running vai the PHP script, I get the following:
[0] => bob 0.0 /bin/sh -c php /home/bob/restart.php bob
[1] => bob 0.0 php /home/bob/restartStream.php bob
[2] => bob 0.0 sh -c ps -u bob -o user:20,%cpu,cmd | awk NR
It seems to be returning all the recent commands executed as opposed to the result of the command executed.
I have seen some posts which show the use of 2>&1 which I believe redirects the stdin and stdout or soemthing similar. However I have tried this in my command like so:
ps -u bob -o user:20,%cpu,cmd | awk 'NR>1' | grep vlc | tr -s ' ' | cut -d ' ' -f 2 2>&1
But it does not seem to make a difference. Can any give me any pointers as to why this is occurring and what can possibly be done to resolve this.
You need to clear out $cpu before you call exec. It appends the new output to the end of the array, it doesn't overwrite it.
You can also get rid of grep, tr, and cut and do all the processing of the output in awk
$cpu = array();
exec("ps -u bob -o user:20,%cpu,cmd | awk 'NR>1 && /vlc/ && !/awk/ {print $2}'",$cpu,$return);
The !/awk/ keeps it from matching the awk line, since that contains vlc.