Execute Long Multi Bash Commands in Laravel Scheduler - php

I'm trying to wipe sensitive history commands daily off my server via Laravel scheduler
When I run this code:
$schedule->exec(`for h in $(history | grep mysql | awk '{print $1-N}' | tac) ; do history -d $h ; done; history -d $(history | tail -1 | awk '{print $1}') ; history`)->{$interval}()->emailOutputTo($email)->environments('prod');
I get this error:
⚡️ app2020 php artisan schedule:run
ErrorException
Undefined variable: h
at app/Console/Kernel.php:44
Noted that this is a perfect working command
for h in $(history | grep mysql | awk '{print $1-N}' | tac) ; do history -d $h ; done; history -d $(history | tail -1 | awk '{print $1}') ; history
Note:
I know I can achieve it by adding the command above to the crontab, but the goal is to keep all cron activities organize in the Laravel project
Since bash and PHP use $ for their variable, how would one go above and make similar bash command works ?
Any hints, I will take.

You're wrapping your command in backticks, which will both interpolate variables and execute the value as a command. You want neither of these things.
Use single quotes to build the command, and then pass it to exec().
$cmd = 'for h in $(history | grep mysql | awk \'{print $1-N}\' | tac) ;';
$cmd .= ' do history -d $h ;';
$cmd .= 'done;';
$cmd .= 'history -d $(history | tail -1 | awk \'{print $1}\') ;';
$cmd .= 'history';
$schedule
->exec($cmd)
->{$interval}()
->emailOutputTo($email)
->environments('prod');
For a somewhat more efficient approach, you might try just editing the history file directly.
$cmd = 'sed -i /mysql/d "$HOME/.bash_history"';
There's no need to erase the last entry in the file, as it's a history of interactive shell commands.
Obviously the best thing to do would be not putting "sensitive" things in the command line. For MySQL, this can be done by adding credentials to the .my.cnf file.

Related

Shell_exec return value NULL on shell command

i'm trying to get the value of my vg space disk.
When i run the command in my shell i don't have any problem that returns the correct value, but when i include it in a PHP file i have a return of a NULL value.
I have see a lot of threads discuting about it but none of the answers solved my problem :(
What commands works ? This one :
$variable = shell_exec("df -h | grep "username" | awk '{print $3}'");
I don't think it's a permission problem cause this commands works..
For the test script has 777 and root permission.
The code who has the problem :
$test = "VG Size";
$tailletotaleduserveur = shell_exec("vgdisplay vghome | grep ".$test." | awk '{print $3}'");
echo json_encode($tailletotaleduserveur);
The actual result in network tab : null -> http://prntscr.com/nehsmb
It should return "5.85"
Thanks for help :)
Your command fails in a shell as well:
bash$ vgdisplay vghome | grep VG Size | awk '{print $3}'
grep: Size: No such file or directory
You have to quote 'VG Size':
$tailletotaleduserveur = shell_exec("vgdisplay vghome | grep '".$test."' | awk '{print $3}'");
Final answer for my problem :) :
$test = 'VG Size';
$tailletotaleduserveur = shell_exec("/usr/bin/sudo vgdisplay vghome | grep '".$test."' | awk '{print $3}'");
So that other guy ;) were in part right ;) i will confirm your answer, thanks

execute bash if else command in php ( bash vs php )

What i want to achieve
I want to execute some script it it's process in not started on the server. so for that i am preparing the command in shell script and executing it in single line.
Command with php variable
$cmd = "if [[ `ps auxww | grep -v grep | grep ".$process_file." | grep '".$find."'` == '' ]] ; then ".$cmd2." fi";
echo $cmd."\n";
Executed command, once variables are replaced (what will actually run on bash):
if [[ `ps auxww | grep -v grep | grep /home/new_jig.php | grep 'test_51 1714052'` == '' ]] ; then php /home/new_jig.php test_51 1714052 & fi;
executing command
exec($cmd,$out,$res);
Please note that, I have also split the problem in to two statement and execute those. But it is time consuming. It is causing problems when I have more than 2000 in list, and the command is executed for all. This takes about 1 or more than 1 minute to reach to the last number.
I want to achieve this within 10 seconds. Please help me to reach optimum output.
Thanks
Jignesh
somehow I am able to make it execute with the following command
$process_file = phpfile which executing some functionality
$cmd2 = " php ".$process_file." 1212 >/dev/null 2>/dev/null & ";
$cmd11 ="if ps -auxw | grep -v grep | grep '".$process_file."' | grep '".$find."' &> /dev/null ; then echo 1;".$cmd2."; fi";
shell_exec($cmd11." >/dev/null 2>/dev/null &");
Before this: for 1100 request the process was taking about 60+ seconds
After this: it is getting completed between 20 to 30 seconds

Exec returning more than expected

I have a php script that runs via a cron job.
I have an exec command in the script like so:
exec("ps -u bob -o user:20,%cpu,cmd | awk 'NR>1' | grep vlc | tr -s ' ' | cut -d ' ' -f 2",$cpu,$return)
This gets me the cpu form a process run by a specific user, if the process exists. When run via the command line I get say 21 or nothing at all depending on if the process is running or not. However, when running vai the PHP script, I get the following:
[0] => bob 0.0 /bin/sh -c php /home/bob/restart.php bob
[1] => bob 0.0 php /home/bob/restartStream.php bob
[2] => bob 0.0 sh -c ps -u bob -o user:20,%cpu,cmd | awk NR
It seems to be returning all the recent commands executed as opposed to the result of the command executed.
I have seen some posts which show the use of 2>&1 which I believe redirects the stdin and stdout or soemthing similar. However I have tried this in my command like so:
ps -u bob -o user:20,%cpu,cmd | awk 'NR>1' | grep vlc | tr -s ' ' | cut -d ' ' -f 2 2>&1
But it does not seem to make a difference. Can any give me any pointers as to why this is occurring and what can possibly be done to resolve this.
You need to clear out $cpu before you call exec. It appends the new output to the end of the array, it doesn't overwrite it.
You can also get rid of grep, tr, and cut and do all the processing of the output in awk
$cpu = array();
exec("ps -u bob -o user:20,%cpu,cmd | awk 'NR>1 && /vlc/ && !/awk/ {print $2}'",$cpu,$return);
The !/awk/ keeps it from matching the awk line, since that contains vlc.

How do I kill a running background program with PHP?

I have a button in my program that when pressed, runs this line of code.
ps -eo pid,command | grep "test-usb20X" | grep -v grep | awk '{print $1}'
It gives me the correct pid output. For example, "3243".
What I want to do now is kill that pid. I would just write
exec("kill -9 3232");
But the pid changes, so how do I save that number to a variable and then use it in the kill line?
You may write the response of your first command to a variable
$pid = exec("ps -eo pid,command | grep \"test-usb20X\" | grep -v grep | awk '{print $1}'");
and use this variable for the next command
exec("kill -9 ".$pid);

Shell: SVN status pipe to php to check syntax

I know this is simple but I just cant figure it out.
I have a bunch of files output by "svn st" that I want php to do a syntax check on the command line.
This outputs the list of files: svn st | awk '{print $2}'
And this checks a php script: php -l somefile.php
But this, or variants of, doesn't work: svn st | php -l '{print $2}'
Any ideas? Thanks!
Use xargs:
svn st | awk '{print $2}' | xargs -L 1 php -l
The xargs -L 1 command reads items from standard input, one per line, and runs the given command for each item separately. See the xargs(1) man page for more info.

Categories