gpg returns 2 when run from php - php

I have this code which I can run without any problem on my Amazon EC2 using SSH
echo mypassphrase | /usr/bin/gpg --batch --yes --always-trust --output daily_file/EZPNotif_1A_20160401.txt --passphrase-fd 0 daily_file/EZPNotif_1A_20160401.txt.gpg
but when I try to execute it using php exec like this
exec("echo mypassphrase | /usr/bin/gpg --batch --yes --always-trust --output ".$decryptedFiles[$i]." --passphrase-fd 0 ".$encryptedFiles[$i], $output, $return_var);
$return_var will return 2. And of course, the files aren't decrypted. What should I do?

Related

PHP exec output text with ansi2html problem

I use ansi2html to covert colors for HTML
But when I use exec in php for run bash file, the output is not correct.
exec("inxi.sh 2>&1", $returnOut, $stdout);
echo $returnOut[0];
<pre style="color:#bbb;white-space:pre-wrap;word-wrap:break-word;overflow-wrap:break-word">␃12System: ␃␃12Host␃ TiTAN ␃12Kernel␃ 5.3.0-59-generic x86_64 ␃12bits␃ 64 ␃12compiler␃ gcc ␃12v␃ 9.2.1 ␃12Console␃ N/A ␃
If I run bash file with terminal its return:
<pre style="color:#bbb;white-space:pre-wrap;word-wrap:break-word;overflow-wrap:break-word"><span style="color:#55f">System:</span>
inxi.sh
#!/bin/bash
inxi -xxx -C -D -G -I -m -M -n -R -s -S --usb -c 2 | ansi2html -n

How can i export environments into a PHP file from gitlab-ci.yml

I set the environment variables in gitlab and i want to export these into a PHP file.
I want to do it for using passwords in acceptance tests and i dont want that these passwords are readable in the files.
This is my gitlab-ci.yml, the last line is my try to export the Variable.
before_script:
- mkdir -p ~/.ssh
- eval $(ssh-agent -s)
- '[[ -f /.dockerenv ]] && echo -e "Host *\n\tStrictHostKeyChecking no\n\n" > ~/.ssh/config'
- echo "$DEPLOY_KEY" | tr -d '\r' | ssh-add - > /dev/null
- echo "$FE_PASS" > "$(pwd)/deploy/dev/codeception/tests/acceptance/ExtranetCept.php"

Can't run Linux "awk" command in script from PHP

I have the shell script "test.sh":
#!/system/bin/sh
PID=$(ps | grep logcat | grep root |grep -v grep | awk '{print $2}')
echo "Using awk: $PID"
PID=$(ps | grep logcat | grep root |grep -v grep | cut -d " " -f 7 )
echo "Using cut: $PID"
When I run the script from PHP:
exec("su -c sh /path/to/my/script/test.sh");
I got this output:
Using awk:
Using cut: 6512
So "cut" command is work but "awk" command doesn't when I run the script from PHP, but when I run it from terminal:
# sh test.sh
I can get both awk and cut work fine! This how look like the output of "ps":
USER PID PPID VSIZE RSS WCHAN PC NAME
root 6512 5115 3044 1108 poll_sched b6e4bb0c S logcat
Do I missed something?
You should learn how to debug first
You said
So "cut" command is work but "awk" command doesn't when I run the
script from PHP, but when I run it from terminal:
I wonder how ?
actually throws error like below, in CLI
$ php -r 'exec("su -c sh /path/to/my/script/test.sh");'
su: user /path/to/my/script/test.sh does not exist
You first need below syntax while debugging code
// basic : stdin (0) stdout (1) stderr (2)
exec('your_command 2>&1', $output, $return_status);
// to see the response from your command
// su: user /path/to/my/script/test.sh does not exist
print_r($output);
Remember :
su gives you root permissions but it does not change the PATH variable and current working directory.
The operating system assumes that, in the absence of a username, the
user wants to change to a root session, and thus the user is prompted
for the root password
[akshay#localhost Desktop]$ su
Password:
[root#localhost Desktop]# pwd
/home/akshay/Desktop
[root#localhost Desktop]# exit
exit
[akshay#localhost Desktop]$ su -
Password:
[root#localhost ~]# pwd
/root
Solution:
You should allow executing your script without password prompt ( don't use su use sudo )
To allow apache user to execute your script and some commands you may make entry like below in /etc/sudoers
# which awk => give you awk path
# same use in your script also, or else set path variable
www-data ALL=NOPASSWD: /path/to/my/script/test.sh, /bin/cut, /usr/bin/awk
So it becomes :
// assuming your script is executable
exec("sudo /path/to/my/script/test.sh 2>&1", $output);
print_r($output);

exec issue with netstat and lsof

I want to check if a certain tunnel exists from inside PHP using (any of these commands):
$(which lsof) -i -n | grep ssh
$(which netstat) -a | grep "localhost:ssh"
The issue is that when I run the commands in the shell everything is fine but from php running them like:
$reply = exec(CMD);
always return nothing.
Any ideas?
Thank you!
You could redirect stderr to stdout and get the $output and $return_var. To do that, change your exec() call like this:
exec('$(which lsof) -i -n | grep ssh 2>&1', $output, $return_var);
var_dump($return_var);
var_dump($output);
More info about exec here: http://php.net/manual/en/function.exec.php (have a look at $output and $return_var parameters).
I think the issue is more to do with how PHP interprets your command...
In this case (assuming instead of CMD you write same command you try in shell), it would try to:
$reply = exec($(which lsof) -i -n | grep ssh);
means it would try to substitute the bold part as a PHP variable, and try to execute the resultant string. As the output of "-i -n |grep ssh" is null, so you get nothing as a result.
I would suggest you to instead:
$lsof = exec(which lsof);
$reply = exec($lsof -i -n | grep ssh);

Wget download queue script

The idea is that when wget is running and downloading something, I can just add another URL that will be downloaded once the current download is finished. I only want to download 1 file at a time. I wrote this script
#!/bin/bash
test=/tmp/wget-download-link.txt
echo -n "$test" | while IFS= read -N 1 a; do
wget -o /tmp/wget.log -P /mnt/usb -i /tmp/wget-download-link.txt
if [[ "$a" == $'\n' ]] ; then
wget -nc -o /tmp/wget.log -P /mnt/usb -i /tmp/wget-download-link.txt
fi
#printf "$a"
echo download finished
done
The script will check for any new lines that consist of URLs, if there's any, it will rerun wget again, the problem is that this script will just keep looping, wget will download the same file continuously and just rename them if it already exists. How do I make wget re-run if there's any new URLs in the wget-download-link.txt file but stop it when the file already exists?
#msturdy I run your script but wget redownload and rename files that already exist, my script:
#!/bin/bash
test=/tmp/wget-download-link.txt
l=$(wc -l $test)
tail -n $l -f $test | while read url; do
wget -o /tmp/wget.log -P /mnt/usb -i /tmp/wget-download-link.txt
done
my wget-download-link.txt file:
http://media2.giga.de/2014/11/angel-beats-kanade.jpg
http://juanestebanrojas.com/wp-content/uploads/2014/06/angel-beats-wallpapers-4.jpg
http://images5.fanpop.com/image/photos/30100000/Angel-Beats-new-life-angel-beats-30142329-2560-909.jpg
http://kristenhazelkannon.files.wordpress.com/2013/06/angelbeats2.jpg
Downloaded files:
angel-beats-wallpapers-4.jpg
angel-beats-wallpapers-4.jpg.1
Angel-Beats-new-life-angel-beats-30142329-2560-909.jpg.1
Angel-Beats-new-life-angel-beats-30142329-2560-909.jpg
angel-beats-kanade.jpg.2
angel-beats-kanade.jpg.1
angel-beats-kanade.jpg
angelbeats2.jpg
the script keeps running, and will just rename files to .1 .2 .3 etc.
SOLVED WITH THIS
while [ true ] ; do
urlfile=$( ls /root/wget/wget-download-link.txt | head -n 1 )
dir=$( cat /root/wget/wget-dir.txt )
if [ "$urlfile" = "" ] ; then
sleep 180
continue
fi
url=$( head -n 1 $urlfile )
if [ "$url" = "" ] ; then
mv $urlfile $urlfile.invalid
continue
fi
mv $urlfile $urlfile.busy
wget $url -P $dir -o /www/wget.log -c -t 100 -nc
mv $urlfile.busy $urlfile.done
done

Categories