I have a condition where I have to start iperf server as daemon on specified port and if iperf server is running, I have to send response to client. I tried
shell_exec('iperf -s -p {port} -D');
but it doesn't return control / infinite loop starts.
The server will start but the code below the shell_exec will never be executed.
Anyone has a solution or suggestion how I should approach this to get the result?
Your command iperf -s -p {port} -D happens to have stderr output, try doing this:
$outfile = "/tmp/erroutperf.out";
$port = 8080;
shell_exec("iperf -s -p $port -D > $outfile 2>&1");
basically the additional command > /tmp/erroutperf.out 2>&1, tells bash to save
both stderr output and stdout of a program (iperf) to a file /tmp/erroutperf.out
getting the output of the command is:
file_get_contents($outfile);
Related
I've got a PHP script that needs to run the .sh file using shell_exec
echo shell_exec('sh /var/www/html/daloradius/start.sh > /dev/null 2>/dev/null &');
I just dump it into background. This is my start.sh
sudo tcpdump port 1812 -w testing.pcap
we know that tcpdump always listen all the time, I tried to resolve this (stop the tcpdump process) with the button that triggering another shell_exec which is stop.sh
pid=$(ps aux | grep "sudo tcpdump" | head -1 | cut -d '.' -f 1 | cut -d ' ' -f 7)
sudo kill $pid
Stop.sh is doing fine when I tested it in cli, but when I click the button that triggering start.sh and I tried to stop it with the button that triggering stop.sh it doesn't work. The tcpdump won't stop, but when I try to stop it in cli using stop.sh it's work well. Can anybody gimme solution to force stop the tcpdump things? Thank you
You are trying to use bash when you should be orchestrating the process from php.
Here, we get the PID of the command and kill it from PHP. Replace the sleep statement with whatever code you have.
<?php
# Script must be run with sudo to start tcpdump
# Be security conscious when running ANY code here
$pcap_file = 'testing.pcap';
$filter = 'port 1812'
$command = "tcpdump $filter -w $pcap_file" . ' > /dev/null 2>&1 & echo $!;';
$pid = (int)shell_exec($command);
echo "[INFO] $pid tcpdump: Writing to $pcap_file\n";
# Some important code. Using sleep as a stand-in.
shell_exec("sleep 5");
echo "[INFO] $pid tcpdump: Ending capture\n";
shell_exec("kill -9 $pid");
Please note that tcpdump has the -c option to stop ofter n packets received and you can rotate files with -G. You may want to read up on tcpdump's manpage to get the most out of it.
With PHP, I am trying to connect to a remote server and exec an rsync command :
$rsync_cmd = 'rsync -P -azv -e "ssh -p 1234" /dir/file.jpg root#remoteserver2:/dir/file.jpg';
exec('sudo ssh -tt root#remoteserver1 '.$rsync_cmd.' 2>&1', $output);
The remote connection is working fine but then when I am including my rsync command I got a port error "ssh: connect to host remoteserver2 port 22: Connection refused"
The correct port to use is 1234 and this command is working fine on the Terminal (shell) but php "exec" function dont want to take it ("ssh -p 1234"), any idea ?
Did you test it in Terminal by sshing into remoteserver1, and then executing the rsync command, or by executing the entire ssh-to-rsync command on one line? Because I'm pretty sure doing it in two steps will work, but one step won't. This is because when it's done as one step, the command string is parsed by the local shell (including doing quote and escape interpretation and removal) before it's passed over ssh to the remote shell (which then does another pass of quote and escape interpretation and removal). Those double-quotes in "ssh -p 1234" get parsed and removed by the local shell, so they don't have the intended effect of being parsed and applied by the remote shell.
If I'm right about the problem, the solution is pretty simple: escape the double-quotes. That way the local shell will parse and remove the escapes, and pass the double-quotes through unmolested so the remote shell can see and apply them:
$rsync_cmd = 'rsync -P -azv -e \"ssh -p 1234\" /dir/file.jpg root#remoteserver2:/dir/file.jpg';
exec('sudo ssh -tt root#remoteserver1 '.$rsync_cmd.' 2>&1', $output);
The Solution is pretty mush simple try skipping -P flag from the command.
$rsync_cmd = 'rsync -azv -e \"ssh -p 1234\" /dir/file.jpg root#remoteserver2:/dir/file.jpg';
exec('sudo ssh -tt root#remoteserver1 '.$rsync_cmd.' 2>&1', $output);
It will work fine with that -P flag it took default port for the rsync.
When I try to run shell_exec("php -S localhost:8000"), it runs the server but it freezes the terminal.
I tried running $result = shell_exec('php -S localhost:8000 -t public/ &> /dev/null 2>&1') but it doesn't store the output to the variable.
The idea is so that I can customize the output message once server successfully boots up.
I have a PHP script that starts a detached screen through SSH:
$ssh->exec("screen -m -d -S ".$user);
I now need to execute a command in that screen without being in that screen. I have the code that does that, which I have tested through a SSH client, but when I try to use it with the phpseclib exec command, it does not work. This is the code that works:
screen -S ".$user." -X stuff "cd minecraft/servers/".$user."/;sh start.sh $(printf '\r')"
And this is it in the PHP script:
$ssh->exec("screen -S ".$user." -X stuff \"cd minecraft/servers/".$user."/;sh start.sh $(printf '\r')\"");
I attempted to escape the extra double quotes in the code.
Is there anything I can do to make this work through PHP? Thanks
Hmmm...
create please two bash script, first: create screen with user parameter with name f.e. run_screen, second: tester for SSH client with user parameter with name f.e. run_test.
Run first script:
$ssh->exec('[full_path]/run_screen ' . $user);
and second:
$ssh->exec('[full_path]/run_test ' . $user);
bash syntax is here bash syntax
Sure that the user of server (f.e. Apache) has permissions to run scripts.
I need to write some scripts for some automation work,
I put a php file on local apache server
test.php
<?php
system("bash inform.sh");
?>
the content of inform.sh is:
#!/bin/bash
proc_id=`ps -ef|grep "sleep"|grep -v "grep"|awk '{print $2}'`
kill -9 $proc_id
I run sleep process on a shell, and open the php page on firefox : localhost/test.php
but it doesn't kill the sleep process,
if I run the php directly through shell, then it works
what's wrong with this and how to deal with it? thanks
I use the following shell command and then it is OK:
sudo -u apache sleep 2000