IIS blocking LFTP command - php

I'm trying to execute a LFTP command using the system() PHP method in a IIS 7.0 site.
$command = 'cmd /c lftp -c "open -u name,password -p 22 sftp://server.mylife.com ; cd test/portal/template ; put /cygdrive/c/inetpub/uploads/cata/exports/tpl_1421946484/cata.csv;"';
system($command);
I put it in a PHP file.
If if run it directly by the command line php sendFile.php it works fine.
But if I access this same php file throught a IIS 7.0 website, I got nothing and no error.
I can't understand where it comes from...!
Any help ?

Have you checked whether this is a permissions problem? Is it that the account under which IIS hosts the website might not have access to the /cygdrive/c/inetpub/uploads/cata/exports/tpl_1421946484/cata.csv file?

Related

aws php shell_exec command doesn't work on browser

My php code runs a shell file, which opens a tmux session and runs a node.js bot. And when I write this code as php phpfile.php from the terminal, it works, but when I enter phpfile.php from the browser, it does not work. As far as I understand, the problem is with the permissions of the apache user, but it does not work even though I have given him all kinds of permissions. When I try the command sudo -u apache tmux new -s node I get the result [exited]
php code:
<?php shell_exec('bash ./tmux.sh'); ?>
shell code:
tmux new -s node
tmux send-keys -t node.0 "node ./js/bot.js" ENTER
Note: i added apache user to visudo like apache ALL=(ALL:ALL) NOPASSWD: ALL
Note 2: i run this in aws ec2 server using aws linux
if you are using an operating system that uses the security layer SELinux. Please try to turn it off or try writing a rule to it.
I always turn it off.
nano /etc/selinux/config
SELINUX=disabled

Automate Letsencrypt SSL Certificate installation with PHP shell_exec command via acme.sh snap package

I install lets encrypt certificates through acme.sh snap package
https://github.com/acmesh-official/acme.sh
I am wondering if there is any way to automate certificate installation via PHP default shell_exec() command.
This shell command used to get certificates works just fine when logged in via SSH
acme.sh --issue --dns dns_gd -d example.com -d *.example.com
However, if I call the same command via PHP shell_exec, it always throws an error
$domainName = 'example.com';
$initCommand = "acme.sh --issue --dns dns_gd -d $domainName -d *.$domainName";
$output = shell_exec("$initCommand 2>&1 | tee -a /var/www/html/sshout.txt 2>/dev/null >/dev/null &");
sshout.txt - The output I am getting on sshout is as following
ssh: 1: /home/ubuntu/.acme.sh: Permission denied
sudo: no tty present and no askpass program specified
Is it possible to get certificates this way?
Or any other way to automate it via PHP? by setting cron, or creating a bash script and calling it from PHP?
I am running PHP 7.2 on ubuntu 18 on an apache server.
You should have permission and the right path to run the bash script. I guess this topic is what you need PHP script can't run bash script. sh: Permission denied
the other way is using acmephp

Meshlabserver : Cannot connect to X server error

I have meshlab installed in my machine running Ubuntu 14.04 OS. I can access it from command line using meshlabserver command. But problem arises whenever I try to call it from a php script using the command
<?php
system('meshlabserver 2>&1');
?>
It shows the error meshlabserver: cannot connect to X server. After going through a few websites I did the following things:
I moved the meshlabserver executable from /usr/bin to /usr/local/bin and gave it executable permissions using
sudo chmod a+x meshlabserver
But when I ran the whoami command from my php script (calling the meshlabserver), it showed www-data. So I gave executable permissions for all users to the meshlabserver using
sudo chmod 777 /usr/local/bin/meshlabserver
But still it is showing the same meshlabserver: cannot connect to X server error. meshlabserver comamnd is working fine when ran from the command line.
I really need to call meshlab from the php script for my website. Thus any help would be highly appreciated. Thanks in advance.
It seems the php script can't access your display variable. If you logged in via ssh remember to tunnel your X-server via 'ssh -X ...' Your second option is to create a virtual frame buffer using Xvfb and redirect the display variable to it:
export DISPLAY=:100.0
Xvfb :100 &
Note the ampersand for the second command as Xvfb needs to be running in the background.
a combo of prior answers works for me:
ssh -X, as well as export DISPLAY=:0.0 (on remote)

Running wget with PHP and apache (MAMP)

I have trouble running wget through php exec or system functions. The configuration is with MAMP.
I successfully run basic commands like ('whoami', 'pwd', etc.).
I even changed the apache user to the root user and still nothing. In the error log it output "wget: command not found".
I can run with the terminal and it works fine. The wget is installed through Macports.
What should I do?
Regards!
From a terminal, run which wget, and use the full path in the call to exec/system.

Executing a bash file from a php page with root-only commands (Ubuntu)

I need to execute a bash file from a php page, with exec() function. The problem is that in this bash file, there's the command "adduser" ... Witch is a sudo command. I had the idea of modifying the sudoers so the user that run the script would have access to it, but who is this user ? I know apache2 is executated with www-data user...
Thanks!
You can find out which user PHP is running as by using system to run the command 'whoami' and display the output.
system('whoami');
That seems like a rather bad plan, giving the www-user sudo access. But yes, its www-data (by default, depending on linux flavor) that apache runs under.

Categories