By default PHP runs under IUSR account. When executed directly:
$lastline = exec('D:\\MyProgram.exe', $output, $return_var);
It runs, but the program is unable to do it's tasks due to insufficient privileges. My question is: How do I run an executable under a specific Windows account from a PHP website?
I've tried:
When executed through Sysinternals PsExec:
$lastline = exec('D:\\PsExec.exe -u UserName -p Password -accepteula "D:\\MyProgram.exe"', $output, $return_var);
MyProgram.exe does not even execute at all. PHP throws empty output and return_var is -1073741502. I assume this is some sort of unhandled exception.
When executed through lsrunas:
$lastline = exec('D:\\lsrunas.exe /user:UserName /password:Password /domain:DOMAIN /command:"D:\\MyProgram.exe" /runpath:D:\\', $output, $return_var);
MyProgram.exe does not execute either. PHP throws empty output and return_var is 0.
Microsoft's built in runas command doesn't work because it doesn't accept password as a parameter.
Tried using different PHP functions like shell_exec, system and passthru
Tried creating a new Application Pool under IIS, to run the website under LOCAL SERVICE, SYSTEM, or a specific user.
EDIT: This is where I went wrong. It should work, but fails when Load User Profile setting is not enabled (step 3 in my answer).
Does anyone have working suggestions?
Note: All commands are working and tested through command line directly.
I kept digging and found out that the only thing that works is a dedicated application pool.
Create a new Application Pool under IIS
Set username and password in Advanced Settings > Identity > Custom account
Set Advanced Settings > Load User Profile to true (this one is important)
Choose this pool under Basic Settings of a site,
-or- for a better security:
. 5.Move all command-relatied code to one section within your website, convert it to application and apply that Application Pool to it. Then you can restrict any public access to that part and safely call that functionality from the other parts of your site.
Important note (!):
If you're running PHP via FastCGI, you must set fastcgi.impersonate = 0 in php.ini file.
Test batch file:
To test who is running the process you can save the following code to a *.bat file and call it from PHP.
#echo off
SET now=%date% %time%
SET out=[%now%] %userdomain%\%username%
echo %out%
echo %out% > D:\hello.txt
::msg * "%out%"
if %username%=="SpecificUser" (
exit /B 100
) else (
exit /B 200
)
Replace SpecificUser with your desired user name. Sometimes you'll see no output. The exit codes will help you then.
In case you can't see any output or exit code at all, this script will output the username to a text file. Replace D:\hello.txt with your desired path.
Related
Context:
I have a simple html frontend that allows the user to enter some basic detail, e.g. colours to use. This then gets placed in my strings.xml file so that we can quickly push out preview versions. Windows 10 64bit running WAMP, building with Gradle 3.6.
The Goal:
To build the updated app from command line with PHP.
The issue:
The project builds correctly when I manually build via the command line, but the build fails when I use PHP's exec function.
The gradle wrapper has to be executed from the base project folder, thus the need for cd...
Firstly I clean the project, so I do:cd path/to/projectdir && gradlew clean 2>&1
Which work as expected from exec and command line.
Next is to build with cd path/to/projectdir && gradlew assemble 2>&1.
Running the above via command line returns success:
However, running the same with exec('cd path/to/projectdir && gradlew assemble 2>&1', $output) and printing output returns:
...[24] => FAILURE: Build failed with an exception.
[25] =>
[26] => * What went wrong:
[27] => Execution failed for task ':app:processDebugResources'.... (full output https://pastebin.com/zdeXuukp)
As far as I can tell, the only difference between the two, is the process owner (based on exec('whoami')).
Any idea if this might be the cause for it failing or what else might be causing the PHP exec build to fail?
After much digging, my suspicions were confirmed, it was indeed due to the apache user having incorrect permissions.
This is the answer: Is it possible to have WAMP run httpd.exe as user [myself] instead of local SYSTEM?
In this instance, I was just playing around on my local machine and would obviously not recommend this approach in a production environment, anyway, here's how I solved it:
Close WAMP.
Open services.msc (win+r services.msc)
Change wampapache64, wampmariadb64 & wampmysqld64 Right click>Properties > Logon Tab and specify a User account to start the service as (I used my user account, as I knew it already has the correct permissions) and Apply changes.
Open WAMP.
Accessing a PHP page through the browser with a simple echo exec('whoami') should now output the same as running whoami directly in the command line.
My script is now executing correctly and returning the same output as when run through the command line.
I am trying to execute Linux shell command from php but there is no output on web page. If I am trying to execute the php page from linux cosole its working fine.
PHP Code:
<?php
$result = shell_exec('asterisk -rx "core show channels concise"');
$ccount =shell_exec('asterisk -rx "core show channels count"');
echo $result;
echo $ccount;
?>
Above code is not giving any output on web page. But on linux console its woking. e.g.
[abc#host sysadminman]# php myfile.php
Asterisk control socket permissions can also be changed easily in /etc/asterisk.conf:
[files]
astctlpermissions = 0660
astctlowner = root
astctlgroup = apache
astctl = asterisk.ctl
First of all your question is incomplete as you not showing what is expected output. But aside from this you are doing a few common mistakes there.
First you are testing your script as root (# php ...) but your httpd is NOT serving your scripts as root. So your tests are useless. You should switch to right user (most likely www-data and then check to run your script from shell. Most likely it will fail for one of two common reasons - insufficient permissions to run asterisk program or incomplete $PATH not pointing to the place where asterisk is.
I agree to Marcin.
I would suggest you write script to execute those commands and put result to some storage (such as text or database). Use cron to run it in root. Then you read the data from storage on web page.
If you want real time response, you have to run cron all the time though it consume server resource. That is trade-off you have to consider. Its depends on what you wanna achieve from the web site.
Use sudo to run thoes commands as root or Asterisk user. You can configure sudo to allow execution without password to only specific commands.
check disable_functions in php.ini. Mb shell_exec just off for web server
TL;DR:
I have a PHP page which executes a shell script containing impdp which imports dump to a new schema.
PHP file:
echo shell_exec("./DumpCreator.sh 22");
DumpCreator.sh
#!/bin/bash
echo $1
impdp U_$1/Pass DIRECTORY=dmpdir DUMPFILE=MYDMP.DMP remap_schema=PARENT:U_$1
It echos 22 but impdp doesn't execute although all permissions are given to a single user (admin).
Full
I have a PHP page which creates a shell script file and overwrites its contents as the following:
$shellFile = fopen("myfile.sh" , "w");
$field = "1";
$command = "#!/bin/bash\n"
."echo $field\n"
."sqlplus system/pass as sysdba << SQLEND\n"
."create user U_$field identified by newpass;\n"
."grant dba to U_$field;\n"
."exit;\n"
."SQLEND\n";
fwrite($shellFile, $command);
$output = shell_exec("bash myfile.sh");
echo $output;
fclose($shellFile);
contents of .sh file
#!/bin/bash
echo 1
sqlplus system/pass sysdba << SQLEND
create user U_1 identified by pass;
grant dba to U_1;
exit;
SQLEND
My problem is the part of sqlplus isn't executing.
so what is wrong with this, thanks in advance.
UPDATE
When I execute .sh file itself everything executes well (user is added and granted).
UPDATE 2
I tried doing mentioned above using php oci and it ran successfully.
Now the problem is with when user is granted permission I need to copy some dump to it using a script which I will be needing to execute using PHP.
My new .sh file
#!/bin/bash
echo $1
impdp U_$1/pass DIRECTORY=DATA_PUMP_DIR DUMPFILE=something.DMP remap_schema=something:U_$1
Even if I removed $1, it doesn't execute this part and I think it doesn't require sudo or to su to root, so what am I doing wrong ? also what permissions that could be missing in the process ?
Update 3
Executing the script directly from terminal using 'admin' account which is the one Oracle is installed on, also getting the current user in PHP shows that it's 'admin'.
So the problem is with How Can I execute any non-os related commands (anything but echo, ls .. etc) from my PHP page ?
So after searching about permissions, I found that it's possible to execute anything (root or non-root commands) by editing sudoers file which will allow any php to execute any command and that's as far as I can tell is a very poor solution.
Ref : How to call shell script from php that requires SUDO?
Make sure you have the required environment variables set.
In particular you'll probably have to set LD_LIBRARY_PATH to the location of the shared libraries that come with your Oracle installation.
The PHP code is probably hiding the error messages related with this.
Compare your environment where you normally run SQL*Plus or IMP before and after running oraenv, you will need to set at least a few of those (and probably most if not all).
I need to run a linux command from php. So I used ftp_exec() function.
$command='ls -al> /ftp_test/t.log';
if (ftp_exec($ftp_conn,$command))
{
echo "$command executed successfully.";
}
else
{
echo "Execution of $command failed.";
}
But it gives me warning
Warning: ftp_exec(): Unknown SITE command
I have googled and found for ftp_exec "execution via FTP isn't very widely supported. Check that it works on the servers that you intend to connect to before you start coding something that requires this."
Can anybody give me a idea to run a linux command from php ?
If you have the appropriate authorization you may do so via SSH:
$file_list = shell_exec('ssh user#site "ls -la"');
You'll need for user to have an authorized ssh key for site, and the user must be accessible from whatever user is running PHP. This usually boils down to using user wwwrun for both.
Or you can use sudo for added security, by placing the command into a script of its own, then sudoing it:
$file_list = shell_exec('sudo /usr/local/bin/ssh-ls-site');
Now user wwwrun can be allowed to run ssh-ls-site but can't modify its contents, so he can't run arbitrary commands, nor has he access to the ssh authorization key.
The ssh-ls-site can log the request as well as updating a local marker file, and exiting immediately if the file is newer than a certain guard time. This will prevent possible DoS attacks against site (running lots of allowed commands, exhausting resources), and also improve performances; if for example you need to run the command often, you can save the results into a temporary file. Then if this file is found to exist, and is not too old, you just read back its contents instead of asking it to #site, effectively caching the command locally.
I have a collection of bash and Perl scripts to
develop a directory structure desired for deployment on linux box
(optionally) export code from svn
build a package from this source
This is working well from terminal. Now my client requests a web interface to this process.
e.g "Create New Package" button on certain page will invoke above steps one by one and return the output to user as script echos, not when the whole scripts executes.
Is it possible to send instantaneous output from bash script to webpage or php script which has invoked it through program execution functions (system, exec, passthru ... or any thing else that suite this process flow) ?
What is elegant why to do this ?
What security precautions I should take while doing such thing (if possible)?
Edit
After some search I have found part of the solution but still not working :
$cmd = 'cat ./password.txt|sudo -S ./setup.sh ';
$descriptorspec = array(
0 => array("pipe", "r"), // stdin is a pipe that the child will read from
1 => array("pipe", "w"), // stdout is a pipe that the child will write to
2 => array("pipe", "w") // stderr is a pipe that the child will read from
);
flush();
$process = proc_open($cmd, $descriptorspec, $pipes, './', array());
echo "<pre>";
if (is_resource($process)) {
while ($s = fgets($pipes[1])) {
print "Message:".$s;
flush();
}
while ($s = fgets($pipes[2])) {
print "Error:".$s;
flush();
}
}
echo "</pre>";
output: (webpage)
Error:WARNING: Improper use of the sudo command could lead to data loss
Error:or the deletion of important system files. Please double-check your
Error:typing when using sudo. Type "man sudo" for more information.
Error:
Error:To proceed, enter your password, or type Ctrl-C to abort.
Error:
Error:Password:
Error:Sorry, try again.
Error:Password:
Error:Sorry, try again.
Error:Password:
Error:Sorry, try again.
Error:sudo: 3 incorrect password attempts**
First issue I am having now is to pass sudo passoword
help please !
I would use a kind of master / slave design. The slave would be your perl / bash script, just doing a job (packaging, compiling, exporting code or so), and feed a log entry.
The master would be your php process. So the principle is the following: the master and the slave share a communication channel, and communicate asynchronously from that channel.
You could imagine a database like:
create table tasks ( id INT primary key, status INT, argument VARCHAR(100));
You php page should switch user choice, and filter input:
switch ($_GET['action']) {
case 'export':
$argument = sanitize($_GET['arg']);
add_task('export', $argument);
break;
case '...':
// ...
}
and the add_task function could be something like:
function add_task($action, $arg)
{
return $db->add('tasks', array($action, NEW_TASK, $arg);
}
The slaves could be run via a cron job, and query the database, feeding the progression of the task.
The pro are:
independant systems, ease the evolution.
if a client gets disconnected, the job is never lost
easier to secure
The cons are:
a little more complicated at the beginning.
less reactive, because of the polling time of the slaves running (for instance if they run every 5 minutes)
less output than the direct output of the command
Notice that you can then implement xml-rpc like triggers to run the slaves, rather than using a message passing system.
The simplest approach is to use shell_exec for your purpose. It executes a command via shell and returns the complete output as a string, hence you can display it on your website.
If this doesn't suit your purpose, because maybe you want some responses while waiting for the command to finish, check out the other program execution functions available in php (hint: there are a few good comments on the manual pages).
Keep in mind, when evoking commandline scripts this way, generated output will have the file owner, group and permissions of your webserver (p.e. wwwrun or whatever). If parts of your deployment need a separate owner, group and/or file permissions, you have to manually set them either in your scripts or after invoking shell_exec (chmod, chown and chgrp can deal with this in php).
About security:
Alot of web-based applications put that kind of function into a separate installation directory, and kindly ask you to remove this directory after installing. I even remember some of them shouting at admins quite persistent unless it is removed. This is an easy way preventing this script being invoked by wrong hands at a wrong time. If your application might need it after installing, then you have to put it into an area where only authorized users have access to (p.e. admin area or something similar).
You can use the Comet web application model to do real time updating of the results page.
For the sudo problem, as a quick and dirty solution I'd use restricted set of commands that the web server can execute without password. Example (add in /etc/sudoers):
apache ALL=(ALL) NOPASSWD: /sbin/service myproject *
which would allow apache to run /sbin/service myproject stop, /sbin/service myproject start and so on.
Take a look at Jenkins, it does the web part nicely, you'll only have to add your scripts in the build.
A better solution would be, as suggested by Aif, to separate the logic. One daemon process waiting for tasks and a web application visualizing the results.
Always use escapeshellarg and escapeshellcmd when executing system commands via PHP for security. It would also be suggested to have the user within a chroot'd directory as limited as possible.
You seem to have solved the issue of getting stdout and stdin to be outputted to your webpage by using proc_open. Now the problem looks to be executing the script via sudo.
From a security perspective, having a PHP app run a script as root (via sudo) makes me cringe. Having the root password in password.txt is a pretty huge security hole.
What I would do (if possible) is to make whatever changes necessary so that setup.sh can be run by the unix user that is running Apache. (I'm assuming you're running PHP with Apache.) If setup.sh is going to be executed by the webserver, it should be able to run it without resorting to sudo.
If you really need to execute the script as a different user, you may check into suPHP, which is designed to run PHP scripts as a non-standard user.
Provide automated sudo rights to a specific user using the NOPASSWD option of /etc/sudoers then run the command with the prefix sudo -u THE_SUDO_USER to have the command execute as that user. This prevents the security hole of giving the entire apache user sudo rights, but also allows sudo to be executed on the script from within apache.