I have a simple scripts that create folder for every user that sign up, it works perfectly fine on local but after pushing to live server it does not work... I discovered that it require sudo password. I google around and all suggested that I should edit sudoer by running visudo but nothing work.
$sudoPassword = \Config::get('app.sudoPassword');
$sudo = "echo $sudoPassword | sudo -S ";
$createDir = "cd ../../merchants && mkdir ".$merchant->identity." && cd ".$merchant->identity;
echo exec($createDir);
my question is, how should I make exec to run as root? I'm using centOS in digitalocean
I don't recommend running the script as sudo. A better alternative is to give permission to the web server (www-data user) to write to the ../../merchants directory.
First add yourself to the www-data group by running the command below.
usermod -a -G www-data <your-username>
Then give the ownership of the merchants folder to the www-data group using the command below.
chgrp www-data merchants
Lastly give the correct permissions to the merchants folder using the command below.
chmod g+rwxs merchants
Hope this helps you solve your issue.
EDIT
On CentOS, you should use apache instead of www-data.
This is not a direct answer to your question, just a suggested improvement to your code. Instead of using sudo I would stick to #tamrat's solution. This is more ment as a comment but is too long to be a comment.
Instead of using php's exec function I would recommend you using Symfony's process component which is already available in your Laravel project.
use Symfony\Component\Process\Process;
use Symfony\Component\Process\Exception\ProcessFailedException;
Use:
$createDir = "mkdir /full/path/to/merchants/".$merchant->identity." && cd /full/path/to/merchants/".$merchant->identity;
$process = new Process($createDir);
$process->run();
if (!$process->isSuccessful()) {
throw new ProcessFailedException($process);
}
I also suggest that you use full paths rather then relative paths (if possible for your use-case) so you don't do actions in another directory when you re-use the code from another file. If not make sure you allways are in the correct path by validating properly.
Related
I am trying to convert a files from doc to pdf on my aws linux server. On my previous machine it worked fine but now i am having issues on my new machine. The only difference is i have upgraded from PHP 7.0 to PHP 7.2. and libre office version
LibreOffice 6.0.1.1 00m0(Build:1)
I have tried giving root permissions to libreoffice and the package that executes the command but no success.
This is the package i am using https://github.com/szonov/libreoffice-converter
I am using it with laravel 5.4. This is the code in the package that performs the action
$replacement = array(
'bin' => $this->bin,
'convert_to' => $convert_to,
'outdir' => $outdir,
'source' => $this->source
);
$cmd = $this->makeCommand($replacement);
$cmd = "sudo ".$cmd;
shell_exec($cmd);
$result = false;
if (file_exists($outfile)) {
$this->mkdir(dirname($this->destination));
$result = rename($outfile, $this->destination);
}
// remove temporary sub directory
rmdir($outdir);
return $result;
I have tried appending sudo since when i dd the command and execute is using sudo it worked in command line..
So you need to either use something like below
Execute root commands via PHP
Or you should enable to login to the www-data user
sudo usermod -s /bin/bash www-data
Then you should login with that user and fix all the issues with permissions so you can run the command
sudo su www-data
Once you have done this make sure to reset the login using below
sudo usermod -s /bin/nologin www-data
Once the issue is sorted out in user terminal, you can run the commands without issue in apache as well
You have to change the ownership of your directory where your storing the files, Please try the below command on your aws ubuntu machine.
sudo chown -R www-data:www-data /var/www/my-project/
This issue has only 2 scenarios
The user dont have permission.
The problem is from the code.
I'm moving on the assumption that Laravel is working well with you except for this part
This means that you have access to the storage files, why don't you save there?
If you can save there, compare the folders and your problem is solved.
If you cannot save there, then the issue is from the code which I doubt as you stated that everything is working well previously.
I have a Python3 Pyro4 server client app that works great when run from command line.
server.py
import Pyro4
#Pyro4.expose
class JokeGen(object):
def __init__(self):
self.jokevar = "Joke"
def joke(self, name):
return "Sorry "+name+", I don't know any jokes."
def main():
Pyro4.Daemon.serveSimple(
{
JokeGen: "example.jokegen"
},
ns = True)
if __name__=="__main__":
main()
client.py
#!/usr/bin/env python3
import Pyro4
import sys
person_to_joke = sys.argv[1]
joke_control = Pyro4.Proxy("PYRONAME:example.jokegen")
print (joke_control.joke(person_to_joke))
The problem is I need to run the client from a web app using PHP.
I have created a joke.php
<?php
$command = escapeshellcmd('/full/path/to/client.py SquirrelMaster');
$output = shell_exec($command);
echo $output;
?>
While this code does actually work I did some non-standard hacking to make it work. I took a copy of my /home/user/.local (where the pyro4 modules have been installed for user) and placed it in /var/www/ and gave ownership to www-data.
sudo chown -R www-data.www-data /var/www/.local
It seems like there must be a better way to do this and I'm pretty sure there will be potentially issues in the future if I leave things this way. The issues seems to be that the Pyro4 modules need to be available for the www-data user. So my question is, What is the proper way to make Pyro4 modules available to the www-data user on Ubuntu linux running apache2?
EDIT - ADDITION
I also tried doing the following:
sudo mkdir /var/www/.local
sudo mkdir /var/www/.cache
sudo chown www-data.www-data /var/www/.local
sudo chown www-data.www-data /var/www/.cache
Then run the command:
sudo -H -u www-data pip3 install pyro4 --user www-data
But this results the error "Could not find a version that satisfies the requirement www-data (from versions: )
No matching distribution found for www-data"
Looks a bit like this question https://superuser.com/questions/646062/granting-write-permissions-to-www-data-group
I wanted to suggest using the PYTHONPATH environment variable to point to a library install location readable by the www-data user where you'd copy the python modules it needs to acces, but I think this is considered bad form nowadays.
Probably best is to create a Python Virtualenv that is accessible for the www-data user and install all required modules into that, using the pip command from that virtualenv. You may have to use some sudo/chown dance to get this right still.
Another way perhaps is to not bother with calling a python subprocess at all, but use Pyro's HTTP gateway. That way you can simply do a HTTP request from PHP to a locally running Pyro http gateway process, which will translate it into a proper Pyro call. I don't know PHP but it seems to me that it should be easy to make a custom http request to a server running on some localhost port. This may be faster as well because you're not starting up python processes for every call.
(edit): another succesfully working solution seemed to be the following, where sudo is used to invoke pip under the appropriate user, letting it install the library into www-data's .local library folder:
create /var/www/.local and /var/www/.cache folders, giving www-data permissons to these folders only (and not /var/www to avoid security issues)
invoke sudo -H -u www-data pip3 install pyro4
You may still need to add --user to the pip command if it's an older version, because I think that only recent pip versions install to the user's lib folder by default instead of to the global system python's lib folder.
I've been at this for two days now and haven't been able to find any way (good or bad) of doing that to work.
I have to be able of dynamically mounting drives over network from my website's pages (that part is inevitable).
I have no problems doing it directly on the console with the following command
mount -t cifs //IP-REMOTE-MACHINE/Folder -o username=username,password=password /mnt/share
Obviously trying to just do a shell_exec() of this command wouldn't work with no root rights.
I tried to shell_exec() a script in which I would switch to root user (via su or sudo mycommand) but both of them wouldn't work (never been able to succeed in doing a script who would automatically switch my user to root even with the root pwd hard coded (even if that feels an extremely bad idea I could have accepted that atm).
After that I tried to use pmountbut never found a way to access to a remote shared file (don't think it's even possible but I may have missed something here?)
All that is running on a Debian machine with apache2.
I have a wild idea...
You could set a cron to run as root that checks for mount commands from your script. The script would simply set a mount command to be processed, and when the cron gets to it, runs the mount, marks the command as processed, and writes to a log file which you could then display.
It's not safe to run sudo commands with www-data (the user for web servers in Debian).
But if you want to run sudo [command] in a php script, you must add the user www-data in sudoers: http://www.pendrivelinux.com/how-to-add-a-user-to-the-sudoers-list/
And then you can exec: sudo mount ...
EDIT: It's safer to add in visudo:
www-data ALL= NOPASSWD: /bin/mount
To allow www-data to use only sudo /bin/mount
So I want to execute the following command in my php script:
exec("/path/to/command");
Because it is the www-data user who runs php scripts, i currently can not run this command.
I have read something about suexec being able to run a command as if it was a different user. I find it rather difficult to understand how this works.
I have already installed suexec and edited the /etc/apache2/suexec/www-data file and added:
/home/user_to_run_command/script.php
I have also edited /etc/apache2/sites-enabled/000-default and added:
SuexecUserGroup user_to_run_command user_to_run_command
Am I missing anything?
suEXEC will work only when PHP is executed in CGI mode but not if PHP is running as an apache2
module. I guess you are running it as a module.
An alternative might be to transfer the ownership to the desired user and then set the suid bit:
chown desired_user your.program
chmod u+s your.program
Now when executing your.program it has permissions as if it where executed by it's owner. Follow the wiki article that I've linked for more information.
Side note: This will work with binaries only (not with shell scripts as they where executed by the shell binary which has no suid bit set)
I had the same problem and finally found a solution which as far a I can see is both safe and simple. A disadvantage of this method is that you have to take care of security updates when they are published.
What we are gonna do is make our own special shell which we chown and SUID to the user which we want the task to perform. To remain safe this user should be just an ordinary user without extensive system rights and place the script somewhere others are not allowed. Now we let php execute a script which uses this special shell and all command within this script will be executed as the chosen user.
In practice:
sudo mkdir /usr/lib/specialshell
sudo chown user_who_may_run_command:root /usr/lib/specialshell
sudo chmod 700 /usr/lib/specialshell
sudo cp /bin/perl specialperl
sudo chown user_to_run_command:usergroup_to_run_command specialperl
sudo u+s specialperl
sudo mv specialperl /usr/lib/specialshell
Now we make a script named command.script containing:
#!/usr/lib/specialshell/specialperl
$ENV{"PATH"} = "/usr/bin";
system("/path/to/command");
and from php code we use:
exec("/path/to/command.script");
et voila, no code change, just the name of command in php.
edit: works only with perl as shell, so changed bash to perl and put the shell somewhere safe
I have subversion installed on CentOs 6.4 and want to write a script (from my understanding a shell script) to run a couple of commands. My issue here is not writing the shell script but more providing a parameter to the shell script (so a function in a way) to be able to complete the request.
In essence I want to do the following:
Run script with parameter from SSH ("somscript reponame")
Create repo: svnadmin create /var/www/svn/reponame
Change repo owner: chown -R apache.apache /var/www/svn/reponame -R
Do security changes: chcon -R -t httpd_sys_content_t /var/www/svn/reponame/
And chcon -R -t httpd_sys_rw_content_r /var/www/svn/reponame
Create default directories: svn import -m 'Initial import' /tmp/svn-structure-template/ http://domain.com/svn/reponame/ (localhost is not accepted by stackoverflow)
Can anyone offer some guidance or perhaps provide an alternative I can use? Would a PHP script work (so to run it from a browser and use a query string of some sort and would this not cause some security issues as apache is the default owner and some of these may require root / sudo access).
Thank you in advance!
As Fausto said in the comment, standard Bash parameters should work fine. At ProjectLocker, we use scripts similar to what you're describing to provision new Subversion repositories, and you should just be able to reference "$1", "$2", and so on in the script.
Incidentally, you don't have to import to the http:// location if you're running on the machine with the instance, if that makes things harder. You can do:
svn import -m 'Initial import' /tmp/svn-structure-template/ file:///var/www/svn/reponame
although I'd recommend testing that first to make sure that doesn't cause an undesired permissions change. If it does, you can simply run it before the apache permission flip and the lockdown.