I am trying to convert a files from doc to pdf on my aws linux server. On my previous machine it worked fine but now i am having issues on my new machine. The only difference is i have upgraded from PHP 7.0 to PHP 7.2. and libre office version
LibreOffice 6.0.1.1 00m0(Build:1)
I have tried giving root permissions to libreoffice and the package that executes the command but no success.
This is the package i am using https://github.com/szonov/libreoffice-converter
I am using it with laravel 5.4. This is the code in the package that performs the action
$replacement = array(
'bin' => $this->bin,
'convert_to' => $convert_to,
'outdir' => $outdir,
'source' => $this->source
);
$cmd = $this->makeCommand($replacement);
$cmd = "sudo ".$cmd;
shell_exec($cmd);
$result = false;
if (file_exists($outfile)) {
$this->mkdir(dirname($this->destination));
$result = rename($outfile, $this->destination);
}
// remove temporary sub directory
rmdir($outdir);
return $result;
I have tried appending sudo since when i dd the command and execute is using sudo it worked in command line..
So you need to either use something like below
Execute root commands via PHP
Or you should enable to login to the www-data user
sudo usermod -s /bin/bash www-data
Then you should login with that user and fix all the issues with permissions so you can run the command
sudo su www-data
Once you have done this make sure to reset the login using below
sudo usermod -s /bin/nologin www-data
Once the issue is sorted out in user terminal, you can run the commands without issue in apache as well
You have to change the ownership of your directory where your storing the files, Please try the below command on your aws ubuntu machine.
sudo chown -R www-data:www-data /var/www/my-project/
This issue has only 2 scenarios
The user dont have permission.
The problem is from the code.
I'm moving on the assumption that Laravel is working well with you except for this part
This means that you have access to the storage files, why don't you save there?
If you can save there, compare the folders and your problem is solved.
If you cannot save there, then the issue is from the code which I doubt as you stated that everything is working well previously.
Related
I have a simple scripts that create folder for every user that sign up, it works perfectly fine on local but after pushing to live server it does not work... I discovered that it require sudo password. I google around and all suggested that I should edit sudoer by running visudo but nothing work.
$sudoPassword = \Config::get('app.sudoPassword');
$sudo = "echo $sudoPassword | sudo -S ";
$createDir = "cd ../../merchants && mkdir ".$merchant->identity." && cd ".$merchant->identity;
echo exec($createDir);
my question is, how should I make exec to run as root? I'm using centOS in digitalocean
I don't recommend running the script as sudo. A better alternative is to give permission to the web server (www-data user) to write to the ../../merchants directory.
First add yourself to the www-data group by running the command below.
usermod -a -G www-data <your-username>
Then give the ownership of the merchants folder to the www-data group using the command below.
chgrp www-data merchants
Lastly give the correct permissions to the merchants folder using the command below.
chmod g+rwxs merchants
Hope this helps you solve your issue.
EDIT
On CentOS, you should use apache instead of www-data.
This is not a direct answer to your question, just a suggested improvement to your code. Instead of using sudo I would stick to #tamrat's solution. This is more ment as a comment but is too long to be a comment.
Instead of using php's exec function I would recommend you using Symfony's process component which is already available in your Laravel project.
use Symfony\Component\Process\Process;
use Symfony\Component\Process\Exception\ProcessFailedException;
Use:
$createDir = "mkdir /full/path/to/merchants/".$merchant->identity." && cd /full/path/to/merchants/".$merchant->identity;
$process = new Process($createDir);
$process->run();
if (!$process->isSuccessful()) {
throw new ProcessFailedException($process);
}
I also suggest that you use full paths rather then relative paths (if possible for your use-case) so you don't do actions in another directory when you re-use the code from another file. If not make sure you allways are in the correct path by validating properly.
I am running Ubuntu 16.04 and I am programming in PHP. Here is my problem:
I have a need to use the wget linux command to download all files from a website. So I am using PHP's shell_exec() function to execute the command:
$wgetCommand = "wget --recursive --page-requisites --convert-links $url 2>&1";
$wgetCommandOutput = shell_exec($wgetCommand);
This command downloads all files and folders from a website, recursively, to the current working directory (unless otherwise specified). The problem is that when these files and folders are downloaded, I do not have permissions to read them programmatically after that.
But if I do something like sudo chmod 777 -R path/to/downloaded/website/directory after they are downloaded with the wget command and before they are read programmatically, they are read just fine and everything works.
So I need a way to download folders and files from a website using wget command and they should have read permissions for all users, not just sudo.
How can I achieve that?
This sounds like a umask issue with the user running the PHP script.
Normally, Ubuntu would have a default umask of 0002. This would create a file with (-rw-rw-r--).
From the console you can check and set the umask for the PHP user via:
$umask
And inside the PHP script, do a
<?php
umask()
If you are on a running webserver, it would, however be better to alter the files permissions of the downloaded files afterwards, via
<?php
chmod()
The reason is, that the umask handles file creation for all files - not just your script.
I am trying to build PHP wrapper for PowerBI. I installed PowerBI Cli (https://github.com/Microsoft/PowerBI-Cli) on my local and when I run any PowerBI Cli command on my terminal, it is working well. It is working well even when I run the commands using the _www user (sudo -u _www powerbi config)
However, when I run them through PHP using either shell_exec or Symphony's Process Component (https://symfony.com/doc/current/components/process.html), I am getting the following exception:
env: node: No such file or directory.
I am facing this issue on Mac Sierra. The commands are working well on Linux using the PHP exec()
Try linking,
"ln -s /path/where/command/is stored/ /to/path/where u want to exec/"
Sometimes the program are stored in usr/local/bin/program meanwhile as per default you are executing in usr/bin/program
And then in shell use the new path you have set.
Example for linking suppose if you have path for command,
/usr/bin/powerbi then with above command you can link new path usr/powerbi after that you can use new path in exec or shell command.
Try using the full path rather than the command. Without knowing your exact path I can't tell you exactly what to do but it would be something like this:
$output = shell_exec("sudo -u _www /path/path/powerbi config");
var_dump($output);
Edit:
Or, change directories first. So using my example above, it would be:
$output = shell_exec("cd /path/path/powerbi; sudo -u _www powerbi config");
I thought I throw together a little dirty script on our server (Ubuntu 16.04) that gives me some plain text output from Python.
I want to call the script like this from PHP (I know there should be some escaping done, but it's just a test currently):
<?php
$command = '/usr/local/bin/script.py';
$output = shell_exec($command);
echo $output;
?>
This is script.py owned by www-data mode 774
#!/usr/bin/python
import CoolProp.CoolProp as CP
import argparse
print('Hallo Welt')
If I comment out the CoolProp import it works. But somehow the package cannot be reached by www-dataand so the script returns nothing.
As you see I want to use the Package CoolProp.
So I tried installing it with pip install CoolProp=> That works for my local user. But now when called from user www-data
After I tried to install it with a target --target=/usr/local/lib/site-packages/ but that did not help.
I tried to change the ACL on the complete site-packages/ to rwx
for www-data but that does not work as well.
In the end: What is the simplest way to pip install a package that can be used by all users including www-data?
I recommend that you try the solution that xotihcan posted first as that is the easy way to make most python modules available to all users including www-data. However it doesn't work for every python module. If that doesn't work for you or you just want to install modules for the www-data user only then use the following commands:
sudo mkdir /var/www/.local
sudo mkdir /var/www/.cache
sudo chown www-data.www-data /var/www/.local
sudo chown www-data.www-data /var/www/.cache
sudo -H -u www-data pip install CoolProp
I had this same issue trying to make the Python pyro4 module available for the www-data use. There is another way to do it but it involves some even dirtier hackery. For more details check out my question/answer # How do I properly call a Python Pyro client using PHP and Apache web server?
Run PIP with root user.
That should fix the problem.
I'm trying to get an image of the raspi camera via a php script.
It's installed php5, apache2 and all necessary stuff.
Snippet: /var/www/img.php
if(isset($_GET['pic']))
system("sudo raspistill -w 512 -h 320 -o /var/www/img/img.jpg");
When I run the command directly in the terminal it's working, but the php script not. With sudo php /var/www/img.php?pic I'll get an error:
Could not read input file: /var/www/img.php
First I thought it's a problem with the permissions, but isn't working even with root privileges.
Have anybody an idea? I'm really depressed..
Thanks a lot!
Solution
first it's necessary to change the owner of the apache directory:
sudo chown www-data:www-data -R /var/www
After that it's not necessary to prepend sudo:
exec('raspistill ...');
It's also possible with popen, system, ...