I have an application in php that works on the docker. I would like to send a command from php code to container that should create files (some_dir/certs/cert.crt etc.). This command i run like this (by system/exec/shell_exec or symfony/process)
system("traefik-certs-dumper file --source acme.json --dest some_dir --version v2");
When php run this code then directory has been created but not files, also i don't have any error.
This command works when i make it from terminal via docker exec but not from php. This is probably some permission problem between php and docker container, but i don't know how can i set it.
I'm trying to set in docker file this, but not working:
RUN chmod 777 /go/bin/traefik-certs-dumper
RUN usermod -u 1000 www-data
also standard command works like this:
system("mkdir -p some_dir_1234");
system("touch some_dir_1234/some_file_1234");
How can I allow an installed library to create files?
I finally was able to find a solution. In a separate container, I had a process supervisor running, who saw this file, because it was mounted to the main application directory also. What had to be done was to mount the file to the main container and the supervisor container.
Try
exec("traefik-certs-dumper file --source acme.json --dest some_dir --version v2");
intead of system()
Related
Hope someone can shed some light on this one, it's driving me nuts. I have a backup web app written in php, this creates an rsync which is run on a remote server and stores the files in a local folder. This function works using the php exec function.
We are migrating this to within a laravel8 app and the identical call to the same rsync and to the same local storage directory fails. The error is a 255.
If i use the same rsync command from command line it works the same as it does in our original app.
A bit more...
The Laravel instance is using Horizon to perform this so i have this in the handle() function of a process file in the jobs folder.
Of note I have a dev version of this laravel app and with this it works correctly and syncs with the remote folder.
My rsync command is using an id_rsa file for the apache user - www-data (the app is not available to the outside world). the folders have the correct permissions and owners (www-data and 755 for directories and 644 for files).
an example (modified version) of the rsync command:
rsync -rLDvcs -e "ssh -p 22 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -i /var/www/.ssh/id_rsa" --no-perms --no-group --no-owner --progress --exclude-from '/var/www/backup/exclude.txt' --delete --backup --backup-dir=/hdd/mnt/incs/deleted_files/the-project/ theproject#188.102.210.41:/home/theproject/files/ /hdd/mnt/incs/theproject/files
the code which calls this is:
$aResult['res'] = exec($rsync . ' 2>&1', $aResult['output'], $aResult['return']);
when run $aResult['return'] has a value of 255 and the process has failed. The docs seem to say this is a failure to connect yet the identical call on another app on the same machine works (as it does if rsync is used in command line)
Any ideas? The php version on this box (the original app and the laravel upgrade) is php8.0 and my dev copy uses php7.4. my dev and this internal box use Ubuntu20.04 and apache2 so besides the php version are pretty identical.
No error is in any logs.
thanks
Craig
I have a binary file whose path is mentioned in the .bashrc file, I am able to execute it through command line. I copied the command to run the binary file into a bash(test.sh) file.
I am trying to execute this test.sh file through Php using the command
<php
shell_exec("test.sh")
?>
This says that command not found.
Maybe, your webserver configuration forbid execution of system commands.
Check your php.ini "disable_functions" section. If you see there something like below, this is the reason.
disable_functions=exec,passthru,shell_exec,system
I'm trying to execute docker command in that runApp.sh file
Remember, you script executes INSIDE docker container. It doesn't have access to the mother system.
If you want to execute docker commands inside docker container, your need docker in docker
I am running Ubuntu 16.04 and I am programming in PHP. Here is my problem:
I have a need to use the wget linux command to download all files from a website. So I am using PHP's shell_exec() function to execute the command:
$wgetCommand = "wget --recursive --page-requisites --convert-links $url 2>&1";
$wgetCommandOutput = shell_exec($wgetCommand);
This command downloads all files and folders from a website, recursively, to the current working directory (unless otherwise specified). The problem is that when these files and folders are downloaded, I do not have permissions to read them programmatically after that.
But if I do something like sudo chmod 777 -R path/to/downloaded/website/directory after they are downloaded with the wget command and before they are read programmatically, they are read just fine and everything works.
So I need a way to download folders and files from a website using wget command and they should have read permissions for all users, not just sudo.
How can I achieve that?
This sounds like a umask issue with the user running the PHP script.
Normally, Ubuntu would have a default umask of 0002. This would create a file with (-rw-rw-r--).
From the console you can check and set the umask for the PHP user via:
$umask
And inside the PHP script, do a
<?php
umask()
If you are on a running webserver, it would, however be better to alter the files permissions of the downloaded files afterwards, via
<?php
chmod()
The reason is, that the umask handles file creation for all files - not just your script.
I am trying to build PHP wrapper for PowerBI. I installed PowerBI Cli (https://github.com/Microsoft/PowerBI-Cli) on my local and when I run any PowerBI Cli command on my terminal, it is working well. It is working well even when I run the commands using the _www user (sudo -u _www powerbi config)
However, when I run them through PHP using either shell_exec or Symphony's Process Component (https://symfony.com/doc/current/components/process.html), I am getting the following exception:
env: node: No such file or directory.
I am facing this issue on Mac Sierra. The commands are working well on Linux using the PHP exec()
Try linking,
"ln -s /path/where/command/is stored/ /to/path/where u want to exec/"
Sometimes the program are stored in usr/local/bin/program meanwhile as per default you are executing in usr/bin/program
And then in shell use the new path you have set.
Example for linking suppose if you have path for command,
/usr/bin/powerbi then with above command you can link new path usr/powerbi after that you can use new path in exec or shell command.
Try using the full path rather than the command. Without knowing your exact path I can't tell you exactly what to do but it would be something like this:
$output = shell_exec("sudo -u _www /path/path/powerbi config");
var_dump($output);
Edit:
Or, change directories first. So using my example above, it would be:
$output = shell_exec("cd /path/path/powerbi; sudo -u _www powerbi config");
I'm trying to setup phing to work with travis-ci, but I can't get it to run a setup script to get all the dependencies installed.
My .travis.yml file is:
language: php
php:
- 5.2
script: ./.travis-phing.sh
In travis, I get the error:
/home/travis/build.sh: line 105: ./.travis-phing.sh: Permission denied
What is causing that?
Solved
The script to be set to execute. I used:
chmod a+x .travis-phing.sh
Then simply commit, and push back to github.
Run the script using bash
Another option would be to run the script using bash, this would omit the need to modify the files' permissions.
bash path/to/file.sh
Alternatively:
sh path/to/file.sh
Note that
In this case you're not executing the script itself, you're executing bash or sh which then runs the script. Therefore the script does not need to be executable.
Make sense?
I've found this solution incredibly useful myself. I'm mainly running node & npm projects on travis-ci, those builds make use of the npm test command which you can configure to be anything.
I'm order to modify file permission I need to use sudo chmod ... on my local machine. But you can't always use sudo on travis-ci.
sh file.sh allows me to run my tests both locally and on travis-ci without having to manually update permissions.