I have encapsulated pdftk (PDF tool kit) as program that can be run inside of a fedora docker image. This allows me to run pdftk on a newer AWS AMI image that does not support pdftk. When replacing pdftk in our PHP program with the docker run command, it works great when running the PHP program from the command line, but when running it as a cron job or as a mailbox command, docker will not run. I have made the cron and mailbox users part of the docker group, checked permissions and paths, but still no joy.
The exec PHP command looks like:
$command = 'docker run -i -t --privileged -v /var/www/mbx/forms/inbox:/workdir -w /workdir/ brnlab/aultman-pdftk ./'.basename($files[$i]).' burst output ./pg_%03d.pdf';
Oddly the return on the $command when running this as a cron or mailbox command is blank, so I am not sure if there is an error or not. It does not appear to hang up. Any ideas on the best way to diagnose the issue further?
Related
Hope someone can shed some light on this one, it's driving me nuts. I have a backup web app written in php, this creates an rsync which is run on a remote server and stores the files in a local folder. This function works using the php exec function.
We are migrating this to within a laravel8 app and the identical call to the same rsync and to the same local storage directory fails. The error is a 255.
If i use the same rsync command from command line it works the same as it does in our original app.
A bit more...
The Laravel instance is using Horizon to perform this so i have this in the handle() function of a process file in the jobs folder.
Of note I have a dev version of this laravel app and with this it works correctly and syncs with the remote folder.
My rsync command is using an id_rsa file for the apache user - www-data (the app is not available to the outside world). the folders have the correct permissions and owners (www-data and 755 for directories and 644 for files).
an example (modified version) of the rsync command:
rsync -rLDvcs -e "ssh -p 22 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -i /var/www/.ssh/id_rsa" --no-perms --no-group --no-owner --progress --exclude-from '/var/www/backup/exclude.txt' --delete --backup --backup-dir=/hdd/mnt/incs/deleted_files/the-project/ theproject#188.102.210.41:/home/theproject/files/ /hdd/mnt/incs/theproject/files
the code which calls this is:
$aResult['res'] = exec($rsync . ' 2>&1', $aResult['output'], $aResult['return']);
when run $aResult['return'] has a value of 255 and the process has failed. The docs seem to say this is a failure to connect yet the identical call on another app on the same machine works (as it does if rsync is used in command line)
Any ideas? The php version on this box (the original app and the laravel upgrade) is php8.0 and my dev copy uses php7.4. my dev and this internal box use Ubuntu20.04 and apache2 so besides the php version are pretty identical.
No error is in any logs.
thanks
Craig
In PHP running on Ubuntu, I can run exec('npm -v') and the output is good,
but I can't run exec('gitbook xxxx').
gitbook is a npm package I installed by
npm install gitbook -g
I can run gitbook xxxx in the Ubuntu terminal, how can I run it from my PHP code?
If you run php by nginx or apache (for example, visit url example.com/index.php), sometime you need to export the PATH
exec("export PATH=/usr/local/bin && gitbook build);
after I added export PATH, everything works fine.
I tried once like this on UNIX-based OS:
You can run shell commands via the exec() function:
// make an php file to execute shell script
exec("node yourscript.js &", $output);
Well output here become array of each line of output along with process id. You can kill process by processid also.
exec("kill " . $processid);
This how I was did. Other then this you can use node supervisor. Hope it will help you. Try your also with node command.
Windows 10 just released Anniversary Update today. Now you can use Ubuntu flavored bash command from Linux subsystem.
The question is: How to execute Windows10's Bash command from PHP?
I tried
<?php
exec('bash',$out1,$result1);
exec('ls -l',$out2,$result2);
var_dump($out1);
var_dump($result1);
var_dump($out2);
var_dump($result2);
It doesn't work. All $out are empty array, and both $results are 1.
Any idea?
Just found out that I can run web server directly from subsystem.
e.g.
$ sudo apt-get install apache2
$ sudo service apache2 start
Then put all web contents inside subsystem's directory located at %localappdata%\Lxss\rootfs
At this point I can execute bash script however I want.
You can not. Ubuntu on Windows is implemented as a different subsystem directly below the Windows Kernel. So it runs separate to normal Windows processes and can not interact with them. (At least that is how I understand it). But maybe you just have to use C:\Windows\System32\bash.exe as the command.
Bash is not meant to be used to run a server but you can run PHP-CLI on bash without an issue.
Instead of running the PHP script in Windows and then accessing bash it would be easier to create a command to run the PHP script directly in bash.
So you would run `bash.exe -c "php path/to/php-script.php"
https://blogs.msdn.microsoft.com/commandline/2016/10/19/interop-between-windows-and-bash/
I'm working with a multimedia router with Linux Linaro as OS. I'm having problems when I try to make a red monitorization with iptraf. If I use iptraf in shell, it works perfectly. The problem appears when I try to run the same command through a php script.
This is the line:
shell_exec("iptraf -i wlan0 -L ../../../mnt/web0/mantenimientoV2/log2.txt");
As I said, this command works perfectly in shell and creates the log, but if I execute it with php, it doesn't create the log. I need this log, what can I do?
I have a php script where some system commands are running fine and others are not. The commands that are not running can be copy and pasted to the shell and be ran just fine.
System: OSX 10.9.2 (everything is updated).
I have tried many different commands like the following.
backticks, exec(), shell_exec(), system(), passthru()
This command works fine.
exec("drush si -y --db-url=mysql://user:pass#localhost:3306/dbname");
But these commands do not run.
exec("drush sql-sync #remote.staging #dev.anme -y");
exec("git ls-remote --heads git#github.com:blablaname/name.git");
The commands that do not run can be copy and pasted into the shell and run great. I have made sure the script is being ran in the proper directory using the getcwd() function.
If you call php program having exec() from web browser,It executes as www user. So www user may not have privilege to connect/sync to remote host.That's why it works on localhost and failing on remote host.
So one solution is
1)save the command as bash script
2)set uid bit(It can be root or user having sufficient privilege).
3)execute that bash script by exec so that it will run as previlged user.
4)You should ip restrict your program since setuid is dangerous.
setuid