CentOS 5.9, PHP 5.4.21, Tomcat 7.0.42, Safe mode is off.
I need to commit some code to repository from php. but failed with exit code 128.
Codes are below, and command_exec is 'cd /data/project && git add .'
ob_start();
$this->system_call_detail = system($this->command_exec, $this->output);
$logger->debug('ExecuteCMD system call result : '.$this->system_call_detail);
ob_end_clean();
I can run git command as the apache user account from cmd, but programs run from PHP fail with exit code 128.
I guessed this cause from PHP. So, I tried this git command, "php -r 'system("cd /data/project && git add .", $test); echo $test;'" from cmd as apache user account and it success.
I solved this problem..
First, I couldn't get contents when run command, because of some php bug(?).
http://kr1.php.net/manual/en/function.system.php#108713
I fixed the command belows:
$this->system_call_detail = system($this->command_exec.' 2>&1', $this->output);
and then, I could get error msgs belows:
fatal: unable to access '/home/{username}/.config/git/config': Permission denied
But, there were no such files or directory.
So, I googled with the error msgs. and found some blog.
http://www.jethrocarr.com/2013/08/25/the-apache-that-wanted-to-be-root/
And I edited '/etc/sysconfig/httpd' file with the blogs posted.
And Solved. :)
Thank for helping me.
deceze and ojrask
Related
I have encountered a strange problem. I have created a gitlab custom webhook. The webhook service is written in php (gitlab.php). The php script, when called with proper payload, should start an automated deployment. The deployment script is basically a shell script (deploy.sh)
Code of gitlab.php
<?php
$json = file_get_contents('php://input');
// Converts it into a PHP object
$data = json_decode($json, true);
$secret = "";
foreach (getallheaders() as $name => $value) {
if($name=='X-Gitlab-Token') {
$secret = $value;
}
}
if($secret=="mysecretcode" && $data['object_kind']=="push") {
file_put_contents("/opt/lampp/htdocs/autodeploy/autodeploy.log", "Webhook auto deployment started at ". date("Y-m-d H:i:s") . "\n". shell_exec("sh /opt/lampp/htdocs/autodeploy/deploy.sh")."\n**********************************************\n\n", FILE_APPEND);
die("Deployment success");
}
else {
die("Auth token invalid or no triggerable event found");
}
?>
Code of deploy.sh
#!/bin/bash
cd "/opt/lampp/htdocs/nodejs/myapp"
echo "Stopping application..."
forever stop index.js
cd ..
sudo rm -rf myapp
echo "Pulling latest code from gitlab..."
git clone -b "master" https://myusername%40domain.com:MyEncodedPass123#gitlab.com/myusername/myapp.git
cd myapp
echo "Installing dependencies..."
npm install
echo "Starting application..."
forever start index.js
echo "Application started on port 3000."
When I push code to gitlab, it triggers the webhook and initiates the deployment process and I see my webhook file gitlab.php returned success response to Gitlab and has written following output in autodeploy.log file
Webhook auto deployment started at 2021-10-03 13:32:07
Stopping application...
Pulling latest code from gitlab...
Installing dependencies...
Starting application...
Application started on port 3000.
**********************************************
But, the deployment actually never happens. It just runs the echo statements (and may be cd, rm etc as well) and rest of the shell commands are kind of ignored or not executed. It for sure doesn't run the git clone command, because I do not see latest code getting refreshed on my server from gitlab. Not sure though what happens to the npm and forever commands.
By the way, this is a Ubuntu VPS server with latest ApacheFriends XAMPP server installed, php v8. All the three files mentioned above reside at the same path i.e. /opt/lampp/htdocs/autodeploy and my deploy.sh script tries to deploy a separate nodejs application in /opt/lampp/htdocs/nodejs/myapp folder. Strangely enough, when I run deploy.sh directly from terminal it deploys the latest code successfully. Which means, all the statements in the script gets executed as expected. It only fails when executed from the php script using shell_exec() function.
Any clue what could be the reason?
The documentation for shell_exec doesn't state it clearly, but chances are you only get the stdout of your command, not its stderr -- the first comment certainly seems to indicate that, it would also match the "it's the same thing as backtick" decription.
If such is the case, you simply don't have the error messages in your output.
Try running :
shell_exec("sh .../deploy.sh 2>&1")
I also second #phd's suggestion (in the comments to your question) : turn set -e on, at least your script will halt when you get an error.
Chances are your current script would fail at the first line : cd "/opt/lampp/htdocs/nodejs/myapp".
I also strongly advise to drop the sudo in sudo rm -rf ....
I have written simple php script to help me update site contents when the commit is sent to bitbucket. I have following problem with it.
<?php
$repo_dir = '/var/www/vhosts/my_full_path';
$output = shell_exec('cd '.$repo_dir.' && hg --config auth.rc.prefix=https://bitbucket.org/XXXXX --config auth.rc.username=my_username --config auth.rc.password=my_pass pull -u https://bitbucket.org/XXXXXXX &');
echo $output;
?>
When I type it to web browser it doesn't work. The output of script is:
pulling from https://bitbucket.org/XXXXXXXXXXXXXX
but when I try to execute it under console on the server it works like a charm:
php myscript.php
generates following output:
pulling from https://bitbucket.org/XXXX
searching for changes
adding changesets
adding manifests
adding file changes
added 2 changesets with 2 changes to 1 files
1 files updated, 0 files merged, 0 files removed, 0 files unresolved
See the oupt is full and correct! in concole I'm using root user in web browser data-www? Is there any difference in this case?
I have found the solution. I hope it helps someone.
There were two problems:
Permissions to my repo dir
Authentication for user www-data for this repo
The problem occured because web browser doesn't flush warnings and abort messages while executing command shell_exec. If you want to test your script, you have to lgoin to console by SSH (as root for example) then execute script / command as apache user:
sudo -u www-data php /path-to-your-script/script.php
In console you will see all problems which following user generates.
I'm currently trying to incorporate Gradle into a web based java IDE, so that a user could build their files.
I've tested the command to do it using Putty and it yielded the exact results I expected(creating a series of directories and sub directories).
$command = 'cd /home/neema/sites/javaIDE.local/branches/develop/javaIDE/app/java/User1/test1 2>&1';
$command2 = '/home/neema/sites/javaIDE.local/branches/develop/javaIDE/app/gradle/bin/gradle init --type java-library 2>&1';
However when i run the above commands in shell_exec in php(console.logging the response)i get this error.
FAILURE: Build failed with an exception.
What went wrong:
Could not create service of type InitScriptHandler using BuildScopeServices.createInitScriptHandler().
Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
I ran the stacktrace and this is the result, http://pastebin.com/YaHyWNR6.
I can't for the life of me understand why it won't work.
I've given the folder 'test1' all permissions (777) and now I've resorted to help :)
Thanks in advance to anyone with insight.
I'm in trouble and that much confused about a php shell_exec command.
When the command is execute by PHP I have no error but the execution fails. If I use exactly the same command from a terminal it works.
Here's the command :
/usr/bin/wkhtmltopdf --lowquality --dpi 300 --encoding utf-8 "/tmp/knplabs_snappyxa9otq.html" "/tmp/knplabs_snappyv3pD7h.pdf"
When I lauch this from a terminal :
$ /usr/bin/wkhtmltopdf --lowquality --dpi 300 --encoding utf-8 "/tmp/knplabs_snappyWG9XTd.html" "/tmp/knplabs_snappyv3pD7h.pdf"
Loading page (1/2)
Printing pages (2/2)
Done
But from my php script :
// Construct the previous command
$command = $this->buildCommand($url, $path);
../..
shell_exec($command);
../..
$content = file_get_contents($path);
../..
I've test the output of shell_exec, it's empty.
The log :
Warning: file_get_contents(/tmp/knplabs_snappyv3pD7h.pdf): failed to open stream: No such file or directory in /*****/lib/snappy/SnappyMedia.class.php on line 64
No permission pb in the /tmp directory :
$ ls -la /tmp
total 448
drwxrwxrwt 16 root root 4096 mars 12 21:51 .
../..
I've tried avec the PHP exec() function to get error informations, I just get an "1" error code in return_var and nothing in output.
For information this issue appear on my test server, my desktop computer but not on my notebook. All the 3 are with sames PHP, Apache, Mysql versions.
I don't understand anything ...
Thanks for any help, I'm loosing my mind.
David.
I've found the solution here : Executing wkhtmltopdf from PHP fails
Thanks to Krzychu.
First to get information from the shell_exec command add " 2>&1" at the end of the command. In that way you will get information in return of the command :
$no_output = shell_exec($command);
echo $no_output; // nothing
$output = shell_exec($command . ' 2>&1');
echo $output; // in my case : "cannot connect to X server"
The solution :
Not use the wkhtmltopdf ubuntu package (0.9.9-4)
Use the official package from the Wkhtmltopdf download page
So no need to install xvfb ! (I've seen this advice many times)
Looks like a user's permissions issue.
When you run the command from the terminal, it is the user account, currently used, which does have the right permissions, to run a command in /usr/bin, and execute the specific file.
When you run it from the php script, it is the http server account on your system, which needs the permission to execute the file in /usr/bin. Usually this is the apache user.
How you should setup permissions depends on your system. Just remember that what is allowed for apache, is allowed for anyone accessing your http server.
I have had this problem for ages and adding . ' 2>&1' after the $command has somehow solved the problem.
this:
$output = shell_exec($command . ' 2>&1');
instead of:
$output = shell_exec($command);
No idea why but it works and I'm grateful.
Is it a shared hosting? It seems like shell_exec is a restricted function. Try running error_reporting(E_ALL); ini_set('display_errors', 1); before calling shell_exec.
I stumbled upon the same Problem, in my case an absolut Path in the exec Command like /var/www did not work, I had to use relative Paths from the point where I executed the php File.
I also wanted to notice, that it did not work using shell_exec, however it worked using normal exec command, not sure wheres the difference here.
I have been going nuts with this..
I have gnupg installed on my CentOS server and I try to encrypt uploaded files (uploaded via a PHP page). On the server via the command line, it works perfectly. But via the php script, it fails with this error:
gpg: /path-to-my-file/my-file: encryption failed: file open error
The user apache (which I think is used to run the exec command) has read/write in the directory of the file.
The file is uploaded fine (I can see it afterward as I removed the deletion of the unencrypted file from my code) and can be deleted correctly via the php site.
the command I run is the following
/path-to-gpg/gpg --homedir=/path-to-my-home-gnupg/.gnupg -e -r therecipient#email the-unencrypted-file
Any idea how I could tackle this?
thanks
A few things to check:
Run system("ls " . escapeshellarg($file)) and check the result — is it file not found? Permission denied? That will help you debug.
Run system("whoami") to make sure PHP is running as who you think it is.
Run echo "<pre>ls " . escapeshellarg($file) . "</pre>" then copy+paste the command and run it from the shell to make sure that the path to the file is what you expected it to be.
Also, I believe CentOS runs SELinux by default… If you've got it installed, check the logs (in /var/log/) to see if SELinux is preventing Apache from executing GPG.
Try running the command with actual apache user privileges in verbose mode:
su apache -c /path-to-gpg/gpg -vv ...