why git clone in php can execute with cli but fpm not - php

I want to execute a git command in my php script. And this is my script demo.php.
<?php
exec("ssh -v > log1.txt 2>&1");
exec('git clone git#github.com:xxx/xxx.git >log.txt 2> &1',
$out,$ret);
?>
When I execute this:
php demo.php
It clones the target project. However, when I execute it through fpm by typing the url in the browser, it sends the request to nginx, then nginx transfer this request to fpm.
type url: localhost:port/demo.php
Now it comes into trouble and the output is as below:
log1.txt
usage: ssh [-1246AaCfGgKkMNnqsTtVvXxYy] [-b bind_address] [-c cipher_spec]
[-D [bind_address:]port] [-E log_file] [-e escape_char]
[-F configfile] [-I pkcs11] [-i identity_file]
[-J [user#]host[:port]] [-L address] [-l login_name] [-m mac_spec]
[-O ctl_cmd] [-o option] [-p port] [-Q query_option] [-R address]
[-S ctl_path] [-W host:port] [-w local_tun[:remote_tun]]
[user#]hostname [command]
It seemed that this script executed by fpm and ssh has already been installed.
However log.txt is as below:
Cloning into '/home/geek/xxx/phpStudy/idl'...
error: cannot run ssh: No such file or directory
fatal: unable to fork
It seemed that it can't find the ssh command when executing git clone from this file.
I can do it with cli, and I can execute ssh command with fpm, but I cannot run ssh with git clone and fpm. This question has tortured me for serveral days. I will appreciate any advice.

I have solved this trouble. The solution as below and hope help others who encounter this trouble.
The fpm can listen several ports and there is a pool configuration file about every port. In this configuration pool there is a directive:
;env[PATH] = /usr/local/bin:/usr/bin:/bin
So the environment variable PATH of fpm maybe different with os, that is to say some command we can use in shell may not be found in fpm.
Finally, I solve this question through amend env[PATH]

Related

Running NginX and PHP (CGI) on Windows 10 - No input file specified

I want to setup a web dev environment on my Windows 10 PC. On my 2nd hard drive (D:\WebDev) I have the following:
This is how my NginX is configured: http://pastebin.com/raw.php?i=JFSX6hfU
So, this is the contents of my start.bat file:
#ECHO OFF
set PATH=D:\WebDev\php-5.6.16;%PATH%
ECHO Starting...
RunHiddenConsole.exe D:\WebDev\php-5.6.16\php-cgi.exe -b 127.0.0.1:9000
RunHiddenConsole.exe D:\WebDev\php-5.6.16\php-cgi.exe -b 127.0.0.1:9001
RunHiddenConsole.exe D:\WebDev\php-5.6.16\php-cgi.exe -b 127.0.0.1:9002
RunHiddenConsole.exe D:\WebDev\php-5.6.16\php-cgi.exe -b 127.0.0.1:9003
RunHiddenConsole.exe D:\WebDev\php-5.6.16\php-cgi.exe -b 127.0.0.1:9004
RunHiddenConsole.exe D:\WebDev\mariadb-10.1.9\bin\mysqld --defaults-file=D:\WebDev\mariadb-10.1.9\my.ini --standalone --console
cd D:\WebDev\nginx-1.9.7 && START /B nginx.exe && cd ..
This was put together based on https://stackoverflow.com/questions/15819351/can-windows-php-fpm-serve-multiple-simultaneous-requests/33032959#=
When I run start.bat file, NginX and MariaDB is getting started (i've verified by visiting http://localhost and I can see the index.html being served) and I can connect to the MariaDB Server (mysql) with Navicat.
The only thing that isn't working is the PHP. Firstly, I cannot see any PHP processes in the task manager and when I visit phpinfo() page, I get the error:
No input file specified.
Any idea what might be wrong? Sorry if this question is in the wrong site.

Execute qpdf in php shell_execute

I have installed Lampp-x64-5.6.3 in my OpenSuse 13.2 OS. I have built a program which require the execution of qpdf which I have installed from the OpenSuse Repo itself.
Well when I run the commands (given below) I get no response & nothing works at all whereas I am able to execute other binary files within the /usr/bin/ directory.
$execQuery = "/usr/bin/qpdf --decrypt --stream-data=uncompress --force-version=1.4 ".escapeshellarg('/opt/lampp/htdocs/test/test.pdf')." ". escapeshellarg('/opt/lampp/htdocs/test/temptest.pdf');
shell_exec($execQuery);
#OR
$execQuery = "/usr/bin/qpdf '/opt/lampp/htdocs/test/test.pdf' '/opt/lampp/htdocs/test/temptest.pdf'";
shell_exec($execQuery);
PHP safe_mode is off, shell_exec, exec, system etc are enabled. Still I am unable to run this particular binary (/usr/bin/qpdf).
I am getting output when I run the commands echo or ls -l or dir or even skype in the php shell_execute functions.
The permission for the file is: -rwxr-xr-x 1 root root 85248 Jun 18 10:31 /usr/bin/qpdf
But however I am able to execute qpdf command via Terminal of the OS. and it creates the file perfectly.
The directory /opt/lampp/htdocs/test/ is writable by both qpdf and apache/lampp
I have tried almost all methods mentioned in various forums but still can't get this executable run the file.
Thanks in advance.
UPDATE:
As suggested tried out this one:
$command = "/usr/bin/qpdf --decrypt --stream-data=uncompress --force-version=1.4 ".escapeshellarg('/opt/lampp/htdocs/test/test.pdf')." ". escapeshellarg('/opt/lampp/htdocs/test/temptest.pdf');
shell_exec($command. " > /opt/lampp/htdocs/debug.log 2>&1");
The errors are logged!
......
/opt/lampp/lib/libstdc++.so.6: version `GLIBCXX_3.4.9' not found
......
SOLUTION:
I simply had to delete the /usr/lib/libstdc++.so.6 file or rename it.
RUN in terminals:
sudo mv /usr/lib/libstdc++.so.6 /usr/lib/libstdc++.so.6___
I simply had to delete the /usr/lib/libstdc++.so.6 file or rename it.
RUN in terminals:
sudo mv /usr/lib/libstdc++.so.6 /usr/lib/libstdc++.so.6___

raspberry pi camera streaming won't start from php

Lately I've been playing with R-Pi. Now I'm trying to stream with the Raspberry pi b+ and camera.
I have a basic website in php from where I give commands to camera (Start - Stop streaming).
The problem is that when I press Start Streaming , the RED led from camera will light for a sec then it's going off.
When I choose to run the stream from command line(terminal) , it works.
Here's my script :
#!/bin/sh
raspivid -o - -t 0 -n -w 600 -h 400 -fps 12 | cvlc -vvv stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/}' :demux=h264
And here's my php file where do I call the shell script:
$trimite = shell_exec('sudo sh streaming.sh');
Any ideeas?
Thanks in advance !
Sounds like this could be being caused by a permissions error. Run the command sudo chmod +x streaming.sh
If that does not fix the problem, (assuming you have a webserver running apache) run "sudo a2enmod" in terminal, then sudo chmod +x /usr/lib/cgi-bin then restart apache (sudo service apache restart)
Note:
If error "could not write permissions, directory does not exist" occurs, you will need to run command 'sudo mkdir /usr/lib/cgi-bin/' in terminal
Create a cgi script to call bash script:
#!/bin/bash
echo ""
echo "Content-type: text/html"
echo "<html><head><title>Light on"
echo "</title>"
echo "</head><body>"
echo "$(bash /home/pi/streaming.sh) #this calls the shell script"
echo "</body></html>"
Then you must save this file as /usr/lib/cgi-bin/first.cgi and assign it permissions with "sudo chmod +x /usr/lib/cgi-bin/first.cgi" Assuming that your apache server is set up correctly (Various guides can be found about enabling the cgi module correctly, if a2enmod did not work properly), you should now be able to go to the web browser on another LAN machine and browse to http://IPofRPI/cgi-bin/first.cgi.
The script should execute. Congratulations!
If the script does not execute, you can read up about CGI and apache modules online and see what is wrong with your scripts.
I had kind of the same problem a while back and that question can be found here: Here
I hope this helps you, or someone else :)

When when executing git from PHP fail to use the proxy?

We are calling a bash script from PHP that will do a simple git pull.
When we run this script from terminal using root or the apache user it executes fine.
However, when php excecutes the script using exec it outputs this error:
error: Failed to connect to XX.XX.XX.XX: Permission denied while accessing https://someuser#bitbucket.org/somecompany/testproject.git/info/refs
XX.XX.XX.XX is the IP address our http proxy resolves to
It also prints out the user and proxy config (as you will see in the bash script below)
PHP:
chdir('/var/www/scripts');
$cmd = './gitBranch.sh 2>&1';
exec($cmd,$currentOutput,$err);
print_r($currentOutput);
BASH:
#!/bin/bash
cd /var/www/gitManagedPackages/testproject
whoami #to verify it's the apache user
git config --get http.proxy #to verify it has the proper proxy setting
git pull
When running the script as the apache user [su -c ./gitBranch.sh -s /bin/sh apache]
apache
http://someproxy.somecompany.net:8181
Already up-to-date.
Why does it fail when running from PHP? It's executing as the apache user and has the correct proxy set.
As it turns out, httpd is not allowed to make outgoing connections by default. The outputted error is actually from git's use of curl.
running this fixed it:
setsebool -P httpd_can_network_connect 1

PHP server on local machine?

I'm trying to build a PHP site and I'm wanting to test my PHP files without uploading them to my host. Basically testing them on my own machine before I upload them. How do I do that?
PHP 5.4 and later have a built-in web server these days.
You simply run the command from the terminal:
cd path/to/your/app
php -S 127.0.0.1:8000
Then in your browser go to http://127.0.0.1:8000 and boom, your system should be up and running. (There must be an index.php or index.html file for this to work.)
You could also add a simple Router
<?php
// router.php
if (preg_match('/\.(?:png|jpg|jpeg|gif)$/', $_SERVER["REQUEST_URI"])) {
return false; // serve the requested resource as-is.
} else {
require_once('resolver.php');
}
?>
And then run the command
php -S 127.0.0.1:8000 router.php
References:
https://www.php.net/manual/en/features.commandline.webserver.php
https://www.php.net/manual/en/features.commandline.options.php
Install and run XAMPP: http://www.apachefriends.org/en/xampp.html
This is a simple, sure fire way to run your php server locally:
php -S 0.0.0.0:<PORT_NUMBER>
Where PORT_NUMBER is an integer from 1024 to 49151
Example: php -S 0.0.0.0:8000
Notes:
If you use localhost rather than 0.0.0.0 you may hit a
connection refused error.
If want to make the web server accessible to any interface, use 0.0.0.0.
If a URI request does not specify a
file, then either index.php or index.html in the given directory are
returned.
Given the following file (router.php)
<?php
// router.php
if (preg_match('/\.(?:png|jpg|jpeg|gif)$/', $_SERVER["REQUEST_URI"])) {
return false; // serve the requested resource as-is.
} else {
echo "<p>Welcome to PHP</p>";
}
?>
Run this ...
php -S 0.0.0.0:8000 router.php
... and navigate in your browser to http://localhost:8000/ and the following will be displayed:
Welcome to PHP
Reference:
Built-in web server
I often use following command to spin my PHP Laravel framework :
$ php artisan serve --port=8080
or
$ php -S localhost:8080 -t public/
In above command :
- Artisan is command-line interface included with Laravel which use serve to call built in php server
To Run with built-in web server.
php -S <addr>:<port> -T
Here,
-S : Switch to Run with built-in web server.
-T : Switch
to specify document root for built-in web server.
I use WAMP. One easy install wizard, tons of modules to for Apache and PHP preconfigured and easy to turn on and off to match your remote config.
If you want an all-purpose local development stack for any operating system where you can choose from different PHP, MySQL and Web server versions and are also not afraid of using Docker, you could go for the devilbox.
The devilbox is a modern and highly customisable dockerized PHP stack supporting full LAMP and MEAN and running on all major platforms. The main goal is to easily switch and combine any version required for local development. It supports an unlimited number of projects for which vhosts and DNS records are created automatically. Email catch-all and popular development tools will be at your service as well. Configuration is not necessary, as everything is pre-setup with mass virtual hosting.
Getting it up and running is pretty straight-forward:
# Get the devilbox
$ git clone https://github.com/cytopia/devilbox
$ cd devilbox
# Create docker-compose environment file
$ cp env-example .env
# Edit your configuration
$ vim .env
# Start all containers
$ docker-compose up
Links:
Github: https://github.com/cytopia/devilbox
Website: http://devilbox.org
Install XAMPP. If you're running MS Windows, WAMP is also an option.
MAMP if you are on a MAC MAMP
AppServ is a small program in Windows to run:
Apache
PHP
MySQL
phpMyAdmin
It will also give you a startup and stop button for Apache. Which I find very useful.
If you are using Windows, then the WPN-XM Server Stack might be a suitable alternative.
Use Apache Friends XAMPP. It will set up Apache HTTP server, PHP 5 and MySQL 5 (as far as I know, there's probably some more than that). You don't need to know how to configure apache (or any of the modules) to use it.
You will have an htdocs directory which Apache will serve (accessible by http://localhost/) and should be able to put your PHP files there. With my installation, it is at C:\xampp\htdocs.
If you have a local machine with the right software: web server with support for PHP, there's no reason why you can't do as you describe.
I'm doing it at the moment with XAMPP on a Windows XP machine, and (at home) with Kubuntu and a LAMP stack.
Another option is the Zend Server Community Edition.
you can create your own server in php using code as well!
<?php
set_time_limit(0);
$address = '127.0.0.1';
$port =4444;
$server = '$address + $port';
// <-- Starts Server
$sock = socket_create(AF_INET, SOCK_STREAM, 0);
socket_bind($sock, $address, $port) or die('Could not bind to address');
echo "\n Server is running on port $port waiting for connection... \n\n";
while(1)
{
socket_listen($sock);
$client = socket_accept($sock);
$input = socket_read($client, 443);
$incoming = array();
$incoming = explode("\r\n", $input);
$fetchArray = array();
$fetchArray = explode(" ", $incoming[0]);
$file = $fetchArray[1];
if($file == "/"){
$file = "src/browser.php";// this file is open with server when it starts!
} else {
$filearray = array();
$filearray = explode("/", $file);
$file = $filearray[1];
}
echo $fetchArray[0] . " Request " . $file . "\n";
// <-- Control Header
$output = "";
$Header = "HTTP/1.1 200 OK \r\n" .
"Date: Fri, 31 Dec 1999 23:59:59 GMT \r\n" .
"Content-Type: text/html \r\n\r\n";
$Content = file_get_contents($file);
$output = $Header . $Content;
socket_write($client,$output,strlen($output));
socket_close($client);
}
print('server running..');
run this code then open browser to localhost:443 or whichever port you chose
A clean way to do this, even if you have existing servers on your machine, is to use Docker. Run from any terminal via docker run with a single line:
docker run --name=php -d -it -p 80:80 --mount type=bind,source='/absolute/path/to/your/php/web/root/folder/',target=/app webdevops/php-nginx-dev
You will now have a running container named php serving requests on your localhost, port 80. You should be able to see your php scripts in any browser using the url http://127.0.0.1
Notes:
If you don't have Docker installed, instructions for Debian/Ubuntu and Windows 10+ are at the end. It can be installed on Windows 7 but it's quite annoying and not worth it. For windows 7, if you must, I'd just install Uniserver or XAMPP or something like that.
You can confirm that the container is live by running docker ps in a terminal on the host machine.
In order to keep your app/code modifications after the container is terminated/removed, the web root is bound to the folder where you ran the docker run command. To change it, specify the path to your local folder in the
--mount source='[/local/path]' parameter. Note: Because the folder is bound to the container, changes you make in the container will also be made in the host folder.
Logs can be viewed using the following command (--follow is optional, ctrl+c to exit):
docker logs php --follow
The web root folder in the container is /app. This may be helpful if you don't need to save anything and don't feel like specifying a bind mount in the docker run command.
The port is specified using the -p [host port]:80 parameters. You may have to explicitly specify -p 80:80 in order to be able to connect to the container from a web browser (at least on Windows 10).
To access the container's bash terminal run this from the host machine (type exit to return to host):
docker exec -it php /bin/bash
You can install packages in the container's bash terminal the same way that you would on a native Debian/Ubuntu box (e.g. apt install -y nano).
Composer is already installed (run composer -v from container's terminal to inspect)
To launch an additional container, specify a different host port and container name using the --name=[new_name] and -p [host port]:80 parameters.
If you need a database or other server, do the same thing with a docker image for MySQL or MariaDB or whatever you need. Just remember to bind the data folder to a host folder so you don't lose it if you accidentally delete your docker image(s).
How to install Docker:
Debian/Ubuntu as root (or add sudo before each of these commands):
apt-get update
apt install -y ca-certificates curl gnupg lsb-release
mkdir -p /etc/apt/keyrings
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
chmod a+r /etc/apt/keyrings/docker.gpg
apt-get update
apt-get install -y docker-ce docker-ce-cli containerd.io docker-compose-plugin
service docker start
systemctl enable docker
Windows 10+ (tested on 10, should work on >10):
Use Chocolatey, a command-line package manager for Windows. Chocolatey also has a gui if you insist. Once installed, run:
choco install -y docker-desktop
Mac, Chromebook, etc:
You are alone. But we believe in you.

Categories