how to test php file from command line using spawn-fcgi - php

I have a php script. I am using nginx and spawn-fcgi.
spawn-fcgi -n -s /tmp/nginx9010.socket -u www-data -g www-data -f /usr/bin/php5-cgi -C 6
How can I test from the command line that spawn-fcgi is working with the script?
e.g. I have a script in /home/ubuntu/test.php
I am having issues with nginx and executing a php script. It prompts for a download.
I have #!/usr/bin/php in the file and did a chmod a+x as well.
Thanks

For testing a FastCGI backend you could try to create a CGI environment and use cgi-fcgi to connect to the backend
You could attach with strace to see what the backend does (for example whether it even receives a request from the web server); attach with -ff to the master process to see syscalls on all workers
php5-cgi in FastCGI mode doesn't need a shebang line nor +x on the files - it doesn't use the kernel to execute them, it just loads them as simple files
Firefox (and probably other browsers too) often cache the mime type, so you will see a download prompt even after you fixed the problem. Use curl for testing!

nginx won't serve the file it passes it to php, nginx only serves static files, So if it is downloading the php file you might need to check that your are sending php files to the correct place, are you using an IP and PORT in the php location block in the config file ?
Only a guess of the top of my head whilst on the train home.

FWIW, problems like that nginx offers the file to be downloaded are due Nginx serving the files itself without sending them to fastcgi backend, often because of try_files or wrong location {} block matching to the uri.

Related

Linux mount script runs in PHP interpreter but not in web Apache

My website is about copying files. Instead of copy the files to avoid to much time and server lagging, I decided to use overlayfs. The code mount the folder to the location specified in the terminal using the PHP interpreter. But, when I run the php script from Apache the script does not mount the overlay. And the worst is that there is no error output so I can debug what's wrong. I checked the php error log, no output about what happened.
The destination i'm mounting the overlayfs is to another user. For that I need root to execute the mount command. To be able to run the code without using root nor sudo, I took a look at this question. I created the c code, compile it and set the proper permissions (root.root, rwsr,sr...). I ran the code in a php file:
<?php
// filename over.php
print shell_exec("whoami")."\n";
print shell_exec('/var/www/vhosts/user/deployment/exec "sudo mount -t overlay overlay -o lowerdir=/var/www/vhosts/user/deployment/template5_dev,upperdir=/var/www/vhosts/user.deve/httpdocs,workdir=/var/www/vhosts/user/deployment/overlay-work /var/www/vhosts/user.deve/httpdocs"');
What the code does is print the actual user name (to make sure I have an output, to see if the code executed) then merge the folders. The file /var/www/vhosts/user/deployment/exec is the c program then I pass the code to execute as argument.
In the terminal I run: php -f "/var/www/vhosts/user/httpdocs/over.php". I check the merged folder I can see it works. And the output is user.
Then I unmount the overlay sudo umount /var/www/vhosts/user.deve/httpdocs.
I access the php script via the browser, I got the output user but the folder does not merge. I ctrl-f5 multiple times but nothing, no error, no error in log nothing.
I changed the command to shell_exec('/var/www/vhosts/user/deployment/exec "sudo mkdir /var/www/vhosts/user.deve/httpdocs/nouvo"');, the sudo created the folder from the browser.
I noticed that only the mount command does not run properly.
What could be the reason the sudo mount command does not run by apache, and even if there was an error, doesn't it print out the error?
I just take a look at /var/log/kern.log. I can see the mount command got executed from the web-browser. But the log is different from the one the ones executed in the terminal.
From the web-browser:
kernel: [ 149.465459] overlayfs: filesystem on '/var/www/vhosts/user.deve/httpdocs' not supported as upperdir
kernel: [ 151.629192] overlayfs: filesystem on '/var/www/vhosts/user.deve/httpdocs' not supported as upperdir
kernel: [ 153.453612] overlayfs: filesystem on '/var/www/vhosts/user.deve/httpdocs' not supported as upperdir
From the terminal after executing from the browser:
kernel: [ 312.858797] overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
kernel: [ 312.858804] overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
I just don't get the log from the browser since the script is same.

Execute shell commands with PHP

I am trying to execute this command using the shell_exec function from PHP:
shell_exec("cd /home/ec2-user; ./certbot-auto -n --apache -d mydomain.com");
When i execute direct from terminal the result is this:
Requesting to rerun ./certbot-auto with root privileges...
Saving debug log to /var/log/letsencrypt/letsencrypt.log
Plugins selected: Authenticator apache, Installer apache
Obtaining a new certificate
Performing the following challenges:
http-01 challenge for mydomain.com
Waiting for verification...
Cleaning up challenges
Created an SSL vhost at /etc/httpd/conf.d/vhost-le-ssl.conf
Deploying Certificate to VirtualHost /etc/httpd/conf.d/vhost-le-ssl.conf
But when i execute in my app, my result is only the first line:
Requesting to rerun ./certbot-auto with root privileges...
How can i fix this?
Obs:
I am trying to install Certbot SSL certificates.
My app is in Amazon AWS
I do not have much knowledge on servers.
I am using Laravel 5.5 in my app.
Take a look at sudo in php exec() or google for "php sudoers".
Your script is running as apache user and hasn't rights as root; so you need to make an entry in /etc/sudoers (or put a file in sudoers.d) to be able to run certbot-auto as root from your script.

Copy remote file with rsync in php

I'm trying to execute with PHP a command (rsync) to copy folders and files from a remote server to a local folder.
This is the code I wrote in php. Command WORKS in SSH (local Terminal and remote with putty.exe), copying correctly the folders and the files.
But it doesn't work in PHP. What can I do? Do you know a better(secure/optimal) way to do this?
exec("echo superuserpassword | sudo -S sshpass -p 'sshremoteserverpassword' rsync -rvogp --chmod=ugo=rwX --chown=ftpuser:ftpuser -e ssh remoteserveruser#remoteserver.com:/path/files/folder /opt/lampp/htdocs/dowloadedfiles/", $output, $exit_code);
EDIT:
I had read this guide to create a link between my server and my local machine.
Now I can login with ssh in my remote machine without password.
I changed my command:
rsync -crahvP --chmod=ugo=rwX --chown=ftpuser:ftpuser remote.com:/path/to/remote/files /path/to/local/files/
This command works too in terminal, but when I send it with exec php command, it fails again, but I got another different error: 127.
As MarcoS told in his answer, I checked the error_log.
The messages are this:
ssh: relocation error: ssh: symbol EVP_des_cbc, version OPENSSL_1.0.0 not defined in file libcrypto.so.1.0.0 with link time reference
rsync: connection unexpectedly closed (0 bytes received so far) [Receiver]
rsync error: remote command not found (code 127) at io.c(226) [Receiver=3.1.1]
Well, after lot of try/error, I finished to cut the problem in the root:
I readed this guide (like the last one, but better explained) and I changed the php file that execute the rsync command to the remote server (where files are located) and run the rsync.php file there, and it worked perfectly.
To execute in the machine with the files (the files to copy and the rsync.php)
1.- ssh-keygen generates keys
ssh-keygen
Enter an empty passphrase and repeat empty passphrase again.
2.- ssh-copy-id copies public key to remote host
ssh-copy-id -i ~/.ssh/id_rsa.pub remoteserveraddressip(xxx.xxx.xxx.xxx)
The rsync.php file:
exec("rsync -crahvP /path/in/local/files/foldertocopy remoteuser#remoteserveraddress:/path/in/remote/destinationfolder/", $output, $exit_code);
After all of that, navigate to the rsync.php file and all must work. At least worked for me...
I suppose you are experiencing identity problems... :-)
On a cli, you are running the command as the logged-in user.
On PHP, you are running the command as the user your web server runs as (for example, apache often runs as www-data, or apache user...).
One possible solution I see (if the above is the problem real cause), is to add your user to web-server group...
I'd also suggest you to check the web-server error logs, to be sure about the real cause of the problem... :-)

Setting environment variables with the built-in PHP web server

PHP 5.4 supports a built-in web server for development purposes. The app we are developing is configured via environment variables.
With Apache you'd do this:
SetEnv FAVORITE_COLOR white
With the normal CLI you can do this:
$ export FAVORITE_COLOR=black
$ php -a
php > echo $_SERVER['FAVORITE_COLOR'];
Is there a way to set these variables for the built-in web server?
Looks like E is excluded from variable_order setting running the built-in server. If you add the E to the variable_order setting, it works:
test.php
<?php
var_dump($_ENV['FOO']);
shell:
FOO=BAR php -d variables_order=EGPCS -S localhost:9090 /tmp/test.php
output:
string 'BAR' (length=3)
Tested on PHP 5.4.12
I use Window DOS to start the PHP server. I store my server startup commands in a text batch file (.bat) to save me from having to select and copy all of the commands at once and paste it into the DOS terminal (note the last blank line that I copy as well so the PHP server will automatically start when I paste the commands into DOS, otherwise I would need to manually use the Enter key to start the server).
Q:
cd Q:\GitLabRepos\myapps\test1
set APPLICATION_TITLE=My testing application with this first env variable
set SOME_OTHER_ENV_VAR2={"myJsonElement":"some value"}
E:\PHP8\php.exe -d variables_order=E -S localhost:8000 -c php.ini
The commands above explained:
The first line Q: changes to the drive where my code resides. The second line cd Q:\GitLabRepos\myapps\test1 changes directories to my root PHP application code (which is where I want to start the PHP server). Next I set some environment variables on lines 3 and 4. Then finally I start the PHP server with the -d variables_order=E parameter so I can use either $_ENV or getenv() to retrieve the environment variable values in my PHP code (eg. $_ENV['APPLICATION_TITLE'] or getenv('APPLICATION_TITLE')). If you exclude -d variables_order=E from the server startup command then you can only use getenv() to access the environment variables. I use the -c php.ini parameter to load additional PHP settings from a php.ini file but this can be excluded for simple server setup.
Then if I have a Q:\GitLabRepos\myapps\test1\index.php script with the following code:
<?php
echo getenv('APPLICATION_TITLE').'---'.$_ENV['APPLICATION_TITLE'].'...'.getenv('SOME_OTHER_ENV_VAR2');
?>
If I visit localhost:8000 in a web browser I should see
My testing application with this first env variable---My testing application with this first env variable...{"myJsonElement":"some value"}.
On Windows:
SET FOO=BAR
php -s localhost:9090

PHP command not executed system(), exec() or passthru()

I am trying to run a command line file conversion using open office.
openoffice pdf filename.doc 2>&1
when i execute in command line as root it works fine and the file is converted. However when i pass the above command in a PHP file as apache user, it does not execute.
I tried all three PHP command line execution:
$command_output=system($command_line,$rtnval);
$command_output=exec($command_line,$rtnval);
$command_output=passthru($command_line,$rtnval);
Also,
echo print_r($rtnval);
echo print_r($command_output);
$rtnval returns 1 and $command_output 1. I am confused unable to know what is the linux (centos) response to above command passed. It is very frustration because unable to know what the system response when i try to execute the command.
I also included /etc/suders permission for apache to run the open office command.
apache ALL: (ALL) NOPASSWD: /path/to/openoffice
still the command is not execute in PHP as apache user.
What am i missing for PHP as apache user not to execute this command?
It could be that openoffice is not in PATH. Try to execute it with the full path.
To run your command as if you were the apache user, just try this in a shell:
# switch to superuser
sudo su -
# then switch to the apache user
su - www-data
You will find yourself in a quite restricted shell, from which it is usually not possible to start openoffice. Indeed, it requires a lot of environment, that would be unsafe to completely set up for apache anyway.
AFAIK, better create a dedicated user that is allowed to run your command (eg a regular "www-runner" user), then "su" to it from PHP. Other security measures include chroot'ing the dedidacted user, or using apparmor to limit what and where it is allowed to run. In any case, never let www-data run something as root by adding www-data to the sudoers: this is way too dangerous!
You can also have a look at libapache2-mod-suphp (a suid apache module to run php scripts with the owner permissions).It is easier to use than the dedicated suEXEC apache beast (http://httpd.apache.org/docs/2.0/suexec.html). The latter really is not for a quick fix ;)
It is possible that your php in apache runs in safe mode or what's it called, in which system() function and alike are disabled.
This answer, actually, assumes that what you call "running as apache user" is in fact running in apache environment, whatever it is.

Categories