I have a php webpage located on Webserver1, which is from Host1.
I also have a bash script located in Gameserver1 which is from Host2.
Is there any way to send a command from Webserver1 to Gameserver1 to execute the bash file? The webpage and file are on different VPSs. Both are running Debian 7.
The script is literally one line, to execute a java command via a screen, so the server can start if a player notices it's down. The command's available already so it doesn't need to be a secure way of hiding what the command is.
There are 2 ways I can think of. either create a bash file in Webserver1 that connects through ssh and executes the bash script you need on Gameserver1. then run it through php with exec() command.
Or you can create a php file in Gameserver1 that uses exec() to execute the bash script you need on Gameserver1 and call it using file_get_contents() on Webserver1, which is not that secure since anyone can call that file and run your script.
Related
I have git bash and I am making commits using it. I need however, to make the .bashrc alias, that will also run a php script, return a value N (which is a number) from that script, and use that value Nto change the name of a file like so:
nameOfFileN
How do I do this? I need this sequence of commands. It is my local windows machine. I have php installed. I use windows git bash to use git. I want to run this in it (in the git bash). I need this bash command together with the php file that just returns an echo-ed variable value.
If your PHP script is hosted in a webserver, use curl:
curl http://localhost/path/to/script.php
If it is a stand-alone script and you have the PHP CLI interpreter installed, run the script directly:
php /path/to/script.php
Both calls will give you the output of the PHP script. Use it however you need (pipe it to another process, redirect it to a file, use it for command substitution, assign it to a shell variable, …)
I'm running a CakePHP project under XAMPP (Apache) on Windows 10 Anniversary Update.
Apache is running under my user account.
The app calls several external processes via shell_exec(): ImageMagick, phantomjs execute as expected.
I also want to call a bash script, that in turn calls ImageMagick under Ubuntu bash (installed separately, via apt-get). I've had to adjust all paths into a form that bash can resolve.
bash /mnt/e/Projects/project-name/website/bin/crop_to_aspect.sh 1 1
/mnt/e/Projects/project-name/website/webroot/images/agent_photo/tmp_ed564289-a6f6-45b7-b9f3-2aec2b8bb3d1.jpg
/mnt/e/Projects/project-name/website/webroot/images/agent_photo/ed564289-a6f6-45b7-b9f3-2aec2b8bb3d1.jpg```
The command fails when called via shell_exec(). The same command, written to CakePHP's log, and then called from a cmd.exe prompt, works perfectly.
Thinking it may have been a path issue, I wrapped the same script in a windows batch file, including the full path to bash. I called the full batch file path, ie:
#echo off
SET aw=%1
SET ah=%2
SET in=%3
SET out=%4
C:\Windows\System32\bash.exe /mnt/e/Projects/project-path/website/bin/crop_to_aspect.sh %aw% %ah% %in% %out%
This script is then called:
E:\Projects\project-path\website\bin\crop_to_aspect.bat 4 3
/mnt/e/Projects/project-path/website/webroot/images/listing_photo/tmp_ed564289-a6f6-45b7-b9f3-2aec2b8bb3d1.jpg
/mnt/e/Projects/project-path/website/webroot/images/listing_photo/ed564289-a6f6-45b7-b9f3-2aec2b8bb3d1.jpg
Once again, the command executes correctly in cmd.exe but does nothing when run via shell_exec() from the PHP script.
Check if a function is not in the list of functions disabled in php.ini (disable_functions) line
We connect to our server using php, I generally run shell scripts,I want to convert those scripts into php scripts.So how to know whether i can run php scripts on my server or not.So how to run these php scripts on my putty connected server?
Ensure the PHP command line program is installed.
Run php /path/to/your/script.php
You can use the PHP CLI to run the script.
You can write like php filename.php.
If the php don't work then check for the actual path of the php.
It should be like /usr/bin/php .
I need your help here.
I wrote one PERL script for PHP application which needs to be run for every 5 mins.
This script will call PHP program, which will fetch data from MySQL DB and will generate a excel report and will mail those reports to specific users.
Every thing seems to be fine when I ran this script manually with the command (perl reports.pl).
But when I set this Perl in a cron tab, nothing works and reports are not getting generated.
Details: perl script path /opt/app/deweb/web/EDI/Microsoft/reports.pl
this script will call PHP program (/opt/app/deweb/web/EDI/Microsoft/reports.php)
content of script
#!/usr/local/bin/perl
use Net::FTP;
use File::Copy;
use POSIX;
#errorreport = `php /opt/app/deweb/web/EDI/Microsoft/reports.php`;
print "#errorreport\n";
exit;
It is working perfectly when running Manually using command - perl reports.pl
No results, when set in CRON:
*/5 7-19 * * * /usr/local/bin/perl /opt/app/deweb/web/EDI/Microsoft/reports.pl
Please note that this crontab is under super user account named webserv and my login is having access to edit under this super user account.
I'm editing this cron tab using command :: sudo -u webserv crontab -e
I would check the following:
Does it run using sudo -u webserv perl reports.pl? If not, fix the problem for the webserv user (permissions or whatever) and it should work via cron too.
Does which perl using your login give you /usr/local/bin/perl? If not, change the path to Perl in crontab to what you got in which perl to fix the problem.
I found myself to be in the same situtation. After trying to find out the reason, I am almost sure about the reason this happens. Crontab does not have the same environment variables as you when running the script. You must be sure about paths. Try for example run your script like /perl-path /path-to-perl-script/script.pl outside the parent directory of the script and I am almost sure that your programm will not find some files. And as you call one php script from the perl script, it's possible to have the same problem with paths to your php script too.
So the solution is to use absolute paths and no relative.
Also at your perl script don't use php but /full-path-to-php for example:
#errorreport = /usr/bin/php /opt/app/deweb/web/EDI/Microsoft/reports.php;
I am trying to run a php script on my remote Virtual Private Server through the command line. The process I follow is:
Log into the server using PuTTY
On the command line prompt, type> php myScript.php
The script runs just fine. BUT THE PROBLEM is that the script stops running as soon as I close the PuTTY console window.
I need the script to keep on running endlessly. How can I do that? I am running Debian on the server.
Thanks in advance.
I believe that Ben has the correct answer, namely use the nohup command. nohup stands for nohangup and means that your program should ignore a hangup signal, generated when you're putty session is disconnected either by you logging out or because you have been timed out.
You need to be aware that the output of your command will be appended to a file in the current directory named nohup.out (or $HOME/nohup.out if permissions prevent you from creating nohup.out in the current directory). If your program generates a lot of output then this file can get very large, alternatively you can use shell redirection to redirect the output of the script to another file.
nohup php myscript.php >myscript.output 2>&1 &
This command will run your script and send all output (both standard and error) to the file myscript.output which will be created anew each time you run the program.
The final & causes the script to run in the background so you can do other things whilst it is running or logout.
An easy way is to run it though nohup:
nohup php myScript.php &
If you run the php command in a screen, detach the screen, then it won't terminate when you close your console.
Screen is a terminal multiplexer that allows you to manage many processes through one physical terminal. Each process gets its own virtual window, and you can bounce between virtual windows interacting with each process. The processes managed by screen continue to run when their window is not active.