Hi I am doing a project with Raspberry Pi. I make a python program which have an endless loop inside. I also make a PHP website which call to that python program, make it run in background by this way:
$call = "sudo python ../python/readver12.py > /dev/null 2>&1 &";
shell_exec($call);
Everything seem okay, but I don't know how to get the status of my python program is running in background or not, and make it available in my website with PHP ?
I guess there is many ways to do that:
Try get the logs from terminal
You can throw some logs at your terminal while running your endless python script, and capture it with PHP
https://docs.python.org/3/howto/logging.html
Write on files and read with it with php
Straight forward, you write in file with python and read with PHP
http://www.pythonforbeginners.com/files/reading-and-writing-files-in-python
API REST On PHP website and python with cURL
You can use cURL inside your python script to comunicate with your endpoints in php and get the needed data.
cURL: http://pycurl.io/docs/latest/index.html
PHP Api Rest: https://www.codeofaninja.com/2017/02/create-simple-rest-api-in-php.html
I hope it helps
With shell_exec() PHP will wait to continue your script until the application you're calling exits and will return the output of that application as a string. When your application picks back up the child is already done (or you've hit the PHP time limit).
It sounds like you want to start the process and monitor it while it's running. For that look at proc_open() and proc_get_status().
Related
I have my web app developed on azure. When I start an ffmpeg process with exec, if the file is larger than 50MB, it stops the web app until it finish to process the file. Why does that happens?
Funny enough I programmed this for a customer myself some weeks ago.
You can use AJAX in front-end and start each convertion via a JS.
You can use a php "daemon"
2.0 Start the program independently php converter.php video1.mp4 >/dev/null 2>&1 &
2.1 You can start the normal script (via website), and it could restart itself as independent php converter.php (e.g. when a POST or GET is given) and the program will start itself in the script with shell_exec('php converter.php daemon'); (for example) in the background. You then can check via AJAX or maybe a websocket if the process has finished.
Tips:
Use a .lock-file or something similar in case the program could start a second time without it being finished yet.
Use this repo composer require php-ffmpeg/php-ffmpeg https://github.com/PHP-FFMpeg/PHP-FFMpeg
I have a nodejs script which opens a torrent stream which I then use to grab screenshots from. Currently I use a bash script for this which is not really extensible. I decided to move most of this to php by using a handy FFMpeg wrapper for php.
The problem is that I can't seem to initiate the nodejs script from within php without blocking. This is the command I'm running using exec()
node ~/node/peerflix/app.js 'test.torrent' &> /dev/null &
However whatever I try, the script hangs on exec. I just want to send it to the background. In other words, how can I asynchronously execute a nodejs script and send it to the background without caring about the output.
It would also be nice to be able to get the PID of the process so I can kill it when I'm done
I've been doing some research on ReactPHP and perhaps that would be a solution but I have no idea how to get that going.
Let me suggest something:
Use forever for manageable running.
You can use it in a javascript runner script of as a bash (my favor)
npm install forever -g
forever start ~/node/peerflix/app.js 'test.torrent' //in bash
This will run app.js forever, and can be monitored with the forever cli
Okay, this is going to be a very weird request/question.
There is a very long running PHP script that needs to be launched by the user (admin) who is not very technically adept. When running the script through apache, it throws a timeout (502 or 504 Bad Gateway).
Let's just assume that apache can't be configured to fix the timeout issues.
I want to create a button in the admin panel that sends an AJAX call to a PHP script on the server, that PHP script will act as a proxy of sorts to launch a shell command. The shell command will then execute the long running PHP script with certain arguments... but I don't want it to wait for the long running script to finish. The proxy PHP script can exit and return true/false based on if the shell command actually started (this part is optional).
Essentially, have PHP launch a shell command which launches a PHP script.
How can I pull something like this off?
Have you tried shell_exec. It worked for me...
http://php.net/manual/en/function.shell-exec.php
i start a linux console app from my php5 script, it starts ok but then termintates. I've tried using system(), shell_exec and tried starting as background process but to no avail it starts and then quits.
What i am trying to achieve is from a remote browser start a console app using a php5 script and then it should remain running (just as it would if i started it from a bash shell) , i then want to send commands (from a bash shell it would be keyboard strokes) to the console app from another set of php5 scripts. Hope its clear what i am trying to do.
If anyone could give some info on the best way about doing this, as i think i may have something fundamentally wrong.
I have a Debian Lenny box running apache.The console app is just a simple program that prints to stdout and reads from stdin.
How do you expect to send input to this app? Where is it listening for input?
It simply may only support interactive use, and exit as a result of that. Or, even simpler, it may terminate because it sees that is has no input (nothing piped in or nothing from some file) and since it's not connected to an interactive shell, it has nothing to do. There's no point in waiting for input from a user that doesn't have a way to interact w/ the application.
On every request, PHP starts up, compiles your script and executes it. After execution, the script exists. When the script exits, all of the resources it was using, including file handles, database handles, and pipes to other programs are terminated.
You're going to need to find another way to keep your program open and have PHP communicate with it. Otherwise, every request to your script is going to open a new copy of the program, and then both will exit when the PHP script is complete.
Unfortunately without knowing what the program is, it will be hard to offer suggestions on how to go about doing this.
How do I make python (local) run php script on a remote server?
I don't want to process its output with python script or anything, just execute it and meanwhile quit python (while php script will be already working and doing its job).
edit:
What I'm trying to achieve:
python script connects to ftp server and uploads php script (I already have this part of code)
it runs php script (that's part of code i'm asking about)
python script continues to do something else
python script quits (but probably php script still didn't finished its work so i don't want it to end when it'll exit python)
python script quit, php script still continues its task
(I don't plan to do anything with php output in python - python just has to upload php script and make it start working)
Hope I'm more clear now. Sorry if my question wasn't specific enough.
another edit:
Also please note that I don't have shell access on remote server. I have only ftp and control panel (cpanel); trying to use ftp for it.
os.system("php yourscript.php")
Another alternative would be:
# will return new process' id
os.spawnl(os.P_NOWAIT, "php yourscript.php")
You can check all os module documentation here.
If python is on a different physical machine than the PHP script, I'd make sure the PHP script is web-accessible and use urllib2 to call to that url
import urllib2
urllib2.urlopen("http://remotehost.com/myscript.php")
I'll paraphrase the answer to How do I include a PHP script in Python?.
import subprocess
def php(script_path):
p = subprocess.Popen(['php', script_path] )