I'm executing a python script from a php page like so: exec("python ./test.py");
This script runs fine if I don't open a serial port in it. If I do, however, (and this is the whole point of calling the python script in the first place), the script doesn't execute properly.
If I call a simple python script that prints a statement -
print "This works!"
Then I get the desired output in my php page.
But, if I open a serial port, I no longer get the output of "This works!", and the serial data is not getting sent to the receiving device -
import serial
ser = serial.Serial("/dev/ttyACM0",9600)
print "This works!"
Both scripts run fine from the command line.
Is this a php limitation? I have tried other methods of execution such as popen and system, but they didn't work for me either.
Perhaps you aren't getting complete error reporting from your Python execution. Try adding raise Exception('Boo!') as the first line of you Python program to find out if you are or not. If you don't get the exception and a traceback, then your program is probably failing on the serial.Serial line, but you aren't hearing about it.
Related
I have a problem displaying the results of a Perl script that I am calling from my PHP webpage. The Perl script constantly monitors a socket and will display the output of this when run from the command line and also saves the output to a file. I know the Perl script is being called and running successfully as the text file is being updated but I do not get the output on the webpage as I was hoping for.
I have tried using the system(), exec(), passthru() and they all allow the Perl script to run but still with no output on the webpage so I am obviously missing something. Am I using the correct functions? Is there a parameter that I need to add to one of the above to push the output back to the webpage that calls the Perl script?
One example of what I have tried from the PHP manual pages:
<?php
exec('perl sql.pl', $output, $retval);
echo "Returned with status $retval and output:\n";
print_r($output);
?>
Edited to include output example as text instead of image as requested.
# perl sql.pl
Connecting to the PBX 192.168.99.200 on port 1752
04/07 10:04:50 4788 4788 3256739 T912 200 2004788 A2003827 A
I'm no PHP expert, but I guess that exec waits for the external program to finish executing before populating the $output and $return variables and returning.
You say that your sql.pl program "constantly monitors a socket". That sounds like it doesn't actually exit until the user closes it (perhaps with a Ctrl-C or a Ctrl-Z). So, presumably, your PHP code sits there waiting for your Perl program to exit - but it never does.
So I think there are a few approaches I'd investigate.
Does sql.pl have a command-line option that tells it to run once and then quit?
Does PHP have a way to send a Ctrl-C or Ctrl-Z to sql.pl a second or so after you've started it?
Does PHP have a way to deal with external programs that never end? Can you open a pipe to the external process and read output from it a line at a time?
I'm executing a python script from php with way,
$mystring = system('python test.py', $retval);
and it works, I just would like to see error messages from the python script. Is it possible? It is returning output, for example, I see a "print "hello"" from the python script, but I dont see errors from the python scripts, it's obviously failing to load a module because execution stops there and I dont see that error displayed.
It's better if you use in PHP:
exec('python test.py', $outp, $return);
You will have at $return the error status returned by the python script (there is an error if is not 0), and the error text at the $outp
I am connecting to an ssh server and using ssh2_shell to create an interactive bash shell.
The problem I have is how to send a command to the shell and get the response for the command accurately.
This is what I am currently doing:
$stream = ssh2_connect();
$shell = ssh2_shell($stream, 'bash');
fwrite($shell, "whoami\n");
fwrite(STDOUT, fread($shell, 2048) . PHP_EOL);
The problem with the above code is the output to STDOUT (for commandline view) shows something like Last login to server etc.. aka the welcome message. I have to wait a second or two and then read from the shell again to get the output of the command I sent.
Now I can solve this by doing sleep(1) after any command I send but this is very hacky. If the command takes longer than a second, I will not get the results. If the command takes less than a second, I waste time.
I tried stream_set_blocking($shell, true) before writing the command and calling fread but the problem still persists unless I sleep after the command.
Basically, how can I use ssh2_shell, send a command and get the response for that command regardless of how long it will take, without timing out the script by requesting content of a blocked stream after the shell has returned its contents?
What you describe is the intended behavior of ssh2_shell - it is interactive. That means you would need to work with timeouts.
If you want a command or script executed on the remote host and just receive its output you need non-interactive execution - ssh2_exec.
I wrote a simple PHP code to execute a console program:
<?php
$cmd = escapeshellcmd('progName.exe arg1 arg2 arg3');
exec($cmd);
?>
If I run the command on the console directly on the server, it works. However, when I run the PHP on the browser, it doesn't work. The process progName.exe is running (checked using Task Manager on the server), but it never finishes. This program is supposed to compute some parameters from the arguments and write the result to a binary file, and also produce a .WAV file. Here is the error message I get on the browser:
Error Summary
HTTP Error 500.0 - Internal Server Error
C:\php\php-cgi.exe - The FastCGI process exceeded configured activity timeout
Detailed Error Information
Module FastCgiModule
Notification ExecuteRequestHandler
Handler PHP
Error Code 0x80070102
Then I wrote a simple console program that write a sentence to a text file (writeTxt.exe hello.txt). Using the same PHP script, I ran it on the browser and it works.
I already tried to increase the timeout on the server, but still have the same error.
What could cause this problem?
When you execute a program in PHP using the exec function (e.g. exec('dir')), PHP waits until it is ended or you sent it to the background and PHP comes back directly (see documentation, especially the comments).
According to your posted PHP sources ($cmd = escapeshellcmd('progName.exe arg1 arg2 arg3');) the program is not sent to background by PHP - so what stays is that progName.exe...
...sends itself or a fork to the background (unlikely, but look into the sources of progName.exe)
...is waiting for input (<-- this is my favorite)
I missed something ;-)
As I said I bet it is the second option. Hope that helped a bit.
I spent hours trying to debug a php script which runs a c simulation using exec(). After throwing stderr print messages in everywhere, I finally found that the underlying issue was printing a directory path to a char array that was too small using sprintf().
I'm guessing it was a segmentation fault, but I never actually saw the shell error message saying 'segmentation fault'. When I changed the allocation size, everything worked.
I had been redirecting stderr output to a log file, and that log file got all of the messages from fprintf(stderr,"..."); but it didn't get any shell error messages.
The command is this
exec("$cmd 2>> $logFile & $PROCFILE 1 $ipaddr $! 2>> $logFile", $output, $rv);
$cmd runs a c simulation, and $PROCFILE runs a second c program that takes three arguments (the 1, $ipaddr, and $!). Before I fixed the allocation size problem, the php script would just stop execution of the $cmd simulation and continue on the next line (i.e. after the exec() statement). The next line in the php file is
if(rv!=0){handle error}
but that also didn't catch the problem.
I kept thinking I was running into a permissions error. How can one get exec() to show the shell errors? Or was this a result of running the two programs simultaneously with '&'?
When a program dies, the "Segmentation fault" message comes from the shell, not from the program. Redirection only applies to the program's output, not the shell's. The message should get put into the $output variable. If $rv is 0 and there's nothing in $output, I suspect you're wrong about what happened. I suggest you verify this by writing a simple test program that just does:
kill(0, SIGSEGV);
BTW, why aren't you using snprintf(), so you can't get a buffer overflow in the first place?