I'm running a simple wget command in shell_exec()
wget_file.php
<?php
$command = "wget http://mydomain.co/media/bigbigbig.wav";
$output = shell_exec($command);
echo $output;
?>
According to http://www.php.net/shell_exec I can possibly expect an output:
"The output from the executed command or NULL if an error occurred or the command produces no output."
If I run wget_file.php from the command line, I'll see a display of the wget results.
However, if I run it from a browser, no result is given. (but the file does indeed download successfully)
I plan to execute the wget_file.php by calling via cUrl, while passing the url + path.
But would be nice to get a response from the shell_exec(), after execution is completed.
Does anyone know if I'm just missing something to get an output (running in the browser)?
If I run wget_file.php from the command line, I'll see a display of the wget results
wget doesn't output the file contents to console by default, so presumably you're talking about the status information.
I'd imagine this is output to STDERR rather than STDOUT, and it's STDOUT that shell_exec obtains for you in your PHP script:
when you run your script from the command line, you're still seeing the output because both streams are shared by the single terminal window; it's just that you're seeing it directly from wget and not from your PHP script;
in the case of passing it through Apache and to a browser to satisfy a web request, this terminal context is disconnected from the result the user sees.
In your shell command you can redirect the former to the latter:
$command = "wget http://mydomain.co/media/bigbigbig.wav 2>&1";
The comments on the documentation for shell_exec touch on this, particularly the — er — very first one!
Related
I'm trying to run a PHP script locally that scrapes Google with wget and dumps the HTML into temp.html.
Running this command from the terminal works fine:
wget -O temp.html http://www.google.ca
Running this command from PHP also works fine (so it's not a permissions issue):
shell_exec('touch temp.html');
But running this from PHP does not work (does not create temp.html):
shell_exec('wget -O temp.html http://www.google.ca');
Any suggestions? Wrapping that last command in a var_dump() outputs null.
Thanks!
According to man wget, using wget -O temp.html http://google.com takes all documents, concatenates them and prints everything in temp.html, without producing any stdout so PHP's shell_exec doesn't return anything (null).
The content of the scraped webpage should be present in temp.html, but shell_exec("wget ...") does not return anything, as not output is produced.
As you mentioned the webpage you are trying to scrape does not work, maybe they implemented some sort of bot-protection preventing exactly what you are trying.
Edit: You may use - to print everything to stdout instead. So try using shell_exec("wget -O - https://google.com"); should return the content of the requested page to your PHP script.
The simplest solution is to provide full path to the wget binary as it seems the user that runs your script does ot have the same $PATH as you.
How about using file_put_contents & file_get_contents instead? This should work without having to worry about wget.
<?php
$filename = 'temp.html';
$address = 'http://www.google.ca';
file_put_contents($filename,file_get_contents($address));
?>
My OS is Ubuntu, 14.04, I have lampp. I want to execute a perl file from PHP through my browser. I simply use the exec function (in PHP) to do that and it works. I have seen similar questions in stackoverflow but they aren't related to this.
Example Perl File named test.pl:
#!usr/bin/perl
print "This is a perl file";
Example PHP File named test.php:
<?php
$perl=exec('perl test.pl',$out,$r); //Works successfully
print_r($out); //Outputs Array ( [0]=>This is a perl file )
?>
But I need to execute some other perl file.
I can execute that successfully from the command line. Lets assume name of that file is:
test2.pl
When command is given in command line as
perl test2.pl -u argument1 -m argument2 -p testresult
It takes a fraction of second to execute the above command.
I get the output in command line.
But when I execute the same command from PHP as:
<?php
$perl=exec('perl test2.pl -u argument1 -m argument2 -p testresult',$out,$r);
print_r($out); //Outputs Array ( )
?>
My output is
Array
(
)
Now I am not getting the output, however the perl file is executing, but I am unable to get the output in $out . I can assure you that the perl file was executed because it also makes some kind of file after execution.
I don't understand why its not giving me the output.
I have also tried following functions in php already:,
exec
system
shell_exec
None of them is giving me the output, they are working fine for test.pl but not for test2.pl (test.pl and test2.pl are mentioned above).
My objective is to get the output.
edit: Solved. Thanks to hrunting's answer.
Your second Perl script isn't outputting anything to STDOUT. In your first Perl script, the print statement specifies no output destination, so it will default to STDOUT. In the second Perl script, every print statement either goes to a file or goes to STDERR (with the exception of your --help message). As the PHP exec() function only captures output on STDOUT, you get no output in PHP when you run it, even though you see output when you run it manually.
You have a few options. Two are presented below:
Redirect STDERR to STDOUT when calling exec()
`exec('perl test.pl 2>&1', $out, $r);`
Write output messages to STDOUT in your Perl script
If the output is expected, I would change your print STDERR calls to simple print calls.
There are more options in this Stack Overflow answer:
PHP StdErr after Exec()
I'm trying to execute a linux command in PHP, here is my sample code:
$command = "last -F";
$o = shell_exec($command);
print_r($o);
Most of the Linux commands gives me an output, but for the Last -F command, I have no output. Why is it so?
Try This Explaination. Your issue MAY be that the last line of last -F is a new-line. shell_exec() only returns the last line of the command, and therefore, if that line is empty, you get nothing, nada.
As an alternative, try exec(), this will allow you to capture the return value (success or failure of execution) as well as the entirety of the command's output. Check it out here
You are executing that command as web user (nobody or www-data), which is a limited privileged user.You have to execute that command as root. Unfortunately giving sudo permission or full permission to web user is really a bad idea. So I recommend make a cron or background script that execute last -F and write output to a file. You can read that file from your PHP script.
You can make a script runs in background like this.
#!/bin/bash
while [ true ]; do
last -F > /tmp/myfile
done
save the code as mycron.sh
chmod +x mycron.sh
mycron.sh &
Read the file /tmp/myfile from your PHP Program. It will give the exact output of that command.
You can easily use exec() or shell_exec() to execute a system command, like ls -l /var/www/mysite and then output the result of the command.
How would one execute and display the result of a command that periodically prints information out to the console?
You have a simply Python script. If you run the script from the console, it simply prints out some type of information to the console every 10 seconds forever until you force-quit the script.
How could use PHP to execute this python command and somehow capture and stream the output into the browser in realtime? Is something like this possible? I'm thinking there would have to be some sort of Ajax involved but I'm not sure.
I tried doing something like this:
python myscript.py > output.txt
And then I was planning on maybe using Ajax to periodically tail or cat the content of the output.txt and display in the browser. But output.txt doesn't appear to have any content added to it until after the script has been force-quit.
You don't see any output to output.txt because it's being buffered. For python there's an option to make it line-buffered. From the manpage:
-u Force the binary I/O layers of stdin, stdout and stderr to be unbuffered. The text I/O layer will still be
line-buffered.
So your command would then become:
python -u myscript.py > output.txt
For PHP the flush function should help you.
I am trying to use exec(), system(), passthru() or anything to read in the output of iscsiadm -m session, am not having much luck, and a little lost.
What I (think i) know:
It is not a sudoers or permission problem, as the results are the same in a terminal or browser (and my sudoers is already successfully setup to use iscsiadm for login/out)
Executing the following command from a terminal, iscsiadm -m session > /tmp/scsi_sess yields an empty scsi_sess file
What I need to know:
Where is the output getting sent, that I can not read it with a bash or php script but can see it in the terminal?
How can I read the output, or get output sent somewhere that I can read it?
With your syntax you're catching only the stdout. You should redirect the stderr on the stdout with
iscsiadm -m session 2>&1 /tmp/scsi_sess
Remember, when you do a redirect with > file and you still see output, that output is from stderr and not from stdout
http://en.wikipedia.org/wiki/Standard_streams