I'm tring to pass a string to a batch file from php using proc_open() on Windows. It works fine unless the string I'm passing is multiline, because it breaks the command with the line break. I tried various escaping methods, but none of them seems to work:
cmd style - prints the escape symbol and breaks line:
proc_open('script.bat -m "this is ^\n multiline"', $desc, $pipes)
another try - prints the whole string:
proc_open('script.bat -m "this is ^\\n multiline"', $desc, $pipes)
powershell style - prints the whole string:
proc_open('script.bat -m "this is `n multiline"', $desc, $pipes)
No matter what I tried, it either breaks the string anyway, or prints it as is, with no line break.
What am I missing or doing wrong? How to get multiline arguments to work via proc_open()?
Even if you get it escaped properly, it's nearly impossible, that your batch file is able to fetch a multi line argument.
You should test the batch script on it's own directly from cmd.exe, before trying to fix the escaping in php
c:\> script.bat This_is_^
More?
More? multiline
A simple echo %* or set "arg=%~1" is not able to handle multi line arguments, but some hacks like:
How to receive even the strangest command line parameters?
Im passing a multi line text as argument which will ... has only 1 line
Related
I just cannot fathom how to get the PHP exec() or shell_exec() functions to treat a '*' character as a wildcard. Is there some way to properly encode / escape this character so it makes it through to the shell?
This is on windows (via CLI shell script if that matters, Terminal or a git-bash yields the same results).
Take the following scenario:
C:\temp\ contains a bunch of png images.
echo exec('ls C:\temp\*');
// output: ls: cannot access 'C:\temp\*': No such file or directory
Permissions is not the problem:
echo exec('ls C:\temp\exmaple.png');
// output: C:\temp\example.png
Therefore the * character is the problem and is being treated as a literal filename rather than a wildcard. The file named * does not exist, so from that point of view, it's not wrong...
It also does not matter if I use double quotes to encase the command:
echo exec("ls C:\temp\*");
// output: ls: cannot access 'C:\temp\*': No such file or directory
I have also tried other things like:
exec(escapeshellcmd('ls C:\temp\*'));
exec('ls C:\temp\\\*');
exec('ls "C:\temp\*"');
exec('ls "C:\temp\"*');
And nothing works...
I'm pretty confused that I cannot find any other posts discussing this but maybe I'm just missing it. At this point I have already worked around the issue by manually programming a glob loop and using the internal copy() function on each file individually, but it's really bugging me that I do not understand how to make the wildcard work via shell command.
EDIT:
Thanks to #0stone0 - The answer provided did not particularly answer my initial question but I had not tried using forward slashes in the path and when I do:
exec('ls C:/temp/*')
It works correctly, and as 0stone0 said, it only returns the last line of the output, which is fine since this was just for proof of concept as I was not actually attempting to parse the output.
Also, on a side note, since posting this question my system had been updated to Win11 22H2 and now for some reason the original test code (with the backslashes) no longer returns the "Cannot access / no file" error message. Instead it just returns an empty string and has no output set to the &$output parameter either. That being said, I'm not sure if the forward slashes would have worked on my system prior to the 22H2 update.
exec() only returns the last output line by default.
The wildcard probably works, but the output is just truncated.
Pass an variable by ref to exec() and log that:
<?php
$output = [];
exec('ls -lta /tmp/*', $output);
var_dump($output);
Without any additional changes, this returns the same as when I run ls -lta /tmp/* in my Bash terminal
That said, glob() is still the preferred way of getting data like this especcially since
You shouldn't parse the output of ls
I have a PHP file that runs a node script using exec() to gather the output, like so:
$test = exec("/usr/local/bin/node /home/user/www/bin/start.js --url=https://www.example.com/");
echo $test;
It outputs a JSON string of data tied to the website in the --url paramater. It works great, but sometimes the output string is cut short.
When I run the command in the exec() script directly, I get the full output, as expected.
Why would this be? I've also tried running shell_exec() instead, but the same things happens with the output being cut short.
Is there a setting in php.ini or somewhere else to increase the size of output strings?
It appears the only way to get this working is by passing exec() to a temp file, like this:
exec("/usr/local/bin/node /home/user/www/bin/start.js --url=https://www.example.com/ > /home/user/www/uploads/json.txt");
$json = file_get_contents('/home/user/www/uploads/json.txt');
echo $json;
I would prefer to have the direct output and tried increasing output_buffering in php.ini with no change (output still gets cut off).
Definitely open to other ideas to avoid the temp file, but could also live with this and just unlink() the file on each run.
exec() only returns the last line of the output of the command you pass to it. Per the section marked Return Value of the following documentation:
The last line from the result of the command. If you need to execute a command and have all the data from the command passed directly back without any interference, use the passthru() function.
To get the output of the executed command, be sure to set and use the output parameter.
https://www.php.net/manual/en/function.exec.php
To do what you are trying to do, you need to pass the function an array to store the output, like so:
exec("/usr/local/bin/node /home/user/www/bin/start.js --url=https://www.example.com/", $output);
echo implode("\n", $output);
I am trying to pass some values from a PHP file to a BASH script. I am getting a ERROR CACHE_MISS response.
The variable 'coretown' holes the value 'Houston, TX'. It must be in that format for the bash script to work.
Results of a test to prove the variables are correct
WorkString531cdf6b8b3451.99781853 OutString531cdf6b8b3451.99781853 Houston, TX
Execute the bash script.
$errorTrap=shell_exec("./Find-Town.sh $workPath $outPath $coreTown");
Bash script:
#!/bin/bash
set -x
InFile="./zipcode.txt"
"$Work1"="$1"
"$OutFile"="$2"
"$InString"="$3"
echo "$1";
echo "$2";
echo "$3";
Returned by the 'echo' in the script:
WorkString531cdf6b8b3451.99781853 OutString531cdf6b8b3451.99781853 Houston,
Notice the state (TX) is missing. If I put 'echo "$4";' in there it will display the 'TX'.
Is one of these languages handling the content of 'coreTown' ('Houston, TX') as an array? If so, which one? Amd how do I fix it? My google searches did not address this problem.
Since $coreTown contains a space, that's being treated as an argument delimiter in the shell; you need to quote it or escape the space. Luckily, PHP has a function that does that for you: escapeshellarg.
$workPathEsc = escapeshellarg($workPath);
$outPathEsc = escapeshellarg($outPath);
$coreTownEsc = escapeshellarg($coreTown);
$errorTap = shell_exec("./Find-Town.sh $workPathEsc $outPathEsc $coreTownEsc");
If you create a simple php script with this code:
shell_exec('"');
And run it with:
php myscript.php
It gives the following error in bash on my Mac:
sh: -c: line 0: unexpected EOF while looking for matching `"'
sh: -c: line 1: syntax error: unexpected end of file
I've tried everything I can think of:
ob_start();
#shell_exec('"');
ob_end_clean();
#shell_exec('" 2> /dev/null');
But no matter what I try, I can't suppress the message. The problem is that I'm creating a unit test to stress test $argc, $argv and the getopt() command for a general purpose tool, and I need to call the script hundreds of times with various inputs containing random characters like '"=: etc.
So I need to be able to either suppress the error output, or detect imbalanced quotes and append either a ' or " to the end of the string. I found this regex to detect single and double quoted strings:
PHP: Regex to ignore escaped quotes within quotes
But I'm having trouble visualizing how to do this for a general bash command that has a mix of quoted and unquoted arguments. Is there a built-in command that would tell me if the string is acceptable without throwing an error? Then I could try appending either "'" or '"' and I'm fairly certain that one of them would close the string. I'm only concerned with preventing the error message for now, because I'm just throwing random input at the script at this point.
Thanks!
The child shell process is writing the error message to STDERR, which it inherited from the parent PHP process. You could close the parent's STDERR file handle in the PHP script before running shell_exec().
<?php
fclose (STDERR);
shell_exec('"');
i am trying to make a pdf file with wkhtmltopdf when i pass url www.example.com pdf is generating or www.example.com?id=1
but when i try to put another parameter command execution is not working
www.example.com?id=1&type=u
shell_exec("c:\pdf\wkhtmltopdf.exe
http://localhost/test/index.php?id=1&typee=abc
test.pdf ");
i try to use it via command line to but its not working there also
thanks for help
The & is causing your command to fail as it has special meaning in shell. Use escapeshellarg() to escape those characters first.
Use escapeshellarg() to escape parameters before passing them to the command line.
This is also mandatory when passing external data (e.g. user input) as parameters.