I have a string in the following format :
(str_rot13(base64_decode("my string")))
I am trying to decode it using a single linux command by piping the output of Base64 decode to rot13.
I am attempting to use echo 'my string' | base64 --decode then pipe the output to tr 'n-za-mN-ZA-M' ‘a-zA-Z’ which applies the Rot13 decode operation on the output.
Can you guide me for the best possible way I can do it using the command line.
Edit
Apologies guys, I was looking at the partial script. I just noticed that the complete script is something like this:
<?php eval(gzinflate(str_rot13(base64_decode('my string')))); ?>
You should be able to pipe those two commands straight through (the only possible issue I can see are the ‘curly’ quotes in your tr command).
PHP
php > echo str_rot13(base64_decode("c2JiCg=="));
foo
Bash
echo 'c2JiCg==' | base64 --decode | tr 'n-za-mN-ZA-M' 'a-zA-Z'
foo
Related
Take the output of gzdeflate(), for example:
$a = gzdeflate('..........');
echo $a . "\n" . strlen($a);
I get output like:
?Ӄ
5
So I've got a 5 byte string that contains characters which cannot be outputted properly, and hence cannot be copy and pasted.
Obviously, echo gzinflate('?Ӄ'); doesn't work, but echo gzinflate($a) does.
Is there any way to get the actual contents of $a onto my clipboard or output it in such a way that I could copy and paste it into gzinflate() to retrieve the original string? The only workaround I've found is something like:
$a = base64_encode(gzdeflate('..........'));
echo $a;
Which gives me:
09ODAQA=
That's friendly enough to do echo gzinflate(base64_decode('09ODAQA=')); and get .........., but I'd like to skip the base64 functions if possible.
The problem is that you're channeling binary data through a text medium. If you require the data to be printed out on your screen where you will select and copy it, there's no way to transport binary data like that.
If this is happening on the command line, you could do it programatically without displaying the actual contents. Take OS X's pbcopy and pbpaste commands:
$ php test.php | pbcopy
$ pbpaste | someotherprogram
If you do require a visible textual representation, you need to ensure that the output is ASCII-safe (or at least "Unicode safe") and not raw binary data. For that you will need to base 64 or hex encode your binary data.
I guess your browser or console just eat some special chars like break line and other.
You can put this string to any file (file_put_contents()), and open this file throw notepad++ for example. and you will see these special chars in this file.
How can I convert/decode html entities of a file's contents (XML) in PHP.
I tried to run this on a command line:
perl -MHTML::Entities -ne 'print decode_entities($_)' /apps/www/mydir/xmlfiles/p34580600.xml >> /apps/www/mydir/xmlfiles/p34580600_1.xml
It works fine running it on command line but when I try to call it within PHP:
system("perl -MHTML::Entities -ne 'print decode_entities($_)' /apps/www/mydir/xmlfiles/p34580600.xml >> /apps/www/mydir/xmlfiles/p34580600_6.xml");
It creates the file but it is empty. I tried to use html_entity_decode but the XML file is just too big. 20megs at least.
Any help or suggestion is greatly appreciated.
Thanks,
try escaping the $ in the system call. It is possible php is looking for a variable $_ since you are using double quotes for the string command.
I have a PHP script that creates large complex tables. I am attempting to set up a shell script which would accept an ID as an argument and then run the PHP script using the ID and accept the HTML output of the PHP script for use as part of a cURL post to DocRaptor to make a PDF.
Example shell script looks like this and I want the document_content to be my generated HTML.
myHTML = usr/bin/php mytablemaker.php?id=$1
curl -H "Content-Type:application/json" -d'{"user_credentials":"API_KEY", "doc":{"name":"docraptor_sample.pdf", "document_type":"pdf", "test":"true", "document_content":"myHTML"}}' http://docraptor.com/docs > docraptor_sample.pdf
How do I do this correctly?
If that is bash, something like this should work:
myHTML = $(usr/bin/php mytablemaker.php?id=$1)
curl -H "Content-Type:application/json" -d'{"user_credentials":"API_KEY", "doc":{"name":"docraptor_sample.pdf", "document_type":"pdf", "test":"true", "document_content":"'"$myHTML"'"}}' http://docraptor.com/docs > docraptor_sample.pdf
However you don't ask for HTML but for HTML as a json string, so make you PHP script encode the string as json, see json_encode. Or do a addcslashes($output, '"') on the " characters.
See as well:
Bash Scripting: Redirect output into variable
The best way is to modify that mytablemaker.php to take the command line use case into account.
e.g. like this:
if(isset($argv[1])) {
$id=$argv[1];
} else {
$id=$_GET["id"];
}
Then from BASH you do:
# Get HTML from PHP script and escape quotes and
# backslashes in string to comply to the JSON specification
myHTML=$(/usr/bin/php -f mytablemaker.php $1 | sed -e 's/[\\"]/\\&/g')
# Put the value of myHTML in a JSON call and send it to the server
curl -H "Content-Type:application/json" -d'{"user_credentials":"API_KEY", "doc":{"name":"docraptor_sample.pdf", "document_type":"pdf", "test":"true", "document_content":"'"$myHTML"'"}}' http://docraptor.com/docs -o docraptor_sample.pdf
Note the string concatenation done at the last line:
'first part'"second part"'third part'
The examples supplied did not mention a document_url parameter but DocRaptor's error message did.
Working code using what I learned from hakre and anttix!
curl -H "Content-Type:application/json" -d'{"user_credentials":"API_KEY", "doc":{"name":"docraptor_sample.pdf", "document_type":"pdf", "test":"false", "document_url":"'"http://foo.com/tablemaker.php?CTN=$1"'"}}' http://docraptor.com/docs -o docraptor_sample.pdf
I want to use php's strip tags in a bash script. I figured I could just cat the html file I want to use and use that input and pipe it into php and then pipe that into something else (sed). Is that possible? I'm not sure exactly how to pipe the output of file.html into the strip_tag function...maybe put it all in a variable? I want the following to keep just the anchor tags...in the following I put in dummy text for strip_tags string because I didn't know how to pipe file.html in:
cat file.html | php strip_tags("<p><a href='#'>hi</a></p>",'<a>') > removed_tags.html
You can read from STDIN in PHP using the stream URI php://stdin. As for executing it, you'll also need to quote the PHP code and use the -r option, as well as echoing the result. So here's the fixed script:
cat file.html | php -r "echo strip_tags(file_get_contents('php://stdin'), '<a>');" > removed_tags.html
Reading from stdin in PHP and writing a php script without a file are possible, but it's way more trouble than just writing a file like
<?php echo strip_tags(file_get_contents($argv[1]), '<a>');
...
$ php that-file.php file.html > removed_tags.html
It is possible to pipe data using unix pipes into a command-line php script? I've tried
$> data | php script.php
But the expected data did not show up in $argv. Is there a way to do this?
PHP can read from standard input, and also provides a nice shortcut for it: STDIN.
With it, you can use things like stream_get_contents and others to do things like:
$data = stream_get_contents(STDIN);
This will just dump all the piped data into $data.
If you want to start processing before all data is read, or the input size is too big to fit into a variable, you can use:
while(!feof(STDIN)){
$line = fgets(STDIN);
}
STDIN is just a shortcut of $fh = fopen("php://stdin", "r");.
The same methods can be applied to reading and writing files, and tcp streams.
As I understand it, $argv will show the arguments of the program, in other words:
php script.php arg1 arg2 arg3
But if you pipe data into PHP, you will have to read it from standard input. I've never tried this, but I think it's something like this:
$fp = readfile("php://stdin");
// read $fp as if it were a file
If your data is on one like, you can also use either the -F or -R flag (-F reads & executes the file following it, -R executes it literally) If you use these flags the string that has been piped in will appear in the (regular) global variable $argn
Simple example:
echo "hello world" | php -R 'echo str_replace("world","stackoverflow", $argn);'
You can pipe data in, yes. But it won't appear in $argv. It'll go to stdin. You can read this several ways, including fopen('php://stdin','r')
There are good examples in the manual
This worked for me:
stream_get_contents(fopen("php://stdin", "r"));
Came upon this post looking to make a script that behaves like a shell script, executing another command for each line of the input... ex:
ls -ln | awk '{print $9}'
If you're looking to make a php script that behaves in a similar way, this worked for me:
#!/usr/bin/php
<?php
$input = stream_get_contents(fopen("php://stdin", "r"));
$lines = explode("\n", $input);
foreach($lines as $line) {
$command = "php next_script.php '" . $line . "'";
$output = shell_exec($command);
echo $output;
}
If you want it to show up in $argv, try this:
echo "Whatever you want" | xargs php script.php
That would covert whatever goes into standard input into command line arguments.
Best option is to use -r option and take the data from the stdin. Ie I use it to easily decode JSON using PHP.
This way you don't have to create physical script file.
It goes like this:
docker inspect $1|php -r '$a=json_decode(stream_get_contents(STDIN),true);echo str_replace(["Array",":"],["Shares"," --> "],print_r($a[0]["HostConfig"]["Binds"],true));'
This piece of code will display shared folders between host & a container.
Please replace $1 by the container name or put it in a bash alias like ie displayshares() { ... }
I needed to take a CSV file and convert it to a TSV file. Sure, I could import the file into Excel and then re-export it, but where's the fun in that when piping the data through a converter means I can stay in the commandline and get the job done easily!
So, my script (called csv2tsv) is
#!/usr/bin/php
<?php
while(!feof(STDIN)){
echo implode("\t", str_getcsv(fgets(STDIN))), PHP_EOL;
}
I chmod +x csv2tsv.
I can then run it cat data.csv | csv2tsv > data.tsv and I now have my data as a TSV!
OK. No error checking (is the data an actual CSV file?), etc. but the principle works well.
And of course, you can chain as many commands as you need.
If you are wanting more to expand on this idea, then how about the ability to include additional options to your command?
Simple!
#!/usr/bin/php
<?php
$separator = $argv[1] ?? "\t";
while(!feof(STDIN)){
echo implode($separator, str_getcsv(fgets(STDIN))), PHP_EOL;
}
Now I can overwrite the default separator from being a tab to something else. A | maybe!
cat data.csv | csv2tsv '|' > data.psv
Hope this helps and allows you to see how much more you can do!