Awk with php and mysql - php

I'm trying to use php with awk. The awk command is just to print out the password of a database so i can feed it to the php code to connect to mysql and work the rest of the code.
My awk code looks something like this ( in the php file):
$pass = system('awk FS='=' '/Mydbpass/ {print $2}'; file.conf');
That code works perfect but it prints the passwod when i open the php file in my browser, how can i make the php mysql read it without having it printed ? I would use include but the file.conf doesn't have the password as a variable. If there's any other way to this also please share.

One way would be to use the exec function which returns the output of the command being executed without sending it to the output.
However, it would be most likely much better to read the file directly in your php script and parse it there. Something like:
#open the file
$fp = fopen('file.conf', 'r');
if ($fp) {
while(($line = fgets($fp)) !== false){
#split the line using = as delimiter
$cols = array_map(trim, explode('=', $line));
#do something with the columns
print_r($cols);
}
fclose($fp);
}

Use parse_ini_file() it suits best for this kind of things, no need of using system function and awk, you can achieve this in php itself.
<?php
$content=parse_ini_file("your.conf");
// Your password
$password = $content['Mydbpass'];
?>
For example if you have input file like below
Input
$ cat test.conf
Mydbpass=somesecretpass
Mydbuser=user12344
Output
$ php -r '$content=parse_ini_file("test.conf");print_r($content);'
Array
(
[Mydbpass] => somesecretpass
[Mydbuser] => user12344
)

Related

PHP strpos() not working

I am trying to get PHP to search a text file for a string. I know the string exists in the text, PHP can display all the text, and yet strpos returns false.
Here is my code:
<?php
$pyscript = "testscript.py";
//$path = "C:\\Users\\eneidhart\\Documents\\Python Scripts\\";
$process_path = "C:\\Users\\eneidhart\\Documents\\ProcessList.txt";
//$processcmd = "WMIC /OUTPUT: $process PROCESS get Caption,Commandline,Processid";
$process_file = fopen($process_path, "r") or die("Unable to open file!");
$processes = fread($process_file);
if (strpos($processes, $pyscript) !== FALSE) {
echo "$pyscript found";
} elseif (strpos($processes, $pyscript) === FALSE) {
echo "$pyscript NOT found :(";
} else {
echo "UHHHHHHHH...";
}
echo "<br />";
while (!feof($process_file)) {
echo fgets($process_file)."<br />";
}
fclose($processfile);
echo "End";
?>
The while loop will print out every line of the text file, including
python.exe python testscript.py
but strpos still can't seem to find "testscript.py" anywhere in it.
The final goal of this script is not necessarily to read that text file, but to check whether or not a particular python script is currently running. (I'm working on Windows 7, by the way.) The text file was generated using the commented out $processcmd and I've tried having PHP return the output of that command like this:
$result = `$processcmd`;
but no value was returned. Something about the format of this output seems to be disagreeing with PHP, which would explain why strpos isn't working, but this is the only command I know of that will show me which python script is running, rather than just showing me that python.exe is running. Is there a way to get this text readable, or even just a different way of getting PHP to recognize that a python script is running?
Thanks in advance!
EDIT:
I think I found the source of the problem. I created my own text file (test.txt) which only contained the string I was searching for, and used file_get_contents as was suggested, and that worked, though it did not work for the original text file. Turns out that the command listed under $processcmd creates a text file with Unicode encoding, not ANSI (which my test.txt was encoded in). Is it possible for that command to create a text file with a different encoding, or even simpler, tell PHP to use Unicode, not ANSI?
You can use the functions preg_grep() and file():
$process_path = "C:\\Users\\eneidhart\\Documents\\ProcessList.txt";
$results = preg_grep('/\btestscript.py\b/', file($process_path));
if(count($results)) {
echo "string was found";
}
You should follow the advice given in the first comment and use either:
file_get_contents($process_path);
or
fread($process_file, filesize($process_path));
If that fix is not enough and there is actually a problem on strpos (which shouldn't be the case), you can use:
preg_match("/.*testscript\.py.*/", $processes)
NB: Really try to use strpos and not preg_match as it's not advised by the documentation.
Well, I found the answer. Thanks to those of you who suggested using file_get_contents(), as I would not have gotten here without that advice. Turns out that WMIC outputs Unicode, and PHP did not like reading that. The solution was another command which converts Unicode to ANSI:
cmd.exe /a /c TYPE unicode_file.txt > ansi_file.txt
I hope this helps, for those of you out there trying to check if a particular python script is working, or if you're just trying to work with WMIC.

Perl CGI script pass request to PHP

Trying to make a simple Perl script that looks at a GET parameter to determine which php version to use, and then pass on the request. Here is the whole script:
#!/usr/bin/perl
use FCGI;
$cnt = 0;
local ($buffer, #pairs, $pair, $name, $value);
while(FCGI::accept >= 0){
$php = "php";
$ENV{PHP_FCGI_CHILDREN}=3;
$ENV{PHP_FCGI_MAX_REQUESTS}=5000;
$buffer = $ENV{'QUERY_STRING'};
#pairs = split(/&/, $buffer);
foreach $pair (#pairs) {
($name, $value) = split(/=/, $pair);
if($name == "php") {
$php = "php".$value;
}
}
print "Content-Type: text/html\r\n\r\n";
print `$php $ENV{PATH_TRANSLATED}`;
}
The idea is that the PHP version can be switched with a GET parameter... that part seems to be working fine when I test with phpversion().
So this thing seems to be "working" but a test file with a simple <?php phpinfo(); ?> outputs a pure string, NOT the formatted HTML. It gives the exact output as if phpinfo() were run from the command line, because that's exactly whats going on.
So the two parts to my question are
Is this actually a problem?
How would I "pass the request" to PHP, instead of invoking the command line?
Nice command injection vulnerability you built there.
QUERY_STRING='0=5;echo "fail_at_Web_security_forever";rm -rf /'
String comparison is eq, not ==. You must validate user input, white-list acceptable input and reject all other. The lack of a standard CGI parameter parsing library is typical for bad code like that: use CGI.pm or similar.
To forward/proxy a request, call PHP via HTTP: use LWP::UserAgent or similar.

How to pass a file as an argument to php exec?

I would like to know how I can pass the content of a file (csv in my case) as an argument for a command line executable (in C or Objective C) to be called by exec in php.
Here is what I have done: the user loads the content of its file from an URL like this:
http://www.myserver.com/model.php?fileName=test.csv
Then the following code allows php to parse and load the csv file:
<?php
$f = $_GET['fileName'];
$handle = fopen("$f", "r");
$data = array();
while (($line = fgetcsv($handle)) !== FALSE) {
$data[] = $line;
}
?>
where I'm stuck is how to pass the content of this csv file as an argument to exec. Even if I can assume the csv is known to have only two columns, how many rows it has is user-specific, so I cannot pass all the values one by one as parameters, e.g.
exec("/path_to_executable/model -a $data[0][0] -b $data[0][1] .....");
The only alternative solution I guess would be to write something like that:
exec("/path_to_executable/model -fileName test.csv");
and have the command line executable do the csv parsing, but in that case, I think I need to have the csv file physically written on the server side. I'm wondering what happens if several people are accessing the webpage at the same time with their own different csv file, are they over-writing each others?
I guess there must be a much proper way to do this and I have not figured it out. Any idea? Thanks!
I would recommend having that data on disk, and loading it within the command line utility - it is much less messing about. But if you can't do that, just pass it in 1 (unparsed) line at a time:
$command = "/path_to_executable/model";
foreach ($fileData as $line) {
$command .= ' "'.escapeshellarg($line).'"';
}
exec($command);
Then you can just fetch the data into your utility by looping argv, where argv[0] is the first line, argv[1] is the second line, and so on.
you could use popen() to get a handle on the process to write to. If you need to go both ways (read/write) and might requre some more power, have a look a proc_open().
You could also just write your data to some random file (to avoid multiple users kicking each other's race-conditioned butts). Something along the lines of
<?php
$csv = file_get_contents('http://www.myserver.com/model.php?fileName=test.csv
');
$filename = '/tmp/' . uniqid(sha1($csv)) . '.csv';
file_put_contents($filename, $csv);
exec('/your/thing < '. escapeshellarg($filename));
unlink($filename);
And since you're also in charge of the executable, you might figure out how to get the number of arguments passed (hint: argc) and read them in (hint: argv). Passing them through line-based like so:
<?php
$csvRow = fgetcsv($fh);
if ($csvRow) {
$escaped = array_map('escapeshellarg', $csvRow);
exec('/your/thing '. join(' ', $escaped));
}

piping data into command line php?

It is possible to pipe data using unix pipes into a command-line php script? I've tried
$> data | php script.php
But the expected data did not show up in $argv. Is there a way to do this?
PHP can read from standard input, and also provides a nice shortcut for it: STDIN.
With it, you can use things like stream_get_contents and others to do things like:
$data = stream_get_contents(STDIN);
This will just dump all the piped data into $data.
If you want to start processing before all data is read, or the input size is too big to fit into a variable, you can use:
while(!feof(STDIN)){
$line = fgets(STDIN);
}
STDIN is just a shortcut of $fh = fopen("php://stdin", "r");.
The same methods can be applied to reading and writing files, and tcp streams.
As I understand it, $argv will show the arguments of the program, in other words:
php script.php arg1 arg2 arg3
But if you pipe data into PHP, you will have to read it from standard input. I've never tried this, but I think it's something like this:
$fp = readfile("php://stdin");
// read $fp as if it were a file
If your data is on one like, you can also use either the -F or -R flag (-F reads & executes the file following it, -R executes it literally) If you use these flags the string that has been piped in will appear in the (regular) global variable $argn
Simple example:
echo "hello world" | php -R 'echo str_replace("world","stackoverflow", $argn);'
You can pipe data in, yes. But it won't appear in $argv. It'll go to stdin. You can read this several ways, including fopen('php://stdin','r')
There are good examples in the manual
This worked for me:
stream_get_contents(fopen("php://stdin", "r"));
Came upon this post looking to make a script that behaves like a shell script, executing another command for each line of the input... ex:
ls -ln | awk '{print $9}'
If you're looking to make a php script that behaves in a similar way, this worked for me:
#!/usr/bin/php
<?php
$input = stream_get_contents(fopen("php://stdin", "r"));
$lines = explode("\n", $input);
foreach($lines as $line) {
$command = "php next_script.php '" . $line . "'";
$output = shell_exec($command);
echo $output;
}
If you want it to show up in $argv, try this:
echo "Whatever you want" | xargs php script.php
That would covert whatever goes into standard input into command line arguments.
Best option is to use -r option and take the data from the stdin. Ie I use it to easily decode JSON using PHP.
This way you don't have to create physical script file.
It goes like this:
docker inspect $1|php -r '$a=json_decode(stream_get_contents(STDIN),true);echo str_replace(["Array",":"],["Shares"," --> "],print_r($a[0]["HostConfig"]["Binds"],true));'
This piece of code will display shared folders between host & a container.
Please replace $1 by the container name or put it in a bash alias like ie displayshares() { ... }
I needed to take a CSV file and convert it to a TSV file. Sure, I could import the file into Excel and then re-export it, but where's the fun in that when piping the data through a converter means I can stay in the commandline and get the job done easily!
So, my script (called csv2tsv) is
#!/usr/bin/php
<?php
while(!feof(STDIN)){
echo implode("\t", str_getcsv(fgets(STDIN))), PHP_EOL;
}
I chmod +x csv2tsv.
I can then run it cat data.csv | csv2tsv > data.tsv and I now have my data as a TSV!
OK. No error checking (is the data an actual CSV file?), etc. but the principle works well.
And of course, you can chain as many commands as you need.
If you are wanting more to expand on this idea, then how about the ability to include additional options to your command?
Simple!
#!/usr/bin/php
<?php
$separator = $argv[1] ?? "\t";
while(!feof(STDIN)){
echo implode($separator, str_getcsv(fgets(STDIN))), PHP_EOL;
}
Now I can overwrite the default separator from being a tab to something else. A | maybe!
cat data.csv | csv2tsv '|' > data.psv
Hope this helps and allows you to see how much more you can do!

Efficient flat file searching in PHP

I'd like to store 0 to ~5000 IP addresses in a plain text file, with an unrelated header at the top. Something like this:
Unrelated data
Unrelated data
----SEPARATOR----
1.2.3.4
5.6.7.8
9.1.2.3
Now I'd like to find if '5.6.7.8' is in that text file using PHP. I've only ever loaded an entire file and processed it in memory, but I wondered if there was a more efficient way of searching a text file in PHP. I only need a true/false if it's there.
Could anyone shed any light? Or would I be stuck with loading in the whole file first?
Thanks in advance!
5000 isn't a lot of records. You could easily do this:
$addresses = explode("\n", file_get_contents('filename.txt'));
and search it manually and it'll be quick.
If you were storing a lot more I would suggest storing them in a database, which is designed for that kind of thing. But for 5000 I think the full load plus brute force search is fine.
Don't optimize a problem until you have a problem. There's no point needlessly overcomplicating your solution.
I'm not sure if perl's command line tool needs to load the whole file to handle it, but you could do something similar to this:
<?php
...
$result = system("perl -p -i -e '5\.6\.7\.8' yourfile.txt");
if ($result)
....
else
....
...
?>
Another option would be to store the IP's in separate files based on the first or second group:
# 1.2.txt
1.2.3.4
1.2.3.5
1.2.3.6
...
# 5.6.txt
5.6.7.8
5.6.7.9
5.6.7.10
...
... etc.
That way you wouldn't necessarily have to worry about the files being so large you incur a performance penalty by loading the whole file into memory.
You could shell out and grep for it.
You might try fgets()
It reads a file line by line. I'm not sure how much more efficient this is though. I'm guessing that if the IP was towards the top of the file it would be more efficient and if the IP was towards the bottom it would be less efficient than just reading in the whole file.
You could use the GREP command with backticks in your on a Linux server. Something like:
$searchFor = '5.6.7.8';
$file = '/path/to/file.txt';
$grepCmd = `grep $searchFor $file`;
echo $grepCmd;
I haven't tested this personally, but there is a snippet of code in the PHP manual that is written for large file parsing:
http://www.php.net/manual/en/function.fgets.php#59393
//File to be opened
$file = "huge.file";
//Open file (DON'T USE a+ pointer will be wrong!)
$fp = fopen($file, 'r');
//Read 16meg chunks
$read = 16777216;
//\n Marker
$part = 0;
while(!feof($fp)) {
$rbuf = fread($fp, $read);
for($i=$read;$i > 0 || $n == chr(10);$i--) {
$n=substr($rbuf, $i, 1);
if($n == chr(10))break;
//If we are at the end of the file, just grab the rest and stop loop
elseif(feof($fp)) {
$i = $read;
$buf = substr($rbuf, 0, $i+1);
break;
}
}
//This is the buffer we want to do stuff with, maybe thow to a function?
$buf = substr($rbuf, 0, $i+1);
//Point marker back to last \n point
$part = ftell($fp)-($read-($i+1));
fseek($fp, $part);
}
fclose($fp);
The snippet was written by the original author: hackajar yahoo com
are you trying to compare the current IP with the text files listed IP's? the unrelated data wouldnt match anyway.
so just use strpos on the on the full file contents (file_get_contents).
<?php
$file = file_get_contents('data.txt');
$pos = strpos($file, $_SERVER['REMOTE_ADDR']);
if($pos === false) {
echo "no match for $_SERVER[REMOTE_ADDR]";
}
else {
echo "match for $_SERVER[REMOTE_ADDR]!";
}
?>

Categories