PHP files with 'cmt' extension (?) - php

This might sound a bit silly.
I have to update a website with a new interface, and although I received the old code I am lost. (As far as I know, I received the complete website, phpPgAdmin included)
Most of the files are .cmt but I'm pretty sure it's PHP, index.cmt for instance is
<?
include_once('redirected.php');
exit();
function get_lang_from_ip() {
$client_ip = $_SERVER['REMOTE_ADDR'];
$client_hostname = gethostbyaddr($client_ip);
if ($client_ip != $client_hostname) {
$dom_arr = split('\.',$client_hostname);
$lang_id = ($dom_arr[sizeof($dom_arr)-1] == 'hu') ? 'hu' : 'en';
} else {
$lang_id = 'hu';
}
return $lang_id;
}
function set_target() {
$valid_lang_arr = array('hu', 'en');
if (isset($_GET['lang_id']) && in_array($_GET['lang_id'], $valid_lang_arr)) {
$lang_id = $_GET['lang_id'];
} elseif (isset($_COOKIE['lang_id']) && in_array($_COOKIE['lang_id'], $valid_lang_arr)) {
$lang_id = $_COOKIE['lang_id'];
} else {
$lang_id = 'hu'; //get_lang_from_ip();
}
setcookie('lang_id', $lang_id, time()+3600*24*365, '/');
return $lang_id;
}
header('Connection: close');
header('Location: /'.set_target().'/');
?>
The .php files however don't begin with the <? short opening tag, but with <?php instead. And most importantly, the .cmt files are not parsed, if I navigate to the index.cmt I see the code in the browser, and thus I can't even put the old layout back together.
Any help is appreciated.

You can make a .htaccess file, and include the following:
AddType application/x-httpd-php .cmt
That should allow PHP to be executed on .cmt file extensions.
Please note that this fix can only be used on Apache web servers or others that support .htaccess.
Although, if possible, I'd recommend that you change all .cmt file extensions to their appropriate counter part, .php.

While #Josh Foskett has a valid answer, I think an actual solution would be to go through your files and rename the .cmt extensions to .php for scalability and to avoid confusion down the road.
Consider "phasing out" the legacy extensions by bringing them up to the current extension standard. Make this a project, not a quick fix, and I think you'll be much happier with your code base.

Enable short_open_tag in php.ini.
This will make sure, that php is actually executed, if it's included from an other file. It still won't execute, .cmt files directly, but if you have main .php entry files, that's exactly the behaviour you need.

You need to configure your server to parse .cmt files as PHP.
Also, both <? and <?php are valid opening tags for PHP, though the short version might be turned off in PHP settings (usually define in php.ini somewhere in the filesystem).

Related

Exploiting file_get_contents() [duplicate]

This question already has an answer here:
Security vulnerabilities with file_get_contents() using variable location
(1 answer)
Closed 3 years ago.
Is it possible to read any file (not only those with the extension .html) from the server in the following script?
<?php
echo file_get_contents($_GET['display'].'.html');
?>
I know about wrappers (php://, file://, etc.) but achieved not too much.
I'm eager to hear all the possible vectors of attack.
The PHP configuration is default:
allow_url_fopen On, and let's assume the version is >= 7.0, so null character %00 doesn't work.
No, that will only ever read files ending in '.html', but that doesn't necessarily mean that it's secure! Generally, the more that you can sanitise and restrict the input, the better.
Also, for anyone planning to use file_get_contents like this, it's always good to remember that when serving from file_get_contents, you can serve files that are not normally accessible - either due to server configuration, e.g. .htaccess, or file permissions.
As #David said, this will only get files ending in '.html', but its not a good practice, if you have html folder and you want the user to get only files from that folder , you shouldn't do that, by using this method a hacker can access any .html file in your server, not just the ones you want him to see.
My suggestion is that if you have a specific folder that you want user to be able to get files from, scan the directory and check for the file name.
Here's an example:
<?php
$paths = scandir('/html');
$file = isset($_GET['display']) : $_GET['display'] ? null;
if(!$file)
{
die('no display provided');
}
$html = '';
foreach($paths as $path) {
if($path !== '.' && $path !== '..' && $path === $file.'.html') {
$html = file_get_contents($path);
}
}
echo $html;
?>
Exploidale as proxy:
http://example.com/script.php?display=https://hackme.com/passwords%3Extension%3D
echo file_get_contents("https://hackme.com/passwords?Extension=.html")
Your IP will be logged on hackme.com machine and return some passwords (when lucky).

How can I change the home directory of some code?

I found some PHP online (it's a 1 page file manager with no permissions) that I find is really awesome, it suits my current needs. However, I'm having some issues changing the working (default) directory.
I got the script from a GitHub project that is no longer maintained. The PHP itself is a 1 page PHP file manager with no permissions, no databases etc. I already have a user accounts system and would like to change the working directory based on an existing database variable, however I can't seem to find a way around changing the directory.
Currently, the script is uploaded to /home/advenacm/public_html/my/ (as the file is /home/advenacm/public_html/my/files.php. By what I can tell, the PHP uses a cookie to determine the working directory, but it can't find a way around setting a custom directory. I want to use '/home/advenacm/public_html/my/'.$userdomain;, which will as a result become something like /home/advenacm/public_html/my/userdomain.com/.
What I would like to do is set the default (or "home") directory so that the file manager cannot access the root directory, only a specified subfolder.
Something like directory = "/home/advenaio/public_html/directory/" is the best way to explain it. I've tried a number of methods to try and achieve this but nothing seems to work.
I've taken the liberty of uploading my code to pastebin with the PHP syntax highlighting. Here is the snippet of PHP that I believe is choosing the working directory (line 19-29):
$tmp = realpath($_REQUEST['file']);
if($tmp === false)
err(404,'File or Directory Not Found');
if(substr($tmp, 0,strlen(__DIR__)) !== __DIR__)
err(403,"Forbidden");
if(!$_COOKIE['_sfm_xsrf'])
setcookie('_sfm_xsrf',bin2hex(openssl_random_pseudo_bytes(16)));
if($_POST) {
if($_COOKIE['_sfm_xsrf'] !== $_POST['xsrf'] || !$_POST['xsrf'])
err(403,"XSRF Failure");
}
I appreciate any help anyone can offer me and would like to thank anyone in advance for even taking the time to look at my question.
Have you tried chdir() function ?
later edit
Updating my answer based on your edited question.
The main problem is line 30
$file = $_REQUEST['file'] ?: '.';
That needs to be a full real path to the file and has to be compared with your user's 'home'.
And you should use the same path for the checks at line 19.
So you can replace 19-30 with:
$user_home = __DIR__ . "/{$userdomain}";
$file = $_REQUEST['file'] ?: $user_home; //you might have to prepend $userdomain to $_REQUEST['file'], can't see from html the format.
$file = realpath($_REQUEST['file']);
if($file === false) {
err(404,'File or Directory Not Found');
}
if(strpos($file, $user_home) !== 0) {
err(403,"Forbidden");
}
if(!$_COOKIE['_sfm_xsrf']) {
setcookie('_sfm_xsrf',bin2hex(openssl_random_pseudo_bytes(16)));
}
if($_POST) {
if($_COOKIE['_sfm_xsrf'] !== $_POST['xsrf'] || !$_POST['xsrf'])
err(403,"XSRF Failure");
}
Although this might solve your question I think the entire script is a poorly written solution.

Serving php as css/js: Is it fast enough? What drawbacks are there?

I've recently started getting into the area of optimizing preformance and load times client side, compressing css/js, gzipping, paying attention to YSlow, etc.
I'm wondering, while trying to achieve all these micro-optimizations, what are the pros and cons of serving php files as css or javascript?
I'm not entirely sure where the bottleneck is, if there is one. I would assume that between an identical css and php file, the "pure" css file would be slightly faster simply because it doesn't need to parse php code. However, in a php file you can have more control over headers which may be more important(?).
Currently I'm doing a filemtime() check on a "trigger" file, and with some php voodoo writing a single compressed css file from it, combined with several other files in a defined group. This creates a file like css/groupname/301469778.css, which the php template catches and updates the html tags with the new file name. It seemed like the safest method, but I don't really like the server cache getting filled up with junk css files after several edits. I also don't bother doing this for small "helper" css files that are only loaded for certain pages.
If 99% of my output is generated by php anyways, what's the harm (if any) by using php to directly output css/js content? (assuming there are no php errors)
If using php, is it a good idea to mod_rewrite the files to use the css/js extension for any edge cases of browser misinterpretation? Can't hurt? Not needed?
Are there any separate guidelines/methods for css and javascript? I would assume that they would be equal.
Which is faster: A single css file with several #imports, or a php file with several readfile() calls?
What other ways does using php affect speed?
Once the file is cached in the browser, does it make a difference anymore?
I would prefer to use php with .htaccess because it is much simpler, but in the end I will use whatever method is best.
ok, so here are your direct answers:
no harm at all as long as your code is fine. The browser won't notice any difference.
no need for mod_rewrite. the browsers usually don't care about the URL (and often not even about the MIME type).
CSS files are usually smaller and often one file is enough, so no need to combine. Be aware that combining files from different directories affect images referenced in the CSS as they remain relative to the CSS URL
definitely readfile() will be faster as #import requires multiple HTTP requests and you want to reduce as much as possible
when comparing a single HTTP request, PHP may be slightly slower. But you loose the possibility to combine files unless you do that offline.
no, but browser caches are unreliable and improper web server config may cause the browser to unnecessarily re-fetch the URL.
It's impossible to give you a much more concrete answer because it depends a lot on your project details.
We are developing really large DHTML/AJAX web application with about 2+ MB of JavaScript code and they still load quickly with some optimizations:
try to reduce the number of Script URLs included. We use a simple PHP script that loads a bunch of .js files and sends them in one go to the browser (all concatenated). This will load your page a lot faster when you have a lot of .js files as we do since the overhead of setting up a HTTP connection is usually much higher that the actually transferring the content itself. Note that the browser needs to download JS files synchroneously.
be cache friendly. Our HTML page is also generated via PHP and the URL to the scripts contains a hash that's dependent on the file modification times. The PHP script above that combines the .js files then checks the HTTP cache headers and sets a long expiration time so that the browser does not even have to load any external scripts the second time the user visits the page.
GZIP compress the scripts. This will reduce your code by about 90%. We don't even have to minify the code (which makes debugging easier).
So, yes, using PHP to send the CSS/JS files can improve the loading time of your page a lot - especially for large pages.
EDIT: You may use this code to combine your files:
function combine_files($list, $mime) {
if (!is_array($list))
throw new Exception("Invalid list parameter");
ob_start();
$lastmod = filemtime(__FILE__);
foreach ($list as $fname) {
$fm = #filemtime($fname);
if ($fm === false) {
$msg = $_SERVER["SCRIPT_NAME"].": Failed to load file '$fname'";
if ($mime == "application/x-javascript") {
echo 'alert("'.addcslashes($msg, "\0..\37\"\\").'");';
exit(1);
} else {
die("*** ERROR: $msg");
}
}
if ($fm > $lastmod)
$lastmod = $fm;
}
//--
$if_modified_since = preg_replace('/;.*$/', '',
$_SERVER["HTTP_IF_MODIFIED_SINCE"]);
$gmdate_mod = gmdate('D, d M Y H:i:s', $lastmod) . ' GMT';
$etag = '"'.md5($gmdate_mod).'"';
if (headers_sent())
die("ABORTING - headers already sent");
if (($if_modified_since == $gmdate_mod) or
($etag == $_SERVER["HTTP_IF_NONE_MATCH"])) {
if (php_sapi_name()=='CGI') {
Header("Status: 304 Not Modified");
} else {
Header("HTTP/1.0 304 Not Modified");
}
exit();
}
header("Last-Modified: $gmdate_mod");
header("ETag: $etag");
fc_enable_gzip();
// Cache-Control
$maxage = 30*24*60*60; // 30 Tage (Versions-Unterstützung im HTML Code!)
$expire = gmdate('D, d M Y H:i:s', time() + $maxage) . ' GMT';
header("Expires: $expire");
header("Cache-Control: max-age=$maxage, must-revalidate");
header("Content-Type: $mime");
echo "/* ".date("r")." */\n";
foreach ($list as $fname) {
echo "\n\n/***** $fname *****/\n\n";
readfile($fname);
}
}
function files_hash($list, $basedir="") {
$temp = array();
$incomplete = false;
if (!is_array($list))
$list = array($list);
if ($basedir!="")
$basedir="$basedir/";
foreach ($list as $fname) {
$t = #filemtime($basedir.$fname);
if ($t===false)
$incomplete = true;
else
$temp[] = $t;
}
if (!count($temp))
return "ERROR";
return md5(implode(",",$temp)) . ($incomplete ? "-INCOMPLETE" : "");
}
function fc_compress_output_gzip($output) {
$compressed = gzencode($output);
$olen = strlen($output);
$clen = strlen($compressed);
if ($olen)
header("X-Compression-Info: original $olen bytes, gzipped $clen bytes ".
'('.round(100/$olen*$clen).'%)');
return $compressed;
}
function fc_compress_output_deflate($output) {
$compressed = gzdeflate($output, 9);
$olen = strlen($output);
$clen = strlen($compressed);
if ($olen)
header("X-Compression-Info: original $olen bytes, deflated $clen bytes ".
'('.round(100/$olen*$clen).'%)');
return $compressed;
}
function fc_enable_gzip() {
if(isset($_SERVER['HTTP_ACCEPT_ENCODING']))
$AE = $_SERVER['HTTP_ACCEPT_ENCODING'];
else
$AE = $_SERVER['HTTP_TE'];
$support_gzip = !(strpos($AE, 'gzip')===FALSE);
$support_deflate = !(strpos($AE, 'deflate')===FALSE);
if($support_gzip && $support_deflate) {
$support_deflate = $PREFER_DEFLATE;
}
if ($support_deflate) {
header("Content-Encoding: deflate");
ob_start("fc_compress_output_deflate");
} else{
if($support_gzip){
header("Content-Encoding: gzip");
ob_start("fc_compress_output_gzip");
} else{
ob_start();
}
}
}
Use files_hash() to generate a unique hash string that changes whenever your source files change and combine_files() to send the combined files to the browser. So, use files_hash() when generating the HTML code for the tag and combine_files() in the PHP script that is loaded via that tag. Just place the hash in the query string of the URL.
<script language="JavaScript" src="get_the_code.php?hash=<?=files_hash($list_of_js_files)?>"></script>
Make sure you specify the same $list in both cases.
You're talking about serving static files via PHP, there's really little point doing that since its always going to be slower than Apache serving a normal file. A CSS #import will be quicker that PHP's readfile() but the best performance will be gained by serving one minified CSS file that combines all the CSS you need to use.
If sounds like you're on the right track though. I'd advise pre-processing your CSS and saving to disk. If you need to set special headers for things like caching just do this in your VirtualHost directive or .htaccess file.
To avoid lots of cached files you could use a simple file-naming convention for your minified CSS. For example, if your main CSS file called main.css and it references reset.css and forms.css via #imports, the minified version could be called main.min.css
When this file is regenerated it simply replaces it. If you include a reference to that file in your HTML, you could send the request to PHP if the file doesn't exist, combine and minify the file (via something like YUI Compressor), and save it to disk and therefore be served via normal HTTP for all future requests.
When you update your CSS just delete the main.min.css version and it will automatically regenerate.
You can do the preprocessing with an ANT Build. Sorry, the post is german, but I've tried translate.google.com and it worked fine :-) So you can use the post as tutorial to achieve a better performance...
I would preprocess the files and save them to disk, just like simonrjones said. Caching-stuff etc. should be done by the dedicated elements, like Apache WebServer, Headers and Browser.
While slower, one advantage / reason you might have to do this is to put dynamic content into the files on the server, but still have them appear to be js or css from the client perspective.
Like this for example, passing the environment from php to javascript:
var environment = <?=getenv('APPLICATION_ENV');?>
// More JS code here ...

PHP 'copy' not working

Can anybody tell me why this function isn't copying the file at all?
$pluginfile = get_bloginfo('template_url') . '/wp-content/plugins/supersqueeze/supersqueeze.php';
$urlparts = get_bloginfo('template_url');
$homeurl = home_url();
$urlstrip = str_replace($homeurl, '..', $urlparts);
$urldest = $urlstrip . '/supersqueeze.php';
function copyemz(){
global $pluginfile; global $urldest;
if(!#copy($pluginfile,$urldest)) {
$errors= error_get_last();
}
}
This file is run from /public_html/wp-admin/plugins.php
I need it to copy the file at ($pluginfile) /public_html/wp-content/plugins/supersqueeze/supersqueeze.php
to ($urldest) /public_html/wp-content/themes/[active wordpress theme] - of course replacing [active wordpress theme] with the directory of the theme.
You need to ensure that you have write permissions to /public_html/wp-content/themes/[active wordpress theme] as well as any other files you may be overwriting.
So, the second parameter to copy() must be a local file. Make sure it is also a writable destination (chmod) like webbiedave said.
$desturl = "./supersqueeze.php";
The reason is two-fold. PHPs http stream wrappers don't support POSTing or PUTing files, which a write-to action would require. Second, your webserver probably wouldn't support HTTP PUT either. (Though a small requesthandler script could handle such.)

Is there any tool that will resolve and hardcode every included file of a PHP script?

I would need a tool, if it exists or if you can write in under 5 mins (don't want to waste anyone's time).
The tool in question would resolve the includes, requires, include_once and require_once in a PHP script and actually harcode the contents of then, recursively.
This would be needed to ship PHP scripts in one big file that actually use code and resources from multiple included files.
I know that PHP is not the best tool for CLI scripts, but as I'm the most pro-efficient at it, I use it to write some personal or semi-personal tools. I don't want un-helpful answers or comments that tell me to use something else than PHP or learn something else.
The idea of that approach is to be able to have a single file that would represent everything needed to put it in my personal ~/.bin/ directory and let it live there as a completely functional and self-contained script. I know I could set include paths in the script to something that would honor the XDG data directories standards or anything else, but I wanted to try that approach.
Anyway, I ask there because I don't want to re-invent the wheel and all my searches gave nothing, but if I don't have any insight here, I will continue in the way I was going to and actually write a tool that will resolve the includes and requires.
Thanks for any help!
P.S.: I forgot to include examples and don't want to rephrase the message:
Those two files
mainfile.php
<?php
include('resource.php');
include_once('resource.php');
echo returnBeef();
?>
resource.php
<?php
function returnBeef() {
return "The beef!";
}
?>
Would be "compiled" as (comments added for clarity)
<?php
/* begin of include('resource.php'); */?><?php
function returnBeef() {
return "The beef!";
}
?><?php /* end of include('resource.php); */
/*
NOT INCLUDED BECAUSE resource.php WAS PREVIOUSLY INCLUDED
include_once('resource.php');
*/
echo returnBeef();
?>
The script does not have to output explicit comments, but it could be nice if it did.
Thanks again for any help!
EDIT 1
I made a simple modification to the script. As I have begun writing the tool myself, I have seen a mistake I made in the original script. The included file would have, to do the least amount of work, to be enclosed out of start and end tags (<?php ?>)
The resulting script example has been modified in consequence, but it has not been tested.
EDIT 2
The script does not actually need to do heavy-duty parsing of the PHP script as in run-time accurate parsing. Simple includes only have to be treated (like include('file.php');).
I started working on my script and am reading the file to unintelligently parse them to include only when in <?php ?> tags, not in comments nor in strings. A small goal is to also be able to detect dirname(__FILE__)."" in an include directive and actually honor it.
An interesting problem, but one that's not really solvable without detailed runtime knowledge. Conditional includes would be nearly impossible to determine, but if you make enough simple assumptions, perhaps something like this will suffice:
<?php
# import.php
#
# Usage:
# php import.php basefile.php
if (!isset($argv[1])) die("Invalid usage.\n");
$included_files = array();
echo import_file($argv[1])."\n";
function import_file($filename)
{
global $included_files;
# this could fail because the file doesn't exist, or
# if the include path contains a run time variable
# like include($foo);
$file = #file_get_contents($filename);
if ($file === false) die("Error: Unable to open $filename\n");
# trimming whitespace so that the str_replace() at the end of
# this routine works. however, this could cause minor problems if
# the whitespace is considered significant
$file = trim($file);
# look for require/include statements. Note that this looks
# everywhere, including non-PHP portions and comments!
if (!preg_match_all('!((require|include)(_once)?)\\s*\\(?\\s*(\'|")(.+)\\4\\s*\\)?\\s*;!U', $file, $matches, PREG_SET_ORDER | PREG_OFFSET_CAPTURE ))
{
# nothing found, so return file contents as-is
return $file;
}
$new_file = "";
$i = 0;
foreach ($matches as $match)
{
# append the plain PHP code up to the include statement
$new_file .= substr($file, $i, $match[0][1] - $i);
# make sure to honor "include once" files
if ($match[3][0] != "_once" || !isset($included_files[$match[5][0]]))
{
# include this file
$included_files[$match[5][0]] = true;
$new_file .= ' ?>'.import_file($match[5][0]).'<?php ';
}
# update the index pointer to where the next plain chunk starts
$i = $match[0][1] + strlen($match[0][0]);
}
# append the remainder of the source PHP code
$new_file .= substr($file, $i);
return str_replace('?><?php', '', $new_file);
}
?>
There are many caveats to the above code, some of which can be worked around. (I leave that as an exercise for somebody else.) To name a few:
It doesn't honor <?php ?> blocks, so it will match inside HTML
It doesn't know about any PHP rules, so it will match inside PHP comments
It cannot handle variable includes (e.g., include $foo;)
It may introduce scope errors. (e.g., if (true) include('foo.php'); should be if (true) { include('foo.php'); }
It doesn't check for infinitely recursive includes
It doesn't know about include paths
etc...
But even in such a primitive state, it may still be useful.
You could use the built in function get_included_files which returns an array of, you guessed it, all the included files.
Here's an example, you'd drop this code at the END of mainfile.php and then run mainfile.php.
$includes = get_included_files();
$all = "";
foreach($includes as $filename) {
$all .= file_get_contents($filename);
}
file_put_contents('all.php',$all);
A few things to note:
any include which is actually not processed (ie. an include inside a function) will not be dumped into the final file. Only includes which have actually run.
This will also have a around each file but you can have multiple blocks like that with no issues inside a single text file.
This WILL include anything included within another include.
Yes, get_included_files will list the script actually running as well.
If this HAD to be a stand-alone tool instead of a drop in, you could read the inital file in, add this code in as text, then eval the entire thing (possibly dangerous).

Categories