I found some PHP online (it's a 1 page file manager with no permissions) that I find is really awesome, it suits my current needs. However, I'm having some issues changing the working (default) directory.
I got the script from a GitHub project that is no longer maintained. The PHP itself is a 1 page PHP file manager with no permissions, no databases etc. I already have a user accounts system and would like to change the working directory based on an existing database variable, however I can't seem to find a way around changing the directory.
Currently, the script is uploaded to /home/advenacm/public_html/my/ (as the file is /home/advenacm/public_html/my/files.php. By what I can tell, the PHP uses a cookie to determine the working directory, but it can't find a way around setting a custom directory. I want to use '/home/advenacm/public_html/my/'.$userdomain;, which will as a result become something like /home/advenacm/public_html/my/userdomain.com/.
What I would like to do is set the default (or "home") directory so that the file manager cannot access the root directory, only a specified subfolder.
Something like directory = "/home/advenaio/public_html/directory/" is the best way to explain it. I've tried a number of methods to try and achieve this but nothing seems to work.
I've taken the liberty of uploading my code to pastebin with the PHP syntax highlighting. Here is the snippet of PHP that I believe is choosing the working directory (line 19-29):
$tmp = realpath($_REQUEST['file']);
if($tmp === false)
err(404,'File or Directory Not Found');
if(substr($tmp, 0,strlen(__DIR__)) !== __DIR__)
err(403,"Forbidden");
if(!$_COOKIE['_sfm_xsrf'])
setcookie('_sfm_xsrf',bin2hex(openssl_random_pseudo_bytes(16)));
if($_POST) {
if($_COOKIE['_sfm_xsrf'] !== $_POST['xsrf'] || !$_POST['xsrf'])
err(403,"XSRF Failure");
}
I appreciate any help anyone can offer me and would like to thank anyone in advance for even taking the time to look at my question.
Have you tried chdir() function ?
later edit
Updating my answer based on your edited question.
The main problem is line 30
$file = $_REQUEST['file'] ?: '.';
That needs to be a full real path to the file and has to be compared with your user's 'home'.
And you should use the same path for the checks at line 19.
So you can replace 19-30 with:
$user_home = __DIR__ . "/{$userdomain}";
$file = $_REQUEST['file'] ?: $user_home; //you might have to prepend $userdomain to $_REQUEST['file'], can't see from html the format.
$file = realpath($_REQUEST['file']);
if($file === false) {
err(404,'File or Directory Not Found');
}
if(strpos($file, $user_home) !== 0) {
err(403,"Forbidden");
}
if(!$_COOKIE['_sfm_xsrf']) {
setcookie('_sfm_xsrf',bin2hex(openssl_random_pseudo_bytes(16)));
}
if($_POST) {
if($_COOKIE['_sfm_xsrf'] !== $_POST['xsrf'] || !$_POST['xsrf'])
err(403,"XSRF Failure");
}
Although this might solve your question I think the entire script is a poorly written solution.
Related
I'm new to using OPcache on php 8 and I have some questions. So my folder structure looks like this:
https://i.stack.imgur.com/vb93u.png
Within each folder is the exact same thing, it's the structure of my website.
Why does OPcache generate multiple folders with the same content?
What is the best way to keep only the most recent folder and delete the others? Is there a check that can be done every so often or a setting that overwrites older files with new ones?
I'm fast approaching the file limit with my hosting and need to clear up some space.
I've read the docs but I don't have a lot of knowledge working with servers so any help is greatly appreciated!
Oh and these are the settings in my php.ini:
zend_extension=opcache.so;
opcache.enable=1;
opcache.memory_consumption=32;
opcache.interned_strings_buffer=8;
opcache.max_accelerated_files=3000;
opcache.revalidate_freq=180;
opcache.fast_shutdown=0;
opcache.enable_cli=0;
opcache.revalidate_path=0;
opcache.validate_timestamps=1;
opcache.max_file_size=0;
opcache.file_cache=/mywebsitepath/.opcache;
opcache.file_cache_only=1;
Just in case anyone else has the same problem, this is what I ended up doing. First I tried to use the Wordpress cron manager but I was having issues getting a simple function to work. Instead, in my hosting you can create a cron job and link it to a php file, so I went that route instead. Here's the contents of the cron-jobs.php file which I put in my OPcache folder. Basically it sorts the folders by date modified then deletes the old ones while keeping the fresh one. If anyone has suggestions for improvements, be my guest!
function opcache_clean($dir) {
$folders = array();
foreach (scandir($dir) as $file) {
//we only want the folders, not files
if ( strpos($file, '.') === false ) {
$folders[$file] = filemtime($dir . '/' . $file);
}
}
//only delete the old folders if there is more than one
if (count($folders) > 1) {
arsort($folders);
$folders = array_keys($folders);
//keep the first folder (most recent directory at index 0)
$deletions = array_slice($folders, 1);
foreach($deletions as $delete) {
echo "deleting $delete <br>";
system("rm -rf ".escapeshellarg($delete));
}
}
else {
echo "No folders to delete!";
}
}
//clean the current directory
opcache_clean ( dirname(__FILE__) );
This question already has an answer here:
Security vulnerabilities with file_get_contents() using variable location
(1 answer)
Closed 3 years ago.
Is it possible to read any file (not only those with the extension .html) from the server in the following script?
<?php
echo file_get_contents($_GET['display'].'.html');
?>
I know about wrappers (php://, file://, etc.) but achieved not too much.
I'm eager to hear all the possible vectors of attack.
The PHP configuration is default:
allow_url_fopen On, and let's assume the version is >= 7.0, so null character %00 doesn't work.
No, that will only ever read files ending in '.html', but that doesn't necessarily mean that it's secure! Generally, the more that you can sanitise and restrict the input, the better.
Also, for anyone planning to use file_get_contents like this, it's always good to remember that when serving from file_get_contents, you can serve files that are not normally accessible - either due to server configuration, e.g. .htaccess, or file permissions.
As #David said, this will only get files ending in '.html', but its not a good practice, if you have html folder and you want the user to get only files from that folder , you shouldn't do that, by using this method a hacker can access any .html file in your server, not just the ones you want him to see.
My suggestion is that if you have a specific folder that you want user to be able to get files from, scan the directory and check for the file name.
Here's an example:
<?php
$paths = scandir('/html');
$file = isset($_GET['display']) : $_GET['display'] ? null;
if(!$file)
{
die('no display provided');
}
$html = '';
foreach($paths as $path) {
if($path !== '.' && $path !== '..' && $path === $file.'.html') {
$html = file_get_contents($path);
}
}
echo $html;
?>
Exploidale as proxy:
http://example.com/script.php?display=https://hackme.com/passwords%3Extension%3D
echo file_get_contents("https://hackme.com/passwords?Extension=.html")
Your IP will be logged on hackme.com machine and return some passwords (when lucky).
I use a php script to include another php file. When someone goes to the index.php with the wrong string, I want it to show on the screen an error message.
How do I make it show a custom error message like "You have used the wrong link. Please try again."?
Here is what I am doing now...
Someone comes to the URL like this...
http://example.com/?p=14
That would take them to the index.php file and it would pick up p. In the index.php script it then uses include ('p'.$p.'/index.php'); which finds the directory p14 and includes the index.php file in that directory.
I am finding people, for what ever reason, are changing the p= and making it something that is not a directory. I want to fight against that and just show an error if they put anything else in there. I have too many directories and will be adding more so I can't just us a simple if ($p != '14'){echo "error";} I would have to make about 45 of those.
So what is a simple way for me to say.... "If include does not work then echo "error";"?
$filename = 'p'.$p.'/index.php';
Solution1:
if(!#include($filename)) throw new Exception("Failed to include ".$filename);
Solution2: Use file_exists - this checks whether a file or directory exists, so u can just check for directory as well
if (!file_exists($filename)) {
echo "The file $filename does not exist";
}
You should never use this include solution, because it can be vulnerable to code injection.
Even using file_exists is not a good solution, because the attacker can try some files in your server that was not properly secured and gain access to them.
You should use a white list: a dictionary containing the files that the user can include referenced by an alias, like this:
$whiteList = array(
"page1" => "/dir1/file1.php",
"page2" => "/dirabc/filexyz.php"
)
if (array_key_exists($p, $whiteList)) {
include_once($whiteList[$p]);
} else {
die("wrong file");
}
In this way you do no expose the server files structure to the web and guarantee that only a file allowed by you can be included.
You must sanitize the $p before using it:
$p = filter_input(INPUT_GET, "p", FILTER_SANITIZE_STRING);
But depending on the keys that you use in the dictionary, other filters should be used... look at the reference.
if(!file_exists('p'.$p.'/index.php')) die('error');
require_once('p'.$p.'/index.php');
In windows, I open a dir, read the files, and for each file, run stat to determine the size, etc.
The problem is that when I run stat on a folder SHORTCUT, it comes back as a FOLDER, and I can't see anywhere in the mode bitmask that might indicate this. This has been true for all of the folder shortcuts in c:\Documents and Settings\myUserName\.
For these shortcuts, is_file returns false, is_dir returns true and is_link isn't supported in XP.
Here's an excerpt from my code (it has been trimmed down, so there may be bugs) :
if(($h=#opendir($root))!==false){
while (false !== ($file = readdir($h))){
if(!($file=="." || $file=="..")){
if( $stat = #lstat($root . $file) ){
$ary[0] = $file;
$ary[1] = $root;
$ary[2] = Date("m/d/y H:i:s", $stat['mtime']);
if($stat['mode'] & 040000){
$ary[3]="dir";
$ary[4]=0;
}else{
$ary[3] ="file";
$ary[4] = $stat['size'];
}
echo(json_encode($ary));
}
}
}
}
A workaround for this will be appreciated...
EDIT: Winterblood's solution almost worked
First off - my bad - it's a win7 machine.
Thanks Winterblood for the quick turnaround - this worked for several of the shortcuts, and the PHP manual says just that... However,
c:\users\myUserName\AppData\Local\Application Data
(and others) are still coming back as directories, while winSCP correctly sees them as shortcuts. As a matter of fact, the 'mode' is 040777, which is exactly the same as many real folders.
Any other suggestions?
PHP's stat() function "follows" shortcuts/symlinks, reporting details on the linked file/folder, not the actual link itself.
For getting stat details on the link itself use lstat().
More information in the PHP documentation on lstat.
I want to check if a file path is in current directory tree.
Suppose the parameter is given as js/script.js. My working directory (WD) is /home/user1/public_html/site
Now for current WD if someone supplies js/script.js I can simply check it by appending to the WD. It works for such normal path. But if anyone (may be an attacker) wants to pass ../../../../etc/password it'd be a problem.
I know it can be suppressed by removing the .. characters using some RegEx. And that will solve it for sure. But I want to know how can I create some sort of chrooted environment sot that whatever path/to/script is passed it will be searched under WD?
Edit:
I am aware of http://php.net/chroot. it requires your app to run with root privileges.
http://php.net/manual/en/function.realpath.php
$chroot = '/var/www/site/userdata/';
$basefolder = '/var/www/site/userdata/mine/';
$param = '../../../../etc/password';
$fullpath = realpath($basefolder . $param);
if (strpos($fullpath, $chroot) !== 0) {
// GOTCHA
}