There are some scripts that I use only via ajax and I do not want the user to run these scripts directly from the browser. I use jQuery for making all ajax calls and I keep all of my ajax files in a folder named ajax.
So, I was hoping to create an htaccess file which checks for ajax request (HTTP_X_REQUESTED_WITH) and deny all other requests in that folder. (I know that http header can be faked but I can not think of a better solution). I tried this:
ReWriteCond %{HTTP_X_REQUESTED_WITH} ^$
ReWriteCond %{SERVER_URL} ^/ajax/.php$
ReWriteRule ^.*$ -
[F]
But, it is not working. What I am doing wrong? Is there any other way to achieve similar results. (I do not want to check for the header in every script).
The Bad: Apache :-(
X-Requested-With in not a standard HTTP Header.
You can't read it in apache at all (neither by
ReWriteCond %{HTTP_X_REQUESTED_WITH}
nor by
%{HTTP:X-Requested-With}), so its impossible to check it in .htaccess or same place. :-(
The Ugly: Script :-(
Its just accessible in the script (eg. php), but you said you don't want to include a php file in all of your scripts because of number of files.
The Good: auto_prepend_file :-)
But ... there's a simple trick to solve it :-)
auto_prepend_file specifies the name of a file that is automatically parsed before the main file. You can use it to include a "checker" script automatically.
So create a .htaccess in ajax folder
php_value auto_prepend_file check.php
and create check.php as you want:
<?
if( !#$_SERVER["HTTP_X_REQUESTED_WITH"] ){
header('HTTP/1.1 403 Forbidden');
exit;
}
?>
You can customize it as you want.
I'm assuming you have all your AJAX scripts in a directory ajax, because you refer to ^/ajax/.php$ in your non-working example.
In this folder /ajax/ place a .htaccess file with this content:
SetEnvIfNoCase X-Requested-With XMLHttpRequest ajax
Order Deny,Allow
Deny from all
Allow from env=ajax
What this does is deny any request without the XMLHttpRequest header.
There are only a few predefined HTTP_* variables mapping to HTTP headers that you can use in a RewriteCond. For any other HTTP headers, you need to use a %{HTTP:header} variable.
Just change
ReWriteCond %{HTTP_X_REQUESTED_WITH} ^$
To:
ReWriteCond %{HTTP:X-Requested-With} ^$
Just check for if($_SERVER['HTTP_X_REQUESTED_WITH']=='XMLHttpRequest'){ at the beginning of the document, if it's not set, then don't return anything.
edit
Here's why: http://github.com/jquery/jquery/blob/master/src/ajax.js#L370
edit 2
My bad, just read through your post again. You can alternatively make a folder inaccessible to the web and then just have a standard ajax.php file that has include('./private/scripts.php') as your server will still be able to access it, but no one will be able to view from their browser.
An alternative to using .htaccess is to use the $_SERVER['HTTP_REFERER'] variable to test that the script is being accessed from your page, rather than from another site, etc.
Related
I am currently running a PHP website, and I was wondering if there is any way to deny access to an image, if the directory was entered in the browser bar, but still be able to use said image in my Page with the <img src=""> tag.
I store said image in a directory called "images" which is on the same level with my main page "home.php". I am familiar with the .htaccess file and the deny from all command in it, however, as I said, it will not display the 'forbidden' files in the other pages. I hope that somebody can help me. Thanks!
Maybe you can try this:
<Files "./your_directory/yourfile.png">
Order Allow,Deny
Deny from all
</Files>
Basically, I believe that the answer would be "no," because in both cases the user's browser is the party making the request. Apache might not know or be able to distinguish between two "reasons" why the browser is making such a request.
On the other hand, programmatic code within your host-side application possibly could. Either directly or using mod_rewrite tricks, you could direct the incoming request to a server-side script, which can examine the entirety of the HTTP request and determine what sort of response it should produce: image-content, or 404. In this scenario, Apache accepts the request ... does not Deny it ... but instead of serving the image directly itself, it hands-off to a script which makes that decision. (The script can still gulp the data from a file, probably from a directory that Apache will not directly serve at all, so it can be "reasonably fast.")
By the way: you can use directives, at the <Directory> or <Location> level, to force "hand this request off to such-and-such script" behavior, so that, when the user's browser "requests such-and-such image," Apache runs that handler-script instead, passing a URL that includes the name of the requested file. The user won't be able to know what actually happened.)
You can accomplish this many ways, but internet jargon this is called "hotlinking". You can use this tool http://www.htaccesstools.com/hotlink-protection/ to create your own .htaccess file.
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?yourdomain.com [NC]
RewriteRule \.(jpg|jpeg|png|gif)$ - [NC,F,L]
Can I use a php script as handler for loading a document in a directory?
My .htaccess (in "/path/to/") would say:
AddHandler handler .png
Action handler /path/to/security.php
And the security.php would do what it was supposed to do, e.g. check databases etc and then continue on to the original file that was loaded. For example:
User attempts to load '/path/to/awesome_picture.png'. The server checks .htaccess and sees the handler. It executes the handler. Following this, the PHP script would redirect to the original file if it saw fit. So user would eventually receive the picture but would undergo checks along the way.
Is this possible, and if so, how can I do it?
Thanks in advance
Looking at the Apache documentation it seems that you can, since the second example matches the config you have written.
But it seems a better idea (to me at least) to use mod_rewrite for this, like so:
RewriteEngine on
RewriteCond %{SCRIPT_FILENAME} ^(.*\.png)$
RewriteRule ^(.*)$ /path/to/security.php?img=$1 [NC,L]
Then you can reference the requested file in your PHP script via $_GET['img'].
I have some existing PHP code on my server. Now I want log-in complete information about requests that come to my server. I don't want to make any changes to existing code. I am using apache mod_rewrite for this. I have a sample php script,stats.php which looks something like this
<?php
/*NOTE:This is peseudo code!!!*/
open database connection
add serverinfo, referer info, script_name, arguments info to database
change characters in request from UTF16 to UTF 8.
//Call header function for redirection
$str = Location : $_SERVER["REQUEST_URI"]
header ("$str");
?>
In httpd.conf file
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{REQUEST_URI} !/stats\.php
RewriteCond %{REQUEST_URI} !\/favicon\.php
RewriteRule ^/(.*)$ /stats.php?$1 [L]
RewriteLog "logs/error_log"
RewriteLogLevel 3
</IfModule>
The problem is, I am afraid this may not be best from SEO perspective and also may be buggy. Are there any better ways to do this? For example, can I use a script to the access_log file?
Say for example, if you go to http://your-domain.com/some-page.html, you'll get a loop:
Browser contacts server with request URI /some-page.html
mod_rewrite rewrites the URI to /stats.php?some-page.html
The stats.php does its thing, then redirects the browser to /some-page.html
Browser contacts server with request URI /some-page.html
repeat starting at #2
What you need to do instead of responding with the Location: header is read the contents of the some-page.html file and return that to the browser, essentially "proxying" the request for the browser. The browser therefore doesn't get redirected.
As for how to do that in php, there's plenty of google results or even plenty of answers on Stack Overflow.
I figured what I should do. I did the following
1) Add a custom logformat to httpd.conf file.
2) Added a customLog dirctive. Piped the output to stats.php.
3) stats.php takes care of adding the code to database.
I'm creating a website with php backend. I have a directory called /inc/ which contains php include files that get included when generating html webpages.
If a user tries to request any files in the /inc/ directory (by url in their browser, for example), I've made it so they get redirected to the home page. I did this in an attempt to ensure that none of these files get called externally.
I have need to call one of these files via jQuery POST request.
Here is my question:
1) Can I somehow hide the url of the file requested in the POST?
2) Will a POST to a file in the /inc/ folder via jQuery fail, since external requests for files in the /inc/ folder get redirected to the home page? Or does the server make a distinction between POST requests, and other types of requests?
3) (OPTIONAL) How can I ensure that the POST request is done "legitimately", as opposed to a bot trying to crash my server by issuing thousands of simultaneous post requests?
Not without using a link somewhere, somehow. Remind yourself that jQuery / Ajax / XHTTPRquests / Xm... anything pointing outwards has to point outwards. There will always be an url, and it will always be traceable.
Options to make sure your url is less traceable
Create a page for your javascript calls (hides away, but doesn't really do anything)
Edit .htaccess options and use it to process javascript requests
Edit .htaccess options and a php page for server-side processing of javascript
I'll be going over option 3
Example (includes 2.!)
#Checks if the request is made within the domain
#Edit these to your domain
RewriteCond %{HTTP_REFERER} !^.*domain\.com [NC]
RewriteCond %{HTTP_REFERER} !^.*domain\.com.*$ [NC]
#Pretends the requested page isn't there
RewriteRule \.(html|php|api.key)$ /error/404 [L]
#Set a key to your 'hidden' url
#Since this is server-based, the client won't be able to get it
#This will set the environment variable when a request is made to
#www.yourwebsite.com/the folder this .htaccess is in/request_php_script
SetEnvIf Request_URI "request_php_script" SOMEKINDOFenvironmentNAME=http://yourlink.com
#Alternatively set Env in case your apache doesn't support it
#I use both
SetEnv SOMEKINDOFNAME request_php_script
#This will send the requester to the script you want when they call
#www.yourwebsite.com/the folder this .htaccess is in/request_php_script
RewriteCond %{REQUEST_URI} request_php_script$ [NC]
#if you don' want a php script to handle javascript and benefit the full url obfuscation, write the following instead
#RewriteRule ^.*$ /adirectscript.php [L]
RewriteRule ^.*$ /aredirectscript.php [L]
#and yes this can be made shorter, but this works best if folders and keys in your ENV are similar in some extend
In this case could call a php script that redirects you to the right page, but if everything is internal, then I don't see the reason why you would hide away the url to your scripts. If you have set the .htaccess as shown, only your page can access it. Users and external sources aren't able to reach it, as they'll be redirected.
If your scripts refer to an external API key however, then this might be useful, and you could call the redirect script
<?php
echo file_get_contents(getEnv("SOMEKINDOFNAME"));
?>
Now when this script is called, it'll return your contents. If you want to load in pages, you can call something described like here this instead
getting a webpage content using Php
To make full use of this, you have to set your jQuery POST method to POST at www.yourwebsite.com /the folder this .htaccess is in/request_php_script.php
If 1 and 2 are done properly, like above, you shouldn't have to worry from bots from the outside trying to reach your .php scripts
Sidenote:
You can skip the extra php script, but you'll be traceable in the .har files. Which means that in the end, your url is still reachable somewhere. Using the extra php script (give it query parameters for convenience) will obfuscate the url enough to make it dead hard to find. I've used this way to hide away requests to an external API key
TLDR:
set .htaccess server environment variable for page
set .htaccess rewrite condition to page to be redirected to page if
originally from my page
called by script
set javascript to call specified page
Medium 'secure'
cannot be found in script
cannot be traced back without saving har files
create php file to return page contents
set .htaccess to point to php instead of page
Highly 'secure'
cannot be found in script
cannot be traced back, even when saving har files
1) No.
2) Depends on how you handle redirects, best way is to try and see.
3) Not an easy task in general. Simple approach is to detect same client and limit request rate. No way to detect a bot in general only by request data.
As for your last comment, you can restrict access to thouse files with .htaccess without need for redirects. However you still won't be able to get them with AJAX. The only real reason to hide something is if there is some sensitive information inside like, passwords, logins, something like that. Overwise it's doesn't really matter, nobody is interested in some hidden utulity files.
My PHP app uses 404 Documents to generate HTML files so that multiple queries to the same HTML file only cause the generation to run once.
I'd like to intercept requests to the HTML files so that the user needs to have an established PHP Session in order to pull up the files.
In the best case, SESSION ID would be used in the URL and force it could be used as a further authentication. For example, logging in would issue you a SessionID and make only certain HTML files accessible to you.
I'm aware that by changing my cookies I could spoof a request, but that's fine.
How would I go about doing this?
Something like this could work (I haven't tested it):
RewriteCond %{HTTP_COOKIE} PHPSESSID=([a-zA-Z0-9]+)
RewriteCond %{REQUEST_FILENAME} %{REQUEST_FILENAME}-%1.html
RewriteRule ^ %{REQUEST_FILENAME}-%1.html
It assumes that you append "-$session_id.html" to filenames ($session_id is PHP's session ID).
It should be safe, and the benefit is that files are served by the web server directly without invoking PHP at all.
SetEnvIf HTTP_COOKIE "PHPSESSID" let_me_in
<Directory /www/static/htmls>
Order Deny,Allow
Deny from all
Allow from env=let_me_in
</Directory>
Of course user can manually create such cookie in his browser (there are extensions which do that for Firefox, and you can always edit your browser's cookie store).
You could use the Apache module mod_rewrite to redirect requests of .html URLs to a PHP script:
RewriteEngine on
RewriteRule \.html$ script.php [L]
The requested URI path and query is then available in the $_SERVER['REQUEST_URI'] variable.
Put you cached files out of your web root, but still in a place where PHP can access them.