I am currently running a PHP website, and I was wondering if there is any way to deny access to an image, if the directory was entered in the browser bar, but still be able to use said image in my Page with the <img src=""> tag.
I store said image in a directory called "images" which is on the same level with my main page "home.php". I am familiar with the .htaccess file and the deny from all command in it, however, as I said, it will not display the 'forbidden' files in the other pages. I hope that somebody can help me. Thanks!
Maybe you can try this:
<Files "./your_directory/yourfile.png">
Order Allow,Deny
Deny from all
</Files>
Basically, I believe that the answer would be "no," because in both cases the user's browser is the party making the request. Apache might not know or be able to distinguish between two "reasons" why the browser is making such a request.
On the other hand, programmatic code within your host-side application possibly could. Either directly or using mod_rewrite tricks, you could direct the incoming request to a server-side script, which can examine the entirety of the HTTP request and determine what sort of response it should produce: image-content, or 404. In this scenario, Apache accepts the request ... does not Deny it ... but instead of serving the image directly itself, it hands-off to a script which makes that decision. (The script can still gulp the data from a file, probably from a directory that Apache will not directly serve at all, so it can be "reasonably fast.")
By the way: you can use directives, at the <Directory> or <Location> level, to force "hand this request off to such-and-such script" behavior, so that, when the user's browser "requests such-and-such image," Apache runs that handler-script instead, passing a URL that includes the name of the requested file. The user won't be able to know what actually happened.)
You can accomplish this many ways, but internet jargon this is called "hotlinking". You can use this tool http://www.htaccesstools.com/hotlink-protection/ to create your own .htaccess file.
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?yourdomain.com [NC]
RewriteRule \.(jpg|jpeg|png|gif)$ - [NC,F,L]
Related
My goal is the following: when a user tries to access an image (or any other attachments in /wp-content/uploads/) I need to redirect him to a processor PHP file that will check if a user is logged in and the file is uploaded by that user.
I've come to this solution so far:
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{REQUEST_URI} ^.*wp-content/uploads/.*
RewriteRule ^wp-content/uploads/(.*)$ file-processor.php?file=$1 [QSA,L]
</IfModule>
And it just doesn't work. 😒 I've tried a huge amount of different combinations and had no luck. And then I've decided to try something simpler and found out that I can't even redirect directly accessed images via the following code (while it works fine for any other pages of the website):
Redirect 301 / https://example.com/
And that's part is really confusing... Should there be some different treatment to the files that are accessed directly?
To make it clear. With the last code I got the following behaviour:
If I try to access any page on the website (ex.: "siteurl.com" or "siteurl.com/about"), then I get redirection to "https://example.com/" as expected.
If I try to access media "siteurl.com/wp-content/uploads/2021/10/someimg.jpg", then I can access this media and see it, while I still expect to be redirected to "https://example.com", but I don't get any redirection.
The same problem happens with directly accessed files from my theme. I still can access them despite this last redirect rule in .htaccess:
Redirect 301 / https://example.com/
Could you please explain how to redirect directly accessed files then?
Thanks in advance! 😁
It sounds like you may be behind a front-end-proxy that is intended to serve your static content. (Nginx is commonly used for this.) The proxy serves the static content, completely bypassing your application server (Apache / .htaccess). This provides great performance at the expense of some functionality.
Check the HTTP response headers when requesting one of these images. You may get a clue as to what is happening by checking the Server HTTP response header.
Possible solutions would be to either:
Make an exception in the front-end proxy to exclude the /wp-content/uploads directory.
OR
Reference these images/resources by a different URL to the actual filesystem path.
Since you are wanting to "protect" these resources you would need to store them outside of the public HTML space (ie. above the document root) - so they are not accessible via the proxy. You can then use the same URL (ie. /wp-content/uploads/...) if you wish, but the script (file-processor.php) would read the file from this alternative location.
I am trying to write a .htaccess file for my website, which will prevent access to pages and images via direct URL input, but localhost requests will be granted. So far I've found this code after some googling:
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^http://(www\.)?localhost [NC]
RewriteCond %{HTTP_REFERER} !^http://(www\.)?mydomain.com.*$ [NC]
RewriteRule \.(php|css|js|jpg)$ - [F]
The problem is my website images are protected all right, but when I want to access the index.php from a parent directory (the htaccess is in my subdirectory, not the parent), I am shown a 403 Forbidden error.
Now I am not really clear as to what these lines mean, or how to tweak them, so I can't tell right from wrong. Can someone help me out and tell what this actually does? Thanks!
Either your assets are accessible or they're not. You cannot serve assets to the public without serving them publicly. You probably think "from localhost" means if someone is "on your website" already; that's a wrong understanding of how the web works. Every asset is requested from the server via a URL, all requests come from clients. Requests do not come from "your local website".
If endusers must be able to see your assets, they must be able to access them via a URL, which means they'll also be able to see them when "inputting the URL directly". There's no technical difference there.
I have a protected folder. I want people who are logged in (via PHP / WordPress) to have access to the folder and the files therein.
Those who are not logged in should be redirected via .htaccess.
Can the .htaccess rewrite condition be based off an environment variable or a server variable which I added or edited from PHP?
UPDATE:
See my answer below.
.htaccess (hypertext access) file is a directory-level configuration file supported by several web servers. I can't think of any simple way of how you could access runtime variables set in PHP with .htaccess as .htaccess allows no "execution" of commands, it is just a bunch of config directives.
You could maybe do some sort of VERY VERY strange combination of .htaccess and CGI scripts and maybe more to access a webservice # PHP level, but that would be far beyond my programming skills and I suppose beyond those of most PHP developers too...
At least this is what I can tell you, I would be interestd too if someone knows a hack for this...
The easiest way of how to do such redirects would in my opinion be header("Location: xxx.html"); directly in PHP.
You can't edit the .htaccess file on the fly using PHP to set these variables. I mean, you can, but the .htaccess file is used by the entire server, not per-user. Unless you wanted to do some ridiculous write-username-environment-variables to .htaccess and hope it works somehow, you're much better off just doing this via php. If they're not logged in, you can redirect them away with PHP so they won't be able to see the protected folders either.
If you want to keep them out of an entire folder but you don't want to do something like require security-check.php on every file, you could look into using auto_prepend_file. You could also use your .htaccess to route all file access through one specific php file that does this. You would need to do this if you were keeping people out of non-php files.
After much research, I solved it.
My folder system is setup like this:
/file-share/users-folder-name
My .htaccess file under /file-share is as follows:
# .htaccess /file-share
RewriteEngine On
RewriteBase /
# no cookie set
RewriteCond %{HTTP_COOKIE} !^.*client.*$ [NC]
RewriteRule ^(.*)$ /file-share-redirect.php?q=$1 [NC,L]
# cookie set
RewriteCond %{QUERY_STRING} !verified$ [NC]
RewriteCond %{HTTP_COOKIE} client=([^;]+) [NC]
RewriteRule ^(.*)$ /file-share/%1/$1?verified [NC,L,QSA]
# custom 404
ErrorDocument 404 /file-share-redirect.php?e=404&q=$1
#end
If the cookie is set and the file exists in the client's folder then the client is redirected seamlessly to the requested file. The final file request is also given a url parameter to avoid a loop in redirection.
If a user is logged but the cookie is not set I have my file-share-redirect.php file create the cookie then redirect to the requested file. The cookie created in the code below is set to expire in an hour.
<?php setcookie('client', $users_folder_name, time()+3600); ?>
UPDATE
You can keep the cookie secure by using an encrypted cookie name and value. The cookie will only be created on systems where users log in.
PHP's setcookie() will even let you create a cookie that is inaccessible from JavaScript. I double checked this.
The subfolder names will be quite complex, completely unguessable. No one will ever see the subfolder names except those with ftp access. Even those logged in will only see /_/filename.ext, without the subfolder.
There are some scripts that I use only via ajax and I do not want the user to run these scripts directly from the browser. I use jQuery for making all ajax calls and I keep all of my ajax files in a folder named ajax.
So, I was hoping to create an htaccess file which checks for ajax request (HTTP_X_REQUESTED_WITH) and deny all other requests in that folder. (I know that http header can be faked but I can not think of a better solution). I tried this:
ReWriteCond %{HTTP_X_REQUESTED_WITH} ^$
ReWriteCond %{SERVER_URL} ^/ajax/.php$
ReWriteRule ^.*$ -
[F]
But, it is not working. What I am doing wrong? Is there any other way to achieve similar results. (I do not want to check for the header in every script).
The Bad: Apache :-(
X-Requested-With in not a standard HTTP Header.
You can't read it in apache at all (neither by
ReWriteCond %{HTTP_X_REQUESTED_WITH}
nor by
%{HTTP:X-Requested-With}), so its impossible to check it in .htaccess or same place. :-(
The Ugly: Script :-(
Its just accessible in the script (eg. php), but you said you don't want to include a php file in all of your scripts because of number of files.
The Good: auto_prepend_file :-)
But ... there's a simple trick to solve it :-)
auto_prepend_file specifies the name of a file that is automatically parsed before the main file. You can use it to include a "checker" script automatically.
So create a .htaccess in ajax folder
php_value auto_prepend_file check.php
and create check.php as you want:
<?
if( !#$_SERVER["HTTP_X_REQUESTED_WITH"] ){
header('HTTP/1.1 403 Forbidden');
exit;
}
?>
You can customize it as you want.
I'm assuming you have all your AJAX scripts in a directory ajax, because you refer to ^/ajax/.php$ in your non-working example.
In this folder /ajax/ place a .htaccess file with this content:
SetEnvIfNoCase X-Requested-With XMLHttpRequest ajax
Order Deny,Allow
Deny from all
Allow from env=ajax
What this does is deny any request without the XMLHttpRequest header.
There are only a few predefined HTTP_* variables mapping to HTTP headers that you can use in a RewriteCond. For any other HTTP headers, you need to use a %{HTTP:header} variable.
Just change
ReWriteCond %{HTTP_X_REQUESTED_WITH} ^$
To:
ReWriteCond %{HTTP:X-Requested-With} ^$
Just check for if($_SERVER['HTTP_X_REQUESTED_WITH']=='XMLHttpRequest'){ at the beginning of the document, if it's not set, then don't return anything.
edit
Here's why: http://github.com/jquery/jquery/blob/master/src/ajax.js#L370
edit 2
My bad, just read through your post again. You can alternatively make a folder inaccessible to the web and then just have a standard ajax.php file that has include('./private/scripts.php') as your server will still be able to access it, but no one will be able to view from their browser.
An alternative to using .htaccess is to use the $_SERVER['HTTP_REFERER'] variable to test that the script is being accessed from your page, rather than from another site, etc.
My PHP app uses 404 Documents to generate HTML files so that multiple queries to the same HTML file only cause the generation to run once.
I'd like to intercept requests to the HTML files so that the user needs to have an established PHP Session in order to pull up the files.
In the best case, SESSION ID would be used in the URL and force it could be used as a further authentication. For example, logging in would issue you a SessionID and make only certain HTML files accessible to you.
I'm aware that by changing my cookies I could spoof a request, but that's fine.
How would I go about doing this?
Something like this could work (I haven't tested it):
RewriteCond %{HTTP_COOKIE} PHPSESSID=([a-zA-Z0-9]+)
RewriteCond %{REQUEST_FILENAME} %{REQUEST_FILENAME}-%1.html
RewriteRule ^ %{REQUEST_FILENAME}-%1.html
It assumes that you append "-$session_id.html" to filenames ($session_id is PHP's session ID).
It should be safe, and the benefit is that files are served by the web server directly without invoking PHP at all.
SetEnvIf HTTP_COOKIE "PHPSESSID" let_me_in
<Directory /www/static/htmls>
Order Deny,Allow
Deny from all
Allow from env=let_me_in
</Directory>
Of course user can manually create such cookie in his browser (there are extensions which do that for Firefox, and you can always edit your browser's cookie store).
You could use the Apache module mod_rewrite to redirect requests of .html URLs to a PHP script:
RewriteEngine on
RewriteRule \.html$ script.php [L]
The requested URI path and query is then available in the $_SERVER['REQUEST_URI'] variable.
Put you cached files out of your web root, but still in a place where PHP can access them.