I have just written some PHP code to combine all the JS on my website into a single file, hash it, then echo out the <script> tag with the hashed filename as the source. I store all these hashed files in a single folder, and delete old hashes every time a new hash is generated.
I want to be able to turn on caching for this file, as the hashed filename determines when the browser needs to download a new file (IE: The actual URL that it requests is different). Does anyone know how to hook into this request in PHP? Something like:
if ($_GET['folder'] = "path/to/hashed/folder/or/file")
{
//Do something
}
Any help is greatly appreciated.
Unless you are serving your JavaScript with PHP, you can't hook into the request. However, you don't need to set caching headers with PHP. You can configure your web server to do this directly. How you do this depends on what server you are using.
What you are going to use for this is a file called .htaccess. Here is a simple rule that sends everything through one file, here named index.php.
RewriteEngine on
RewriteBase /
RewriteCond %{REQUEST_URI} !^/some_file_that_you_want_to_exclude_from_this_rule
RewriteCond %{REQUEST_URI} !^/some_other_file_that_you_want_to_exclude_from_this_rule
RewriteRule ^(.*)$ index.php [L]
The full URL will be in $_SERVER['REQUEST_URI'], and other variables in $_SERVER will point you to other information ($_SERVER['REDIRECT_URL'] excludes the host name).
Related
After many hours messing with .htaccess I've arrived to the conclusion of sending any request to a single PHP script that would handle:
Generation of html (whatever the way, includes or dynamic)
301 Redirections with a lot more flexibility in the logic (for a dumb .htaccess-eer)
404 errors finally if the request makes no sense.
leaving in .htaccess the minimal functionality.
After some tests it seems quite feasible and from my point of view more preferable. So much that I wonder what's wrong or can go wrong with this approach?
Server performance?
In terms of SEO I don't see any issue as the procedure would be "transparent" to the bots.
The redirector.php would expect a query string consisting on the actual request.
What would be the .htaccess code to send everything there?
I prefere to move all your php files in a other directory and put only 1 php file in your htdocs path, which handle all requests. Other files, which you want to pass without php, you can place in that folder too with this htaccess:
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.*)$ /index.php/$0 [L]
Existing Files (JPGs,JS or what ever) are still reachable without PHP. Thats the most flexible way to realize it.
Example:
- /scripts/ # Your PHP Files
- /htdocs/index.php # HTTP reachable Path
- /htdocs/images/test.jpg # reachable without PHP
- /private_files/images/test.jpg # only reachable over a PHP script
You can use this code to redirect all requests to one file:
RewriteEngine on
RewriteRule ^.*?(\?.*)?$ myfile.php$1
Note that all requests (including stylesheets, images, ...) will be redirected as well. There are of course other possibilities (rules), but this is the one I am using and it will keep the query string correct. If you don't need it you can use
RewriteEngine on
RewriteRule ^.*?$ myfile.php
This is a common technique as the bots and even users only see their requested URL and not how it is handled internally. Server performance is not a problem at all.
Because you redirect all URLs to one php file there is no 404 page anymore, because it gets cached by your .php file. So make sure you handle invalid URLs correctly.
Can I use a php script as handler for loading a document in a directory?
My .htaccess (in "/path/to/") would say:
AddHandler handler .png
Action handler /path/to/security.php
And the security.php would do what it was supposed to do, e.g. check databases etc and then continue on to the original file that was loaded. For example:
User attempts to load '/path/to/awesome_picture.png'. The server checks .htaccess and sees the handler. It executes the handler. Following this, the PHP script would redirect to the original file if it saw fit. So user would eventually receive the picture but would undergo checks along the way.
Is this possible, and if so, how can I do it?
Thanks in advance
Looking at the Apache documentation it seems that you can, since the second example matches the config you have written.
But it seems a better idea (to me at least) to use mod_rewrite for this, like so:
RewriteEngine on
RewriteCond %{SCRIPT_FILENAME} ^(.*\.png)$
RewriteRule ^(.*)$ /path/to/security.php?img=$1 [NC,L]
Then you can reference the requested file in your PHP script via $_GET['img'].
I'm creating a website with php backend. I have a directory called /inc/ which contains php include files that get included when generating html webpages.
If a user tries to request any files in the /inc/ directory (by url in their browser, for example), I've made it so they get redirected to the home page. I did this in an attempt to ensure that none of these files get called externally.
I have need to call one of these files via jQuery POST request.
Here is my question:
1) Can I somehow hide the url of the file requested in the POST?
2) Will a POST to a file in the /inc/ folder via jQuery fail, since external requests for files in the /inc/ folder get redirected to the home page? Or does the server make a distinction between POST requests, and other types of requests?
3) (OPTIONAL) How can I ensure that the POST request is done "legitimately", as opposed to a bot trying to crash my server by issuing thousands of simultaneous post requests?
Not without using a link somewhere, somehow. Remind yourself that jQuery / Ajax / XHTTPRquests / Xm... anything pointing outwards has to point outwards. There will always be an url, and it will always be traceable.
Options to make sure your url is less traceable
Create a page for your javascript calls (hides away, but doesn't really do anything)
Edit .htaccess options and use it to process javascript requests
Edit .htaccess options and a php page for server-side processing of javascript
I'll be going over option 3
Example (includes 2.!)
#Checks if the request is made within the domain
#Edit these to your domain
RewriteCond %{HTTP_REFERER} !^.*domain\.com [NC]
RewriteCond %{HTTP_REFERER} !^.*domain\.com.*$ [NC]
#Pretends the requested page isn't there
RewriteRule \.(html|php|api.key)$ /error/404 [L]
#Set a key to your 'hidden' url
#Since this is server-based, the client won't be able to get it
#This will set the environment variable when a request is made to
#www.yourwebsite.com/the folder this .htaccess is in/request_php_script
SetEnvIf Request_URI "request_php_script" SOMEKINDOFenvironmentNAME=http://yourlink.com
#Alternatively set Env in case your apache doesn't support it
#I use both
SetEnv SOMEKINDOFNAME request_php_script
#This will send the requester to the script you want when they call
#www.yourwebsite.com/the folder this .htaccess is in/request_php_script
RewriteCond %{REQUEST_URI} request_php_script$ [NC]
#if you don' want a php script to handle javascript and benefit the full url obfuscation, write the following instead
#RewriteRule ^.*$ /adirectscript.php [L]
RewriteRule ^.*$ /aredirectscript.php [L]
#and yes this can be made shorter, but this works best if folders and keys in your ENV are similar in some extend
In this case could call a php script that redirects you to the right page, but if everything is internal, then I don't see the reason why you would hide away the url to your scripts. If you have set the .htaccess as shown, only your page can access it. Users and external sources aren't able to reach it, as they'll be redirected.
If your scripts refer to an external API key however, then this might be useful, and you could call the redirect script
<?php
echo file_get_contents(getEnv("SOMEKINDOFNAME"));
?>
Now when this script is called, it'll return your contents. If you want to load in pages, you can call something described like here this instead
getting a webpage content using Php
To make full use of this, you have to set your jQuery POST method to POST at www.yourwebsite.com /the folder this .htaccess is in/request_php_script.php
If 1 and 2 are done properly, like above, you shouldn't have to worry from bots from the outside trying to reach your .php scripts
Sidenote:
You can skip the extra php script, but you'll be traceable in the .har files. Which means that in the end, your url is still reachable somewhere. Using the extra php script (give it query parameters for convenience) will obfuscate the url enough to make it dead hard to find. I've used this way to hide away requests to an external API key
TLDR:
set .htaccess server environment variable for page
set .htaccess rewrite condition to page to be redirected to page if
originally from my page
called by script
set javascript to call specified page
Medium 'secure'
cannot be found in script
cannot be traced back without saving har files
create php file to return page contents
set .htaccess to point to php instead of page
Highly 'secure'
cannot be found in script
cannot be traced back, even when saving har files
1) No.
2) Depends on how you handle redirects, best way is to try and see.
3) Not an easy task in general. Simple approach is to detect same client and limit request rate. No way to detect a bot in general only by request data.
As for your last comment, you can restrict access to thouse files with .htaccess without need for redirects. However you still won't be able to get them with AJAX. The only real reason to hide something is if there is some sensitive information inside like, passwords, logins, something like that. Overwise it's doesn't really matter, nobody is interested in some hidden utulity files.
I have a protected folder. I want people who are logged in (via PHP / WordPress) to have access to the folder and the files therein.
Those who are not logged in should be redirected via .htaccess.
Can the .htaccess rewrite condition be based off an environment variable or a server variable which I added or edited from PHP?
UPDATE:
See my answer below.
.htaccess (hypertext access) file is a directory-level configuration file supported by several web servers. I can't think of any simple way of how you could access runtime variables set in PHP with .htaccess as .htaccess allows no "execution" of commands, it is just a bunch of config directives.
You could maybe do some sort of VERY VERY strange combination of .htaccess and CGI scripts and maybe more to access a webservice # PHP level, but that would be far beyond my programming skills and I suppose beyond those of most PHP developers too...
At least this is what I can tell you, I would be interestd too if someone knows a hack for this...
The easiest way of how to do such redirects would in my opinion be header("Location: xxx.html"); directly in PHP.
You can't edit the .htaccess file on the fly using PHP to set these variables. I mean, you can, but the .htaccess file is used by the entire server, not per-user. Unless you wanted to do some ridiculous write-username-environment-variables to .htaccess and hope it works somehow, you're much better off just doing this via php. If they're not logged in, you can redirect them away with PHP so they won't be able to see the protected folders either.
If you want to keep them out of an entire folder but you don't want to do something like require security-check.php on every file, you could look into using auto_prepend_file. You could also use your .htaccess to route all file access through one specific php file that does this. You would need to do this if you were keeping people out of non-php files.
After much research, I solved it.
My folder system is setup like this:
/file-share/users-folder-name
My .htaccess file under /file-share is as follows:
# .htaccess /file-share
RewriteEngine On
RewriteBase /
# no cookie set
RewriteCond %{HTTP_COOKIE} !^.*client.*$ [NC]
RewriteRule ^(.*)$ /file-share-redirect.php?q=$1 [NC,L]
# cookie set
RewriteCond %{QUERY_STRING} !verified$ [NC]
RewriteCond %{HTTP_COOKIE} client=([^;]+) [NC]
RewriteRule ^(.*)$ /file-share/%1/$1?verified [NC,L,QSA]
# custom 404
ErrorDocument 404 /file-share-redirect.php?e=404&q=$1
#end
If the cookie is set and the file exists in the client's folder then the client is redirected seamlessly to the requested file. The final file request is also given a url parameter to avoid a loop in redirection.
If a user is logged but the cookie is not set I have my file-share-redirect.php file create the cookie then redirect to the requested file. The cookie created in the code below is set to expire in an hour.
<?php setcookie('client', $users_folder_name, time()+3600); ?>
UPDATE
You can keep the cookie secure by using an encrypted cookie name and value. The cookie will only be created on systems where users log in.
PHP's setcookie() will even let you create a cookie that is inaccessible from JavaScript. I double checked this.
The subfolder names will be quite complex, completely unguessable. No one will ever see the subfolder names except those with ftp access. Even those logged in will only see /_/filename.ext, without the subfolder.
I noticed in Drupal if you add .php to the url bar of any page it gives you a 404 message; clean urls enabled. The page is obviously a .php, but the .htaccess is preventing the user from being able to tamper with url extensions in the url bar. How could you do this using .htaccess. I have file extensions omitted at the moment, but would also like to add that feature. Thank you.
Also, this question does not pertain to Drupal. I only mentioned Drupal for and example.
Just because a file contains PHP code it doesn't mean it has to have the .php extension; even more so when you're accessing a file over the internet.
When you request http://mysite.com/page and you're using an .htaccess like Drupal's, the request is forwarded onto index.php?q=page whereupon Drupal will check it's database for a path matching page. If it finds one it will display the content for that page, if not it will (rightly) give a 404.
If you want all of your pages to be accessible with a PHP extension you could add an extra rule in your .htaccess file to remove .php from any request where the PHP file doesn't physically exist:
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)\.php $1 [NC]
Bear in mind though that this adds zero extra value for your site's visitors (in fact they have to remember a file extension as well as the path to the page), and it exposes exactly what server-side technology you're using so a potential attacker would have some of his work done for him.
Hope that helps.
Could you please explain that in more depth. How can it redirect content into an existing page? Is that common practice / typical way of doing things?
Yes it is a very common practice, used by most frameworks and CMS.
The principle is simple: you setup your .htaccess so that every request which doesn't match a real file or directory will be redirected to a front controller, usually the index.php in the root directory of the application. That front controller handles the request by analyzing the URL and calling the necessary actions.
In this way you can minimize the rewrite rules to just one, and you can offer customized 404 pages.
I dunno Drupal but in the usual php app every request being routed to the front controller which performs some validations and throws 404 on errors.
easy-peasy