PHP 404 response and redirect - php

I know that I can setup a 404 error page simply enough using .htaccess.
As I understand it, if a page isn't found, an error gets sent back to the browser and the 404 page is displayed. The user may or may not be aware of this 'error'.
What I would like to do is allow the user to type in page1.php, page2.php, page3.php etc., and go to a page that prints page1, page2, page3, etc.. respectively.
The reasoning is that the logic is very similar for page1, page2, page3, etc.. I think it might be easier to send the user to the same page, which then calculates what to do using the PHP server variables: determine if this is a valid page or not, and then either throw a 404 error or print the correct message.
Is this possible? Or just a stupid thought?
I would like the url to be unaffected and remain as the user typed.
Thanks.

This is definitely not a stupid thought.
The instance that delivers content, or error pages if no content is found, is the web server.
If you want to handle that logic in PHP, you can tell the web server to pass all requests to the same PHP file. Then you can make use of the $_SERVER variable to determine what the user was requesting and serve the content and send the correct status code (e.g. 404).
In Apache, you can define a Fallback Resource if you are using version 2.2.16 or later. This will redirect requests to non-existent files to a specified file.
<Directory /var/www/my_blog>
FallbackResource index.php
</Directory>
In older Apache versions, you can use mod_rewrite to redirect the requests:
<Directory /var/www/my_blog>
RewriteBase /my_blog
RewriteCond /var/www/my_blog/%{REQUEST_FILENAME} !-f
RewriteCond /var/www/my_blog/%{REQUEST_FILENAME} !-d
RewriteRule ^ index.php [PT]
</Directory>

Related

Redirect requests to different local file but keep url

So I have a problem understanding redirects.
Let's say I have
example.com/id/someuserid(all after) and if someone makes a request to this url path, I want to execute/redirect them to let's say a handler php file which then gets the incoming Request uri, and showing content based on the user in the url.
I've tried proxy match and apache redirects but they don't seem to be the right thing, or I am just not getting it right
What I've tried is
ProxyPass /id/ http://example.com/handler.php
ProxyPassReverse /id/ http://example.com/handler.php
Note that the request uri is suppose to be the same and the url should be kept the same.
RewriteEngine on
RewriteRule "^/foo\.html$" "/bar.html" [PT]

I want HTACCESS to block only direct access to a specific file

I have a file that I don't want users to be able to navigate to on their own accord. However, if they click a link that sends them there, it's okay for the page to work. I currently have my htaccess file set up like so.
<Files "success.php">
Order Allow,Deny
Deny from all
</Files>
success.php is the name of the file, and in the directory of the success.php, I have the following in a htaccess file:
RewriteRule /?\.htaccess$ - [F,L]
RewriteRule ^/?admin/paypal/success\.php$ - [F,L]
Will users still be able to get to success.php if they're directed there, because I know you're shown a 403 error if you just try to navigate there.
If it is the case that they will be blocked from being directed there, is there a way I can fix this?
When I type a URL to "success.php" in my browser's location bar and hit enter, my browser sends a request for success.php.
When I go to your website and click on a link that takes me to "success.php", my browser sends a request for success.php.
It's exactly the same, just because I click on a link on your site vs typing it in my browser, both are requested exactly the same. So when you deny access, you deny all access. What you need to check for the is "Referer" header, which browsers can (optionally) include in a request to let the webserver know what URL it just came from. Referers can be easily forged or sometimes omitted, so checking the referer isn't a guarantee.
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^https?://example.com/ [NC]
RewriteRule ^admin/paypal/success\.php$ - [F,L]
So if someone loads any page on the "example.com" site (e.g. your site) and then clicks on a link that goes to the success.php page, they'll be fine. Any other access from anywhere else will be 403.

apache mod_rewrite to collect request info

I have some existing PHP code on my server. Now I want log-in complete information about requests that come to my server. I don't want to make any changes to existing code. I am using apache mod_rewrite for this. I have a sample php script,stats.php which looks something like this
<?php
/*NOTE:This is peseudo code!!!*/
open database connection
add serverinfo, referer info, script_name, arguments info to database
change characters in request from UTF16 to UTF 8.
//Call header function for redirection
$str = Location : $_SERVER["REQUEST_URI"]
header ("$str");
?>
In httpd.conf file
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{REQUEST_URI} !/stats\.php
RewriteCond %{REQUEST_URI} !\/favicon\.php
RewriteRule ^/(.*)$ /stats.php?$1 [L]
RewriteLog "logs/error_log"
RewriteLogLevel 3
</IfModule>
The problem is, I am afraid this may not be best from SEO perspective and also may be buggy. Are there any better ways to do this? For example, can I use a script to the access_log file?
Say for example, if you go to http://your-domain.com/some-page.html, you'll get a loop:
Browser contacts server with request URI /some-page.html
mod_rewrite rewrites the URI to /stats.php?some-page.html
The stats.php does its thing, then redirects the browser to /some-page.html
Browser contacts server with request URI /some-page.html
repeat starting at #2
What you need to do instead of responding with the Location: header is read the contents of the some-page.html file and return that to the browser, essentially "proxying" the request for the browser. The browser therefore doesn't get redirected.
As for how to do that in php, there's plenty of google results or even plenty of answers on Stack Overflow.
I figured what I should do. I did the following
1) Add a custom logformat to httpd.conf file.
2) Added a customLog dirctive. Piped the output to stats.php.
3) stats.php takes care of adding the code to database.

Hide url and redirection implications for AJAX POST?

I'm creating a website with php backend. I have a directory called /inc/ which contains php include files that get included when generating html webpages.
If a user tries to request any files in the /inc/ directory (by url in their browser, for example), I've made it so they get redirected to the home page. I did this in an attempt to ensure that none of these files get called externally.
I have need to call one of these files via jQuery POST request.
Here is my question:
1) Can I somehow hide the url of the file requested in the POST?
2) Will a POST to a file in the /inc/ folder via jQuery fail, since external requests for files in the /inc/ folder get redirected to the home page? Or does the server make a distinction between POST requests, and other types of requests?
3) (OPTIONAL) How can I ensure that the POST request is done "legitimately", as opposed to a bot trying to crash my server by issuing thousands of simultaneous post requests?
Not without using a link somewhere, somehow. Remind yourself that jQuery / Ajax / XHTTPRquests / Xm... anything pointing outwards has to point outwards. There will always be an url, and it will always be traceable.
Options to make sure your url is less traceable
Create a page for your javascript calls (hides away, but doesn't really do anything)
Edit .htaccess options and use it to process javascript requests
Edit .htaccess options and a php page for server-side processing of javascript
I'll be going over option 3
Example (includes 2.!)
#Checks if the request is made within the domain
#Edit these to your domain
RewriteCond %{HTTP_REFERER} !^.*domain\.com [NC]
RewriteCond %{HTTP_REFERER} !^.*domain\.com.*$ [NC]
#Pretends the requested page isn't there
RewriteRule \.(html|php|api.key)$ /error/404 [L]
#Set a key to your 'hidden' url
#Since this is server-based, the client won't be able to get it
#This will set the environment variable when a request is made to
#www.yourwebsite.com/the folder this .htaccess is in/request_php_script
SetEnvIf Request_URI "request_php_script" SOMEKINDOFenvironmentNAME=http://yourlink.com
#Alternatively set Env in case your apache doesn't support it
#I use both
SetEnv SOMEKINDOFNAME request_php_script
#This will send the requester to the script you want when they call
#www.yourwebsite.com/the folder this .htaccess is in/request_php_script
RewriteCond %{REQUEST_URI} request_php_script$ [NC]
#if you don' want a php script to handle javascript and benefit the full url obfuscation, write the following instead
#RewriteRule ^.*$ /adirectscript.php [L]
RewriteRule ^.*$ /aredirectscript.php [L]
#and yes this can be made shorter, but this works best if folders and keys in your ENV are similar in some extend
In this case could call a php script that redirects you to the right page, but if everything is internal, then I don't see the reason why you would hide away the url to your scripts. If you have set the .htaccess as shown, only your page can access it. Users and external sources aren't able to reach it, as they'll be redirected.
If your scripts refer to an external API key however, then this might be useful, and you could call the redirect script
<?php
echo file_get_contents(getEnv("SOMEKINDOFNAME"));
?>
Now when this script is called, it'll return your contents. If you want to load in pages, you can call something described like here this instead
getting a webpage content using Php
To make full use of this, you have to set your jQuery POST method to POST at www.yourwebsite.com /the folder this .htaccess is in/request_php_script.php
If 1 and 2 are done properly, like above, you shouldn't have to worry from bots from the outside trying to reach your .php scripts
Sidenote:
You can skip the extra php script, but you'll be traceable in the .har files. Which means that in the end, your url is still reachable somewhere. Using the extra php script (give it query parameters for convenience) will obfuscate the url enough to make it dead hard to find. I've used this way to hide away requests to an external API key
TLDR:
set .htaccess server environment variable for page
set .htaccess rewrite condition to page to be redirected to page if
originally from my page
called by script
set javascript to call specified page
Medium 'secure'
cannot be found in script
cannot be traced back without saving har files
create php file to return page contents
set .htaccess to point to php instead of page
Highly 'secure'
cannot be found in script
cannot be traced back, even when saving har files
1) No.
2) Depends on how you handle redirects, best way is to try and see.
3) Not an easy task in general. Simple approach is to detect same client and limit request rate. No way to detect a bot in general only by request data.
As for your last comment, you can restrict access to thouse files with .htaccess without need for redirects. However you still won't be able to get them with AJAX. The only real reason to hide something is if there is some sensitive information inside like, passwords, logins, something like that. Overwise it's doesn't really matter, nobody is interested in some hidden utulity files.

Will using ModRewrite cause the web page to refresh?

I am having an issue with my website which I have been a bit unsure of how to fix, only because it is not constant happening. Sometimes it does, and other times it does not.
The issue has to do with sessions, and redirecting. What I am doing is, when an user completes a task, they are redirected. When they are redirected, a $_SESSION is set with a message which says that what they did has been done. Once the page which the user is being redirected to loads, and the message is not being displayed, the session variable that contains the message is unset because the page refreshes before it is finally shown to the user. The weird thing is that for one specific redirect (when a user logs in with Facebook Connect) it works, but for all other redirects it does not.
My latest idea as to why this happens is because I am using mod_rewrite. Can mod_rewrite cause the page to refresh before it is displayed in the browser?
Here is the code which I am using for the URL rewriting:
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.*) index.php [L]
No. mod_rewrite tells apache to take the incoming request that matches your rule (before a response has been sent to you), and route it to a particular destination. It's like mail forwarding--you don't get the letter twice, you just get it in a different location.
UPDATING TO ADD SOLUTION
You are likely performing some actions after the header call for the redirect. Add a call to exit after header, and your problem should be solved.

Categories