I have a number of urls that are stored in a database, and thus instead of adding a rewrite rule in .htaccess for every url, I am using the following code in htaccess to give the control to PHP code through the following ReWrite rule in Apache:
RewriteRule ^.*$ ./index.php
A url mentioned in the database, has a corresponding original url. Though, the tricky situation comes when I have to serve the content of the url fetched from DB by the corresponding original url for which the ReWrite rules are written in .htaccess. One solution is to implement the same rewrite rules for the url fetched from DB in PHP as written in Apache for the original url, however, the number of such original urls is huge.
Thus would be glad to know about a solution if possible which can make execution flow through the ReWrite rules mentioned in Apache after the processing inside PHP is complete.
If you have access to the main httpd.conf you could use a RewriteMap written in PHP.
Other than that, there is no way you can give control from PHP back to Apache so Apache can process it further, not in the same request anyway. You could do a 30x rewrite from PHP to let Apache work on the next request.
Basic rewriting, rather create an apache rule to redirect all 404 errors to a php file, which will be your url handler, using the requested url do a lookup in your list of urls in your database, return the original url to your handler, from there either do a redirect or fetch the page contents server-side/ajax the page/iframe whichever you perfer, if the requested url is not in your list display your custom 404 page, this kills two birds.
Setting up a 404 page:
http://www.404-error-page.com/404-create-a-custom-404-error-page.shtml
Related
I want to redirect to another url when request come to apache http for download a file
for example, client call https://example.com/download/apps/v1.01.apk
/download/apps/v1.01.apk is a real path
I want when call url apache prevent to download it and redirect to another url
For this, you will need to use a .htaccess file.
Create a .htaccess file, in the root of your project and type this into the file:
RewriteEngine on
Options -Indexes -Multiviews
RewriteRule ^(v1\.01\.apk)$ your-new-url.php
It's worth keeping in mind that when the web server first receives a request, the URL is just a string, and the server has to decide what to do.
One of the things it can do is look for a file on disk whose name matches the URL. If it finds a file, it can decide what to do with that information, perhaps combined with other information the browser sent in the request, or information it finds about the file.
Eventually, the server will come up with a response - maybe a response with the content of the file it found; maybe the result of running a particular script; maybe a response indicating a redirect to a different URL.
With most web server software, you can configure all of these decisions, in very flexible ways. So you can say "if the URL has a v in it, look for a file in this folder; if it exists, run this PHP script with the file name as an argument; if it doesn't, issue a redirect response to a URL where the v is replaced with an x".
For Apache, you will see a lot of advice to use .htaccess files to do this. These are not the primary configuration for Apache, but they are a convenient place to put extra configuration when you are using a shared server and can't edit the main configuration for security reasons.
The specific configuration line used to trigger a redirect response in Apache looks like this:
RewriteRule pattern-to-match-against-request url-to-redirect-to [R]
The first argument is a "regular expression" which can be as general or specific as you want. Note that . means "any character", so if you want to match a dot specifically, write \.
The second argument can contain variables like $1 and $2 which refer to parts of the requested URL "captured" by putting them in brackets in the pattern.
The [R] at the end can also have a type, like [R=temp] or [R=307], which will change how the browser handles the redirect, caches it, and so on. There are also other flags your can add, like [R,NC] for "Redirect, Not Case-sensitive".
Finally, you can add any number of RewriteCond lines before a rule, such as RewriteCond -f %{REQUEST_URI} meaning "if a file exists with the same name as the requested URL.
How does instagram.com pass the username variable like "instagram.com/username" or like
instagram.com/floydmayweather
without using the $_GET function and it does not turn out looking like this
instagram/index.php?username=floydmayweather
Use a URL rewrite command in your HTTP server. There are many examples out there for both Apache and nginx.
The rewrite rule happens at the server level before it hits your code. This means the URL doesn't actually have to get modified before your code receives it.
The way I do it is I configure Apache/nginx to send all URLs that do not match an existing file (so that static files like images, js and css still work) to my index.php file. Then in the index.php file I parse the URL to determine what page type to load and what data.
In your example, they would grab the last token off the URL, know that it would be a user's name in URL format, look up that user in the database and build the page accordingly.
This is where something like a front controller or URL router comes in to play in most frameworks. In index.php I would map each URL, based on its components, to a class that would then handle the actual page building.
Here is more info on the rewrite modules;
http://httpd.apache.org/docs/current/mod/mod_rewrite.html
http://wiki.nginx.org/HttpRewriteModule
Some quick Googling will show you many examples for how to configure this.
Your index.php file can examine the $_SERVER array to determine the URL that has been requested. In this situation, the explode() function is your friend, for parsing the URL and checking its components :)
The Rewrite engine will be a perfect solution, for example:
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}\.php -f
RewriteRule ^(.*)$ $1.php
Rewrite engine - A rewrite engine is software located in a Web application framework running on a Web server that modifies a web URL's appearance. This modification is called URL rewriting. Rewritten URLs (sometimes known as short, fancy URLs, search engine friendly - SEF URLs, or slugs) are used to provide shorter and more relevant-looking links to web pages. The technique adds a layer of abstraction between the files used to generate a web page and the URL that is presented to the outside world.
Usage
Instead getting URL with extenstion link (.php / html etc..)
www.stackoverflow.com/index.php
You will get URL Without extenstion
www.stackoverflow.com/index
I am trying to write a content management system and have hit a snag while trying to develop seo friendly urls. I am using php to handle urls, however I have a problem when I try to get the REQUEST_URI for more than one depth level. I am trying to avoid using .htaccess to handle this, because I would like the system to be fairly easy to set up on IIS/nginx/etc also and do not want it to be apache dependent any more than is necessary.
I have in my htaccess file
FallbackResource index.php
and then in my php I have a class that handles the REQUEST_URI slug by checking to see if a record exists in the mysql database. This works fine if the request is something like
http://example.com/foo
however throws an internal server error if the request is
http://example.com/foo/bar
This seems to occur even if I have a completely blank index.php, so I suspect the answer must be at the htaccess level. How can I get my system to handle multiple REQUEST_URI depth levels? Do I need to use a mod rewrite regex or is there a less apache dependent solution?
My bad, I needed to change my .htaccess rule from
FallbackResource index.php
to
FallbackResource /index.php
The missing slash was causing the error. -.-
I'm creating a website with php backend. I have a directory called /inc/ which contains php include files that get included when generating html webpages.
If a user tries to request any files in the /inc/ directory (by url in their browser, for example), I've made it so they get redirected to the home page. I did this in an attempt to ensure that none of these files get called externally.
I have need to call one of these files via jQuery POST request.
Here is my question:
1) Can I somehow hide the url of the file requested in the POST?
2) Will a POST to a file in the /inc/ folder via jQuery fail, since external requests for files in the /inc/ folder get redirected to the home page? Or does the server make a distinction between POST requests, and other types of requests?
3) (OPTIONAL) How can I ensure that the POST request is done "legitimately", as opposed to a bot trying to crash my server by issuing thousands of simultaneous post requests?
Not without using a link somewhere, somehow. Remind yourself that jQuery / Ajax / XHTTPRquests / Xm... anything pointing outwards has to point outwards. There will always be an url, and it will always be traceable.
Options to make sure your url is less traceable
Create a page for your javascript calls (hides away, but doesn't really do anything)
Edit .htaccess options and use it to process javascript requests
Edit .htaccess options and a php page for server-side processing of javascript
I'll be going over option 3
Example (includes 2.!)
#Checks if the request is made within the domain
#Edit these to your domain
RewriteCond %{HTTP_REFERER} !^.*domain\.com [NC]
RewriteCond %{HTTP_REFERER} !^.*domain\.com.*$ [NC]
#Pretends the requested page isn't there
RewriteRule \.(html|php|api.key)$ /error/404 [L]
#Set a key to your 'hidden' url
#Since this is server-based, the client won't be able to get it
#This will set the environment variable when a request is made to
#www.yourwebsite.com/the folder this .htaccess is in/request_php_script
SetEnvIf Request_URI "request_php_script" SOMEKINDOFenvironmentNAME=http://yourlink.com
#Alternatively set Env in case your apache doesn't support it
#I use both
SetEnv SOMEKINDOFNAME request_php_script
#This will send the requester to the script you want when they call
#www.yourwebsite.com/the folder this .htaccess is in/request_php_script
RewriteCond %{REQUEST_URI} request_php_script$ [NC]
#if you don' want a php script to handle javascript and benefit the full url obfuscation, write the following instead
#RewriteRule ^.*$ /adirectscript.php [L]
RewriteRule ^.*$ /aredirectscript.php [L]
#and yes this can be made shorter, but this works best if folders and keys in your ENV are similar in some extend
In this case could call a php script that redirects you to the right page, but if everything is internal, then I don't see the reason why you would hide away the url to your scripts. If you have set the .htaccess as shown, only your page can access it. Users and external sources aren't able to reach it, as they'll be redirected.
If your scripts refer to an external API key however, then this might be useful, and you could call the redirect script
<?php
echo file_get_contents(getEnv("SOMEKINDOFNAME"));
?>
Now when this script is called, it'll return your contents. If you want to load in pages, you can call something described like here this instead
getting a webpage content using Php
To make full use of this, you have to set your jQuery POST method to POST at www.yourwebsite.com /the folder this .htaccess is in/request_php_script.php
If 1 and 2 are done properly, like above, you shouldn't have to worry from bots from the outside trying to reach your .php scripts
Sidenote:
You can skip the extra php script, but you'll be traceable in the .har files. Which means that in the end, your url is still reachable somewhere. Using the extra php script (give it query parameters for convenience) will obfuscate the url enough to make it dead hard to find. I've used this way to hide away requests to an external API key
TLDR:
set .htaccess server environment variable for page
set .htaccess rewrite condition to page to be redirected to page if
originally from my page
called by script
set javascript to call specified page
Medium 'secure'
cannot be found in script
cannot be traced back without saving har files
create php file to return page contents
set .htaccess to point to php instead of page
Highly 'secure'
cannot be found in script
cannot be traced back, even when saving har files
1) No.
2) Depends on how you handle redirects, best way is to try and see.
3) Not an easy task in general. Simple approach is to detect same client and limit request rate. No way to detect a bot in general only by request data.
As for your last comment, you can restrict access to thouse files with .htaccess without need for redirects. However you still won't be able to get them with AJAX. The only real reason to hide something is if there is some sensitive information inside like, passwords, logins, something like that. Overwise it's doesn't really matter, nobody is interested in some hidden utulity files.
I am very noob to this URL rewrite. just got a question in my head.
www.example.com/?page_name=home
The $_GET['page_name'] is actually home
after the URL rewrite the url become
www.example.com/home
can PHP still get the $_GET['page_name'] is 'home'??
Thanks
The URL rewriting is done by the web server, let's say in this case Apache. This is not the same as PHP.
Apache receives a request for the URL www.example.com/home. It now needs to figure out what to do with this request. It will check its configuration for something that matches www.example.com, which will point it to a document root, i.e. some folder on the hard disk. It checks that folder on the hard disk and encounters an .htaccess file. It evaluates the .htaccess file, which tells it to rewrite the URL from /home to ?page_name=home.
Apache now tries to figure out what to do with ?page_name=home. Since there's no filename given, it defaults to index.php (which hopefully exists). It now runs that index.php file in the document root, passing it ?page_name=home as the URL it has received. PHP takes it from there, oblivious of the rewriting that happened. To PHP, it appears that you have received the parameter page_name as query parameter and puts it in $_GET.
Try using rewriting like this (change it as per your need):
RewriteRule ^([A-Za-Z0-9-_]+)/?$ index.php?page_name=$1 [L]
Above rule redirects
http://www.domain.com/string_LiKe-this53/
on a real existing page
http://www.domain.com/index.php?page=string_LiKe-this53
on which you can use your $_GET['page'], which will have value string_LiKe-this53.
It depends on the rewrite rule, but yes, you can get it to work as intended.
The following rewrite rule:
RewriteRule /home /index.php?page_name=home
Will simply cause requests to /home to execute index.php with $_GET['page_name'] equal to "home".
Depending on the complexity of your site, however, it may be preferable to use a more generic rewrite rule, such as:
RewriteRule ^(.+)$ index.php/$1
Then you would query $_SERVER['PATH_INFO'] to see if it contained "home". This will play nicely with other $_GET parameters that may be passed in.