I want to create an application in PHP implementing virtual directory feature.
Example: http://mydomain.com/user001 will display the contents of the url http://mydomain.com/index.php?user=user001. How can I do that?
Note:
I am using Apache server.
The traditional way to do it is mod_rewrite.
Please read this friendly article regarding rewrite.
Next, try to find a simple way in PHP to parse this variable $_SERVER['REQUEST_URI'].
After doing that, you have the name of the directory and you can get its data from the DB.
Intercept the HTTP request using the 'REQUEST_URI' element of $_SERVER. This returns (I believe) only the requested page, not the entire URI/URL - more info here. Once you've grabbed the page request, substitute the address of the actual file that's needed. For example, the user-friendly www.somewebsite.com/page01 becomes a request for the more clunky-sounding www.somewebsite.com?page01.php. This method won't create a virtual directory, as such, but should work okay. I have used a similar method on my own IT website, where each page is loaded via index.php, allowing that file to keep a log of visitors in real time (the site has Webalizer, which runs a day or so in arrears).
Rewriting the filename might work, although it's not to my personal taste. Using PHP to effect a URI/URL-swap would likely carry the benefit of reduced server demand, due to requiring less disk read/write time than filename rewrites.
I hope that helps.
Related
I have placed a video file (mp4) on a Apache server which will be accessed from a Android Application. I need to know how many times did the video have been viewed. The solutions I can think of are
View the Apache logs. But I have very limited access to them.
Call a PHP file then redirect to video file.
Any other better solutions apart from above two?
The third option is to have a PHP file which will register the download and then deliver the file by reading it and sending it to the client.
(See http://www.gayadesign.com/diy/download-counter-in-php-using-htaccess/)
Performance-wise this is somewhat worse than either the logs / redirect methods, but it is the most reliable, as the only way a client can access the file is via the PHP script. Furthermore, you can do this without any access to logs (it is Apache-independent). You also have more control (e.g. you can count download only once per IP), but then again, the other methods allow that too, with some modifications. I am not sure if there is any other way to do it effectively besides the two you've listed and the one I suggest, maybe there is a way with PHP / Apache extensions, I am just not aware of it.
So either go with the redirect or this.
We have several different client directories (each it's own domain) that include/require the central app from a different location on the server. Basically each domain is an extension of the centralized code, but very lean because all the main code doesn't need to be duplicated.
If we wanted to give clients/resellers access to editing their own PHP codes, how would we prevent them from reading the central code that we wish to protect?
Basically we want to prevent them from creating some code that opens, reads, TARs, or somehow outputs the source code, but we must still allow the include.
open_basedir() does almost this; it prevents the opening of the code, but in doing so it also prevents the include.
Are code encryption solutions (e.g. Zend Guard) our only options, or is there a way like open_basedir() that allows includes? I've also though about disabling all the read functions and writing my own that checks the source.
Thoughts?
The answer is no, you cannot give the "read" permission and prevent them from reading...
If they can "include" the code they can also write a simple php script that reads your central app files and print the content to screen, for example.
I believe you cannot restrict reading if you allow reading globally however you could filter the access of your site in .htaccess file with %{REMOTE_HOST} or similar. Basically if you are able to identify your clients from their remote locations by IP or url than I believe you can restrict reading specific directories based on who is accessing the site. Can you give me an example of your PHP code for the reseller access to your side?
I ended up using Smarty to give limited capabilities to clients (templating), while keeping the PHP secure.
Nowadays, Developers and Professionals tend to use PHP templates because of two reasons. They are manageable and secondly, we don't need to advertise our technology as there are no question marks and .php extensions within the URL.
But how to make non-advertisement of your technology possible while sending a jQuery Ajax request to load a PHP file in a div. I mean we would, have to write $.get('phpfile.php') within the script and one can say that voa he is using PHP hmmmm.
Simply, I want to ask is there is any way of loading a PHP through request without advertising your technology as above told.
Some coding will be honored.
But how to make non-advertisement of your technology possible while sending a jQuery ajax request to load a php file in a div. I mean we would, have to write $.load('phpfile.php') within the script and one can say that voa he is using PHP hmmmm.
I don't get it. jQuery doesn't know about PHP files. If your website has 2 "public pages" www.example.com and www.example.com/foo, then you can access to the /foo page from the homepage with something like $.get("/foo"). Here I use AJAX, jQuery, and nobody knows if my page use PHP or whatever.
Then, you should look for mod_rewrite has explained by verisimilitude, but rewriting url is not the unique solution. Have a look to this site http://builtwith.com/ and enter a random url. Web servers send, by default, a lot of data about themselves, so you should avoid that behavior too if you want to "hide" the technology used. Have a look here http://xianshield.org/guides/apache2.0guide.html. It's "a guide to installing and hardening an Apache 2.0 web server to common security standards.". You may find useful information in there.
Edit
And also, "PHP templates" are not related to pages URL at all. For example, you could have multiple URL which use the same "PHP template".
mod_rewrite is the best answer for all your predicaments. Why not use it? The URL phpfile.php in your above code could be rewritten to achieve the obfuscation...
#pomeh. Good point.
See. two things can be done here.
1) Disable the APACHE signature. In the default configuration of Apache, any page served through it will contain a full signature of the server. Server signatures contain valuable information about installed software and can be read (and exploited). Therefore is it safer to turn off this behavior. This is how you do it. Open Apache’s configuration file (httpd.conf or apache2.conf) and search for ServerSignature . Set it to 'Off'. After that search for ServerTokens and set it to 'Prod'.
2) Set "expose_php" in php.ini to false: Exposes to the world that PHP is installed on the server, which includes the PHP version within the HTTP header.
3) There are some php obfuscators available which also may be used. I will not recommend them since I've not personally tried them.
There are ways and means beyond these to hide the "technology". By default, a php enabled APACHE web server processes and interprets all files with .php extension. But we can bind any weirdo extension to hide the technology to be processed by the server..
I guess verisimilitude and pomeh already answered this question.
All web servers send information about themselves over the internet. You cant hide that.
If you want to hide file extensions, like 'aspx, php, asp, html' then you will need to use mod_rewrite under Apache or something like URL Rewrite under IIS7.
You can also set default documents under IIS7. This really only works once per web folder. For example you can set default.htm as one of the default documents. When a visitor goes to your website they type www.domain.com and they get a web page. That visitor is actually looking at www.domain.com/default.htm
I was wondering if there was a way to basically host a site on your server so you can run PHP, but have the actual code hosted on GitHub. In other words...
If a HTTP request went to:
http://mysite.com/docs.html
It'd request and pull in the content (via file_get_contents() or something):
https://raw.github.com/OscarGodson/Core.js/master/docs.html
Or, if they went to:
http://mysite.com/somedir/another/core.js
It'd pull down:
https://raw.github.com/OscarGodson/Core.js/master/somedir/another/core.js
I know GitHub has their own DNS servers, but id rather host it on my so i can run server side code. What would the htaccess code look like for this?
This is beyond the capabilities of .htaccess files, if the requirement is to run the PHP embedded in the HTML stored on github.com at the server on yourserver.com simply by a configuration line like a redirect in the .htaccess file.
A .htaccess file is typically used to provide directives to the Apache web server. These directives can indicate, for example, access permissions, popup password protection, linkages between URLs and the server's file system, handlers for certain types of files when fetched by the server before delivery to the browser, and redirects from one URL to another URL.
An .htaccess file can issue redirects for http://mysite.com/somedir/another/core.js to https://raw.github.com.... but then the browser will be pointed to raw.github.com, not mysite.com. Tricks can be done with frames to make this redirection less transparent to the human at the browser... but these dont affect the fact that the data comes from github.com without ever going to the server at mysite.com
In particular, PHP tags embedded in the HTML on github.com are never received by mysite.com's server and therefore will not run. Probably not want you want. Unless some big changes have occurred in Apache, .htaccess files will not set up that workflow. It might be possible for some expert to write an apache module to do it, but I am not sure.
What you can do is put a cron job on mysite.com that git pull's from github.com every few minutes. Perhaps that is what you want to do instead?
If the server can run PHP code, you can do this.
Basically, in the .htaccess file you use a RewriteRule to send all paths to a PHP script on your server. For example, a request for /somedir/anotherdir/core.js becomes /my-script.php/somedir/anotherdir/core.js. This is how a lot of app frameworks operate. When my-script.php runs the "real" path is in the PATH_INFO variable.
From that point the script could then fetch the file from GitHub. If it was HTML or JavaScript or an image, it could just pass it along to the client. (To do things properly, though, you'll want to pass along all the right headers, too, like ETag and Last-Modified and then also check those files, so that caching works properly and you don't spend a lot of time transferring files that don't need to be transferred again and again. Otherwise your site will be really slow.)
If the file is a PHP file, you could download it locally, then include it into the script in order to execute it. In this case, though, you need to make sure that every PHP file is self-contained, because you don't know which files have been fetched from GitHub yet, so if one file includes another you need to make sure the files dependent on the first file are downloaded, too. And the files dependent on those files, also.
So, in short, the .htaccess part of this is really simple, it's just a single RewriteRule. The complexity is in the PHP script that fetches files from GitHub. And if you just do the simplest thing possible, your site might not work, or it will work but really painfully slowly. And if you do a ton of genius level work on that script, you could make it run OK.
Now, what is the goal here? To save yourself the trouble of logging into the server and typing git pull to update the server files? I hope I've convinced you that trying to fetch files on demand from GitHub will be even more trouble than that.
I am using the header function of PHP
to send the file to the browser with some small code. Its work well
and I have it so that if any one requests it with a referer other than my site
it redirects to a page first.
Unfortunately it's not working with the internet download manager.
What I want to know is how the rabidshare and 4shared sites do this.
You could use sessions to make sure the download is being requested by a valid user.
Not all browsers / softwares that can see web pages will send a Referer to your server. Some sites will make a browser "fingerprint", usually hashed, which might be Referer, User-Agent and a couple of other headers strung together to make a uniquie identifier for that user and thus restrict access as you describe.
Of course, I may have completely missed the point of your post!
A typical design pattern is using a front controller to have a single entry point for all requests. By having a front controller, you can control exactly what the client sees.
You can configure this in Apache so that all requests go through a single file (it's been a while since I've done this because I now concentrate on Java). I think you would need to look at pathinfo documentation for Apache.
This might require a significant change in the rest of your application code. But, the code will be more secure and maintainable in the long run.
I've served images and other binary files through this pattern. This allowed me to easily verify users were authenticated before actually sending them the file. Obfuscation is not security, so if you rely on obfuscating your URL, an attacker may be delayed in getting in, but it is just a matter of time.
Walter
The problem probably is that sending file through php script (with headers you mentioned) doesn't support starting file download at certain position. Download managers use this feature to download file using several simultaneous threads (assuming server gives one thread at certain speed).
For small project I would recommend making a copy of file with unique filename just for download time and redirecting user to this copied file. This way he gets full server download features and it also doesn't load processor as php does. Disadvantages - more disk space required and need to cleanup download directory.