Routing .htaccess to GitHub - php

I was wondering if there was a way to basically host a site on your server so you can run PHP, but have the actual code hosted on GitHub. In other words...
If a HTTP request went to:
http://mysite.com/docs.html
It'd request and pull in the content (via file_get_contents() or something):
https://raw.github.com/OscarGodson/Core.js/master/docs.html
Or, if they went to:
http://mysite.com/somedir/another/core.js
It'd pull down:
https://raw.github.com/OscarGodson/Core.js/master/somedir/another/core.js
I know GitHub has their own DNS servers, but id rather host it on my so i can run server side code. What would the htaccess code look like for this?

This is beyond the capabilities of .htaccess files, if the requirement is to run the PHP embedded in the HTML stored on github.com at the server on yourserver.com simply by a configuration line like a redirect in the .htaccess file.
A .htaccess file is typically used to provide directives to the Apache web server. These directives can indicate, for example, access permissions, popup password protection, linkages between URLs and the server's file system, handlers for certain types of files when fetched by the server before delivery to the browser, and redirects from one URL to another URL.
An .htaccess file can issue redirects for http://mysite.com/somedir/another/core.js to https://raw.github.com.... but then the browser will be pointed to raw.github.com, not mysite.com. Tricks can be done with frames to make this redirection less transparent to the human at the browser... but these dont affect the fact that the data comes from github.com without ever going to the server at mysite.com
In particular, PHP tags embedded in the HTML on github.com are never received by mysite.com's server and therefore will not run. Probably not want you want. Unless some big changes have occurred in Apache, .htaccess files will not set up that workflow. It might be possible for some expert to write an apache module to do it, but I am not sure.
What you can do is put a cron job on mysite.com that git pull's from github.com every few minutes. Perhaps that is what you want to do instead?

If the server can run PHP code, you can do this.
Basically, in the .htaccess file you use a RewriteRule to send all paths to a PHP script on your server. For example, a request for /somedir/anotherdir/core.js becomes /my-script.php/somedir/anotherdir/core.js. This is how a lot of app frameworks operate. When my-script.php runs the "real" path is in the PATH_INFO variable.
From that point the script could then fetch the file from GitHub. If it was HTML or JavaScript or an image, it could just pass it along to the client. (To do things properly, though, you'll want to pass along all the right headers, too, like ETag and Last-Modified and then also check those files, so that caching works properly and you don't spend a lot of time transferring files that don't need to be transferred again and again. Otherwise your site will be really slow.)
If the file is a PHP file, you could download it locally, then include it into the script in order to execute it. In this case, though, you need to make sure that every PHP file is self-contained, because you don't know which files have been fetched from GitHub yet, so if one file includes another you need to make sure the files dependent on the first file are downloaded, too. And the files dependent on those files, also.
So, in short, the .htaccess part of this is really simple, it's just a single RewriteRule. The complexity is in the PHP script that fetches files from GitHub. And if you just do the simplest thing possible, your site might not work, or it will work but really painfully slowly. And if you do a ton of genius level work on that script, you could make it run OK.
Now, what is the goal here? To save yourself the trouble of logging into the server and typing git pull to update the server files? I hope I've convinced you that trying to fetch files on demand from GitHub will be even more trouble than that.

Related

Redirect PHP controllers page [duplicate]

Hello and thanks to everyone for reading my question.
I've been working on a PHP web program for a little while and was wondering what measures should I take to protect the source before putting it on a live server. The source isn't being distributed, it's being accessed through a website (users log into the website to use it).
First I'd like to protect the source php files from being found and downloaded. I'm not using any framework, just php and all files are in the home directory as index.php. I read around and it seems that robots.txt isn't really effective for hiding. I came across some posts of people recommending .htaccess, but I often thought it was for protecting files within a directory with a password, so not sure if there's a way to make it htaccess suitable for a web app.
Second, I'd like to protect the source files in the case someone gets access to them (either finds them and downloads them or a sys admin that has ready access to the server). I thought of source encryption with something like ioncube. My host also has GnuPG [which I'm not familiar with, any thoughts about it compared to ioncube?]
I'm not familiar with source protection, so any ideas would be nice, and of course thank you muchly :)
Just make sure your web server is set up to handle .php files correctly, and that all files have the correct .php extension (not .php.inc or similar)
As long as your server executes the PHP, no one can download its source code (ignoring any security holes in your code, which is a different topic)
There was a time when it was common to name included files along the lines of mystuff.php.inc - this is a bad idea. Say your site is at "example.com", and you store your database configuration in config.php.inc - if someone guesses this URL, they can request http://example.com/config.php.inc and get your database login in plain text..
It is a good idea to store configuration and other libraries up one directory as bisko answered - so you have a directory structure like..
/var/example.com:
include/
config.php
helper_blah.php
webroot/
index.php
view.php
This way, even if your web-server config gets screwed up, and starts serving .php files as plain text, it'll be bad, but at least you wont be announcing your database details to the world..
As for encrypting the files, I don't think this is a good idea.. The files must be unencrypted to Apache (or whatever server you're using) can access them. If Apache can access it, your sysadmin can too..
I don't think encryption is the solution to an untrustworthy sysadmin..
Well for your first point, that's web server security, which you should look for help on serverfault. Basically you would use a secure/locked directory for this, or access the files in a virtual directory via a web service.
For you second point, you would use an obfuscator for this, which will protect your source, but remember that if they get the file, you can only do so much to protect it. If they are really interested, they'll get what they want.
The first step you should take is take out all unnecessary files out of the website root and put them in some other place and leave only the files, being called from the web.
For example if you have this setup:
/var/htdocs/mysexydomain.com/root/config.php
/var/htdocs/mysexydomain.com/root/db.class.php
/var/htdocs/mysexydomain.com/root/index.php
/var/htdocs/mysexydomain.com/root/samplepage1.php
Take all the files one level above so you get
/var/htdocs/mysexydomain.com/includes/config.php
/var/htdocs/mysexydomain.com/includes/db.class.php #see the includes dir? :)
/var/htdocs/mysexydomain.com/root/index.php
/var/htdocs/mysexydomain.com/root/samplepage1.php

how to send header:location to out of web root file

I'm making a web application which will only allow registered members to download zip folders from a folders directory.
I really need to know which would be the proper way to secure the folder as only members stored in my database will be able to access them so the problem is if somebody finds the directory and a file name there's nothing to stop them accessing it.
I've been doing some research and found some approaches but they all have major drawbacks.
1.) put the files outside of the webroot then use readfile to send them the data.
This is how I have it currently set up. the major draw back is that I'm on a shared server and max execution time for the script is 30 seconds (can't be changed) and if the file is big or user connection slow the timeout will be called before the download is complete.
2.) htaccess and htpasswd inside a webroot directory.
The problem with this is I don't want to have to ask the user to put a password again. unless there's a way to allow php to send the password then send a header to the actual zip file that needs to be downloaded.
3.) Keeping the files in webroot but obfuscating the file names so they are hard to guess.
this is just totally lame!
What I really would like to do is keep the files outside of web root then just send a header:location to that document to force a download, obviously as it's not in web root so the browser won't see it. is there a way around this. Is there a way to redirect to an out of web root file with header:location('/file') to force a download. thus allowing apache to serve the file and not php with readfile.
Is there some easier way to secure the folders and serve with apache that I am just not coming across? Has anybody experienced this problem before and is there an industry standard way to do this better?
I know this may resemble a repeat question but none of the answers in the other similar question gave any useful information for my needs.
What I really would like to do is keep the files outside of web root then just send a header:location to that document to force a download, obviously as it's not in web root so the browser won't see it.
More to the point, it is outside the web root so it doesn't have a URL that the server can send in the Location header.
is there a way around this. Is there a way to redirect to an out of web root file with header:location('/file') to force a download.
No. Preventing the server from simply handing over the file is the point of putting it outside the web root. If you could redirect to it, then you would just be back in "hard to guess file name" territory with the added security flaw of every file on the server being public over HTTP.
Is there some easier way to secure the folders and serve with apache that I am just not coming across.
Your options (some of which you've expressed already in the form of specific implementations) are:
Use hard to guess URLs
Put the file somewhere that Apache won't serve it by default and write code that will serve it for you
Use Apache's own password protection options
There aren't any other approaches.
Is there some easier way to secure the folders and serve with apache that I am just not coming across.
No, there isn't an easier way (but that said, all three implementations you've described are "very easy").
Another approach, which I consider really dirty but might get around your resource constraints:
Keep the files outside the web root
Configure Apache to follow symlinks
On demand: Create a symlink from under the web root to the file you want to serve
Redirect to the URI of that symlink
Have a cron job running every 5 minutes to delete old symlinks (put a timestamp in the symlink filename to help with this)
It's effectively a combination of the first two options in my previously bulleted list.

How to protect PHP from the public?

So I'm a bit confused about what crafty users can and can't see on a site.
If I have a file with a bunch of php script, the user cant see it just by clicking "view source." But is there a way they can "download" the entire page including the php?
If permission settings should pages be set to, if there is php script that must execute on load but that I dont want anyone to see?
Thanks
2 steps.
Step 1: So long as your PHP is being processed properly this is nothing to worry about...do that.
Step 2: As an insurance measure move the majority of your PHP code outside of the Web server directory and then just include it from the PHP files that are in the directory. PHP will include on the file system and therefore have access to the files, but the Web server will not. On the off chance that the Web server gets messed up and serves your raw PHP code (happened to Facebook at one point), the user won't see anything but a reference to a file they can't access.
PHP files are processed by the server before being sent to your web browser. That is, the actual PHP code, comments, etc. cannot be seen by the client. For someone to access your php files, they have to hack into your server through FTP or SSH or something similar, and you have bigger problems than just your PHP.
It depends entirely on your web server and its configuration. It's the web server's job to take a url and decide whether to run a script or send back a file. Commonly, the suffix of a filename, file's directory, or the file's permission attributes in the filesystem are used to make this decision.
PHP is a server side scripting language that is executed on server. There is no way it can be accessed client side.
If PHP is enabled, and if the programs are well tagged, none of the PHP code will go past your web server. To make things further secure, disable directory browsing, and put an empty index.php or index.html in all the folders.
Ensure that you adhere to secure coding practices too. There are quite a number of articles in the web. Here is one http://www.ibm.com/developerworks/opensource/library/os-php-secure-apps/index.html

Serve a PHP website with PHP files being remote

This is the situation:
I have a LAMP server, which serves HTML, PHP, etc... Now I have remote folder, somewhere in the web, which has a directory full of PHP files, images, an MVC folder structure (CodeIgniter), etc...
Now, What I want to do is that instead of every time I want to serve those PHP files, instead of downloading them and uploaded them into my LAMP server, I want to use those PHP files directly and serve them in my LAMP server.
Again, I want the PHP files from a folder in another server, which I only have access to the direct link to each individual file, being serve in my LAMP server, so if I access my website, for instance: www.website.com/page1, gets the folder structure from the remote web server or all PHP files, and get serve within my server.
I know this sounds a little bit complicated but I'm not sure what to use... Maybe reverse proxy? Do you think I may download the files directly and constantly syncing the files? If anyone gets with a good solution I may even pay that person...
EDIT(1)
Good answers so far... but I think I did not make a good question so here it goes again:
I have access to a "list" of PHP files, and in order to get them I need to authenticate myself using oath via PHP. Once I get authenticated, I can retrieve a list of PHP, html, etc.. files, each one of them having a public URL that anyone can access. So the think is that instead of downloading all files in that repository, and serve those files, I want to be able to reuse that repository's web space and I just serve these files myself. So basically I want to be able to have symbolic links to urls, which I think is not possible, but being able to just read the files and serve the PHP logic, even though the files are elsewhere.
I'm concern about the security issues involved, but if someone could help me I will be thankful... Also if you are interested in what I'm doing I always can use a partner for this project which I intent to use it in charity, but still can pay that person.
This is not a smart thing to do. You open yourself up to potential security issues, but at a minimum, you will significantly slow your site down.
I would recommend that you simply script synchronizing the files on both servers over SSH by a script.
Edit: ManseUK's suggestion if rsync is also a good one.
If you have ftp access to the remote server, you could mount the folder using fuse, and serve as usual for apache.
Do you have the ability to mount the remote folder as an NFS volume, or perhaps with SSHFS? If those options are available, either could work for you. You'd mount the remote folder locally and tell your local web server to serve files from that path.
Not that it would be the most efficient setup in the world, but I don't know why you have all this split apart in the first place. ;)
You could write a cronjob to grab the remote file list every X minutes/hours/days then store the results locally, then write a simple script to parse those results upon request. Alternatively, you could still use an NFS or SSHFS mount to read the remote paths in real time and build whatever URL's you need.

Securing PHP files

Hello and thanks to everyone for reading my question.
I've been working on a PHP web program for a little while and was wondering what measures should I take to protect the source before putting it on a live server. The source isn't being distributed, it's being accessed through a website (users log into the website to use it).
First I'd like to protect the source php files from being found and downloaded. I'm not using any framework, just php and all files are in the home directory as index.php. I read around and it seems that robots.txt isn't really effective for hiding. I came across some posts of people recommending .htaccess, but I often thought it was for protecting files within a directory with a password, so not sure if there's a way to make it htaccess suitable for a web app.
Second, I'd like to protect the source files in the case someone gets access to them (either finds them and downloads them or a sys admin that has ready access to the server). I thought of source encryption with something like ioncube. My host also has GnuPG [which I'm not familiar with, any thoughts about it compared to ioncube?]
I'm not familiar with source protection, so any ideas would be nice, and of course thank you muchly :)
Just make sure your web server is set up to handle .php files correctly, and that all files have the correct .php extension (not .php.inc or similar)
As long as your server executes the PHP, no one can download its source code (ignoring any security holes in your code, which is a different topic)
There was a time when it was common to name included files along the lines of mystuff.php.inc - this is a bad idea. Say your site is at "example.com", and you store your database configuration in config.php.inc - if someone guesses this URL, they can request http://example.com/config.php.inc and get your database login in plain text..
It is a good idea to store configuration and other libraries up one directory as bisko answered - so you have a directory structure like..
/var/example.com:
include/
config.php
helper_blah.php
webroot/
index.php
view.php
This way, even if your web-server config gets screwed up, and starts serving .php files as plain text, it'll be bad, but at least you wont be announcing your database details to the world..
As for encrypting the files, I don't think this is a good idea.. The files must be unencrypted to Apache (or whatever server you're using) can access them. If Apache can access it, your sysadmin can too..
I don't think encryption is the solution to an untrustworthy sysadmin..
Well for your first point, that's web server security, which you should look for help on serverfault. Basically you would use a secure/locked directory for this, or access the files in a virtual directory via a web service.
For you second point, you would use an obfuscator for this, which will protect your source, but remember that if they get the file, you can only do so much to protect it. If they are really interested, they'll get what they want.
The first step you should take is take out all unnecessary files out of the website root and put them in some other place and leave only the files, being called from the web.
For example if you have this setup:
/var/htdocs/mysexydomain.com/root/config.php
/var/htdocs/mysexydomain.com/root/db.class.php
/var/htdocs/mysexydomain.com/root/index.php
/var/htdocs/mysexydomain.com/root/samplepage1.php
Take all the files one level above so you get
/var/htdocs/mysexydomain.com/includes/config.php
/var/htdocs/mysexydomain.com/includes/db.class.php #see the includes dir? :)
/var/htdocs/mysexydomain.com/root/index.php
/var/htdocs/mysexydomain.com/root/samplepage1.php

Categories