So I'm a bit confused about what crafty users can and can't see on a site.
If I have a file with a bunch of php script, the user cant see it just by clicking "view source." But is there a way they can "download" the entire page including the php?
If permission settings should pages be set to, if there is php script that must execute on load but that I dont want anyone to see?
Thanks
2 steps.
Step 1: So long as your PHP is being processed properly this is nothing to worry about...do that.
Step 2: As an insurance measure move the majority of your PHP code outside of the Web server directory and then just include it from the PHP files that are in the directory. PHP will include on the file system and therefore have access to the files, but the Web server will not. On the off chance that the Web server gets messed up and serves your raw PHP code (happened to Facebook at one point), the user won't see anything but a reference to a file they can't access.
PHP files are processed by the server before being sent to your web browser. That is, the actual PHP code, comments, etc. cannot be seen by the client. For someone to access your php files, they have to hack into your server through FTP or SSH or something similar, and you have bigger problems than just your PHP.
It depends entirely on your web server and its configuration. It's the web server's job to take a url and decide whether to run a script or send back a file. Commonly, the suffix of a filename, file's directory, or the file's permission attributes in the filesystem are used to make this decision.
PHP is a server side scripting language that is executed on server. There is no way it can be accessed client side.
If PHP is enabled, and if the programs are well tagged, none of the PHP code will go past your web server. To make things further secure, disable directory browsing, and put an empty index.php or index.html in all the folders.
Ensure that you adhere to secure coding practices too. There are quite a number of articles in the web. Here is one http://www.ibm.com/developerworks/opensource/library/os-php-secure-apps/index.html
Related
I have placed a video file (mp4) on a Apache server which will be accessed from a Android Application. I need to know how many times did the video have been viewed. The solutions I can think of are
View the Apache logs. But I have very limited access to them.
Call a PHP file then redirect to video file.
Any other better solutions apart from above two?
The third option is to have a PHP file which will register the download and then deliver the file by reading it and sending it to the client.
(See http://www.gayadesign.com/diy/download-counter-in-php-using-htaccess/)
Performance-wise this is somewhat worse than either the logs / redirect methods, but it is the most reliable, as the only way a client can access the file is via the PHP script. Furthermore, you can do this without any access to logs (it is Apache-independent). You also have more control (e.g. you can count download only once per IP), but then again, the other methods allow that too, with some modifications. I am not sure if there is any other way to do it effectively besides the two you've listed and the one I suggest, maybe there is a way with PHP / Apache extensions, I am just not aware of it.
So either go with the redirect or this.
I've just had an argument with a colleaque.
My index.php contains my mysql connection and therefor also the host, username, password and database name.
He claims it is a security thread for the possibility exists that the php parser may fail which would cause the webserver to return the entire file as plain text.
I however believe that IF the php parser would fail the webserver would give an internal server error to the users.
Can anyone confirm whether it is or is not a security risk?
thank you.
The short answer is no.
The long answer is yes, but only if:
your server's been compromised, in which case people reading your php files are the least of your worries
you've misconfigured your server to parse .php files and plain text, which would be very silly indeed.
Also, if you're using some kind of version control software, make sure your .hg or .svn or whatever folders can't be viewed from a web browser. You'd be surprised how often that happens.
EDIT:
I would be inclined to go with some of the suggestions on here already, which is what I do in my day to day development. Have a config.php file outside of your web root folder and include this in your index.php. That way you know for sure it's never going to be viewable. Btw, I've been developing in PHP for a number of years and have never had the parser fail in such a way that it's resulted in raw PHP being displayed to an end user.
EDIT 2:
If your colleague is referring to parse errors when he talks about the PHP parser "failing" then in a live environment you should have error reporting disabled anyway.
Either outcome is a possibility. The normal course of action is to use require to bring in a separate file containing your db credentials. That file should be outside the webserver file tree so it can't be reached via a browser.
I'm in the belief that you can never be too safe. What's easier, replacing thousands, possibly millions of records if a hacker gets your db information, the security breach you would have to explain to your users (and possibly their lawyers depending on content and breach) or putting your db information in a separate, password protected folder and including the information on the pages you need the connection?
To me, the choice is simple.
Your co-worker is correct but this is very unlikely to happen. The .php file will only be returned as plain text or as a download if PHP has stopped running on the host.
To be safer, use an include() path to the database credentials in a new folder. In that folder have a .htaccess file with 'deny from all'.
That way even if PHP stops running on the server, Apache will still run and protect all the files including the database credentials. If even apache stops running, the whole webserver will be unreachable and your credentials will still be safe.
:)
Personally I'd put the options in a config file outside the web tree and, once uploaded, remove FTP access from that directory. It's not just a matter of whether the PHP parser fails and drops the file out as plain text BUT if the FTP server has a vulnerability that's compromised that file could be accessed by FTP as well as HTTP.
As long as Apache/PHP is running as a separate user to FTP you can still require the config file from PHP.
I was wondering if there was a way to basically host a site on your server so you can run PHP, but have the actual code hosted on GitHub. In other words...
If a HTTP request went to:
http://mysite.com/docs.html
It'd request and pull in the content (via file_get_contents() or something):
https://raw.github.com/OscarGodson/Core.js/master/docs.html
Or, if they went to:
http://mysite.com/somedir/another/core.js
It'd pull down:
https://raw.github.com/OscarGodson/Core.js/master/somedir/another/core.js
I know GitHub has their own DNS servers, but id rather host it on my so i can run server side code. What would the htaccess code look like for this?
This is beyond the capabilities of .htaccess files, if the requirement is to run the PHP embedded in the HTML stored on github.com at the server on yourserver.com simply by a configuration line like a redirect in the .htaccess file.
A .htaccess file is typically used to provide directives to the Apache web server. These directives can indicate, for example, access permissions, popup password protection, linkages between URLs and the server's file system, handlers for certain types of files when fetched by the server before delivery to the browser, and redirects from one URL to another URL.
An .htaccess file can issue redirects for http://mysite.com/somedir/another/core.js to https://raw.github.com.... but then the browser will be pointed to raw.github.com, not mysite.com. Tricks can be done with frames to make this redirection less transparent to the human at the browser... but these dont affect the fact that the data comes from github.com without ever going to the server at mysite.com
In particular, PHP tags embedded in the HTML on github.com are never received by mysite.com's server and therefore will not run. Probably not want you want. Unless some big changes have occurred in Apache, .htaccess files will not set up that workflow. It might be possible for some expert to write an apache module to do it, but I am not sure.
What you can do is put a cron job on mysite.com that git pull's from github.com every few minutes. Perhaps that is what you want to do instead?
If the server can run PHP code, you can do this.
Basically, in the .htaccess file you use a RewriteRule to send all paths to a PHP script on your server. For example, a request for /somedir/anotherdir/core.js becomes /my-script.php/somedir/anotherdir/core.js. This is how a lot of app frameworks operate. When my-script.php runs the "real" path is in the PATH_INFO variable.
From that point the script could then fetch the file from GitHub. If it was HTML or JavaScript or an image, it could just pass it along to the client. (To do things properly, though, you'll want to pass along all the right headers, too, like ETag and Last-Modified and then also check those files, so that caching works properly and you don't spend a lot of time transferring files that don't need to be transferred again and again. Otherwise your site will be really slow.)
If the file is a PHP file, you could download it locally, then include it into the script in order to execute it. In this case, though, you need to make sure that every PHP file is self-contained, because you don't know which files have been fetched from GitHub yet, so if one file includes another you need to make sure the files dependent on the first file are downloaded, too. And the files dependent on those files, also.
So, in short, the .htaccess part of this is really simple, it's just a single RewriteRule. The complexity is in the PHP script that fetches files from GitHub. And if you just do the simplest thing possible, your site might not work, or it will work but really painfully slowly. And if you do a ton of genius level work on that script, you could make it run OK.
Now, what is the goal here? To save yourself the trouble of logging into the server and typing git pull to update the server files? I hope I've convinced you that trying to fetch files on demand from GitHub will be even more trouble than that.
I am curious about the security of PHP on an HTML webpage where PHP code is embedded (a webpage that would exist on the server as "webpage.php") or on a PHP script that may be referenced by an HTML page (that is, a PHP script that is not actually part of a webpage that exists on the server as "something.php" and is referenced by "webpage.html"). Getting to the point, let us say that if the source code of my PHP script is known by anyone it would be a very big problem. I know that when you view the source of a PHP page in a browser the PHP script is not shown, but what if the PHP server failed and the HTML still loaded (is this even possible), would a user be able to see the PHP script? To be more general, is there ANY possible way that a user could access the source of a PHP script from a web browser, and if so, how do I prevent it?
what if the PHP server failed and the HTML still loaded (is this even possible), would a user be able to see the PHP script?
Security holes aside, this typically happens when someone's messing with the server or migrating the site across servers and the PHP files have been dumped into a folder that's not set up to execute PHP. This is the price you pay for PHP deployment being as simple as dropping files into a folder.
Whilst it's never ideal to leak PHP source, you can mitigate the situation by putting all your sensitive deployment information (like database passwords) in a PHP include file that lives outside the web root (the folder mapped to the / URL, often known as htdocs). It's much harder to screw up the configuration to leak that.
(For larger, more modular projects you will typically be doing the bulk of your processing work in includes anyway.)
One simple thing you can do to guard against a simple server mis-configuration is to have the HTML file include a PHP file which is outside of the document root (at or above the level of the document root, usually "htdocs"). That way if there was a brief misconfiguration all the user would get would be the path to the included file, but they would not be able to load that included file directly in their browser.
Is there really a way to do this ? Retrieving raw .php file from the server (other than getting into server's FTP account) ? Is this the reason why there are tools/script to encrypt php source code ?
If it's true, then how to protect against it ? (without using php source code encryption)
edit: the server mentioned has php running, eg. apache-php-mysql, your standard hosting server configuration.
If you are talking about someone else's server, then the short answer is no. If third parties could read your PHP source code, that would be quite a security hole, since PHP files tend to contain database passwords, hash keys, proprietary algorithms and other goodies that you don't want falling in the wrong hands.
If you are talking about your own server (ie. that you yourself have access to), then there are simple scripts that you can put on the server, that allow you to specify a path to any file on the server and have it returned as plaintext.
However, you NEVER EVER want to place such a script on a production server, for the reasons mentioned above.
Generally speaking, you can't access remote source code. The PHP module would have to be disabled for this to occur.
But as a thought experiment, how might this happen?
Leaving aside wholesale exploits which get access to the entire filesystem, imagine if there were a security hole in an application which allowed you to insert an line into an .htaccess file. Given that an .htaccess writable by the httpd process is useful for apps like Wordpress, it's not too outlandish a possibility.
If you added this:
php_value engine off
The source files now become downloadable!
It is possible if the server is not well configured that PHP files are not handles as such.
Some examples:
Some servers are configured to show the highlighted source code of a PHP file when requested as .phps instead.
Some developers use .inc for files that are intended to be included using include or require. If the server is not configured to handle these as PHP as well, they will be delivered as plain text when they are requested directly.
But the developer can also be the source of vulnerability. For example when he uses a script for downloading files from the server and this script accepts nearly every input without validation.
If the file is served from a web server that has php interpretation enabled (via HTTP) then it will be processed. The only way you'd receive the code unprocessed is if PHP was disabled somehow.
I have encountered a mis-configured web server in the past that had one virtual host properly setup to server PHP files via the PHP interpreter. There was a second virtual host pointing at the same directory, but didn't have php enabled. This meant things like the 'config.php' for several apps where visible as plain text. As everyone knows a typical config.php has database auth credentials and other things that shouldn't be known.
So, it is very important to understand your web server setup, and make sure you aren't doing something silly.