I just wanted to keep all my code libraries (PHP Classes; ex: http://libraries.com/form.php) on a single server for easy maintenance and availability. Wherever I need to use this library; I'd just include it in my code. But; I know; enabling remote URL include isn't safe at all. So I found a work around.
I'd just use eval( file_get_contents( 'http://libraries.com/form.txt' ). I use .txt instead of .php so I get PHP code as it is; not a blank file returned by server after PHP is processed.
This works; I get my PHP library/class and I can play with it on a remote location. But I don't know if it is safe or not. What could be pros and cons of this way. Or what other way you can suggest me to achieve this safely?
This:
Has all the security downsides of includeing remote files
Is massively inefficient due to all the extra HTTP requests
Means that a new release of a library gets deployed without being tested against the rest of the code in an application
Adds an extra point of failure for the application
Don't do this. It is a terrible idea.
Installation of dependencies should be a feature of your install script, not the application itself.
Related
I know that PHP's include/require statements can append other .php files into the script, either from a local path or an url.
Today i tried to include and also to require a .ddf (a text file), and it worked, with no errors or warnings. Then PHP actually executed some code that was in that file!
After that i went into the PHP's documentation for include to see if including non-php files is fully supported and safe. Turns out that, the documentation barely mentions this procedure (include 'file.txt'; // Works.) that's it.
So i'm asking you guys, Is including non-php files safe? Also is it a bad practice?
I just want to say that it is completely unsafe. While yes, as long as you trust the page, you technically could do this. But the page when pulled up directly in the browser isn't parsed as php. Anyone who goes directly to the file in the web server, whether guessing or you made a framework or they just know some file names, would see the complete source of the file. Exposing your site and possibly releasing sensitive information like database credentials. Another thing to think about is that people are usually pretty good about not allowing *.php files to be uploaded to their site, but just imagine you are allowing other files to be included and someone uploads a text file named "someImage.jpg" with php script in it and for some dumb reason you include it. People now have a way to execute scripts on your server. Likely including calling shell commands (exec). It used to be common practice to use *.inc files to specify includes but that has been considered bad for quite a long time.
It is not advisable to include txt files in php scripts. Instead, you should use file_get_contents.
I'm having many websites installed on the same webserver. What i wanna do, is to be able to include a same file from different websites as
<?php include '/home/site/www/path/to/file.php'; ?>
and in the same time block functions like highlight_file and file so using the following code won't displays my files content
<?php echo hightlight_file('/home/site/www/path/to/file.php'); ?>
Any help will be appreciated.
If you want your PHP files to be runnable but be safe from being read, your best option is to encode them.
Take a look at IonCube PHP Encoder and SendGuard , they are both very popular options to protect source code.
Blocking PHP function can work, but you'll never be safe because you can forget functions (can you reall list them all? What if there's one you actually need?), or new functions could be added in the future and if you do not block them you'd be exposed.
...so using the following code won't displays my files content
Does that mean you want to allow other people to deploy code on the server which calls your code without revealing the PHP source? If so, then disabling highlight_file isn't going to help much. You also need to disable include, require, fopen, file_get_contents, the imap extension and several other things - which means they won't be able to access your code at all.
If you're letting other people whom you don't necessarily trust deploy code on your server then there are lots of things you need to do to isolate each account - it's not a trivial exercise and well beyond the scope of an answer here. But it's not really possible to allow access to a shared include file without providing access to the source code. Using encoded PHP solves some problems but introduces others. A better solution is to expose the functionality via a web or socket API (this solves the sharing problem but not the isolation problem).
Assume that we have Linux + Apache + PHP installed with all default settings. I have some PHP website that uses some large thirdparty PHP library, let's say 1 Mb of PHP sources. This library is used very rarely, let's say for POST requests only. There is a reason why I can't move this library usage into separate PHP file. So, I have to include this library for each HTTP request, but use it very rarely. Should I concern about time spend for PHP parsing in that case? Let me explain. I could do this:
<?php
require_once('heavy_library.php');
// do regular stuff
if(we need heavy library)
{
heavy_library_function();
}
?>
I assume that this solution is bad because in this case heavy_library.php is parsed for each HTTP request. I can move it into the if statement:
<?php
// do regular stuff
if(we need heavy library)
{
require_once('heavy_library.php');
heavy_library_function();
}
?>
Now as I understand it it's being parsed only in case when we need the library.
Now, get back to the question. Default settings for Apache and PHP. Should I concern about this issue? Should I move require_once into the place where it is really being used, or I can leave it as usual and Apache / PHP will do some kind of caching that will prevent parsing for each HTTP request?
No, Apache will not do the caching. You should keep the require_once inside the if so it is only used when you need it.
If you do want caching of PHP, then look at something like eaccelerator.
When you tell PHP to require() something, it will do it no matter what; the only thing that prevents parsing that file from scratch every time will be to use an opcode cache such as APC.
Conditionally loading the file would be preferred in this case. If you're worried about making life more complicated by having these conditions, perform a small benchmark.
You could also use autoloading to load files "on demand" automatically; see spl_autoload
I have a series of web sites all hosted on the same server with different domains. I want to host some common PHP scrips and then be able to call these from the other domains.
Im am a bit fresh with my php so pls excuse code attempts - I have tried iterations of the following which may try and help you understand what I am aiming for!
from within php tags ...
include('http://www.mydomain/common_include.php?show_section=$section');
$show_section = $_GET['show_section'];
include('http://www.mydomain/common_include.php');//Then $show_section would be available to the included file/cod
Finally I have tried pulling in the include which contains a function then trying to run that include from the parent script.
I would much prefer to keep this PHP
orientated rather than getting
involved with the server (file
systems etc (but I can change
permissions etc)
I can but would prefer not to just upload the same library to each of the domains separately
I understand PHP is run on the server hence maybe problematic to include scripts across onto another server.
Thanks in advance.
#
EDIT
OK OK - I get that its bad practice so will not do it....THANKS VERY MUCH FOR THE QUICK ANSWERS.
However is there any other recommendations of how to esentially show this basic php app on all of the sites with out haveing to add the files to the root of each site? Just to prevent massive script duplication...(thinking out loud call the scripts in from a db or anyother soloutions)
Again thanks for your assistance
That would be a huge security risk if you could just include remote PHP files to your own projects. The PHP gets parsed before the server sends it to you so cross-domain includes would only contain the output the script generates. The only way to include PHP files so that they can be executed is via local filesystem.
If you look at PHP.net's documentation about include, you can find this:
If "URL fopen wrappers" are enabled in PHP (which they are in the default configuration), you can specify the file to be included using a URL (via HTTP or other supported wrapper - see List of Supported Protocols/Wrappers for a list of protocols) instead of a local pathname. If the target server interprets the target file as PHP code, variables may be passed to the included file using a URL request string as used with HTTP GET. This is not strictly speaking the same thing as including the file and having it inherit the parent file's variable scope; the script is actually being run on the remote server and the result is then being included into the local script.
Which pretty much explains the whole thing.
The root of the original question seemed to be the poster's concern about using a PHP script or plugin on multiple sites and then having an onerous task each time it needs to be updated. While trying to include PHP files across sites is a bad idea, it is a better plan to structure your script to be as self contained as possible. Keep the entire plugin contained in one directory.... and ensure your function calls to utilize it are as well formed as possible - clean, well named functions, uniform naming conventions and a well thought out plan for what parameters each function needs. Avoid using global variables.
Ideally you should then have quite an easy time each time you need to update the plugin/script in all locations. You can even set up an automated process that will upload the new directory containing the plugin to each site replacing the old one. And the function calls within your code should rarely if ever change.
If your script is big enough you might implement an automatic update process like the more recent versions of Wordpress use. Click a button and it updates itself. In the past, updating a dozen sites running Wordpress (as an example) was a massive pain.
That is very bad practice.
Actually you're including not PHP but just HTML code.
Include files, not urls. It is possible for the same server.
Just use absolute path to these files.
Apart from the fact that it's a bad practice you should first check if include allows URLs if you really want to do that.
If however all the sites that need to use the script, you could put the script somewhere in a directory accessible by the user that executes php and add that dir to the php.ini include_path property (can also be done at runtime)
(Or you could create a php extension and load it as extension)
If you have root rights on that server, you could just use absolute path from filesystem root, but most hostings won't let you do this.
I am writing a small web server, nothing fancy, I basically just want to be able to show some files. I would like to use PHP though, and im wondering if just putting the php code inside of the html will be fine, or if I need to actually use some type of PHP library?
http://www.adp-gmbh.ch/win/misc/webserver.html
I just downloaded that and I am going to use that to work off of. Basically I am writing a serverside game plugin that will allow game server owners to access a web control panel for their server. Some features would be possible with PHP so this is my goal. Any help would be appreciated, thanks!
The PHP won't serve itself. What happens in a web server like Apache is before the PHP is served to the user it is passed through a PHP parser. That PHP parser reads, understands and executes anything between (or even ) tags depending on configuration. The resultant output, usually still HTML, is served by the web server.
There are a number of ways to achieve this. Modules to process PHP have been written by Apache but you do not have to use these. PHP.exe on windows, installed from windows.php.net, will do this for you. Given a PHP file as an argument it will parse the PHP and spit the result back out on the standard output.
So, one option for you is to start PHP.exe from within your web server with a re-directed standard output to your program, and serve the result.
How to create a child process with re-directed IO: http://msdn.microsoft.com/en-us/library/ms682499%28VS.85%29.aspx however, you won't be writing the child process, that'll be PHP.exe
Caveat: I am not sure from a security / in production use perspective if this is the most secure approach, but it would work.
PHP needs to be processed by the PHP runtime. I'm assuming the case you're talking about is that you have a C++ server answering HTTP queries, and you want to write PHP code out with the HTML when you respond to clients.
I'm not aware of any general-purpose PHP library. The most straightforward solution is probably to use PHP as a CGI program.
Here's a link that might be useful for that: http://osdir.com/ml/php-general/2009-06/msg00473.html
This method is nice because you don't need to write the HTML+PHP out to a file first; you can stream it to PHP.
You need execute the PHP page to serve the page it generates.
The easiest thing for you to do would be to add CGI support to your webserver in some basic form. This is non-trivial, but not too difficult. Basically you need to pass PHP an environment and input, and retrieve the output.
Once you have CGI support you can just use any executable, including PHP, to generate webpages.