Emulating PHP Include Statement Using Code From HTTP Requests - php

I have a primary "driver" script written in PHP, and based on certain criteria, I'd like this script to pull code from one or more supporting servers via HTTP in order to load functions into memory as though I had done so using the PHP include statement.
Is it possible to use HTTP requests in PHP to pull data that is actually interpreted as PHP code when it is returned?
For example, suppose I used cURL or a web service to return the following text that was stored in a variable, say $URLResponse:
function userKeyGet() {
return (isset($_SESSION['user_key']) ? $_SESSION['user_key'] : '-1');
}
If I then used something such as eval($URLResponse), would that create the function for use during the current execution of the calling PHP script? I've used cURL and Buzz to return JSON or similarly structured data that I've converted into an array, but not a function or class. Is this possible? Thanks.

You can load remote PHP codes with include(),include_once(),require() and require_once() functions, It requires allow_url_include enabled in php.ini.
require_once("http://www.yourserver.com/function.php");
Included file should contain codes and not interpreted by server as executable, so if you are using php supported web server maybe you can give another extension to remote file.
eval() will work too. If you use eval/include and declare same function two times it will raise fatal error since it is already declared. You can use object or anonymous functions to override the function.
$code="function userKeyGet() {
return (isset(\$_SESSION['user_key']) ? \$_SESSION['user_key'] : '-1');
}";
eval($code);
eval($code); # Second time using eval on $code with "Fatal error: Cannot redeclare"
# This will work
echo userKeyGet();
# An example for anonymous function way
$code="\$userKeyGet = function() {
return (isset(\$_SESSION['user_key']) ? \$_SESSION['user_key'] : '-1');
};";
eval($code); # this wont raise redeclare error
echo $userKeyGet();

Related

PHP clean_all_processes() giving 500 internal error, but still outputs

So my PHP file is meant to check if a variable is null, if so than echo and output, and stop there
Here is that code:
if(is_null($ip)){
echo "IP is not valid";
clean_all_processes();
}
So when I try to test this script using the insomnia rest client it outputs the "IP is not valid" but also gives a "500 internal server error"
In my error_log file it spits out this every time
Uncaught Error: Call to undefined function clean_all_processes()
Note: I am using php 7.3
There is no such function called clean_all_processes() in PHP. The answer you linked to used it as an example name of a function you could call.
If you want a hard stop of your script use die(). This is not recommended! You should structure your code in such a way that you should almost never need to use this approach.
There is no way to break out of if statement, because such thing makes no sense. An if statement is already a condition. You either execute the code or don't.

Calling Perl with parameters from PHP: exec(), virtual(), etc

I am trying to retrieve the output of a Perl cgi script that has parameters with a PHP page on Apache. If my PHP is
<? echo exec('../cgi-bin/test.cgi'); ?>
then I get the correct output (but I can't use the parameters). However, if my PHP is
<? echo exec('../cgi-bin/test.cgi?m=1'); ?>
then I get no output. When I use virtual()
<? echo virtual('../cgi-bin/test.cgi?m=1'); ?>
I get a "Call to undefined function virtual()" error.
My Perl script is getting the parameters with
my $co = new CGI;
my $mobile = $co->param('m') || 0;
I can't run the script from the command line because the shared hosting provider set the server that way.
I don't know if this answer will help anyone else because it isn't technically calling the Perl script via CGI as I understand it, but greg_diesel let me to call the Perl script from PHP with multiple parameters:
exec('../cgi-bin/test.cgi "1" "2"');
and access the parameter in Perl with
$ARGV[0];
$ARGV[1];
etc. instead of CGI.
On thing to consider since it is a server side request is that the perl script submittal of parameters may initiate the script, but return of data may come from a different perl script. a connector/link between the two might be some form of unique identifier.
'../cgi-bin/test.cgi'
'../cgi-bin/run_test.cgi'

Calling included remote functions and variables

I'm trying to include a remote file from one of LAN pcs using include, allow_url_fopen = On and allow_url_include = On.
One local PC (let's say pc2), I have remote.php, which contains:
<?php
echo $var_on_pc1; // this doesn't output
$remote_var = 'Var on pc2';
function square($num){
return $num * $num;
}
?>
In my PC (let's say pc1), I have test.php, which consists of this:
<?php
$var_on_pc1 = 'Var on pc1';
include "http://pc2/path/to/remote.php";
echo $remote_var; // this doesn't output
echo square(4); // this got error
?>
When I run the script test.php, i got the error:
"Fatal error: Call to undefined function: square() in
path/to/test.php on line 7.
What happened? I thought I could call the included functions and variables and vice versa?
If I cannot implement this, what is the best way?
I have no security concern because I use this locally for temporary development.
Type http://pc2/path/to/remote.php into your browser and see what you get. PHP gets exactly the same.
If the PHP file is being processed by the web server at pc2, you likely get zilch in that file, because the code as been processed. You'd need to configure the other server to not process the PHP file and serve its raw source code instead.
This is not a good idea overall.

Calling PHP function with arguments remotely

I have a scenario where in our php file accepts parameters from the command line.
For example we say,
php test.php 'hello'
Now the above runs on command prompt. Suppose now we want to invoke this from client end however ofcourse i do not want the System system call as that would be a bad design, i just want to directly call the function which accepts parameters from the client end and ofcourse client can be anything maybe .Net or PHP so how can I caccomplish that?
Thanks!
Put your script on a web server and accept the argument via HTTP GET or POST.
It would look something like http://hostname/test.php?argument=hello. Then, in your test.phpscript, pass the $_GET['argument'] to your function:
myfunction($_GET['argument']);
Don't forget to sanitize the input!
you may use a function that will manage command line argiments:
$param_1 = isset($argv[1]) ? $argv[1] : null;
if ($param_1 == 'function1')
function1()
elseif...
and so on.
You can make a wrapper script that puts GET parameters from your client into the command line argument array.
Let's say your client makes a request like:
hostname/testWrapper.php?params[]=hello&params[]=goodbye
Your wrapper script testWrapper.php could then look like
<?php
$params = $_GET['params'];
foreach ($params as $i => $param)
$_SERVER['argv'][$i + 1] = $param;
$_SERVER['argc'] = count($_GET['params'] + 1);
include ('test.php');
?>
This assumes that your test.php uses $_SERVER['argv/argc']to read command line arguments. It may use $argv and $argc instead (if 'register_argc_argv' is enabled in your php.ini), in which case you just use those in your wrapper script instead of $_SERVER[...].
Notice that we have to insert the parameters with an offset of 1 (i.e. $params[0] becomes $_SERVER['argv'][1]. This is because when the script is called from the command line, the first parameter $_SERVER['argv'][0] is the script name.
Lastly, unless you are absolutely sure that your test.php sanitizes the parameters, you have to do it in the wrapper script.

Fatal error php

Is there a way to make the code continue (not exit) when you get a fatal error in PHP?
For example I get a timeout fatal error and I want whenever it happens to skip this task and the continue with others.
In this case the script exits.
There is a hack using output buffering that will let you log certain fatal errors, but there's no way to continue a script after a fatal error occurs - that's what makes it fatal!
If your script is timing out you can use set_time_limit() to give it more time to execute.
"Fatal Error", as it's name indicates, is Fatal : it stop the execution of the script / program.
If you are using PHP to generate web pages and get a Fatal error related to max_execution_time which, by defaults, equals 30 seconds, you are certainly doing something that really takes too mych time : users won't probably wait for so long to get the page.
If you are using PHP to do some heavy calculations, not in a webpage (but via CLI, or a cron, or stuff like that), you can set another (greater) value for max_execution_time.
You have two ways of doing that :
First is to modify php.ini, to set this value (it's already in the file ; just edit the property's value). Problem is it'll modify it also for the web server, which is bad (this is a security measure, after all).
Better way is to create a copy of php.ini, called, for instance, phpcli.ini, and modify this file. Then, use it when invoking php :
php -c phpcli.ini myscript.php
This'll work great if you have many properties you need to configure for CLI execution. (Like memory_limit, which often has to be set to a higher value for long-running batches)
The other way is to define a different value for max_execution_time when you invoke php, like this :
php -d max_execution_time=60 myscript.php
This is great if you launch this via the crontab, for instance.
It depends on the exact error type. You can catch errors by creating your own error handler. See the documentation on set_error_handler(), but not all types of errors can be caught. Look at the timeout error you get and see what type it is. If it is one of E_ERROR, E_PARSE, E_CORE_ERROR, E_CORE_WARNING, E_COMPILE_ERROR or E_COMPILE_WARNING then you cannot catch it with an error handler. If it another type then you can. Catch it with the error handler and simply return.
If you have a suitable PHP version (PHP>=5.2 for error_get_last) you can try the technique described here which uses register_shutdown_function and error_get_last.
This won't allow you to "continue" when you get a fatal error, but it at least allows you to log the error (and perhaps send a warning email) before displaying a custom error page to the user.
It works something like this:
function fatalErrorHandler()
{
$lastError = error_get_last();
if (isset($lastError["type"]) && $lastError["type"]==E_ERROR) {
// do something with the fatal error
}
}
...
register_shutdown_function('fatalErrorHandler');
A few points:
you can use ob_clean() to remove any content that was generated prior to the fatal error.
it's a really bad idea to do anything to intensive in the shutdown handler, this technique is about graceful failure rather than recovery.
whatever you do, don't try to log the error to a database ... what if it was a database timeout that caused the fatal error?
for some reason I've had problems getting this technique to work 100% of the time when developing in Windows using WAMP.
The most simple answer I can give you is this function: http://php.net/manual/en/function.pcntl-fork.php
In more detail, what you can do is:
Fork the part of the process you think might or might not cause a fatal error (i.e. the bulk of your code)
The script that forks the process should be a very simple script.
For example this is something that I would do with a job queue that I have:
<?php
// ... load stuff regarding the job queue
while ($job = $queue->getJob()) {
$pid = pcntl_fork();
switch ($pid) {
case -1:
echo "Fork failed";
break;
case 0:
// do your stuff here
echo "Child finished working";
break;
default:
echo "Waiting for child...";
pcntl_wait($status);
// check the status using other pcntl* functions if you want
break;
}
}
Is there a way then to limit the execution time of an function but not all script?
For example
function blabla()
{
return "yes";
}
to make it so that if it is not executed in 25 seconds to return no;

Categories