I have PHP code that requires other php files by address $_SERVER['DOCUMENT_ROOT'].'/subdir/file.php';
First of all -- is this the best proper way to include things? Obviously I don't want to use path like '../../subdir/file.php'; because moving the file would break it.
But another interesting issue is that if I run this file through command line then $_SERVER is not created. I can fake it via $_SERVER['DOCUMENT_ROOT'] = '.'; but I'm curious to if this is the best practice. Seems like not.
Edit: Obviously there are many ways to skin this cat, although I think the best practice is to define a variable (or a constant) responsible for the include directory. Such as:
define('INC_DIR', $_SERVER['DOCUMENT_ROOT'].'/../includes');
or
if (PHP_SAPI == 'cli') {
$_includes = '../includes';
} else {
$_includes = $_SERVER['DOCUMENT_ROOT'].'/../includes/');
}
And use the aforementioned variable or constant throughout the code.
I prefer to use a folder definition system in my architectures. Something like this:
define( 'DIR_ROOT',dirname(__FILE__) );
That works both in command line and web mode. Use that in your application entry point (index.php in most cases) and then load the rest of your framework from that file outward. All inbound calls to your application should be routed via .htaccess or other method so that they call index.php?foo=bar etc.
I also hate typing DIRECTORY_SEPARATOR all the time so I usually make the first definition:
define( 'DS' , DIRECTORY_SEPARATOR );
This enables you later to do something like this:
require_once( DIR_ROOT.DS.'myfolder'.DS.'myfile.class.php' );
Alternatively if you don't want or need to modify your php files and you just need a page to be executed normally you could use curl. Most Linux and Unix systems have it installed.
$ curl http://www.example.com/myscript.php &> /dev/null
The &> /dev/null part sends the output into a black hole in the system so you don't have to see the HTML which was returned by the request.
if(PHP_SAPI == 'cli')
{
$_SERVER['DOCUMENT_ROOT'] = '/path/to/webroot';
}
Related
header("location:http;//")
Above line does not seem to work when executing PHP scripts from command line. How best can I open links via command line?
A very hastily and quickly tested method might be to use exec passing in the path to a known browser with the url as it's argument - seemed to work ok.
<?php
$url='https://www.google.co.uk';
$cmd=sprintf( '%%userprofile%%\AppData\Local\Google\Chrome\Application\chrome.exe %s', $url );
exec( $cmd );
?>
Thanks to #Álvaro's comment the above can be simplified further( on Windows at least )
<?php
$url='https://www.google.co.uk';
$cmd=sprintf( 'start %s',$url );
exec( $cmd );
?>
The solution provided above would only work for windows. IT would not work on mac OS . Here is a more general solution
public function open(string $url): void
{
switch (PHP_OS) {
case 'Darwin':
$opener = 'open';
break;
case 'WINNT':
$opener = 'start';
break;
default:
$opener = 'xdg-open';
}
exec(sprintf('%s %s', $opener, $url));
}
header() is HTTP-related only and is used to tell what headers should by returned by the server to the client's browser which performed the request. Location, in particular, simply means Hey ! Check out this place instead: xxxxx.
The client's browser, in turn, will decide by itself if it chooses to follow this advice or not (it usually does) but at no time, the served fetches these informations to re-serve them again to its client.
So the best way to do is to use your script throughout a web browser instead (as it supposed to be). If you want to « open links » from a command line, simply type your browser's executable name followed by an URL (e.g.: firefox http://www.stackoverflow.com).
If what you want to do instead is fetching files or specific pages from a remote web server, use a command line client such as wget or curl instead.
I have a PHP script which is typically run as part of a bigger web application.
The script essentially makes some changes to a database and reports back to the web user on the status/outcome.
I have an opening section in my PHP:
require $_SERVER['DOCUMENT_ROOT'].'/security.php';
// Only level <=1 users should be able to access this page:
if ( $_SESSION['MySecurityLevel'] > 1 ) {
echo '<script type="text/javascript" language="JavaScript">window.location = \'/index.php\'</script>';
exit();
}
So, basically, if the authenticated web user's security level is not higher than 1, then they are just redirected to the web app's index.
The script works fine like this via web browsers.
Now to my issue...
I want to also cron-job this script - but I don't know how to bypass the security check if ran from the CLI.
If I simply run it from the CLI/cron with 'php -f /path/to/report.php' and enclose the security check in a "if ( php_sapi_name() != 'cli' )", it spews out errors due to multiple uses of $_SERVER[] vars used in the script (there may be other complications but this was the first error encountered).
If I run it using CURL, then the php_sapi_name() check won't work as it's just being served by Apache.
Please can anyone offer some assistance?
Thank you! :)
If you invoke the script through the CLI some of the $_SERVER variables will be defined however their values may not be what you expect: for instance $_SERVER['DOCUMENT_ROOT'] will be empty so your require will look for a file called 'security.php' in the filesystem root. Other arrays such as $_SESSION will not be populated as the CLI does not have a comparable concept.
You could get around these issues by manually defining the variables (see "Set $_SERVER variable when calling PHP from command line?" however a cleaner approach would be to extract the code that makes the database changes to a separate file which is independent from any specific and that does not depend on any SAPI-specific variables being defined.
For instance your PHP script (let's call it index.php) could be modified like this:
require $_SERVER['DOCUMENT_ROOT'].'/security.php';
require $_SERVER['DOCUMENT_ROOT'].'/db_changes.php';';
// Only level <=1 users should be able to access this page:
if ( $_SESSION['MySecurityLevel'] > 1 ) {
echo '<script type="text/javascript" language="JavaScript">window.location = \'/index.php\'</script>';
exit();
} else {
do_db_changes();
}
Then in the SAPI-agnostic db_changes.php you would have:
<?
function do_db_changes() {
// Do the DB changes here...
}
?>
And finally you would have a file, outside the web root, which you can invoke from cron (say cron.php):
<?
require("/absolute/path/to/db_changes.php");
do_db_changes();
?>
Like this you can continue using index.php for the web application and invoke cron.php from cron to achieve your desired results.
Consider this:
#!/usr/bin/php
<?php
class Foo {
static function bar () {
echo "Foo->bar\n";
}
}
if (PHP_SAPI === 'cli') {
Foo::bar();
}
?>
I can execute this from CLI, but when I include it in, say, a CGI-run PHP script, the shebang ends up in the output.
I like simple scripts compact: I guess I could put the class part in a separate "lib"-file and have a simple wrapper for CLI use. BUT I'd like to keep it all in one place without having to worry about include paths etc.
Is this possible without ob_*-wrapping the include to capture the shebang (if this is even possible), or is it dumb to cram all of this into one file anyway? Alternatives/Thoughts/Best Practices welcome!
Edit: I'd like to put the script in my PATH, so calling I'd rather not call it by php file.php. See my comment to #misplacedme's answer
It's actually easy.
Remove the shebang and when you run the script, run it as
php scriptname.php OR /path/to/php scriptname.php
instead of
./scriptname.php
Running php script.php will look in only the current directory, or any directory within PATH. If you absolutely have to run it that way, add it. export PATH=$PATH:/path/to/php/script/folder(in bash)
That will mess up includes unless you're using full paths within the script.
No matter what you do, you'll have to use full paths somewhere.
I'm rather late to this one, but if anyone still cares, you can solve this on Linux by registering a binfmt handler.
As a one-off (resets after reboot):
echo ":PHP:M::<?php::/usr/bin/php:" > /proc/sys/fs/binfmt_misc/register
With this in place, any file that starts with the "magic" string "<?php" will be executed by running it with /usr/bin/php.
You can make this registration permanent by saving the line a file in /etc/binfmt.d
You can remove the registration with:
echo -1 > /proc/sys/fs/binfmt_misc/PHP
I am attempting to write a PHP script that will allow for me to select a few files to download from a predetermined location. I'd like my script to pass an array to a Powershell script that id written earlier and have my Powershell script handle the downloading (basically the php file just needs to tell the powershell file what needs to be downloaded).
I've looked at a few options, and it seems that exec is the command I should use for this (as I do not care about command line output I shouldnt need shell_exec).
So far I've turned OFF safe mode to allow me to use this command. I should also note that the php file will be run from a server, however the powershell files are located on a local machine.
A snippet of the code so far to handle the param passing looks like this:
if(isset($_POST['formSubmit']))
{
$choosePlugin = $_POST['wpPlugin'];
$chooseTheme = $_POST['wpTheme'];
if(isset($_POST['wpTheme']))
{
echo("<p>You selected: $chooseTheme</p>\n");
exec('powershell.exe C:\Wordpress Setup\setupThemes.ps1 $chooseTheme');
}
else
{
echo("<p>You did not select a theme</p>\n");
}
I am a bit confused as to what I should put inside the exec. When I run the above code there are no errors however nothing happens. I am a bit new to this so I apologize if more information is required. Any help is appreciated thank you.
Try to do:
echo exec('powershell.exe C:\\Wordpress Setup\\setupThemes.ps1 $chooseTheme');
to see the results of powershell.exe (remember the double \), also make sure to put the absolute path to the exe file:
echo exec('c:\\PATH_TO_POWERSHELL.EXE\\powershell.exe C:\\Wordpress Setup\\setupThemes.ps1 $chooseTheme');
If you want to pass the contents of the variable you should use double quotes to actually expand it, I guess. Furthermore you should quote the script name because the path contains spaces:
exec("powershell.exe \"C:\Wordpress Setup\setupThemes.ps1\" $chooseTheme");
I have a script that, inserts into the database e.g. 20,000 users with email addresses in batches of 1000
(so two tables, emailParent, emailChild), there are 1000 rows in emailChild for every row in emailParent.
I want to run a script that sends these emails which basically says
//check_for_pending_parent_rows() returns the id of the first pending row found, or 0
while($parentId = check_for_pending_parent_row()){//loop over children of parent row}
Now because this is talking to the sendgrid servers this can take some time.
So I want to be able to hit a page and have that page launch a background process which sends the emails to sendgrid.
I thought I could use exec() but then I realized, I am using code igniter, which means the entry point MUST be index.php hence, I don't think exec() will work,
How can I launch a background process that uses code igniter?
This is not really an answer, Just something that is too long to post as a comment
#Frank Farmer: 70 lines seems a bit
excessive, this example from simple
test does it in pretty much half that,
What is the difference?
<?php
//---------------------------
//define required constants
//---------------------------
define('ROOT', dirname(__file__) . '/');
define('APPLICATION', ROOT . 'application/');
define('APPINDEX', ROOT . 'index.php');
//---------------------------
//check if required paths are valid
//---------------------------
$global_array = array(
"ROOT" => ROOT,
"APPLICATION" => APPLICATION,
"APPINDEX" => APPINDEX);
foreach ($global_array as $global_name => $dir_check):
if (!file_exists($dir_check)) {
echo "Cannot Find " . $global_name . " File / Directory: " . $dir_check;
exit;
}
endforeach;
//---------------------------
//load in code igniter
//---------------------------
//Capture CodeIgniter output, discard and load system into $ci variable
ob_start();
include (APPINDEX);
$ci = &get_instance();
ob_end_clean();
//do stuff here
Use exec to run a vanilla CLI PHP script to calls the page via cURL
See http://php.net/manual/en/book.curl.php for info on cURL
This is what I have had to do with some of my codeigniter applications
(Also make sure you set time out to 0)
And doing it this way, you are still able to debug it in the browser
Petah suggested cURL, but recently (since 2.0), CodeIgniter now permits calls to your controllers through the CLI:
This should be easier than cURL.