Define variable in command line - php

OK, the question is simple though I can't find a real working solution.
I want to be able to define something while invoking a specific script.
I have tried it like php -d DEBUG_ON myscript.php but it's not working (when testing if (defined("DEBUG_ON")) { } inside the script, it returns false)
Also tried something along the lines of php -r ('define("DEBUG_ON",1);') myscript.php; which doesn't work either.
So, any ideas? (Or any suggestions on how I could achieve the very same effect?)

Use $argv to pass arguments from command line. Then define them accordingly in the beginning of the script.
if(php_sapi_name() === 'cli') {
define('DEBUG_ON', $argv[1]);
}
If you put this code in the beginning of your script, it should then define DEBUG_ON to whatever you pass as argument from commandline: php myscript.php arg1
You can also define($argv[1], $argv[2]); and then use php myscript.php DEBUG_ON 1 to define DEBUG_ON as 1.

Related

Calling Perl script with PHP, passing variables and put result into a file

I am near losing my mind cause of a perl script I want to call via PHP.
I have a PHP Form where I put in a MySQL Query which gets stored in a file and choosing some variables.
If I call any SHELL command like "top" ... everything works fine, but as soon as I try to call my perl script with variables, there are no results at all.
The file where the results should get stored stays empty.
That's the calling part from the PHP File:
if (isset($_POST['submit'])) {
$file = 'query.sql';
$query = $_POST['query'];
file_put_contents($file, $query);
$command = "perl /home/www/host/html/cgi/remote-board-exec.pl -sqlfileexec query.sql > /home/www/host/html/cgi/passthrutest.txt";
exec($command, &$ausgabe, $return_var);
There is no error message and i already tried debug things, but nothing helped :(
Are you sure that perl is being executed? Perhaps you ought to replace the command with 'which perl' just to make sure, and to use the full path to Perl. Another idea is to make sure your:
perl script is executable (use chmod)
ensure it has '#!/usr/bin/perl' (or wherever your path to perl is)
change the command to "/home/www/host/cgi/remote-board-exec.pl..." without the perl command
dump the contents of your output array ($ausgabe) as if the command fails to execute you may find out what is happening.

make a PHP script executable from CLI and include-able?

Consider this:
#!/usr/bin/php
<?php
class Foo {
static function bar () {
echo "Foo->bar\n";
}
}
if (PHP_SAPI === 'cli') {
Foo::bar();
}
?>
I can execute this from CLI, but when I include it in, say, a CGI-run PHP script, the shebang ends up in the output.
I like simple scripts compact: I guess I could put the class part in a separate "lib"-file and have a simple wrapper for CLI use. BUT I'd like to keep it all in one place without having to worry about include paths etc.
Is this possible without ob_*-wrapping the include to capture the shebang (if this is even possible), or is it dumb to cram all of this into one file anyway? Alternatives/Thoughts/Best Practices welcome!
Edit: I'd like to put the script in my PATH, so calling I'd rather not call it by php file.php. See my comment to #misplacedme's answer
It's actually easy.
Remove the shebang and when you run the script, run it as
php scriptname.php OR /path/to/php scriptname.php
instead of
./scriptname.php
Running php script.php will look in only the current directory, or any directory within PATH. If you absolutely have to run it that way, add it. export PATH=$PATH:/path/to/php/script/folder(in bash)
That will mess up includes unless you're using full paths within the script.
No matter what you do, you'll have to use full paths somewhere.
I'm rather late to this one, but if anyone still cares, you can solve this on Linux by registering a binfmt handler.
As a one-off (resets after reboot):
echo ":PHP:M::<?php::/usr/bin/php:" > /proc/sys/fs/binfmt_misc/register
With this in place, any file that starts with the "magic" string "<?php" will be executed by running it with /usr/bin/php.
You can make this registration permanent by saving the line a file in /etc/binfmt.d
You can remove the registration with:
echo -1 > /proc/sys/fs/binfmt_misc/PHP

how to create such a php script so that it can take same argument from other php script and command line both

I am creating PHP script which is taking arguments from command line right now but after sometime it may change to simply including my PHP script.
How can I prepare my script for both scenarios?
Can I create such PHP script so that it can take argument from command line and from other script which is simply including my script?
let's say script A is the first script that can be called from the commnad line and included in another script and script B is the one that includes it.
if you include script A in B you'll have access to any variable in script B. so why don't you add a check in script A to see if a param has been passed from the command line, if no params have been passed from the command line use whatever variables you created in script B
If I understand well, you'll be forced to use a parameter to tell if your script is using the commandline parameter, or if he will call your other script.
In your script, you can do something like :
if (!strcmp($argv[1], "-s"))
shell_exec('php YourOtherScript.php [put_args_here]');
else
your_current_script($argv);
Then, if you want to execute your other script, you'll make a command like :
php MyScript.php -s [args]
Otherwise, simply put the standard arguments :
php MyScript.php [args]
could be as simple as:
if(isset($argv[1])){
//commadn line arguments
$foo=$argv[1]
}else{
//not
$foo=$foo
}

Cronjob issue. Doesn't recognize variable

Works:
php -q /home/site/public_html/cron/file.php
Doesn't work:
php -q /home/site/public_html/cron/file.php?variable=1
Any suggestions? I need to send the variable as $_GET (or not)
do it something like this
curl http://hostname/cron/file.php?variable=1
and in the file.php you will be managing the code to get the $_GET[variable]
this woould behave as a simple browser call but only in your shell/terminal
Hope this helps
Command Line arguments are passed in $argv instead of the normal $_GET/$_POST-Arrays
Of course this does not work with URI-style parameters (that ?variable=1-part). So you have to call it like: php -q /path/to/script.php 1.
As an alternative you could use getopt:
<?php
$shortopts = implode("", array(
"v:"
));
$longopts = array(
"variable:", // Required value
);
$options = getopt($shortopts, $longopts);
var_dump($options);
And call it like php -q /path/to/script.php --variable=1.
The easiest way to work around this (assuming public_html, is, well, public WWW), is to have cron call wget or curl to access the PHP file, so URL variables are processed as normal.
-q means no head, so there is no space for the get-fields i assume, at least i hope so :D
Greetz

Rails, PHP and parameters

I working in Rails and I need to call to an PHP-script.
I can connect to the script like this:
system('php public/myscript.php')
But I need to send some parameters with it.
How do I do that?
Thanks
You can provide command-line arguments to your PHP script:
system('php public/myscript.php arg1 arg2')
They will be available from your PHP code like this:
echo $argv[0]; // public/myscript.php
echo $argv[1]; // arg1
echo $argv[2]; // arg2
You can just specify the parameters on the command line, such as system('php -f public/myscript.php argument1 argument2 [...]') and they will be available in the $argv[] array, starting from $argv[1]. See the doc page here for more info.
Yes its right way to use system('php public/myscript.php arg1 arg2') as SirDarius answered .
but system command will return the true or false in that case.
system() will return TrueClass or FalseClass and display output, try it on console .
I suggest , You can use the open method on any URL to call it, so you can call your PHP script using that:
require 'open-uri'
open('YOUR PHP SCRIPT PATH WITH PARAMETER') do |response|
content = response.read
end

Categories