PHP execute external script without waiting, while passing variables - php

I'm making a function for my users where they can upload large XML files to synchronize with my database.
When a user uploads a file to upload.php, I want to start processing the data in the background with process.php, preferably from a shell command, and redirect the user to status.php, which shows the process of the synchronization.
I need to pass some variables to the process.php script while executing it, either at least one variable with the user id and put the other variables into a text file, (Would probably prefer this so I wont have to put to much data into the exec() command.) or the user id and a bunch of $_POST variables.
One solution I had in mind is executing the PHP script like this:
exec("php -f ./process.php > /dev/null 2>/dev/null &");
This allows me to lock away process.php from http access, which is good since it's a process taking script. The only thing I need here is to pass a variable somehow, but i don't know how to do it.
So my main question is:
How do i pass a variable in the above solution?
Or do any of you have a better solution to doing this? Possibly one where i wont have to go through exec()? Keep in mind that i do not want the user to wait for the script to execute, and i need to pass at least one variable.
Update: For future reference, remember to use escapeshellarg() when passing arguments through exec() or likewise functions.

You test use it
exec("php -f ./process.php var1 var2 > /dev/null 2>/dev/null &");
And if you like get these variables values can acces with global variable $argv. If you print this var show same:
print_r($argv);
Array
(
[0] => process.php
[1] => var1
[2] => var2
)

You can pass parameters like the following.
// call process.php
exec("php -f ./process.php foo=bar bar=foo > /dev/null 2>/dev/null &");
// process.php
if ($argc > 0) {
for ($i=1;$i < $argc;$i++) {
parse_str($argv[$i],$tmp);
$_REQUEST = array_merge($_REQUEST, $tmp);
}
}
var_dump($_REQUEST);

I don't really understood your goal, but to pass an argument to an PHP-script works similar to any other shell scripts. See: http://www.php.net/manual/en/features.commandline.usage.php (Example #2)
"When a user uploads a file […], I want to start processing the data in the background" - You can't access an upload before it is finished, in PHP using CGI.

Here is my solution.
The advantage is that you can use this script for command line usage as well as for regular web usage. In case you call it from cmd the $argv variable is set, so the parse_str() part extracts the variables and puts them into the $_GET array. In case you call it from the web the $argv is not set so the values come from the url.
// executingScript.php
// You have to make percent escaping yourself
exec("php -f ./executedScript.php foo=bar%20foo bar=foo%20bar > /dev/null 2>/dev/null &");
// executedScript.php
// The if-statement avoids crashing when calling the script from the web
if (isset($argv)) {
parse_str(implode('&', array_slice($argv, 1)), $_GET);
}
This will make you able to access the variables as usual:
echo $_GET["foo"] // outputs "bar foo"
echo $_GET["bar"] // outputs "foo bar"

Related

PHP: how can I call a function in a non-blocking fashion?

I saw a couple of other question on the issue but not a clear answer.
I've a PHP file (must be PHP, cannot cron or other stuff) running from CLI where I must call the same function multiple time with different arguments:
doWork($param1);
doWork($param2);
doWork($param2);
function doWork($data)
{
//do stuff, write result to db
}
Each call makes HTTPs requests and parses the response. The operation can require up to a minute to complete. I must prevent the "convoy effect": each call must be executed without waiting for the previous one to complete.
PECL pthread is not an option due to server constraints.
Any ideas?
As far as I know you cannot do what you are looking for.
Instead of calling a function with its parameters, you have to call another cli php script in a nonblocking manner and put your function in that script.
This is your main script:
callDoWork($param1);
callDoWork($param2);
callDoWork($param3);
function callDoWork($param){
$cmd = 'start "" /b php doWork.php '.$param;
//if $param contains spaces or other special caracters for the command line,
// you have to escape them.
pclose(popen($cmd);
}
doWork.php would look like :
if(is_array($_SERVER['argv'])) $param = $_SERVER['argv'][1];
doWork($param);
function doWork($data)
{
//do stuff, write result to db
}
More information about argv.
How about adding "> /dev/null 2>/dev/null &"
exec('php myFile.php > /dev/null 2>/dev/null &');
You can check the documentation for more

calling exec on a php file and passing parameters?

I am wanting to call a php file using exec.
When I call it I want to be able to pass a variable through (an id).
I can call echo exec("php /var/www/unity/src/emailer.php"); fine, but the moment I add anything like echo exec("php /var/www/unity/src/emailer.php?id=123"); the exec call fails.
How can I do this?
Your call is failing because you're using a web-style syntax (?parameter=value) with a command-line invokation. I understand what you're thinking, but it simply doesn't work.
You'll want to use $argv instead. See the PHP manual.
To see this in action, write this one-liner to a file:
<?php print_r($argv); ?>
Then invoke it from the command-line with arguments:
php -f /path/to/the/file.php firstparam secondparam
You'll see that $argv contains the name of the script itself as element zero, followed by whatever other parameters you passed in.
try echo exec("php /var/www/unity/src/emailer.php 123"); in your script then read in the commandline parameters.
If you want to pass a GET parameter to it, then it's mandatory to provide a php-cgi binary for invocation:
exec("QUERY_STRING=id=123 php-cgi /var/www/emailer.php");
But this might require more fake CGI environment variables. Hencewhy it is often advisable to rewrite the called script and let it take normal commandline arguments and read them via $_SERVER["argv"].
(You could likewise just fake the php-cgi behaviour with a normal php interpreter and above example by adding parse_str($_SERVER["QUERY_STRING"], $_GET); on top of your script.)
this adapted script shows 2 ways of passing parameters to a php script from a php exec command:
CALLING SCRIPT
<?php
$fileName = '/var/www/ztest/helloworld.php 12';
$options = 'target=13';
exec ("/usr/bin/php -f {$fileName} {$options} > /var/www/ztest/log01.txt 2>&1 &");
echo "ended the calling script";
?>
CALLED SCRIPT
<?php
echo "argv params: ";
print_r($argv);
if ($argv[1]) {echo "got the size right, wilbur! argv element 1: ".$argv[1];}
?>
dont forget to verify execution permissions and to create a log01.txt file with write permissions (your apache user will usually be www-data).
RESULT
argv params: Array
(
[0] => /var/www/ztest/helloworld.php
[1] => 12
[2] => target=13
)
got the size right, wilburargv element 1: 12
choose whatever solution you prefer for passing your parameters, all you need to do is access the argv array and retrieve them in the order that they are passed (file name is the 0 element).
tks #hakre
I know this is an old thread but it helped me solve a problem so I want to offer an expanded solution. I have a php program that is normally called through the web interface and takes a long list of parameters. I wanted to run it in the background with a cron job using shell_exec() and pass a long list of parameters to it. The parameter values change on each run.
Here is my solution: In the calling program I pass a string of parameters that look just like the string a web call would send after the ?. example: sky=blue&roses=red&sun=bright etc. In the called program I check for the existence of $argv[1] and if found I parse the string into the $_GET array. From that point forward the program reads in the parameters just as if they were passed from a web call.
Calling program code:
$pars = escapeshellarg($pars); // must use escapeshellarg()
$output = shell_exec($php_path . ' path/called_program.php ' . $pars); // $pars is the parameter string
Called program code inserted before the $_GET parameters are read:
if(isset($argv[1])){ // if being called from the shell build a $_GET array from the string passed as $argv[1]
$args = explode('&', $argv[1]); // explode the string into an array of Type=Value elements
foreach($args as $arg){
$TV = explode('=', $arg); // now explode each Type and Value into a 2 element array
$_GET[$TV[0]] = $TV[1]; // set the indexes in the $_GET array
}
}
//------------------------
// from this point on the program processes the $_GET array normally just as if it were passed from a web call.
Works great and requires minimal changes to the called program. Hope someone finds it of value.

PHP passing $_GET in the Linux command prompt

Say we usually it access via
http://localhost/index.php?a=1&b=2&c=3
How do we execute the same on a Linux command prompt?
php -e index.php
But what about passing the $_GET variables? Maybe something like php -e index.php --a 1 --b 2 --c 3? I doubt that'll work.
From this answer on Server Fault:
Use the php-cgi binary instead of just php, and pass the arguments on the command line, like this:
php-cgi -f index.php left=1058 right=1067 class=A language=English
Which puts this in $_GET:
Array
(
[left] => 1058
[right] => 1067
[class] => A
[language] => English
)
You can also set environment variables that would be set by the web server, like this:
REQUEST_URI='/index.php' SCRIPT_NAME='/index.php' php-cgi -f index.php left=1058 right=1067 class=A language=English
Typically, for passing arguments to a command line script, you will use either the argv global variable or getopt:
// Bash command:
// php -e myscript.php hello
echo $argv[1]; // Prints "hello"
// Bash command:
// php -e myscript.php -f=world
$opts = getopt('f:');
echo $opts['f']; // Prints "world"
$_GET refers to the HTTP GET method parameters, which are unavailable on the command line, since they require a web server to populate.
If you really want to populate $_GET anyway, you can do this:
// Bash command:
// export QUERY_STRING="var=value&arg=value" ; php -e myscript.php
parse_str($_SERVER['QUERY_STRING'], $_GET);
print_r($_GET);
/* Outputs:
Array(
[var] => value
[arg] => value
)
*/
You can also execute a given script, populate $_GET from the command line, without having to modify said script:
export QUERY_STRING="var=value&arg=value" ; \
php -e -r 'parse_str($_SERVER["QUERY_STRING"], $_GET); include "index.php";'
Note that you can do the same with $_POST and $_COOKIE as well.
Sometimes you don't have the option of editing the PHP file to set $_GET to the parameters passed in, and sometimes you can't or don't want to install php-cgi.
I found this to be the best solution for that case:
php -r '$_GET["key"]="value"; require_once("script.php");'
This avoids altering your PHP file and lets you use the plain php command. If you have php-cgi installed, by all means use that, but this is the next best thing. I thought this options was worthy of mention.
The -r means run the PHP code in the string following. You set the $_GET value manually there, and then reference the file you want to run.
It's worth noting you should run this in the right folder, often, but not always, the folder the PHP file is in. 'Requires' statements will use the location of your command to resolve relative URLs, not the location of the file.
I don't have a php-cgi binary on Ubuntu, so I did this:
% alias php-cgi="php -r '"'parse_str(implode("&", array_slice($argv, 2)), $_GET); include($argv[1]);'"' --"
% php-cgi test1.php foo=123
<html>
You set foo to 123.
</html>
%cat test1.php
<html>You set foo to <?php print $_GET['foo']?>.</html>
Use:
php file_name.php var1 var2 varN
Then set your $_GET variables on your first line in PHP, although this is not the desired way of setting a $_GET variable and you may experience problems depending on what you do later with that variable.
if (isset($argv[1])) {
$_GET['variable_name'] = $argv[1];
}
The variables you launch the script with will be accessible from the $argv array in your PHP application. The first entry will the name of the script they came from, so you may want to do an array_shift($argv) to drop that first entry if you want to process a bunch of variables. Or just load into a local variable.
Try using WGET:
WGET 'http://localhost/index.php?a=1&b=2&c=3'
Option 1: php-cgi
Use 'php-cgi' in place of 'php' to run your script. This is the simplest way as you won't need to specially modify your PHP code to work with it:
php-cgi -f /my/script/file.php a=1 b=2 c=3
Option 2: If you have a web server
If the PHP file is on a web server you can use 'wget' on the command line:
wget 'http://localhost/my/script/file.php?a=1&b=2&c=3'
Or:
wget -q -O - "http://localhost/my/script/file.php?a=1&b=2&c=3"
Accessing the variables in PHP
In both option 1 & 2, you access these parameters like this:
$a = $_GET["a"];
$b = $_GET["b"];
$c = $_GET["c"];
If you have the possibility to edit the PHP script, you can artificially populate the $_GET array using the following code at the beginning of the script and then call the script with the syntax: php -f script.php name1=value1 name2=value2
// When invoking the script via CLI like
// "php -f script.php name1=value1 name2=value2",
// this code will populate $_GET variables called
// "name1" and "name2", so a script designed to
// be called by a web server will work even
// when called by CLI
if (php_sapi_name() == "cli") {
for ($c = 1; $c < $argc; $c++) {
$param = explode("=", $argv[$c], 2);
$_GET[$param[0]] = $param[1]; // $_GET['name1'] = 'value1'
}
}
If you need to pass $_GET, $_REQUEST, $_POST, or anything else you can also use PHP interactive mode:
php -a
Then type:
<?php
$_GET['a'] = 1;
$_POST['b'] = 2;
include("/somefolder/some_file_path.php");
This will manually set any variables you want and then run your PHP file with those variables set.
At the command line, paste the following:
export QUERY_STRING="param1=abc&param2=xyz" ;
POST_STRING="name=John&lastname=Doe" ; php -e -r
'parse_str($_SERVER["QUERY_STRING"], $_GET); parse_str($_SERVER["POST_STRING"],
$_POST); include "index.php";'
I just pass them like this:
php5 script.php param1=blabla param2=yadayada
It works just fine. The $_GET array is:
array(3) {
["script_php"]=>
string(0) ""
["param1"]=>
string(6) "blabla"
["param2"]=>
string(8) "yadayada"
}
Use:
php -r 'parse_str($argv[2],$_GET);include $argv[1];' index.php 'a=1&b=2'
You could make the first part as an alias:
alias php-get='php -r '\''parse_str($argv[2],$_GET);include $argv[1];'\'
Then simply use:
php-get some_script.php 'a=1&b=2&c=3'
Or just (if you have Lynx):
lynx 'http://localhost/index.php?a=1&b=2&c=3'

shell_exec php example

Hi there are multiple specific examples but I just wanted to have a working generic calling PHP into background example from shell_exec.
So my php function runs a large processing job.
On the top of the script (process.php) I put?
!#usr/bin/php
i think - any way to get that specific path, maybe 'which php'?
then the actual command is
shell_exec(sprintf('php process.php %s %s > /dev/null 2>/dev/null &','data1','data2'));
and access the data from process.php with argsv[1] and argsv[2]
?
thanks
You can definitely access the arguments from process.php the way you described, but why would you want to kick off process.php like that?
If you're already in a php shell script, why not just include the process.php file?
You need to run curl multi exec on your php script with a parameter and receive an answer after run, like: http://localhost/phpjob.php?par1=asdasd&par2=1212....

PHP exec() return value for background process (linux)

Using PHP on Linux, I'd like to determine whether a shell command run using exec() was successfully executed. I'm using the return_var parameter to check for a successful return value of 0. This works fine until I need to do the same thing for a process that has to run in the background. For example, in the following command $result returns 0:
exec('badcommand > /dev/null 2>&1 &', $output, $result);
I have put the redirect in there on purpose, I do not want to capture any output. I just want to know that the command was executed successfully. Is that possible to do?
Thanks, Brian
My guess is that what you are trying to do is not directly possible. By backgrounding the process, you are letting your PHP script continue (and potentially exit) before a result exists.
A work around is to have a second PHP (or Bash/etc) script that just does the command execution and writes the result to a temp file.
The main script would be something like:
$resultFile = '/tmp/result001';
touch($resultFile);
exec('php command_runner.php '.escapeshellarg($resultFile).' > /dev/null 2>&1 &');
// do other stuff...
// Sometime later when you want to check the result...
while (!strlen(file_get_contents($resultFile))) {
sleep(5);
}
$result = intval(file_get_contents($resultFile));
unlink($resultFile);
And the command_runner.php would look like:
$outputFile = $argv[0];
exec('badcommand > /dev/null 2>&1', $output, $result);
file_put_contents($outputFile, $result);
Its not pretty, and there is certainly room for adding robustness and handling concurrent executions, but the general idea should work.
Not using the exec() method. When you send a process to the background, it will return 0 to the exec call and php will continue execution, there's no way to retrieve the final result.
pcntl_fork() however will fork your application, so you can run exec() in the child process and leave it waiting until it finishes. Then exit() with the status the exec call returned.
In the parent process you can access that return code with pcntl_waitpid()
Just my 2 cents, how about using the || or && bash operator?
exec('ls && touch /tmp/res_ok || touch /tmp/res_bad');
And then check for file existence.

Categories