php exec infinite loop - php

I have the following which call the exec to run the script test.php in the background.
exec("/home/gooffers/test.php?one=one &");
Script test.php contains the following
$test = $_GET['one'];
echo $test;
However this is creating an infinite loop (infinite number of processes) which is crashing the server. Why is this happening.

$_GET is not availible when you are running a script via commandline (php-cli).
See here on how to pass arguments to a command line script in php: How do I pass parameters into a PHP script through a webpage?
Basically, it's
exec("/home/gooffers/test.php arg1 arg2");
and then fetching them via
$argument1 = $argv[1];
$argument2 = $argv[2];

I dont know what is happening, but i think it should be
exec("php /home/gooffers/test.php?one=one &");

Related

Running a program using shell_exec in PHP, how to know if process is done?

Hi I am running a python script inside PHP using the shell_exec command,
However how would I know if the execution of the python has been completed already?
Here is portion of my code
$py_path = 'app/public/engines/validation/delta';
chdir($py_path);
$py_var = 'python scraped_data.py';
$exec = shell_exec($py_var);
How would I know if the python is finished its processing?
I wanted to know because I have a series of processes to run but it needs the first process to be completed first.
Thank You,
If you check php.net description of shell_exec() is:
shell_exec — Execute command via shell and return the complete output as a string
So you can edit your python script to output true or 1 once script is finished.
then you can simply code something like this:
$py_path = 'app/public/engines/validation/delta';
chdir($py_path);
$py_var = 'python scraped_data.py';
$exec = shell_exec($py_var);
If($exec){
//Execute another script or do what you wish
}

Calling a PHP script from within PHP

I have a PHP script that I'd like to run both via the browser and from the command line. When I run the script from the command line, it executes without a problem. However, when I call it via a function such as exec or passthru, it doesn't work. I'm not getting any output from the call and I see no errors in the logs. I'm very confused...
echo exec('php /usr/share/nginx/www/function.php arg1');
Any ideas?
The output is captured in a parameter variable as an array and not returned back, when you use exec(). So do something like this:
exec('php /usr/share/nginx/www/function.php arg1', $output);
print_r($output);
#David,
A few things to check.
Create a PHP Info page and see if exec or passthru is enabled. Hosts disable it for security.
Try:
<?php
$exe = exec('php /usr/share/nginx/www/function.php arg1');
var_dump($exe);
?>
You could even do:
<?php
if(function_exists('exec')) {
echo "exec is enabled";
}
?>

Rails, PHP and parameters

I working in Rails and I need to call to an PHP-script.
I can connect to the script like this:
system('php public/myscript.php')
But I need to send some parameters with it.
How do I do that?
Thanks
You can provide command-line arguments to your PHP script:
system('php public/myscript.php arg1 arg2')
They will be available from your PHP code like this:
echo $argv[0]; // public/myscript.php
echo $argv[1]; // arg1
echo $argv[2]; // arg2
You can just specify the parameters on the command line, such as system('php -f public/myscript.php argument1 argument2 [...]') and they will be available in the $argv[] array, starting from $argv[1]. See the doc page here for more info.
Yes its right way to use system('php public/myscript.php arg1 arg2') as SirDarius answered .
but system command will return the true or false in that case.
system() will return TrueClass or FalseClass and display output, try it on console .
I suggest , You can use the open method on any URL to call it, so you can call your PHP script using that:
require 'open-uri'
open('YOUR PHP SCRIPT PATH WITH PARAMETER') do |response|
content = response.read
end

running shell_exec in php causes web server to hang

I am running the following code. What it does is take a text file, splits it into parts that end with '_part' ending and than calls the same script with a flag to process the files - uploading the content to a Drupal system.
What happens is that the script runs and finishes the work, all invoked scripts finish too and I can see the results. but each time after I run it the web server stops responding. Is there anything basic that I am missing or doing wrong?
if(isset($argv[3])){
$isSplit = $argv[3] == 'true' ? true : false;
}
if($isSplit){
$fileSplitter = new CSVFileParts($fileName);
$parts = $fileSplitter->split_file();
echo 'Splited file to '.$parts.' parts'.PHP_EOL;
for($part =0; $part < $parts; $part++){
echo shell_exec('php Service.php u ./partial_files/'.basename($fileName).'.part_'.$part.' false > /dev/null 2>/dev/null &');
}
}else{
$log->lwrite('uploading '.$argv[2]);
$drupalUploader = new DrupalUploader($fileName, $log);
$drupalUploader->upload();
}
shell_exec — Execute command via shell and return the complete output as a string
shell_exec expects the file handle to be open, but you redirect everything to /dev/null and detach it.
As you plan to detach the process and remove all the output, you should use exec() and escapeshellcmd()
see: http://www.php.net/manual/en/function.exec.php

PHP exec() return value for background process (linux)

Using PHP on Linux, I'd like to determine whether a shell command run using exec() was successfully executed. I'm using the return_var parameter to check for a successful return value of 0. This works fine until I need to do the same thing for a process that has to run in the background. For example, in the following command $result returns 0:
exec('badcommand > /dev/null 2>&1 &', $output, $result);
I have put the redirect in there on purpose, I do not want to capture any output. I just want to know that the command was executed successfully. Is that possible to do?
Thanks, Brian
My guess is that what you are trying to do is not directly possible. By backgrounding the process, you are letting your PHP script continue (and potentially exit) before a result exists.
A work around is to have a second PHP (or Bash/etc) script that just does the command execution and writes the result to a temp file.
The main script would be something like:
$resultFile = '/tmp/result001';
touch($resultFile);
exec('php command_runner.php '.escapeshellarg($resultFile).' > /dev/null 2>&1 &');
// do other stuff...
// Sometime later when you want to check the result...
while (!strlen(file_get_contents($resultFile))) {
sleep(5);
}
$result = intval(file_get_contents($resultFile));
unlink($resultFile);
And the command_runner.php would look like:
$outputFile = $argv[0];
exec('badcommand > /dev/null 2>&1', $output, $result);
file_put_contents($outputFile, $result);
Its not pretty, and there is certainly room for adding robustness and handling concurrent executions, but the general idea should work.
Not using the exec() method. When you send a process to the background, it will return 0 to the exec call and php will continue execution, there's no way to retrieve the final result.
pcntl_fork() however will fork your application, so you can run exec() in the child process and leave it waiting until it finishes. Then exit() with the status the exec call returned.
In the parent process you can access that return code with pcntl_waitpid()
Just my 2 cents, how about using the || or && bash operator?
exec('ls && touch /tmp/res_ok || touch /tmp/res_bad');
And then check for file existence.

Categories