shell_exec php example - php

Hi there are multiple specific examples but I just wanted to have a working generic calling PHP into background example from shell_exec.
So my php function runs a large processing job.
On the top of the script (process.php) I put?
!#usr/bin/php
i think - any way to get that specific path, maybe 'which php'?
then the actual command is
shell_exec(sprintf('php process.php %s %s > /dev/null 2>/dev/null &','data1','data2'));
and access the data from process.php with argsv[1] and argsv[2]
?
thanks

You can definitely access the arguments from process.php the way you described, but why would you want to kick off process.php like that?
If you're already in a php shell script, why not just include the process.php file?

You need to run curl multi exec on your php script with a parameter and receive an answer after run, like: http://localhost/phpjob.php?par1=asdasd&par2=1212....

Related

Execute multipe PHP scripts simultaneously

Okay I have 50 php scripts each of which will take 20 days to finish I want to write a PHP script to run these 50 scripts simultaneously I did use exec() function in my script but the problem is it runs first script and wait until it is finished before executing the next script. I want to run all of them in parallel.Is there any way to do that?
Thanks
Okay thanks I got my answer after a lot of search
Apart from adding a &, you also need to redirect output to somewhere - otherwise your php process waits until the other process finished, because there could be more output:
exec('/path/to/program & > /dev/null 2>&1')
You put a & between scripts
php /var/www/script1.php & php /var/www/script2.php ........
if have to use only php, more "php'iish" way to do this is Robo (https://robo.li) it is used with codeception, for example.

how to connect php with bash scripts or passing variables from php to bash script [duplicate]

This question already has answers here:
How to run shell script with live feedback from PHP?
(2 answers)
Closed 9 years ago.
I want to know how PHP can run bash scripts.
I mean, I want to pass some variables from a PHP script to a bash script which will run when the submit button is clicked.
Also how can I take output from terminal to php page?
The purpose of this question is to have understanding of making front end in PHP and back end with bash, I want to run command of terminal from PHP page by giving some variables.
Please put me in right direction, I haven't slept for 4 days now, just reading stuff for it, so far I am confused.
You might want to take a look at the Symfony2 Console Component. It is designed to facilitate the execution of shell commands and returning their output.
Running shell script from php code can be done as follows:
shell_exec("script_path -with arguments");
The command shell_exec returning whatever the script prints out.
here is an example:
<?php
$dir_listing=shell_exec("/bin/ls -1 /path");
?>
Try this:
$file = "/etc/hosts";
echo `cat $file`;
Added:
If you want to use it like this:
$f = $_POST['f'];
$s = $_POST['s'];
$path = $basepath.'/'.$_POST['path'];
$res = `dvblast -f $f -s $s -c $path`;
be sure that your input is sanitized. May be shell injection here.

PHP exec() re-executing same script again and again

Here I am writing about my flow in detail.
I am doing an Ajax request on same host.
AJAX call to saveUser.php
in saveUser.php, I have included Common.php.
Common.php have createFile1() and createFile2() function.
createFile1() function just creating a sample1.php for another purpose.
in createFile2(), I am creating a file sample2.php and executing a exec('php /home/public/sample2.php > /dev/null &'); command to execute that file.
I am calling this createFile1() and createFile2() function respectively from saveUser.php which is being called by an AJAX request.
As I have set the exec command to run in background by '&' at end of command, it is returning without waiting for the response and all goes smoothly in front.
But when I am checking on my server, these files sample1.php and sample2.php are getting created again and again. It seems all this saveUser.php action are getting executed again and again until I stops the sample2.php process from SSH.
I checked the processes list, each time 'php /home/public/sample2.php' is having new process_id, so it confirms that it is getting executed again and again. If I remove the exec() code and execute this sample2.php from SSH, it works as expected, there is not such problem.
Please suggest me whats going on wrong? Is there any problem with server configuration, I am using hostgator shared account.
Also I am including same Common.php file in sample2.php also, informing in case it can help it.
Thanks for your help in advance.
saveUser.php code
include_once dirname(__FILE__).'/Common.php';
createFile1();
createFile2();
echo 'saved successfully!';
Common.php code
function createFile1()
{
$template = file_get_contents('sendDmTemplate.php');
$serviceFp = fopen('sendDmFiles/sample1.php',"w");
fwrite($serviceFp , $fileContent);
fclose($serviceFp);
}
function createFile2()
{
$fileContent = file_get_contents(dirname(__FILE__).'/sampleFileTemplate.php');
$serviceFp = fopen('sample2.php',"w");
fwrite($serviceFp , $fileContent);
fclose($serviceFp);
exec('php '.dirname(__FILE__).'/sample2.php > /dev/null &');
}
sample2.php code
include_once dirname(__FILE__).'/Common.php';
echo 'hi';
This exact same thing happened to me. I was trying to have one php script call exec() on another php script but for some strange reason it would just exec() itself again creating an infinite loop. The currently accepted answer didn't get me anywhere, but I was able to get it to work by specifying an absolute path to php.
So, instead of
exec('php /home/public/sample2.php > /dev/null &');
it would be
exec('/usr/local/bin/php /home/public/sample2.php > /dev/null &');
Check the following:
1) Are you sure you are NOT calling your script saveUser.php multiple times? Mayby a codingerror somewhere in the javascript XHR? Check this by looking in the (apache?) log.
2) Are you sure your php executes alright without the -q? I use php -q pathtoscript.php
3) If not 1 or 2: Post the code in here (or somewhere) of saveUser.php
EDIT: I see your problem. The file you create includes common.php again, and executes that again. Etc. <-- wrong Oops. I wrote that too early. Looking into it again now.
4) Is it possible you use some errorhandling that redirects to your saveUser.php?
Edit:
5) There might arise a problem from the fact that you are including the file that is executing the command itself in combination with include_once, but I am not sure. You could simply test this by changing your content of sample2.php content by adjusting sampleFileTemplate.php. Create a common2.php (with identical content as common.php), and use that one. COuld you testdrive that setup and report back?

how to create such a php script so that it can take same argument from other php script and command line both

I am creating PHP script which is taking arguments from command line right now but after sometime it may change to simply including my PHP script.
How can I prepare my script for both scenarios?
Can I create such PHP script so that it can take argument from command line and from other script which is simply including my script?
let's say script A is the first script that can be called from the commnad line and included in another script and script B is the one that includes it.
if you include script A in B you'll have access to any variable in script B. so why don't you add a check in script A to see if a param has been passed from the command line, if no params have been passed from the command line use whatever variables you created in script B
If I understand well, you'll be forced to use a parameter to tell if your script is using the commandline parameter, or if he will call your other script.
In your script, you can do something like :
if (!strcmp($argv[1], "-s"))
shell_exec('php YourOtherScript.php [put_args_here]');
else
your_current_script($argv);
Then, if you want to execute your other script, you'll make a command like :
php MyScript.php -s [args]
Otherwise, simply put the standard arguments :
php MyScript.php [args]
could be as simple as:
if(isset($argv[1])){
//commadn line arguments
$foo=$argv[1]
}else{
//not
$foo=$foo
}

Run and forget system call in php

So I am trying to execute some script from my php code. That lives in page blah.php
<?php
// ....
// just basic web site that allows upload of file...
?>
Inside I use system call
if (system("blah.pl arg1") != 0)
{
print "error\n";
}
else
{
print "working on it..you will receive e-mail at completion\n";
}
Works great but it waits until it completes until it prints working on it.
I am aware that I am calling perl script from php not a typo.
How can I just start program execution and let it complete in background.
The blah.pl script handles e-mail notification.
Anyone?
Thanks I appreciate it
From system function documentation:
Note: If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
Simply redirect the output of your Perl script and PHP should be able to run without having to wait for the script to finish.
system("blah.pl arg1 > /dev/null 2>&1 &");
Can you add an ampersand to spawn the process in the background?
if (system("blah.pl arg1 &") != 0)
You could probably use the popen function
http://www.php.net/manual/en/function.popen.php
Since all the PHP command line functions wait for a result - I recommend that you have a separate page to call the emailer script.
In other words, send a request to the email.php page (or whatever) using cURL or the file wrappers which allows you to close the connection before receiving a result. So the process will look something like this.
page_running.php
|
cURL call to (email.php?arg1=0)
| |
final output email.php
|
calls system("blah.pl arg1")

Categories