CMD from PHP script - get feedback - php

I have a LOT ( almost 300 ) old SVN repositories to migrate to git using git2svn.
After considering GOLANG and PYTHON, I finally decided that the easiest way is to use PHP . Might be a bad questionable decision, but it's seemed easy.
So, after 15 minutes , I did have a script that is more or less running ok in tests . Ugly script , but it is a one-timer.
The problem is that the process takes a lot of time , even for simple almost empty repos is can take 30sec. and even a minute. On big ones - even 10min - so before taking it into production, I would like to have some feedback mechanism - so I can actually see what is going on .
..as of now ,the script does output the command feedback like so :
$cmd = "cd ".$GITrepoPath." && svn2git svn://127.0.0.1/". $repoName . " --username " .$SVNusername ." --authors authors.txt --notags --nobranches --notrunk";
$output = shell_exec($cmd);
echo "<pre>$output</pre>";
..but this is only after each repo was finished processing .. not like the real cmd execution where I can see the steps .
The only question I found that might be close to what I need was here - but honestly - I did not understood much from the answer ...
I know it is just a one-timer script - but the use case had me interested in how to actually achieve that ( and if it is possible ).
I am on a win7 local machine , but would like to know also for *nix if possible .

shell_exec waits until the process closes. You have to create the process and listen to it, the same as CMD. Use exec function in this way:
$cmd = ''; // your command here
$output_storage = [];
$output_showed = [];
$result = null;
exec($cmd, $output_storage, $result);
while( $result === null ){
$diff = array_diff($output_storage, $output_showed);
if( $diff ){
// all new outputs here as $diff
$output_showed = $diff;
}
}

I suggest instead running a script or program in the background that runs the command and then updates a record in a database, you could then use AJAX or whatever to poll the server for record changes. This allows a nice environment for the user.
The column in the database table could be named something like "finished" and once that boolean is true then you know its complete and the output could be stored in the database.

Related

Interesting thing happened while writing data into redis within a php loop

i wrote a php script to pull data from one server (lets call it Server A) to the other (Server B). data in server A is a redis list stores all the operating commands need to be written in server B, such as :
["setex",["session:xxxx",604800,"xxxx"]]
["set",["uid:xxx","xxxxx"]]
["pipeline",[]]
["set",["uid:xxx","xxxxx"]]
["hIncrBy",["Signin:xxxx","totalTimes",1]]
["pipeline",[]]
....
my php codes are :
while($i < 1000){
$line = $redis['server_a']->rpop('sync:op');
list($op,$params) = json_decode($line,1);
$r = call_user_func_array(array($redis['server_b'], $op), $params);
$i++;
}
The wired thing is, when the call_user_func_array method executes the redis command uncorrectly, all the rest commands in the queue cannot be written correctly into server B.
i stuck in this problem almost one week for seeking answers. after thousands of tests i found if i remove the "bad commands" that cannot be executed correctly, such as the ["pipeline",[]] row. all the other commands can be inserted properly. so it reminds me of some redis transaction issue. maybe there are some machanisms that when a command executed unproperly in redis , all the other commands afterwards will be treated as a transaction. so i add a exec() command into the while loop :
while($i < 1000){
$line = $redis['server_a']->rpop('sync:op');
list($op,$params) = json_decode($line,1);
$r = call_user_func_array(array($redis['server_b'], $op), $params);
$redis['server_b']->exec(); //this is the significant update
$i++;
}
then, my problem solved !!!
My question is , can anybody help me explain the redis machanism? is my assumption correct ?
Your library is probably using transactions for pipelining for whatever reason. pipeline is no actual Redis command, see http://redis.io/commands
Just strip all pipeline commands with empty arguments or just use ->exec when you issued a pipeline before.

Symfony2 background Process running after script ends?

Symfomaniacs!
Let's say I have this code in my controller:
for ($i = 0; $i <= 3; $i++)
{
$command = 'php ' . $kernel->getRootDir() . '/console Educa:ExecuteJob '
. $i . ' --env=' . $kernel->getEnvironment();
$logger->addInfo("Executing command: ".$command);
$process = new Process($command);
$process->disableOutput();
$process->start();
}
//sleep(5);
return $this->render(...);
and the ExecuteJob takes about 5 seconds to execute. Inside the console command, I have a logger->addInfo() at both the beginning and end script.
If I uncomment the sleep(5) line they both print. If I don't, only the initial line prints in the log, which must mean that when the controller scripts end, it stops all processes, am I wrong?
Is there any way to workaround this? I'm open to change the design, I just want to execute background stuff without having to wait to render the page (is not even relevant for that page)
PS: I'm looking for a way to run background tasks that plays nice with Symfony2, not weird php hacks or cron jobs. I know about those. I also don't want to setup a MQ queue. It's kind of overkill for what I'm trying.
In your situation the perfect solution would be to use the kernel.terminate Event, it's triggered after the response is streamed to the client so you can do some heavy operations there.
You will need something like this https://github.com/schmittjoh/JMSJobQueueBundle but you will need to setup cron too
Use & in command to send it to the background, like this
$command = 'php ' . $kernel->getRootDir() . '/console Educa:ExecuteJob &'

Parallelism in php

I want to optimize part of my code to improve performance.Since my application make use of commandline tool , i think it would certainly improve performance to execute lines of code in parallel rather than executing code sequentially
<?php
$value = exec("command goes here"); //this takes time
/* Some instructions here that don't depend on $value */
/* Some instructions here that don't depend on $value */
$result = $value*2 ; //this is just a dumb example
?>
I want to execute the codes that don't depend on value at the same time as $value so that the whole script execute faster rather that waiting for exec() to complete
For a quick and dirty way of releasing your php thread from a blocking exec thread, you can simply append the command with a "&" or "& disown". In the example below I also redirected all errors and stdout to /dev/null. (I assume a linux system and just used a simple command that might take some amount of time...)
$command = "mv oldFolder/hugeFile.txt newFolder/hugeFile.txt >> /dev/null 2>&1 &";
$value=exec($command);
If you really need the return value from $command just remove the >> /dev/null 2>&1 bit.
Unfortunately PHP always despond you in parallelism, concurrent programming, ....
And I never know that why PHP doesn't support these important things and WHEN PHP WANT TO SUPPORT THESE.
But maybe you want to use Fork in php (if you know the problems AND Troubles in Fork )
http://php.net/manual/en/function.pcntl-fork.php
https://codereview.stackexchange.com/questions/22919/how-to-fork-with-php-4-different-approaches

Running PHP Process in the background from a Browser on Windows

I'm a little sorry to ask this as I know that it has been asked many times before on here but all of the answers I have found do not work in my situation. Either I am doing something fundamentally wrong or I am trying to do something which is just not possible.
I need to be able to fork a background process from a PHP file accessed via a web browser (served up by an Apache web server running on Windows). I need the foreground process to finish, and therefore the browser to stop waiting for the server whilst the forked process continues in the background. I am using PHP 5.3.
I have tried numerous suggested solutions all with varying degree of fail:
shell_exec('D:\php5.3\php.exe sleep.php > out 2>out2' );
Whether ran through command line, or through browser, the foreground process did not complete until the background one did. Adding a "&" in at the end didn't seem to make any difference
pclose(popen("start D:\php5.3\php.exe sleep.php","r"));
This one worked fine through command line, but when accessed via browser, it waited for both foreground and background processes to finish.
exec("nohup D:/php5.3/php.exe -f sleep.php > out 2>out2");
This didn't seem to work at all
$commandString = "start /b D:/php5.3/php.exe d:\\webroot\\other\\tests\\sleep.php";
pclose(popen($commandString, 'r'));
Worked in command line, waited in browser
$WshShell = new COM("WScript.Shell");
$oExec = $WshShell->Run("D:/php5.3/php-win.exe -f d:\\webroot\\other\\tests\\sleep.php", 0, false);
didn't work at all - hung when trying to fork the new process.
Can anyone help? Am I missing something really obvious?!
I know I can queue the tasks up in a database and run in batch, but I need this to operate in as close to real time as possible so do not want to introduce more delays by queueing things up that I can then only run max once a minute.
You might try pcntl_fork functions. This can fork a child process (serving the web page) while continuing to do the rest of the work in the work in the background. It is however something that needs to be used with caution. Just read through the php.net documentation to get a feel for some of the complexities of using this approach.
even this post is outdated but may it be useful for others,
this worked for me, it took hours to find the solution!
function execInBack($phpscript, $args = array(), $logfile = '', $stdoutfile = '') {
if (count($args) == 0) {
$args[] = md5(mt_rand());
}
$arg = '';
foreach ($args as $a) {
$arg .= "\"$a\" ";
}
$cmd = "start /b c:\php\php5.3.10\php.exe ./$phpscript $arg";
$p = array();
$desc = array();
if ($logfile && $stdoutfile) {
$desc = array(
1 => array("file", $stdoutfile, "w"),
2 => array("file", $logfile, "a")
);
}
proc_close(proc_open($cmd, $desc, $p, getcwd()));
}

Error handling for popen

I am using popen in PHP to execute a TCL file .
$cmd='C:/wamp/www/Tcl/bin/tclsh84.exe'; //windows
$ph = popen($cmd,'w')
But if someone restarts the machine or the tclsh84.exe process is killed . How do I know this error condition has occured ?
$ph is not returning 0 in these conditions.
Regards,
Mithun
you see the documentation of the tasklist utility from there. This utility is a kind of task manager in command line (a bit less feature since you can only list process and not affect them). Having that utility accessible on your server, you can use exec function to get list of process and work with that.
a probable code would be something like that:
function isTCLRunning(){
$running = false;
exec('tasklist.exe /fo CSV /fi tclsh84.exe', $output);
return count($output) == 1
}
note: this is completly untested you might want to play a bit with tasklist and make sure it is returning expected output before starting to code your PHP function.

Categories