PHP - send command line output to dynamically named files - php

I have an application which is built in CakePHP 3.
It uses Console Commands to execute several intensive processes in the background using cron.
The application consists of 5 individual commands:
src/Command/Stage1Command.php
src/Command/Stage2Command.php
src/Command/Stage3Command.php
src/Command/Stage4Command.php
src/Command/Stage5Command.php
These can be executed manually by running each one individually, e.g. to execute Stage1Command.php:
$ php bin/cake.php stage1
To make them run via Cron, I created a 6th command (src/Command/RunAllCommand.php) which goes through these in order.
// src/Command/RunAllCommand.php
class RunAllCommand extends Command
{
public function execute(Arguments $args, ConsoleIo $io)
{
$stage1 = new Step1Command();
$this->executeCommand($stage1);
// ...
$stage5 = new Stage5Command();
$this->executeCommand($stage5);
}
}
This works fine so I can now execute everything with 1 command, php bin/cake.php run_all, which will be added as a cron task to automate running the 5 processes.
The problem I'm having is that each of the 5 commands (Stage1Command ... Stage5Command) produces output which appears on standard output in the console.
I need to be able to write the output produced by each of the 5 commands individually into dynamically named files.
So I can't do something like this
$ php bin/cake.php run_all > output.log
Because
output.log would contain everything, i.e. the output from all 5 commands.
output.log isn't a dynamic filename, it has been entered manually on the command line (or as the output destination of the cron task).
I looked at Redirecting PHP output to a text file and tried the following.
Added ob_start(); to RunAllCommand.php:
namespace App\Command;
ob_start();
class RunAllCommand extends Command { ... }
After executing the first task (Stage1Command) capturing ob_get_clean() to a variable called $content:
$stage1 = new Step1Command();
$this->executeCommand($stage1);
$content = ob_get_clean();
When I var_dump($content); it comes out as an empty string:
string(0) ""
But the output is still produced on the command line when executing php bin/cake.php run_all (RunAllCommand.php).
My plan for the dynamic filename was to generate it with PHP inside RunAllCommand.php, e.g.
// $id is a dynamic ID generated from a database call.
// This $id is being generated inside a foreach() loop so is different on each iteration (hence the dynamic nature of the filename).
$id = 234343;
$filename_stage1 = 'logs/stage1_' . $id . '.txt'; // e.g. "logs/stage1_234343.txt"
Then write $content to the above file, e.g.
file_put_contents($filename_stage1, $content);
So I have 2 problems:
The output is being echoed to the console, and unavailable in $content.
Assuming (1) is fixed, how to "reset" the output buffering such that I can use file_put_contents with 5 different filenames to capture the output for the relevant stage.

On each command file you could use the LogTrait then output what file is outputting before any commands to seperate what command is logging or setup the log config with different scopes to output to different files. example of outputting to the cli-debug.log file.
use Cake\Log\LogTrait;
class Stage1Command extends Command
{
use LogTrait;
public function execute(Arguments $args, ConsoleIo $io)
{
$this->log('Stage 1 Output: ', 'debug');
//do stuff
$this->log('output stage 1 stuff', 'debug');
}
}

I have two suggestions for solving your issue.
Option 1 - Using shell_exec
shell_exec returns a string of the output, so you can write it to a log file directly.
public function execute(Arguments $args, ConsoleIo $io)
{
$stage1_log = shell_exec('bin/cake stage1 arguments');
file_put_contents('stage1_dynamic_log_file.txt', $stage1_log);
$stage2_log = shell_exec('bin/cake stage2 arguments');
file_put_contents('stage2_dynamic_log_file.txt', $stage2_log);
}
Option 2 - Overwrite the ConsoleOut stream
Alternatively a more CakePHP style would be to call the command slightly differently. If you look at the contents of executeCommand() it does a few checks and then calls command->run($args, $io)
Also if you look at how the ConsoleIo is constructed, we can override the output method so instead of using php://stdout we could use a file instead, if you look at the code for ConsoleOutput it's just using normal fopen and fwrite.
use Cake\Console\ConsoleIo;
use Cake\Console\ConsoleOutput;
public function execute(Arguments $args, ConsoleIo $io)
{
// File names
$id = 234343;
$filename_stage1 = 'logs/stage1_' . $id . '.txt';
// Create command object
$stage1 = new Stage1Command();
// Define output as this filename
$output = new ConsoleOutput($filename_stage1);
// Create a new ConsoleIo using this new output method
$stage1_io = new ConsoleIo($output);
// Execute passing in the ConsoleIo with text file for output
$this->executeCommand($stage1, ['arguments'], $stage1_io);
}

Related

CakePHP 3 - executing multiple Commands from 1 Command and logging errors if they occur

I'm building an application in CakePHP 3.8 which uses Console Commands to execute several processes.
These processes are quite resource intensive so I've written them with Commands because they would easily time-out if executed in a browser.
There are 5 different scripts that do different tasks: src/Command/Stage1Command.php,
... src/Command/Stage5Command.php.
The scripts are being executed in order (Stage 1 ... Stage 5) manually, i.e. src/Command/Stage1Command.php is executed with:
$ php bin/cake.php stage1
All 5 commands accept one parameter - an ID - and then perform some work. This has been set up as follows (the code in buildOptionsParser() exists in each command):
class Stage1Command extends Command
{
protected function buildOptionParser(ConsoleOptionParser $parser)
{
$parser->addArgument('filter_id', [
'help' => 'Filter ID must be passed as an argument',
'required' => true
]);
return $parser;
}
}
So I can execute "Stage 1" as follows, assuming 428 is the ID I want to pass.
$ php bin/cake.php stage1 428
Instead of executing these manually, I want to achieve the following:
Create a new Command which loops through a set of Filter ID's and then calls each of the 5 commands, passing the ID.
Update a table to show the outcome (success, error) of each command.
For (1) I have created src/Command/RunAllCommand.php and then used a loop on my table of Filters to generate the IDs, and then execute the 5 commands, passing the ID. The script looks like this:
namespace App\Command;
use Cake\ORM\TableRegistry;
// ...
class RunAllCommand extends Command
{
public function execute(Arguments $args, ConsoleIo $io)
{
$FiltersTable = TableRegistry::getTableLocator()->get('Filters');
$all_filters = $FiltersTable->find()->toArray();
foreach ($all_filters as $k => $filter) {
$io->out($filter['id']);
// execute Stage1Command.php
$command = new Stage1Command(['filter_id' => $filter['id']]);
$this->executeCommand($command);
// ...
// execute Stage5Command.php
$command5 = new Stage5Command(['filter_id' => $filter['id']]);
$this->executeCommand($command5);
}
}
}
This doesn't work. It gives an error:
Filter ID must be passed as an argument
I can tell that the commands are being called because these are my own error messages from buildOptionsParser().
This makes no sense because the line $io->out($filter['id']) in RunAllCommand.php is showing that the filter IDs are being read from my database. How do you pass an argument in this way? I'm following the docs on Calling Other Commands (https://book.cakephp.org/3/en/console-and-shells/commands.html#calling-other-commands).
I don't understand how to achieve (2). In each of the Commands I've added code such as this when an error occurs which stops execution of the rest of that Command. For example if this gets executed in Stage1Command it should abort and move to Stage2Command:
// e.g. this code can be anywhere in execute() in any of the 5 commands where an error occurs.
$io->error('error message');
$this->abort();
If $this->abort() gets called anywhere I need to log this into another table in my database. Do I need to add code before $this->abort() to write this to a database, or is there some other way, e.g. try...catch in RunAllCommand?
Background information: The idea with this is that RunAllCommand.php would be executed via Cron. This means that the processes carried out by each Stage would occur at regular intervals without requiring manual execution of any of the scripts - or passing IDs manually as command parameters.
The arguments sent to the "main" command are not automatically being passed to the "sub" commands that you're invoking with executeCommand(), the reason for that being that they might very well be incompatible, the "main" command has no way of knowing which arguments should or shouldn't be passed. The last thing you want is a sub command do something that you haven't asked it to do just because of an argument that the main command makes use of.
So you need to pass the arguments that you want your sub commands to receive manually, that would be the second argument of \Cake\Console\BaseCommand::executeCommand(), not the command constructor, it doesn't take any arguments at all (unless you've overwritten the base constructor).
$this->executeCommand($stage1, [$filter['id']]);
Note that the arguments array is not associative, the values are passed as single value entries, just like PHP would receive them in the $argv variable, ie:
['positional argument value', '--named', 'named option value']
With regards to errors, executeCommand() returns the exit code of the command. Calling $this->abort() in your sub command will trigger an exception, which is being catched in executeCommand() and has its code returned just like the normal exit code from your sub command's execute() method.
So if you just need to log a failure, then you could simply evaluate the return code, like:
$result = $this->executeCommand($stage1, [$filter['id']]);
// assuming your sub commands do always return a code, and do not
// rely on `null` (ie no return value) being treated as success too
if ($result !== static::CODE_SUCCESS) {
$this->log('Stage 1 failed');
}
If you need additional information to be logged, then you could of course log inside of your sub commands where that information is available, or maybe store error info in the command and expose a method to read that info, or throw an exception with error details that your main command could catch and evaluate. However, throwing an exception would not be overly nice when running the commands standalone, so you'll have to figure what the best option is in your case.

Test php file with console input with PHPUnit?

I know how to test classes and functions but I was wondering how to test a file and pass parameters via the console to this file.
For example I have index.php which needs 1 integer num via the console using fgets(STDIN). Can I make a PHPUnit file and test index.php ?
Unit Tested can be pretty much any code on the planet. In your case - pipe input data to script with echo bash command. (Of course depends on OS how to pass data through command shell) :
use PHPUnit\Framework\TestCase;
class ConsoleAppTest extends TestCase
{
public function testIndexFile()
{
$out = shell_exec("echo 123 | php index.php");
// check standard output or some DB modifications if script mangles DB
$is_ok = verify_results($out);
$this->assertSame($is_ok, true);
}
}

Get output of a Symfony command and save it to a file

I'm using Symfony 2.0.
I have created a command in Symfony and I want to take its output and write it to a file.
All I want is to take everything that is written on the standard output (on the console) and to have it in a variable. By all I mean things echoed in the command, exceptions catched in other files, called by the command and so on. I want the output both on the screen and in a variable (in order to write the content of the variable in a file). I will do the writing in the file in the end of the execute() method of the command.
Something like this:
protected function execute(InputInterface $input, OutputInterface $output)
{
// some logic and calls to services and functions
echo 'The operation was successful.';
$this->writeLogToFile($file, $output???);
}
And in the file I want to have:
[Output from the calls to other services, if any]
The operation was successful.
Can you please help me?
I tried something like this:
$stream = $output->getStream();
$content = stream_get_contents($stream, 5);
but the command doesn't finish in that way. :(
You could just forward the command output using standard shell methods with php app/console your:command > output.log. Or, if this is not an option, you could introduce a wrapper for the OutputInterface that would write to a stream and then forward calls to the wrapped output.
I needed the same thing, in my case, I wanted to email the console output for debug and audit to email, so I've made anon PHP class wrapper, which stores the line data and then passes to the original output instance, this will work only for PHP 7+.
protected function execute(InputInterface $input, OutputInterface $output) {
$loggableOutput = new class {
private $linesData;
public $output;
public function write($data) {
$this->linesData .= $data;
$this->output->write($data);
}
public function writeln($data) {
$this->linesData .= $data . "\n";
$this->output->writeln($data);
}
public function getLinesData() {
return $this->linesData;
}
};
$loggableOutput->output = $output;
//do some work with output
var_dump($loggableOutput->getLinesData());
}
Note this will only store the data written using write and writeln OutputInterface methods, this will no store any PHP warnings etc.
Sorry for bringing this up again.
I'm in a similar situation and if you browse the code for Symfony versions (2.7 onwards), there already is a solution.
You can easily adapt this to your specific problem:
// use Symfony\Component\Console\Output\BufferedOutput;
// You can use NullOutput() if you don't need the output
$output = new BufferedOutput();
$application->run($input, $output);
// return the output, don't use if you used NullOutput()
$content = $output->fetch();
This should neatly solve the problem.

Is there any way to instantiate CodeIgniter and call a function inside a controller from an outside class?

I am creating a cron job that will run every few minutes and it will read from database data to see which function needs to be called and act accordingly to the data but half the crons are written in codeigniter and the other half in native php.
Is there any way to do this? I have been googling but the anwser. I came up with is that it is impossible. I tried changing directory and than including or requiring index.php from the codeigniter, in which the function is that i need to call.
while doing this if my class is written in native php, It returns some errors that don't make sense and if I correct those errors, I would say that half the system function from codeigniter would be gone. It would still be a question if it will work even then.
If my class is written in codeigniter, when I include index.php, it just breaks. No errors no response or it says that the "ENVIRONMENT" variables are already defined. Thus I have been looking for a way to undefine those variables from config file or overwrite them to null or empty string but nothing works.
If you have any ideas it would be much appreciated.
I saw a question question link about some cron jobs in php where the user #michal kralik gave an answer about what i am doing in general with database and one cron class that will call other crons (coould use help for that too).
btw forgot to mention that using curl and exec will not work, because on our servers they sometimes just stop working for no reason.
UPDATE 1:
this is my class currently after so many tries:
class Unicron extends MY_Controller {
public $config;
function __construct() {
parent::__construct();
}
public function init(){
$config['base_url'] = "http://localhost/test";
define('EXT_CALL', true); // Added EXT_CALL constant to mark external calls
$_GET['controller/method'] = ''; // add pair to $_GET with call route as key
$current = getcwd(); // Save current directory path
chdir('C:/inetpub/wwwroot/test/'); // change directory to CI_INSTALLATION
include_once 'index.php'; // Add index.php (CI) to script
define('BASEPATH', 'C:/inetpub/wwwroot/test/system/');
$this->load->library("../../application/controllers/controller.php");
$job = new $this->controller->Class();
$isDone = $job->exportExcel(somekey);
echo $isDone;
$CI =& get_instance(); // Get instance of CI
$CI->exportExcel('baseparts', 'exportExcel');
// FOR STATIC CALLING!!
//$CI->method('controller','method');
//replace controller and method with call route
// eg: $CI->list('welcome','list'); If calling welcome/list route.
//$OUT->_display(); // to display output. (quick one)
// Or if you need output in variable,
//$output = $CI->load->view('VIEW_NAME',array(),TRUE);
//To call any specific view file (bit slow)
// You can pass variables in array. View file will pick those as it works in CI
chdir($current); // Change back to current directory
echo $current;
}
where i try to define BASEPATH it does not define it nor override the previous value.
In the index.php of other codeigniter i put:
if(!defined('ENVIRONMENT'))
define('ENVIRONMENT', 'development');
this way i resolved my issue with ENVIRONMENT already being set error.
this is a few things i found and combined together, hoping it could work but still when i call it via command line it shows nothing (even tried echo anything everywhere and nothing).
This may be a long comment rather than a answer as the code supplied requires a lot of work to make it useful.
Running multiple instances of 'codeigniter' - executed from codeigniter.
Using the 'execute programs via the shell' from PHP. Each instance runs in its own environment.
There are some excellent answers already available:
By default the PHP commands 'shell' wait for the command to complete...
732832/php-exec-vs-system-vs-passthru.
However, we want to 'fire and forget' quite often so this answer is quite useful...
1019867/is-there-a-way-to-use-shell-exec-without-waiting-for-the-command-to-complete
All i did was use this show an example of how to use 'codeigniter' to do this. The example was the 'Hello World' cli example from user manual. The version of ci is 2.1.14. I haven't used 'ci' before.
It is tested and works on 'PHP 5.3.18' on windows xp.
As well as the usual 'Hello World' example, i used an example of a a command that uses 'sleep' for a total of 20 seconds so that we can easily see that the 'ci' instances are separate from each other while executing.
Examples:
<?php
class Tools extends CI_Controller {
// the usual 'hello world' program
public function message($to = 'World')
{
echo "Hello {$to}!".PHP_EOL;
}
// so you can see that the processes are independant and 'standalone'
// run for 20 seconds and show progress every second.
public function waitMessage($to = 'World')
{
$countDown = 20;
while ($countDown >= 0) {
echo "Hello {$to}! - ending in {$countDown} seconds".PHP_EOL;
sleep(1);
$countDown--;
}
}
}
'ci' code to run 'ci' code...
<?php
class Runtools extends CI_Controller {
/*
* Executing background processes from PHP on Windows
* http://www.somacon.com/p395.php
*/
// spawn a process and do not wait for it to complete
public function runci_nowait($controller, $method, $param)
{
$runit = "php index.php {$controller} {$method} {$param}" ;
pclose(popen("start \"{$controller} {$method}\" {$runit}", "r"));
return;
}
// spawn a process and wait for the output.
public function runci_wait($controller, $method, $param)
{
$runit = "php index.php {$controller} {$method} {$param}";
$output = exec("{$runit}");
echo $output;
}
}
How to run them from the cli...
To run the 'ci' 'nowait' routine then do:
php index.php runtools runci_nowait <controller> <method> <param>
where the parameters are the ci controller you want to run. Chnge to 'runci_wait' for the other one.
'Hello World: 'wait for output' - (ci: tools message )
codeigniter>php index.php runtools runci_wait tools message ryan3
Hello ryan3!
The waitMessage - 'do not wait for output' - (ci : tools waitMessage )
codeigniter>php index.php runtools runci_nowait tools waitMessage ryan1
codeigniter>php index.php runtools runci_nowait tools waitMessage ryan2
These will start and run two separate 'ci' processes.

autoloader when executing php from linux bash

im currently working on some sort of upload with automatic video conversion. At the moment i am executing a php script via php shell command after the upload is finished so the user doesn't have to wait until the conversion is completed. Like so:
protected function _runConversionScript() {
if (!exec("php -f '" . $this->_conversionScript . "' > /dev/null &"))
return true;
return false;
}
Now in my conversion script file i am using functions from another class "UploadFunctions" to update the status in the database (like started, converted, finished...). The problem there is though that this UploadFunctions class inherits from another class "Controller" where for example the database connection gets established. Currently i am using spl_autoloader to search specific directories for the files needed (for example controller.php), but because the conversion script is out of context with the whole autoloader stuff it doesn't recognize the Controller class and throws an fatal php error.
Here is some code from the conversion script:
require_once('uploadfunctions.php');
$upload_func = new UploadFunctions();
// we want to make sure we only process videos that haven't already
// been or are being processed
$where = array(
'status' => 'queued'
);
$videos = $upload_func->getVideos($where);
foreach ($videos as $video) {
// update database to show that these videos are being processed
$update = array(
'id' => $video['id'],
'status' => 'started'
);
// execute update
$upload_func->updateVideo($update);
.........
Am i doing this completly wrong or is there a better way to accomplish this? If you need more code or information please let me know!
Thanks a lot
Here is my spl_autoload code:
<?php
spl_autoload_register('autoloader');
function autoloader($class_name) {
$class_name = strtolower($class_name);
$pos = strpos($class_name ,'twig');
if($pos !== false){
return false;
}
$possibilities = array(
'..'.DIRECTORY_SEPARATOR.'globals'.DIRECTORY_SEPARATOR.$class_name.'.php',
'controller'.DIRECTORY_SEPARATOR.$class_name.'.php',
'..'.DIRECTORY_SEPARATOR.'libs'.DIRECTORY_SEPARATOR.$class_name.'.php',
'local'.DIRECTORY_SEPARATOR.$class_name.'.php'
);
foreach ($possibilities as $file) {
if(class_exists($class_name) != true) {
if (file_exists($file)) {
include_once($file);
}
}
}
}
?>
I have my project divided into subfolders wich represent the functionality, for example upload, myaccount and gallery.. in every subfolder there are also 2 other folders: controller and local. Controller is the class controlling this part (upload for example) and local is the folder where i am putting the local classes wich are needed. The controller class gets called from the index.php wich is located in the sub-project folder. "libs" and "global" are just projectwide classes, like database, user and so on.
This is an example of my folder structure:
www/index.php // main site
www/upload/index.php // calls the controller for upload and initializes the spl_autoload
www/upload/controller/indexcontroller.php // functionality for the upload
www/upload/local/processVideo.php // this is the conversion script.
I am fairly new to spl_autoload function. In my opinion the spl_autoload is not getting called if my script is calling: "php -f processVideo.php", isn't it?
PHP relative paths are calculated from the path where PHP binary is called.
I suggest you to use __DIR__ constant to avoid that behavior
http://php.net/manual/en/language.constants.predefined.php
I was actually able to resolve the issue. I had to include the spl_autoload_register function inside the conversion script so that it was able to locate the files. This was an issue because the conversion script is not build into my framework an so it isn't able to load the classes from the framework autoloader.

Categories