How to run exec()/ linux commands using gearman extension from cakephp application? - php

I installed gearman extension and gearman command line tool also. I tried to reverse a string using gearman from simple php file.
Example:
$gmclient= new GearmanClient();
$gmclient->addServer();
$result = $gmclient->doNormal("reverse", "Test the reverse string");
echo "Success: $result\n";
output:
Success: gnirts esrever eht tseT
In the same way i tried to run exec('ls -l') , I am able to execute using simple php files from cakephp application from webroot directory. filepath: cakephp/app/webroot/worker.php, cakephp/app/webroot/client.php.
worker.php
<?php
$worker= new GearmanWorker();
$worker->addServer();
$worker->addFunction("exec", "executeScript");
while ($worker->work());
function executeScript($job)
{
$param = $job->workload();
$t = exec($param);
return $t;
}
?>
client.php
<?php
$client= new GearmanClient();
$client->addServer();
$cmd = 'ls -l';
print $client->do("exec", $cmd);
?>
How to implement the same type of execution using View, Controller from cakephp?
Workflow: Post data from View to Controller using ajax method and execute "exec() from gearman" , send output back to View as response of ajax POST methhod.

Why are you using exec?! That brings a huge security risk. Use DirectoryIterator instead.
Your client code should be part of the controller.
<?php
class UploadController extends AppController
{
public function directoryList()
{
$directory = '';
// Get data
if (!empty($this->data['directory']) && is_string($this->data['directory']))
{
$directory = $this->data['directory'];
}
$client= new GearmanClient();
$client->addServer("localhost",4730); // Important!!!
$result = $client->do("fileList", serialize($data));
return $result;
}
}
Then from view use requestAction.
$uploads = $this->requestAction(
array('controller' => 'upload', 'action' => 'directoryList'),
array('return')
);
Worker could look like this:
<?php
$worker= new GearmanWorker();
$worker->addServer("localhost",4730); // Important!!!
$worker->addFunction("fileList", "getFileList");
while ($worker->work());
// From Art of Web
// http://www.the-art-of-web.com/php/directory-list-spl/
function getFileList($dir)
{
// array to hold return value
$retval = array();
$dir = $job->workload();
// add trailing slash if missing
if(substr($dir, -1) != "/") $dir .= "/";
// open directory for reading
$d = new DirectoryIterator($dir) or die("getFileList: Failed opening directory $dir for reading");
foreach($d as $fileinfo) {
// skip hidden files
if($fileinfo->isDot()) continue;
$retval[] = array(
'name' => "{$dir}{$fileinfo}",
'type' => ($fileinfo->getType() == "dir") ?
"dir" : mime_content_type($fileinfo->getRealPath()),
'size' => $fileinfo->getSize(),
'lastmod' => $fileinfo->getMTime()
);
}
return $retval;
}
This is pseudo code. Do not use it in production!!! See Gearman documentation for more advance worker setup.
To actually take advantage of load distribution Gearman server should not be on localhost of course.

Your worker.php needs to be already running on a server for this to work. For testing, open up a new terminal window to the server where you want worker.php to run. Start the worker: php worker.php on the command line. (On a production server, you might want to look at supervisor to run your worker without a terminal.)
The code in client.php would go in your controller, but save the result to a variable instead of a print statement.
The fact that this would be from an AJAX call is irrelevant, it will work the same as a normal web page. When the controller executes, the gearman client code will get a response from the worker, and you can output the result to the view.

Related

Scheduling Windows job to run a PHP script failing

I have a PHP script which exports data from SQL table into Drupal content. I would like to schedule the job as a windows task and followed the below steps:
My PHP file contains below code:
<?php
namespace Drupal\Import\Commands;
use Drush\Commands\DrushCommands;
use Drupal\node\Entity\Node;
class ImportCommands extends DrushCommands
{
public function loadData()
{
$tx = \Drupal::database()->startTransaction();
try
{
$query_result = \Drupal::entityQuery('node')
->condition('type', 'test_mod')
->execute();
entity_delete_multiple('node', $query_result);
$database = \Drupal::database();
$result = $database->query("SELECT * FROM my_sql_table");
$records = $result->fetchAll();
foreach($records as $key => $record) {
$node = Node::create(['type' => 'test_mod']);
$node->set('title', $key);
$node->set('field_IDVal', $record->ID);
$node->set('field_name', $record->Name);
$node->status = 1;
$node->enforceIsNew();
$node->save();
}
}
catch (Exception $e) {
$tx->rollBack();
// log/report failure
}
// $tx->commit();
}
}
Created .bat file and have the below query:
call C:\MyProj\web\Commands\php.exe -f C:\MyProj\web\Commands\ExportCommands.php
Created a windows task to link with this file.
All I am seeing is 0x1 with the task success. Any help to make this work?!
Expected behavior: I want to create a wndows task that calls this PHP fine and the command in the file loads data from SQL table to Drupal.
Right now I execute the file by going to commandprompt and do: drush MyProj:loaddata
Any help?!
Php.exe is in wrong place. Move it to it's original location

PHP Eval alternative to include a file

I am currently running a queue system with beanstalk + supervisor + PHP.
I would like my workers to automatically die when a new version is available (basically code update).
My current code is as follow
class Job1Controller extends Controller
{
public $currentVersion = 5;
public function actionIndex()
{
while (true) {
// check if a new version of the worker is available
$file = '/config/params.php';
$paramsContent = file_get_contents($file);
$params = eval('?>' . file_get_contents($file));
if ($params['Job1Version'] != $this->currentVersion) {
echo "not the same version, exit worker \n";
sleep(2);
exit();
} else {
echo "same version, continue processing \n";
}
}
}
}
When I will update the code, the params file will change with a new version number which will force the worker to terminate. I cannot use include as the file will be loaded in memory in the while loop. Knowing that the file params.php isn't critical in terms of security I wanted to know if there was another way of doing so?
Edit: the params.php looks as follow:
<?php
return [
'Job1Version' => 5
];
$params = require($file);
Since your file has a return statement, the returned value will be passed along.
After few tests I finally managed to find a solution which doesn't require versionning anymore.
$reflectionClass = new \ReflectionClass($this);
$lastUpdatedTimeOnStart = filemtime($reflectionClass->getFileName());
while (true) {
clearstatcache();
$reflectionClass = new \ReflectionClass($this);
$lastUpdatedTime = filemtime($reflectionClass->getFileName());
if ($lastUpdatedTime != $lastUpdatedTimeOnStart) {
// An update has been made, exit
} else {
// worker hasn't been modified since running
}
}
Whenever the file will be updated, the worker will automatically exit
Thanks to #Rudie who pointed me into the right direction.

Instantiating all classes in directory

I'm using Laravel and creating artisan commands but I need to register each one in start/artisan.php by calling
Artisan::add(new MyCommand);
How can I take all files in a directory (app/commands/*), and instantiate every one of them in an array ? I'd like to get something like (pseudocode) :
$my_commands = [new Command1, new Command2, new Command3];
foreach($my_commands as $command){
Artisan::add($command);
}
Here is a way to auto-register artisan commands. (This code was adapted from the Symfony Bundle auto-loader.)
function registerArtisanCommands($namespace = '', $path = 'app/commands')
{
$finder = new \Symfony\Component\Finder\Finder();
$finder->files()->name('*Command.php')->in(base_path().'/'.$path);
foreach ($finder as $file) {
$ns = $namespace;
if ($relativePath = $file->getRelativePath()) {
$ns .= '\\'.strtr($relativePath, '/', '\\');
}
$class = $ns.'\\'.$file->getBasename('.php');
$r = new \ReflectionClass($class);
if ($r->isSubclassOf('Illuminate\\Console\\Command') && !$r->isAbstract() && !$r->getConstructor()->getNumberOfRequiredParameters()) {
\Artisan::add($r->newInstance());
}
}
}
registerArtisanCommands();
If you put that in your start/artisan.php file, all commands found in app/commands will be automatically registered (assuming you follow Laravel's recommendations for command and file names). If you namespace your commands like I do, you can call the function like so:
registerArtisanCommands('App\\Commands');
(This does add a global function, and a better way to do this would probably be creating a package. But this works.)
<?php
$contents = scandir('dir_path');
$files = array();
foreach($contents as $content) {
if(substr($content,0,1 == '.') {
continue;
}
$files[] = 'dir_path'.$content;
}
That reads the contents of a folder, itterates over it and saves the filename including path in the $files array. Hope this is what you're looking for
PS: Im not familiar with laravel or artisan. So if you have to use specific semantics(like camelCase) to register them, then please tell me so

Gearman PHP, sendComplete has no effect

Have successfully connected Gearman to an existing PHP project. Using supervisord to ensure that the workers are running, it has produced pretty good results!
I have a critical issue, however, in that the "setCompleteCallback" is not working at all.
Split up somewhat like this:
Client
$client = new GearmanClient();
$client->addServer();
$client->setCompleteCallback(
array( 'LDPE_Service_AWSConnect_Transfer_Target', 'transferComplete' ) );
// push core to S3 bucket
$target = new LDPE_Service_AWSConnect_Transfer_Target( $transaction->id,
"/usr/local/include/LDP/", LDPE_Service_S3::BUCKET_CORE );
// push S3 bucket to instances
foreach( $aws_target_list as $dns )
{
$target->addChildRequest(
new LDPE_Service_AWSConnect_Transfer_Target( $transaction->id,
null, LDPE_Service_S3::BUCKET_CORE, $dns )
);
}
$client->addTaskBackground( 'transferStart', serialize( $target ) );
$client->runTasks();
Worker
(basically bootstraps a Zend Framework environment, and loads the exec functions)
include 'bootstrap.php';
ini_set('memory_limit', -1);
$worker = new GearmanWorker();
$worker->addServer();
$worker->addFunction( 'transferStart', array(
'LDPE_Service_AWSConnect_Transfer_Target', 'transferStart' ) );
while ($worker->work())
{
switch( $worker->returnCode() )
{
case GEARMAN_SUCCESS:
break;
default:
echo "ERROR RET: " . $worker->returnCode() . "\n";
exit;
}
}
Finally, here's the LDPE_Service_AWSConnect_Transfer_Target class that contains all of the heavy lifting. I've pruned out all of the logic, and it doesn't fire at all.
Implementation Methods
class LDPE_Service_AWSConnect_Transfer_Target {
public static function transferStart( GearmanJob $job )
{
$workload = $job->workload();
$target = unserialize( $workload );
echo "transferStart/begin [ " .
$target->getShortRepresentation() . " ]\n";
// perform a series of actions
echo "transferStart/complete [ " .
$target->getShortRepresentation() . " ]\n";
return serialize( $target );
}
public static function transferComplete( GearmanTask $task )
{
echo "transferComplete/begin\n";
$workload = $task->data();
$parent_target = unserialize( $workload );
echo "transferComplete/complete\n";
}
}
To be clear then, the "transferStart/begin" and "transferStart/complete" strings are correctly printed to logs, however, transferComplete/begin is never fired. What's going on?
Thanks!
Alex
Seems as though the callbacks don't fire when run in background mode..
Try setting the callback after your call to the process function
$client->addTaskBackground('my_task', 'payload');
$client->setCompleteCallback('complete');
$client->runTasks();
I had tried that, it really boiled down to having the client run as a Gearman task itself. The client was being invoked as a part of a browser-invoked page. Seems that the callback wasn't being honored under this context. The solution was to move the client that schedules the callbacks into a Gearman-run method. I added a "scheduleXXXX" function to the work, which pretty much called the flow above. This function received the "normal" function's input, serialized.

Running a Zend Framework action from command line

I would like to run a Zend Framework action to generate some files, from command line. Is this possible and how much change would I need to make to my existing Web project that is using ZF?
Thanks!
UPDATE
You can have all this code adapted for ZF 1.12 from https://github.com/akond/zf-cli if you like.
While the solution #1 is ok, sometimes you want something more elaborate.
Especially if you are expecting to have more than just one CLI script.
If you allow me, I would propose another solution.
First of all, have in your Bootstrap.php
protected function _initRouter ()
{
if (PHP_SAPI == 'cli')
{
$this->bootstrap ('frontcontroller');
$front = $this->getResource('frontcontroller');
$front->setRouter (new Application_Router_Cli ());
$front->setRequest (new Zend_Controller_Request_Simple ());
}
}
This method will deprive dispatching control from default router in favour of our own router Application_Router_Cli.
Incidentally, if you have defined your own routes in _initRoutes for your web interface, you would probably want to neutralize them when in command-line mode.
protected function _initRoutes ()
{
$router = Zend_Controller_Front::getInstance ()->getRouter ();
if ($router instanceof Zend_Controller_Router_Rewrite)
{
// put your web-interface routes here, so they do not interfere
}
}
Class Application_Router_Cli (I assume you have autoload switched on for Application prefix) may look like:
class Application_Router_Cli extends Zend_Controller_Router_Abstract
{
public function route (Zend_Controller_Request_Abstract $dispatcher)
{
$getopt = new Zend_Console_Getopt (array ());
$arguments = $getopt->getRemainingArgs ();
if ($arguments)
{
$command = array_shift ($arguments);
if (! preg_match ('~\W~', $command))
{
$dispatcher->setControllerName ($command);
$dispatcher->setActionName ('cli');
unset ($_SERVER ['argv'] [1]);
return $dispatcher;
}
echo "Invalid command.\n", exit;
}
echo "No command given.\n", exit;
}
public function assemble ($userParams, $name = null, $reset = false, $encode = true)
{
echo "Not implemented\n", exit;
}
}
Now you can simply run your application by executing
php index.php backup
In this case cliAction method in BackupController controller will be called.
class BackupController extends Zend_Controller_Action
{
function cliAction ()
{
print "I'm here.\n";
}
}
You can even go ahead and modify Application_Router_Cli class so that not "cli" action is taken every time, but something that user have chosen through an additional parameter.
And one last thing. Define custom error handler for command-line interface so you won't be seeing any html code on your screen
In Bootstrap.php
protected function _initError ()
{
$error = $frontcontroller->getPlugin ('Zend_Controller_Plugin_ErrorHandler');
$error->setErrorHandlerController ('index');
if (PHP_SAPI == 'cli')
{
$error->setErrorHandlerController ('error');
$error->setErrorHandlerAction ('cli');
}
}
In ErrorController.php
function cliAction ()
{
$this->_helper->viewRenderer->setNoRender (true);
foreach ($this->_getParam ('error_handler') as $error)
{
if ($error instanceof Exception)
{
print $error->getMessage () . "\n";
}
}
}
It's actually much easier than you might think. The bootstrap/application components and your existing configs can be reused with CLI scripts, while avoiding the MVC stack and unnecessary weight that is invoked in a HTTP request. This is one advantage to not using wget.
Start your script as your would your public index.php:
<?php
// Define path to application directory
defined('APPLICATION_PATH')
|| define('APPLICATION_PATH',
realpath(dirname(__FILE__) . '/../application'));
// Define application environment
defined('APPLICATION_ENV')
|| define('APPLICATION_ENV',
(getenv('APPLICATION_ENV') ? getenv('APPLICATION_ENV')
: 'production'));
require_once 'Zend/Application.php';
$application = new Zend_Application(
APPLICATION_ENV,
APPLICATION_PATH . '/configs/config.php'
);
//only load resources we need for script, in this case db and mail
$application->getBootstrap()->bootstrap(array('db', 'mail'));
You can then proceed to use ZF resources just as you would in an MVC application:
$db = $application->getBootstrap()->getResource('db');
$row = $db->fetchRow('SELECT * FROM something');
If you wish to add configurable arguments to your CLI script, take a look at Zend_Console_Getopt
If you find that you have common code that you also call in MVC applications, look at wrapping it up in an object and calling that object's methods from both the MVC and the command line applications. This is general good practice.
Just saw this one get tagged in my CP. If you stumbled onto this post and are using ZF2, it's gotten MUCH easier. Just edit your module.config.php's routes like so:
/**
* Router
*/
'router' => array(
'routes' => array(
// .. these are your normal web routes, look further down
),
),
/**
* Console Routes
*/
'console' => array(
'router' => array(
'routes' => array(
/* Sample Route */
'do-cli' => array(
'options' => array(
'route' => 'do cli',
'defaults' => array(
'controller' => 'Application\Controller\Index',
'action' => 'do-cli',
),
),
),
),
),
),
Using the config above, you would define doCliAction in your IndexController.php under your Application module. Running it is cake, from the command line:
php index.php do cli
Done!
Way smoother.
akond's solution above is on the best track, but there are some subtleties that may may his script not work in your environment. Consider these tweaks to his answer:
Bootstrap.php
protected function _initRouter()
{
if( PHP_SAPI == 'cli' )
{
$this->bootstrap( 'FrontController' );
$front = $this->getResource( 'FrontController' );
$front->setParam('disableOutputBuffering', true);
$front->setRouter( new Application_Router_Cli() );
$front->setRequest( new Zend_Controller_Request_Simple() );
}
}
Init error would probably barf as written above, the error handler is probably not yet instantiated unless you've changed the default config.
protected function _initError ()
{
$this->bootstrap( 'FrontController' );
$front = $this->getResource( 'FrontController' );
$front->registerPlugin( new Zend_Controller_Plugin_ErrorHandler() );
$error = $front->getPlugin ('Zend_Controller_Plugin_ErrorHandler');
$error->setErrorHandlerController('index');
if (PHP_SAPI == 'cli')
{
$error->setErrorHandlerController ('error');
$error->setErrorHandlerAction ('cli');
}
}
You probably, also, want to munge more than one parameter from the command line, here's a basic example:
class Application_Router_Cli extends Zend_Controller_Router_Abstract
{
public function route (Zend_Controller_Request_Abstract $dispatcher)
{
$getopt = new Zend_Console_Getopt (array ());
$arguments = $getopt->getRemainingArgs();
if ($arguments)
{
$command = array_shift( $arguments );
$action = array_shift( $arguments );
if(!preg_match ('~\W~', $command) )
{
$dispatcher->setControllerName( $command );
$dispatcher->setActionName( $action );
$dispatcher->setParams( $arguments );
return $dispatcher;
}
echo "Invalid command.\n", exit;
}
echo "No command given.\n", exit;
}
public function assemble ($userParams, $name = null, $reset = false, $encode = true)
{
echo "Not implemented\n", exit;
}
}
Lastly, in your controller, the action that you invoke make use of the params that were orphaned by the removal of the controller and action by the CLI router:
public function echoAction()
{
// disable rendering as required
$database_name = $this->getRequest()->getParam(0);
$udata = array();
if( ($udata = $this->getRequest()->getParam( 1 )) )
$udata = explode( ",", $udata );
echo $database_name;
var_dump( $udata );
}
You could then invoke your CLI command with:
php index.php Controller Action ....
For example, as above:
php index.php Controller echo database123 this,becomes,an,array
You'll want to implement a more robust filtering/escaping, but, it's a quick building block. Hope this helps!
One option is that you could fudge it by doing a wget on the URL that you use to invoke the desirable action
You cant use -O option of wget to save the output. But wget is clearly NOT the solution. Prefer using CLI instead.
akond idea works great, except the error exception isnt rendered by the error controller.
public function cliAction() {
$this->_helper->layout->disableLayout();
$this->_helper->viewRenderer->setNoRender(true);
foreach ($this->_getParam('error_handler') as $error) {
if ($error instanceof Exception) {
print "cli-error: " . $error->getMessage() . "\n";
}
}
}
and In Application_Router_Cli, comment off the echo and die statement
public function assemble($userParams, $name = null, $reset = false, $encode = true) {
//echo "Not implemented\n";
}
You can just use PHP as you would normally from the command line. If you call a script from PHP and either set the action in your script you can then run whatever you want.
It would be quite simple really.
Its not really the intended usage, however this is how it could work if you wanted to.
For example
php script.php
Read here: http://php.net/manual/en/features.commandline.php
You can use wget command if your OS is Linux. For example:
wget http://example.com/controller/action
See http://linux.about.com/od/commands/l/blcmdl1_wget.htm
UPDATE:
You could write a simple bash script like this:
if wget http://example.com/controller/action
echo "Hello World!" > /home/wasdownloaded.txt
else
"crap, wget timed out, let's remove the file."
rm /home/wasdownloaded.txt
fi
Then you can do in PHP:
if (true === file_exists('/home/wasdownloaded.txt') {
// to check that the
}
Hope this helps.
I have used wget command
wget http://example.com/module/controller/action -O /dev/null
-O /dev/null if you dont want to save the output

Categories