HHVM - Running resource extensive php daemons - php

Hi i'm trying to use hhvm to run all of the background PHP workers that are currently there in my application. I don't want to run hhvm as a server as Apache is already taking care of it , all i want to do is to run my php codes with hhvm, instead of the regular Zend engine.
Ok here are the codes which i want to run.
This is the entry point of the computationally intensive modules that i want to run
-------------**RunRenderer.php**--------------
#!/usr/bin/php
<?php
require_once 'Config.php';
require_once 'Renderer.php';
Renderer::getInstance()->run();
?>
Here is just a small a portion of the main controller that controls/forks/manages thousands of php tasks/processes.
----------------------------Renderer.php---------------------
<?php
require 'Workers/BumpMapsCalc.php';
/**
* Main Entry class of the Map rendering module
*
* Create workers for all of the different maps calc sub routines
*
*
*
*/
class Renderer extends \Core_Daemon {
/**
* the interval at which the execute method will run
*
* Interval : 10 min
*
*/
protected $loop_interval = 600;
/**
* Set the chunk size
*/
protected $chunkSize = 500;
/**
* Loop counter
*/
protected $loopCounter;
/**
* Low limit and the high limit
*/
protected $lowLimit;
protected $highLimit;
/**
* set the plugins for lock file and settings ini files
*
*/
protected function setup_plugins() {
$this->plugin('Lock_File');
$this->plugin('settings', new \Core_Plugin_Ini());
$this->settings->filename = BASE_PATH . "/Config/settings.ini";
$this->settings->required_sections = array('geometry');
}
protected function setup() {
$this->log("Computing Bumps Maps");
}
/**
* Create multiple separate task that will run in parallel
* Provide the low limit and the high limit which should effectively partition
* the whole table into more manageable chunks , thus making importing and
* storing data much faster and finished within 10 min
*
*/
protected function execute() {
for ($this->loopCounter = 1 ; $this->loopCounter <= $this->settings['geometry']['number'] ; $this->loopCounter += $this->chunkSize) {
$this->lowLimit = $this->loopCounter;
$this->highLimit = $this->loopCounter + $this->chunkSize;
$this->task(new LocalBumpMaps($this->lowLimit, $this->highLimit));
}
}
protected function log_file() {
$dir = BASE_PATH . "/Logs";
if (#file_exists($dir) == false)
#mkdir($dir, 0777, true);
return $dir . '/log_' . date('Y-m-d');
}
}
?>
So normally i would run the program as
php RunRenderer.php -d -p ./pid/pid $1
which would invoke the default zend engine and Renderer.php would fork around thousands of instances of LocalBumpMaps ( along with 100 other map rendering classes ). Now with each of this subtasks taking around 20-30 mb all of the memory in the workstation gets exhausted pretty quickly thus causing the system to screech to a halt.
Of course the main rendering engine is written in C++, but due to some weird requirement the whole front end is in PHP. And the php modules needs to perform around billions of calculations per second. So the only options that was left was to use HHVM in hopes of some significant increase in performance and efficiency.
But the problem is i can't get this code to run with hhvm. This is what i'm trying
hhvm RunRenderer.php -p ./pid $1
This doesn't do anything at all. No processes are forked, no output, nothing happens. So can anyone please tell me how do i run the php scripts with hhvm instead of zend.
I hope my question makes sense, and i would really appreciate any help.
Thanks,
Maxx

Just run the following line first without forking a process:
hhvm RunRenderer.php
If you see console output, and that you can Ctrl+C to terminate the process, then you can demonize the process with an Upstart script. Create a file called /etc/init/renderer.conf:
start on startup
stop on shutdown
respawn
script
hhvm RunRenderer.php
end script
Then you can manually start and stop the process by running:
start renderer
and
stop renderer
If you are running Ubuntu 12.04LTS and above, a log file will be created for you automatically under the name /var/log/upstart/renderer.log. You can fetch live output by tailing the file:
tail -f /var/log/upstart/renderer.log

Related

Laravel Cron Scheduler job not running as expected

I have Laravel cron issue ,In Console Kernel I have defined job which will hit Rollovercron.php file every 10 mins and every time it will hit it will pass one country. atleast 100 Countries are defined in an array and will be passed one by one to Rollovercron.php according to foreach loop. Rollovercron.php file takes minimum 2 hrs to run for single country.
I have multiple issues with this cron job:
100 elements in an array not getting fetched one by one means I can see 'GH' country(Ghana) has run for 5 times continuously and many of the countries are skipped.
when ever I get country missing issue I do composer update and clear cache frequently.
I want my cron should run smoothly and fetch all countries not even single country should miss and I should not need to do composer update for this all the time.
Please help me in this ,struggling for this since many months.
bellow is Kernel.php file:
<?php
namespace App\Console;
use Illuminate\Console\Scheduling\Schedule;
use Illuminate\Foundation\Console\Kernel as ConsoleKernel;
use DB;
class Kernel
extends ConsoleKernel
{
/**
* The Artisan commands provided by your application.
*
* #var array
*/
protected $commands = [
\App\Console\Commands\preAlert::class,
\App\Console\Commands\blCron::class,
\App\Console\Commands\mainRollover::class,
\App\Console\Commands\refilingSync::class,
\App\Console\Commands\TestCommand::class,
\App\Console\Commands\rollOverCron::class,
\App\Console\Commands\FrontPageRedis::class,
\App\Console\Commands\filingStatusRejectionQueue::class,
\App\Console\Commands\VesselDashboardRedis::class,
\App\Console\Commands\Bookingcountupdate::class,
// \App\Console\Commands\Voyagetwovisit::class,
];
/**
* Define the application's command schedule.
*
* #param \Illuminate\Console\Scheduling\Schedule $schedule
* #return void
*/
protected function schedule(Schedule $schedule)
{
$countrylist=array('NL','AR','CL','EC','DE','PH','ID','TT','JM','KR','BE','VN','US','BR','CM','MG','ZA','MU','RU','DO','GT','HN','SV', 'PR','SN', 'TN', 'SI','CI','CR','GM','GN','GY','HR','LC','LR','MR','UY','KH','BD','TH','JP','MM','AT','IE','CH','LB','PY','KE','YT','TZ','MZ','NA','GQ','ME');
foreach ($countrylist as $country) {
$schedule->command('rollOverCron:send ' . $country)
->everyTenMinutes()
->withoutOverlapping();
}
foreach ($countrylist as $country) {
$schedule->command('mainRollover:send ' . $country)
->daily()
->withoutOverlapping();
}
$schedule->command('filingStatusRejectionQueue')
->hourly()
->withoutOverlapping();
$schedule->command('Bookingcountupdate')
->everyTenMinutes()
->withoutOverlapping();
$schedule->command('preAlert')
->hourly()
->withoutOverlapping();
}
protected function commands()
{
require base_path('routes/console.php');
}
}
/**
* Register the Closure based commands for the application.
*
* #return void
*/
Laravel scheduling, knowing how it works helps, so you can debug it when it doesn't work as expected. This does involve diving in the source.
You invoke command on the scheduler, this returns an event.
Let's check how Laravel decides what defines overlapping, we see it expires after 1440 minutes, aka 24 hours.
So after one day, if the scheduled items have not run these scheduled items just stop being scheduled.
We see that a mutex is being used here. Let's see where it comes from. It seems it's provided in the constructor.
So lets see which mutex is being provided. In the exec and the call functions the mutex defined in the Scheduler constructor is used.
The mutex used there is an interface, probably used as a Facade, and the real implementation is most likely in CacheSchedulingMutex, which creates a mutex id using the mutexName from the event and the current time in hours and minutes.
Looking at the mutexName we see that the id exists out of the expression and command combined.
To summarise, all events called in one Scheduler function, share the same mutex that is used in checking if method calls don't overlap, but the mutex generates an unique identifier for each command, including differing parameters, and based on the time.
Your scheduled jobs will expire after 24 hours, which means that with jobs that take 2 hours to complete, you'll get about 12 jobs in a day completed. More if the jobs are small, less if the jobs take longer. This is because PHP is a single threaded process by default.
First task 1, then task 2, then task 3. Etc... This means that if each tasks takes 2 hours, then after 12 tasks their queued jobs expire because the job has been running for 1440 minutes and then the new jobs are scheduled and it starts again from the top.
Luckily there is a way to make sure they run simultaneously.
I suggest you add ->runInBackground() to your scheduling calls.
$schedule->command('rollOverCron:send ' . $country)
->everyTenMinutes()
->withoutOverlapping()
->runInBackground()
->emailOutputTo(['ext.amourya#cma-cgm.com','EXT.KKURANKAR#cma-cgm.com']);cgm.com']);
}

Laravel Queue - Pause between jobs

I want to create a queue (AMAZON SQS) that only runs jobs every X sec. So if suddenly 50 jobs are submitted, the end up in the queue. The queue listener then pulls a job, does something and waits X sec. After that, the next job is pulled. Another X sec pause. Etc etc
For the queue listener, the sleep option option only determines how long the worker will "sleep" if there are no new jobs available. So it will only sleep if there is nothing in the queue.
Or should I just put in a pause(x) in my PHP code?
[edit] I just tested the sleep method with a FIFO and standard AWS SQS queue and this messes up the whole queue. Suddenly jobs are (sucesssfully) resubmitted 3 times after which the go into failed state. Moreover, the delay that is given in my code (3-4 min) was ignored, instead a one minute was taken
<?php
namespace App\Jobs;
use App\City;
class RetrieveStations extends Job
{
protected $cities;
/**
* Create a new job instance.
*
* #return void
*/
public function __construct ($cities)
{
$this->cities = $cities;
}
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
// code here
doSomething()
sleep(X);
}
}
I have the exact same problem to solve. I'm using Laravel 5.8 and I don't see how I can get the queue worker to wait a fixed period between jobs.
I'm now thinking of using a scheduled task to handle this. I can schedule a task to run, say, every 5 minutes and run the following artisan command:
$schedule->command('queue:work --queue=emails --once')->everyFiveMinutes();
This will take one job from the queue and run it. Unfortunately, there's not much more granular control over how often a job is processed.
Exactly, you need to set asleep your php code, there is no other way.
Php sleep

Mediawiki: Getting jobs to run on edit on wiki farm/family

I've set up a Wiki family consisting of a small number of Wikis that have (and are expected to continue having) low to moderate traffic.
When you run a single MediaWiki, it runs a job on every page request which is nice for keeping links and categories up to date, but I can't get this behaviour to work for a wiki family.
I have a wiki setup with a branching localSettings (depending on the SERVER_NAME) and have despite searching (and asking on Mediawiki) found no way to keep this job behaviour, rather I get jobs queueing up, presumably because the maintenance scripts being automatically run do not know which Wiki they originate from.
Is there a way to fix/circumvent this? I have not found any kind of variable being supplied when the job queue is run that could be passed into the localSettings.php so that the correct settings are loaded and the jobs can run properly.
Generally jobs are run on each page load within context of current Wiki, this means in your case there should be no problems with queue, because your LocalSettings file is branched. Though, in certain circumstances job queue may be overloaded, in this case you will need to disable default queue behavior ( by setting $wgJobRunRate = 0;) and configure maintenance scripts runner in crontab. This can be tricky for branched farm, but i think it will work like that:
* * * * * php /path/to/your/wiki/maintenance/runJobs.php --wiki domainA.com
* * * * * php /path/to/your/wiki/maintenance/runJobs.php --wiki domainB.com
* * * * * php /path/to/your/wiki/maintenance/runJobs.php --wiki domainC.com
In this scenario during script execution two constants will be available in LocalSettings.php: MW_DB and MW_PREFIX (use only MW_DB), so you will need to modify your LocalSettings.php like that:
...
$activeWiki = 'defaultWiki';
$switchVar = $_SERVER['SERVER_NAME'];
if( defined('DO_MAINTENANCE') && defined('MW_DB') ) {
$switchVar = MW_DB;
}
switch( $switchVar ) {
...
}
...
Problem found - the issue was that the wikis were behind a permission gate (just a regular Apache one), and async jobs don't inherit the permissions, so I had to set async jobs to false to solve it.
In case anyone else gets this problem - $wgRunJobsAsync = false; should be added to localSettings.php

Is there a good php git client with http support? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
For a project I am working on, we want to use git as a revision tracker for certain data we modify often. We are using php for the web frontend and we need a goo php git client to use. I have come across a handful on the internet and they all tend to have the same limitation...
There is no support for HTTP. We need to be able to push/pull to remote repositories. We also need to clone.
Ideally I am looking for something that does not use the git command (ie: wrapers to exec()) but I am willing to settle if the class works well. I have seen a C library which appears to do what I want, however the php language binding is incomplete and the http functions are labeled experimental.
Does anyone have any insight into using git and http through php?
https://github.com/kbjr/Git.php
Git.php is a wrapper class around git calls that uses proc_open instead of exec to run the commands. While it does not have push/pull methods, it does have a general run method for running custom git commands, so it could be used something like this:
$repo = Git::open('/path/to/repo');
$repo->run('push origin master');
It also does have methods for cloning (clone_to and clone_from which do local cloning and clone_remote for remote cloning).
One possibility is to use PHP's SSH library to perform those actions by connecting back to the web server?
Or I found this set of classes which allow you to you to clone and read other metadata over HTTP, but not push nor pull. However it could be a starting point if you're brave enough to extend them to do that. I can imagine it would be a lot of work to replicate these processes and keep them compliant with various server versions etc.
[UPDATED 23/03/2014, after receiving an upvote - thanks!]
I did get some way trying to implement the above (I was searching for an implementation, drew a blank so tried to write my own), and it was hard as I thought! I actually abandoned it as I found a simpler way to achieve this in a different architecture, but here's the class I wrote trying. It essentially works, but was brittle to the environmental variations I was working in (i.e. it doesn't cope very well with errors or problems).
It uses:
herzult/php-ssh
an ssh config file - a trick to simplify setting up authentication credentials in code
(Note - I had to quickly strip out a few details from the code to post it. So you'll need to fiddle about a bit to integrate it into your app/namespace etc.)
<?php
/**
* #author: scipilot
* #since: 31/12/2013
*/
/**
* Class GitSSH allows you to perform Git functions over an SSH session.
* i.e. you are using the command-line Git commands after SSHing to a host which has the Git client.
*
* You don't need to know about the SSH to use the class, other than the fact you will need access
* to a server via SSH which has the Git client installed. Its likely this is the local web server.
*
* This was made because PHP has no good native Git client.
*
* Requires herzult/php-ssh
*
* Example php-ssh config file would be
*
* <code>
* Host localhost
* User git
* IdentityFile id_rsa
*
* Host your.host.domain.com
* User whoever
* IdentityFile ~/.ssh/WhoeverGit
*</code>
*/
class GitSSH {
protected $config;
protected $session;
protected $sPath;
/**
* #var string
*/
protected $sConfigPath = '~/.ssh/config';
/**
* Connects to the specified host, ready for further commands.
*
* #param string $sHost Host (entry in the config file) to connect to.
* #param string $sConfigPath Optional; config file path. Defaults to ~/.ssh/config,
* which is probably inaccessible for web apps.
*/
function __construct($sHost, $sConfigPath=null){
\Log::info('New GitSSH '.$sHost.', '.$sConfigPath);
if(isset($sConfigPath)) $this->sConfigPath = $sConfigPath;
$this->config = new \Ssh\SshConfigFileConfiguration($this->sConfigPath, $sHost);
$this->session = new \Ssh\Session($this->config, $this->config->getAuthentication());
}
public function __destruct() {
$this->disconnect();
}
/**
* Thanks to Steve Kamerman, as there isn't a native disconnect.
*/
public function disconnect() {
$this->exec('echo "EXITING" && exit;');
$this->session = null;
}
/**
* Run a command (in the current working directory set by cd)
* #param $sCommand
* #return string
*/
protected function exec($sCommand) {
//echo "\n".$sCommand."\n";
$exec = $this->session->getExec();
$result = $exec->run('cd '.$this->sPath.'; '.$sCommand);
// todo: parse/scrape the result, return a Result object?
return $result;
}
/**
* CD to a folder. (This not an 'incremental' cd!)
* Devnote: we don't really execute the cd now, it's appended to other commands. Each command seems to re-login?
*
* #param string $sPath Absolute filesystem path, or relative from user home
*/
public function cd($sPath){
$this->sPath = $sPath;
// #todo this is useless! each command seems to run in a separate login?
//$result = $this->exec('cd'); // /; ls');
//return $result;
}
/**
* #return string
*/
public function ls(){
$result = $this->exec('ls ');
return $result;
}
public function gitAdd($sOptions=null, array $aFiles=null){
$result = $this->exec('git add '
.(empty($sOptions) ? '' : ' '.$sOptions)
.(empty($aFiles) ? '' : ' '.implode(' ', $aFiles))
);
return $result;
}
public function gitClone($sRepo, $sBranch=null, $sTarget=null){
\Log::info('GitSSH::clone '.$sRepo.', '.$sBranch.', '.$sTarget);
$result = $this->exec('git clone '
.(empty($sBranch) ? '' : ' --branch '.$sBranch)
.' '.$sRepo
.' '.$sTarget);
return $result;
}
public function gitCommit($sMessage, $sOptions=null, array $aFiles=null){
$result = $this->exec('git commit '
.'-m "'.addcslashes($sMessage, '"').'"'
.(empty($sOptions) ? '' : ' '.$sOptions)
.(empty($aFiles) ? '' : ' '.implode(' ', $aFiles))
);
return $result;
}
public function gitPull($sOptions=null, $sRepo=null, $sRefspec=null){
$result = $this->exec('git pull '
.(empty($sOptions) ? '' : ' '.$sOptions)
.(empty($sRepo) ? '' : ' '.$sRepo)
.(empty($sRefspec) ? '' : ' '.$sRefspec)
);
return $result;
}
public function gitPush($sOptions=null, $sRepo=null, $sRefspec=null){
$result = $this->exec('git push '
.(empty($sOptions) ? '' : ' '.$sOptions)
.(empty($sRepo) ? '' : ' '.$sRepo)
.(empty($sRefspec) ? '' : ' '.$sRefspec)
);
return $result;
}
/**
* #return string the raw result from git status
*/
public function gitStatus(){
$result = $this->exec('git status');
return $result;
}
}
This looks promising: http://gitphp.org (broken link; see an archived version)
I think that will do it for you. Here is the description of it:
GitPHP is a web frontend for git repositories. It emulates the look of standard gitweb, but is written in PHP and makes use of Smarty templates for customization. It has a couple extras, including syntax highlighting through the GeSHi PHP class and project category support. It works with standard git as well as msysgit on Windows.
Setup should be fairly simple – just extract the tarball where you want to install it, copy config/gitphp.conf.php.example to config/gitphp.conf.php, and set the projectroot in the conf to point to your directory where your bare git repositories are, and make the templates_c directory writeable by the webserver if it’s not already.
You can look through all the available options and defaults in config/gitphp.conf.defaults.php, and copy an option into your config file if you want to override the default. You can also copy config/projects.conf.php.example to config/projects.conf.php and edit it if you want more advanced control over your projects, such as defining categories for projects or loading projects from a text file. More detailed instructions are in the included README.
Note: if you’re upgrading your existing gitphp.conf.php will not be overwritten, but I recommend checking gitphp.conf.defaults.php for new configuration options that may have been added.
You can view the live copy running on this site.

Trouble executing the Yii web service demo

I'm fairly new to Yii and am currently trying to use this framework to create some PHP web services. While trying to execute the brief tutorial on web services provided on the Yii web site http://www.yiiframework.com/doc/guide/1.1/en/topics.webservice#declaring-web-service-action I ran into some trouble. Namely, I get a "Maximum execution time of 60 seconds exceeded" fatal error when executing the script. My guess is that the getPrice() method actually never gets called.
I'd appreciate any suggestions related to why this may be happening. The contents of my index.php file are listed below. (Note that the Yii gramework is properly installed and I'm running PHP 5.3.0 with the php_soap extension).
<?php
$yii=dirname(__FILE__).'/../yii/framework/yii.php';
defined('YII_DEBUG') or define('YII_DEBUG',true);
defined('YII_TRACE_LEVEL') or define('YII_TRACE_LEVEL',3);
require_once($yii);
class StockController extends CController{
function __construct(){
parent::__construct($this->id, $this->module);
}
public function actions(){
return array(
'quote'=>array(
'class'=>'CWebServiceAction',
),
);
}
/**
* #param string the symbol of the stock
* #return float the stock price
* #soap
*/
public function getPrice($symbol){
$prices=array('IBM'=>100, 'GOOGLE'=>350);
return isset($prices[$symbol])?$prices[$symbol]:0;
//...return stock price for $symbol
}
}
$client=new SoapClient('http://localhost/SampleWebService/?r=stock/quote');
echo $client->getPrice('GOOGLE');
?>
It seems strange that you are declaring the call in the index.php entry script... I'm also not sure why you are overriding the constructor?
And I think, if this really in your entry script, you are missing call the create the application, either Yii::createWebApplication($config)->run(); or Yii::createConsoleApplication($config)->run();, depending on if you are running this as a web or console application.
Have you made sure that the application is running as expected without the SOAP/services stuff? I would set up a basic Hello World app (web or console), then try the web services.

Categories