Phabricator, haw to take all tasks from database - php

I use Conduit API to create gantt chart for phabricator application. I call maniphest.search method but I have an error when I want to take more than 100 tasks.
Maximum page size for Conduit API method calls is 100, but this call specified 101.
Why? Haw Can I take all tasks? (or more than 100)
Thanks for reply.
#UPDATE:
First solution it is possible by maniphest.search method
Second solution is a loop like below - Ugly (bad practice - but you can get all nessesery fields)
for($i=0; $i <= 700; $i=$i+100){
try {
$result = $conduit->searchManiphest(
[
'queryKey' => $this->getParameter('phacility_maniphest_query_key'),
'attachments' => [
'projects' => TRUE,
'subscribers' => TRUE,
'columns' => TRUE,
'constraints' => TRUE,
],
'order' => "newest",
'after' => $i,
'limit' => 100
]
);
} catch (\Exception $e) {
$result = ['data' => []];
}
foreach($result['data'] as $item){
$all_result['data'][] = $item;
}
}

Related

Yii2: googleChart Widget extension in Carousel

I want show graphs in Carousel, the first graph is shown as well as expected see bellow,
But other slinding graphs are different as seen bellow
Note: All graphs are generated from the same databses tables, bellow is the code from my index.php in frontend/view directory
<?php
use backend\models\WaterLevel;
use backend\models\GaugeSite;
use scotthuangzl\googlechart\GoogleChart;
use kv4nt\owlcarousel\OwlCarouselWidget;
$this->title = 'Water Level';
$model = new WaterLevel();
$siteModel = new GaugeSite();
$siteId = $siteModel->getSiteIds();
$counter = count($siteId);
OwlCarouselWidget::begin([
'container' => 'div',
'containerOptions' => [
'id' => 'container-id',
'class' => 'container-class'
],
'pluginOptions' => [
'autoplay' => true,
'autoplayTimeout' => 3000,
'items' => 1,
'loop' => true,
'itemsDesktop' => [1199, 3],
'itemsDesktopSmall' => [979, 3]
]
]);
/**Loop to generate items */
for($i = 0;$i<$counter;$i++){
$id = $siteId[$i];
$data = $model->getLevels($id);
$readings = [['Hour', 'Water Level']];
foreach($data as $value){
array_push($readings,[$value['submition_time'],(int)$value['reading_level']]);
}
echo GoogleChart::widget(
array('visualization' => 'LineChart',
'data' => $readings,
'options' => array('title' => 'Water Level Reading')));
}
OwlCarouselWidget::end();?>
I also have tried to use the GoogleChart::widget in normal bootstrap4 Carousel, but it behaves the same. I appriciate of your idea to take me out from here.

Getting a list of ALL plugins

I would like to get a list of ALL Wordpress plugins.
There is a function called get_plugins() but it will return all plugins that I have installed. What I need is a list of all plugins, no matter if I installed them before or not.
Is there a function that I could use? If not, is there a JSON, database, API or anything that I could use?
Edit:
var_dump(plugins_api('query_plugins', array(
'per_page' => 100,
'tag' => 'contact form 7',
'number' => 5,
'page' => 1,
'fields' =>
array(
'short_description' => false,
'description' => false,
'sections' => false,
'tested' => false,
'requires' => false,
'rating' => false,
'ratings' => false,
'downloaded' => false,
'downloadlink' => false,
'last_updated' => false,
'added' => false,
'tags' => false,
'compatibility' => false,
'homepage' => false,
'versions' => false,
'donate_link' => false,
'reviews' => false,
'banners' => false,
'icons' => false,
'active_installs' => false,
'group' => false,
'contributors' => false
))));
This returns a full of data that I don't need:
The only data that I need are the yellow marked keys below: name and slug
I know that I could get them out from the array but it would be very bad for the performance.
Even when I try it with a loop, I'll get 45 plugins but not more. Where is the rest???
foreach ($plugins as $plugin) { // $plugins is the variable of my code above but without 'tag' => 'contact form 7',
foreach ($plugin as $p) {
if ($p != null) {
echo $p->name . "<br>";
}
}
}
Not the best answer but I tried to solve my own problem the best way I could.
Getting a list of plugins
This will not return ALL plugins but it will return the top rated ones:
$plugins = plugins_api('query_plugins', array(
'per_page' => 100,
'browse' => 'top-rated',
'fields' =>
array(
'short_description' => false,
'description' => false,
'sections' => false,
'tested' => false,
'requires' => false,
'rating' => false,
'ratings' => false,
'downloaded' => false,
'downloadlink' => false,
'last_updated' => false,
'added' => false,
'tags' => false,
'compatibility' => false,
'homepage' => false,
'versions' => false,
'donate_link' => false,
'reviews' => false,
'banners' => false,
'icons' => false,
'active_installs' => false,
'group' => false,
'contributors' => false
)));
Save the data as JSON
Since the data that we get is huge and it will be bad for performance, we try to get the name and the slug out of the array and then we write it in a JSON file:
$plugins_json = '{' . PHP_EOL;
// Get only the name and the slug
foreach ($plugins as $plugin) {
foreach ($plugin as $key => $p) {
if ($p->name != null) {
// Let's beautify the JSON
$plugins_json .= ' "'. $p->name . '": {' . PHP_EOL;
$plugins_json .= ' "slug": "' . $p->slug . '"' . PHP_EOL;
end($plugin);
$plugins_json .= ($key !== key($plugin)) ? ' },' . PHP_EOL : ' }' . PHP_EOL;
}
}
}
$plugins_json .= '}';
file_put_contents('plugins.json', $plugins_json);
Now we have a slim JSON file with only the data that we need.
To keep updating the JSON file, we run that script to create a JSON file every 24 hours by setting up a Cron Job.
Because getting all plugins at once will be too heavy for the server, it is a better idea to do it in steps.
You could do as many plugins at once as the server can handle. For the example I use a safe 100 plugins at once.
Everytime the script runs, it increments the "page" number with 1. So the next time the script runs the next 100 plugins are retrieved. The contents of the existing plugins.json will be parsed. The new plugins will be added (or overwritten if the plugin already is present) to the existing data, before encoding and saving it again.
If the page number is past the last, no results will be returned. This way the script knows there are no more plugins next. It then resets the page to 1, so it starts over.
I use the wp_options table to keep track of the pages, simply because it's the quickest way. It would be better to use some kind of filesystem caching. That will be easier to reset manually if needed.
You can set a cronjob to execute the script every x minutes. Now the plugins.json file will build up and grow step by step, every time it runs.
// get the current "page", or if the option not exists, set page to 1.
$page = get_option( 'plugins_page' ) ? (int)get_option( 'plugins_page' ) : 1;
// get the plugin objects
$plugins = plugins_api( 'query_plugins', [
'per_page' => 100,
'page' => $page,
'fields' => [
//.........
]
] );
// increment the page, or when no results, reset to 1.
update_option( 'plugins_page', count( $plugins ) > 0 ? ++ $page : 1 );
// build up the data array
$newData = [];
foreach ( $plugins as $plugin ) {
foreach ( $plugin as $key => $p ) {
if ( $p->name != null ) {
$newData[ $p->name ] = [ 'slug' => $p->slug ];
}
}
}
// get plugin data already in file.
// The last argument (true) is important. It makes json objects into
// associative arrays so they can be merged with array_merge.
$existingData = json_decode( file_get_contents( 'plugins.json' ), true );
// merge existing data with new data
$pluginData = array_merge( $existingData, $newData );
file_put_contents( 'plugins.json', json_encode( $pluginData ) );

CodeIgniter-Access from model to controller

I have the following code in Model:
<?php
class Route_Model extends CI_Model
{
function __construct()
{
parent::__construct();
}
public function getRoute($date = array())
{
try {
$data = array(
'route' => array(
'id' => 1,
'name' => 'budapest-athens',
'price' => 150,
'id' => 2,
'name' => 'rome-madrid',
'pret' => 250,
'id' => 3,
'name' => 'belgrade-bucharest',
'price' => 180,
'id' => 4
)
);
return $data;
} catch (Exception $e) {
return $e->getMessage();
}
}
}?>
And I want to access array elements in my controller.
How can I access each field separately?
Something like $price = $this->data['price']?
Thank you!
You are returning an array with two levels, if you want to get the price from the array $data, simply do this in your controller:
$data = $this->route_model->getRoute($date);
$price = $data['route']['price'];
Please, note that your array is not well formed because you have repeated keys and this may cause problems
This array will never work since you're overwriting keys, I think you would want the following array:
$data = [
'route' => [
[
'id' => 1,
'name' => 'budapest-athens',
'price' => 150
], [
'id' => 2,
'name' => 'rome-madrid',
'price' => 250
], [
'id' => 3,
'name' => 'belgrade-bucharest',
'price' => 180
]
]
];
Next to that, your try / catch seems unnecessary here, there is no real try. It's a hard-coded array, so unless this will actually do some interactions there is no need for the try / catch.
Anyway, to receive this data in your controller you should do:
$this->load->model('Route_model');
$route = $this->Route_model->getRoute();
var_dump($route);
exit;
Now you will have this array. Another wonder, are you actually trying to grab all the routes in this array, or is there something you want to do with the $date parameter? Since right now it doesn't looks like it's used unless you stripped some code away.

Laravel Nested Relationship on Factory for Testing

I got a problem, i have to admit i don't find any solution.
I'm actually developping some testing for functionnalities and Factories are blocking me.
First I'm trying to add with factories an Entity called "Tasklist" which contains one or many "sections" which contains one or many "actions".
I have a 3 level deep relationship.
Here are my factories:
$factory->define(\App\V2\Models\Tasklist::class, function (\Faker\Generator $faker) {
return [
'id_course' => \App\V2\Models\Program::all()->random(1)->id,
'id_event' => \App\V2\Models\Stage::all()->random(1)->id,
'id_course_rounds' => \App\V2\Models\ProgramRound::all()->random(1)->id,
'name' => $faker->word,
'display_name' => $faker->word,
'color' => 0,
'key' => str_random(16),
'auto_active' => 1,
'status' => 1,
];
});
$factory->define(\App\V2\Models\TasklistSection::class, function (\Faker\Generator $faker) {
return [
'id_tasklist' => function(){
return factory(\App\V2\Models\Tasklist::class)->create()->id;
},
'number' => 1,
'title' => $faker->word,
'text' => $faker->text(100),
'status' => 1
];
});
$factory->define(\App\V2\Models\TasklistAction::class, function(\Faker\Generator $faker) {
return [
'id_tasklists_section' => factory(\App\V2\Models\TasklistSection::class)->create()->id,
'number' => rand(1, 10),
'title' => $faker->word,
'percent' => $faker->numberBetween(0, 100),
'status' => 1
];
});
In my testing class, i'm trying to generate a tasklist with 1 section with one action. The only way i found actually was something like that:
$task = factory(Tasklist::class, 2)->create()
->each(function($t){
$t->sections()->save(factory(TasklistSection::class)->create()
->each(function($s){
$s->actions()->save(factory(TasklistAction::class)->create());
})
);
});
To this code, if I delete the second each, it works, i got 2 tasklists with each 1 sections. In fact, the each is disturbing me.
I would like to create only one tasklist, with one or several sections with one or several actions on it.
But the each only accept Collection input the save method accepts only model input and not collection.
Does somebody have an idea how to deal with that ?
One approach can be this:
create task with sections and store them in the variable and then loop through each task section and add actions to it like this:
$tasklist = factory(App\Tasklist::class)->create();
$tasklist->sections()->saveMany(factory(App\TasklistSection::class, 3)->make());
foreach ($tasklist->sections as $section){
$section->actions()->saveMany(factory(App\TasklistAction::class, 3)->make());
}
this will work as expected.

MongoDB -> DynamoDB Migration

All,
I am attempting to migrate roughly 6GB of Mongo data that is comprised of hundreds of collections to DynamoDB. I have written some scripts using the AWS PHP SDK and am able to port over very small collections but when I try ones that have more than 20k documents (still a very small collection all things considered) it either takes an outrageous amount of time or quietly fails.
Does anyone have some tips/tricks for taking data from Mongo (or any other NoSQL DB) and migrating it to Dynamo, or any other NoSQL DB. I feel like this should be relatively easy because the documents are extremely flat/simple.
Any thoughts/suggestions would be much appreciated!
Thanks!
header.php
<?
require './aws-autoloader.php';
require './MongoGet.php';
set_time_limit(0);
use \Aws\DynamoDb\DynamoDbClient;
$client = \Aws\DynamoDb\DynamoDbClient::factory(array(
'key' => 'MY_KEY',
'secret' => 'MY_SECRET',
'region' => 'MY_REGION',
'base_url' => 'http://localhost:8000'
));
$collection = "AccumulatorGasPressure4093_raw";
function nEcho($str) {
echo "{$str}<br>\n";
}
echo "<pre>";
test-store.php
<?
include('test-header.php');
nEcho("Creating table(s)...");
// create test table
$client->createTable(array(
'TableName' => $collection,
'AttributeDefinitions' => array(
array(
'AttributeName' => 'id',
'AttributeType' => 'N'
),
array(
'AttributeName' => 'count',
'AttributeType' => 'N'
)
),
'KeySchema' => array(
array(
'AttributeName' => 'id',
'KeyType' => 'HASH'
),
array(
'AttributeName' => 'count',
'KeyType' => 'RANGED'
)
),
'ProvisionedThroughput' => array(
'ReadCapacityUnits' => 10,
'WriteCapacityUnits' => 20
)
));
$result = $client->describeTable(array(
'TableName' => $collection
));
nEcho("Done creating table...");
nEcho("Getting data from Mongo...");
// instantiate class and get data
$mGet = new MongoGet();
$results = $mGet->getData($collection);
nEcho ("Done retrieving Mongo data...");
nEcho ("Inserting data...");
$i = 0;
foreach($results as $result) {
$insertResult = $client->putItem(array(
'TableName' => $collection,
'Item' => $client->formatAttributes(array(
'id' => $i,
'date' => $result['date'],
'value' => $result['value'],
'count' => $i
)),
'ReturnConsumedCapacity' => 'TOTAL'
));
$i++;
}
nEcho("Done Inserting, script ending...");
I suspect that you are being throttled by DynamoDB, especially if your tables' throughputs are low. The SDK retries the requests, up to 11 times per request, but eventually, the requests fail, which should throw an exception.
You should take a look at the WriteRequestBatch object. This object is basically a queue of items that get sent in batches, but any items that fail to transfer are re-queued automatically. Should provide a more robust solution for what you are doing.

Categories