I would like to allow certain graphQl operations only for certain api users based on a confguration. Our stack is Symfony 6.2 + overblog/GraphQLBundle.
My current approach is to check in the authenticate method of a custom authenticator, if the current operation is cleared in the allowed operations config of the user. For this I would like to parse the graphql query into a kind of array, that I can interpret easily.
Anybody knows how this can be done? I was scanning the underlying webonyx/graphql-php library, but can not see how they do it.
As a simple example:
query myLatestPosts($followablesToFilter: [FollowableInput], $limit: Int, $offset: Int) {
my_latest_posts(followablesToFilter: $followablesToFilter, limit: $limit, offset: $offset) {
...PostFragment
__typename
}
my_roles
}
From this I would like to retrieve the operations my_latest_posts and my_roles.
Update 1
it's probably possible to write a simple lexer utilising preg_split - I'm just hesitating, as I'm sure someone has done this already... The spec appears to be well defined.
Alright, so it turned out webonyx/graphql-php has indeed all the lowlevel function needed for this :D. Especially the Visitor is very useful here.
With this code you can drill down the query to get the operation names (or selections, as they are called in the code)
This works for me:
use GraphQL\Language\AST\NodeKind;
use GraphQL\Language\Parser;
use GraphQL\Language\Visitor;
// whatever comes before
// ...
$graphQlRequest = json_decode($request->getContent(), true, 512, JSON_THROW_ON_ERROR);
$operations = [];
Visitor::visit(Parser::parse($graphQlRequest['query']), [
NodeKind::OPERATION_DEFINITION => function ($node) use (&$operations) {
// $node contains the whole graphQL query in a structured way
$selections = array_map(function ($selection) {
return $selection['name']['value'];
}, $node->toArray()['selectionSet']['selections']);
foreach ($selections as $selection) {
$operations[] = $selection;
}
return Visitor::stop();
}]
);
print_r($operations);
The result of $operations is then for my example above:
Array
(
[0] => my_latest_posts
[1] => my_roles
)
And this information is all I need to decide weather the user should have access to the endpoint or not.
Related
I have a function in my API to update the name of a person in an SQLite database. You give it the ID of the name you wish to change and the new name.
How can I build a function in a way that allows me to update a wide range of fields in the database? even things from different tables?
I started off trying to use parameters to switch which SQL query is executed, but this feels a bit clunky and not scalable. Is there a better way?
Current code:
private function json_update_authors() {
$input = json_decode(file_get_contents("php://input"));
$query = "UPDATE authors SET name = :name WHERE authorId = :authorId";
$params = ["name" => $input->name, "authorId" => $input->authorId];
$res = $this->recordset->getJSONRecordSet($query, $params);
return json_encode(array("status" => 200, "message" => "ok"));
}
Prependix
You can achieve what you want, but before reading the details, I recommend contemplating about what you would like to restrict this to, because if there is a file your function blindly trusts, then, should malicious input be inside that file, your database can easily be hacked. So, you should have a whitelist of tables/fields that you allow to be updated and apply that.
Decoding JSON
json_decode decodes your JSON into an object that you do not foresee its members. However, according to the documentation you can iterate this object like:
foreach ($obj as $key => $value) {
echo "$key => $value\n";
}
However, json_decode can decode your JSON into an array as well, like:
$input = json_decode(file_get_contents("php://input"), true);
I personally prefer to decode JSON into arrays, but you can operate with the first approach as well. In both cases, you can iterate the array in a similar manner as described above.
Recommended format
Your update has an anatomy as follows:
table
fields
filter
So, I would recommend that you could use a JSON representation of your input, that has a tableName field, which is a string, a fields field, which is an array of key-value pairs, the keys representing the fields to be updated and the values representing the values to update to and finally a filter field, which, if we intend to be very elegant, could also be an array of objects of key-value pairs, the keys representing the fields you are to filter by and the values representing the values you would filter with. A sample Javascript object that would comply to this format would look like the following:
[
{ //Your query
tableName: 'authors',
fields:
[
{
name: 'somename'
}
],
filters:
[
{
authorId: 123
}
]
},
{ //Some other example
tableName: 'books',
fields:
[
{
isbn: 'someisbn',
title: 'sometitle'
}
],
filters:
[
{
pageNumber: 123,
weight: '5kg'
}
]
},
]
I have given an example above, of two objects, so you can see that:
several updates can be notified in the JSON
you can update several fields in a single command
you can filter by several fields
I should mention that this is a rabbit hole, because you might want to vary the operator as well, but since this is a mere answer, I do not write a full elegant project for its purpose. Instead of that, let me just tell you that there is a lot of room for improvement, operator dynamicity springs to mind instantly as an improvement that you may need.
How to generate an update query:
//assuming that $JSON is a variable holding such values as describe in the previous chapter
foreach ($JSON as $obj) {
$tableName = $obj['tableName'];
$fields = [];
$filters = [];
$params = [];
$toExecute = isset($whiteList['tables'][$tableName]);
foreach ($obj['fields'] as $key => $value) {
$fields[]=($key.'=:field_value'.$key);
$params['field_value'.$key] = $value;
$toExecute = $toExecute && isset($whiteList['fields'][$key]);
}
foreach ($obj['filters'] as $key => $value) {
$filters[]=($key.'=:filter_value'.$key);
$params['filter_value'.$key] = $value;
$toExecute = $toExecute && isset($whiteList['filters'][$key]);
}
}
I have used a whitelist above to make sure that the queries will not update tables/fields using filters where the name of the table/field/filter is either badly formatted, malicious or unwanted. This code is untested, it might well contain typos, but the idea should be a good starting point.
Laravel 5.5 + Redis.
Got the following code in controller:
$products = Cache::remember('category_'.$category->alias.'_page_'.$page, 1440, function() use ($childrenCategoriesIndexes){
return Product::whereIn('category_id', $childrenCategoriesIndexes)
->userFilter()
->paginate(15);
});
It caches each page. But what if there are too many custom filters? This is scopeUserFilter() from Product model:
public function scopeUserFilter($query) {
if (request('price_from')) {
$query->where('price', '>', request('price_from'));
}
if (request('price_to')) {
$query->where('price', '<', request('price_to'));
}
return $query;
}
And there're only 2 variables. But what if there will be 10 and more variables, how to cache this data? I think keys like this are not good:
'category_'.$category->alias.'_page_'.$page.'_'.request('price_from').'_'.request('price_to')
Hashed the params, then you can include as many as you can:
$params = [
'page' => 1,
'price_from' => '',
'price_to' => '',
'param0' => '',
...
];
foreach (array_keys($params) as $param) {
if (request()->has($param))
$params[$param] = request()->input($param);
}
$prefix = 'category_';
$hashed = md5(json_encode($params));
$cache_key = $prefix . $hashed;
Instead of:
defining an array of all possible parameters
looping through that array to check whether the request contains any of those parameters and if so, assigning that value to the relevant key in the array
creating a hash of the array to use as the key for your cache
...an alternative, and possibly more flexible, approach (which I picked up from this article) would be to:
create an array of all request parameters and sort them alphabetically
rebuild the url using this sorted array
create a hash of this url to use as the key for your cache
This means your code would look something like this:
$url = request()->url();
$queryParams = request()->query();
ksort($queryParams);
$queryString = http_build_query($queryParams);
$fullUrl = "{$url}?{$queryString}";
$rememberKey = sha1($fullUrl);
return Cache::remember($rememberKey, $minutes, function () use ($data) {
return $data;
});
While #Ben's answer does address how we can cache multiple params, it's not really a good practice to cache all requests.
Caching is typically used for the most popular requests (highly frequent reads, infrequent writes). For example, caching the top 10 param combinations. If you start caching the long tail, you're defeating the purpose of caching as you drift towards more frequent writes and less frequent reads. Eventually you'll run out of memory if you're using in-memory caching engines
I would suggest rethink your caching strategy
I call an object that returns an array given certain chained methods:
Songs::duration('>', 2)->artist('Unknown')->genre('Metal')->stars(5)->getAllAsArray();
The problem lies that every time I want to get this array, for example, in another script, I have to chain everything again. Now imagine that in over 10 scripts.
Is there a way to recall the chained methods for later use?
Since you can't cache the result, you could cache the structure of the call chain in an array.
$chain = [
'duration' => ['>', 2],
'artist' => 'Unknown',
'genre' => 'Metal',
'stars' => 5,
'getAllAsArray' => null
];
You could use that with a function that emulates the chained call using the cached array:
function callChain($object, $chain) {
foreach ($chain as $method => $params) {
$params = is_array($params) ? $params : (array) $params;
$object = call_user_func_array([$object, $method], $params);
}
return $object;
}
$result = callChain('Songs', $chain);
If you can not cache your results as suggested, as I commented, here are a couple ideas. If your application allows for mixing of functions (as in you are permitted by standards of your company's development rules) and classes, you can use a function wrapper:
// The function can be as complex as you want
// You can make '>', 2 args too if they are going to be different all the time
function getArtists($array)
{
return \Songs::duration('>', 2)->artist($array[0])->genre($array[1])->stars($array[2])->getAllAsArray();
}
print_r(getArtists(array('Unkown','Metal',5)));
If you are only allowed to use classes and __callStatic() is not forbidden in your development and is also available in the version of PHP you are using, you might try that:
// If you have access to the Songs class
public __callStatic($name,$args=false)
{
// This should explode your method name
// so you have two important elements of your chain
// Unknown_Metal() should produce "Unknown" and "Metal" as key 0 and 1
$settings = explode("_",$name);
// Args should be in an array, so if you have 1 value, should be in key 0
$stars = (isset($args[0]))? $args[0] : 5;
// return the contents
return self::duration('>', 2)->artist($settings[0])->genre($settings[1])->stars($stars)->getAllAsArray();
}
This should return the same as your chain:
print_r(\Songs::Unknown_Metal(5));
It should be noted that overloading is hard to follow because there is no concrete method called Unknown_Metal so it's harder to debug. Also note I have not tested this particular set-up out locally, but I have notated what should happen where.
If those are not allowed, I would then make a method to shorten that chain:
public function getArtists($array)
{
// Note, '>', 2 can be args too, I just didn't add them
return self::duration('>', 2)->artist($array[0])->genre($array[1])->stars($array[2])->getAllAsArray();
}
print_r(\Songs::getArtists(array('Unkown','Metal',5)));
I wrote a lib doing exactly what you're looking for, implementing the principle suggested by Don't Panic in a high quality way: https://packagist.org/packages/jclaveau/php-deferred-callchain
In your case you would code
$search = DeferredCallChain::new_(Songs::class) // or shorter: later(Songs::class)
->duration('>',2) // static syntax "::" cannot handle chaining sadly
->artist('Unknown')
->genre('Metal')
->stars(5)
->getAllAsArray();
print_r( $search($myFirstDBSongs) );
print_r( $search($mySecondDBSongs) );
Hoping it will match your needs!
I have a datase table with a list of books. Below is my sql statement:
SELECT `Book`.`id` , `Book`.`name` , `Book`.`isbn` , `Book`.`quantity_in_stock` , `Book`.`price` , (`Book`.`quantity_in_stock` * `Book`.`price`) AS `sales`, concat(`Author`.`name`, ' ', `Author`.`surname`) AS `author`
FROM `books` AS `Book`
LEFT JOIN authors AS `Author`
ON ( `Book`.`author_id` = `Author`.`id` )
WHERE (`Book`.`quantity_in_stock` * `Book`.`price`) > 5000.00
The query works fine and the workflow works fine too. However, I am wanting to access this through an API and make the 5000.00 value configurable through a variable bar.
Question is how do I make this possible such that when I call my API with my endpoint below it works?
https://domain.flowgear.io/5000booklist/{sales_value}
What I want is to be able to re-use my workflow via an API and just pass a sales value I want to query the table against. Sales value can be 2000 or 5000 depending on what I want to achieve.
Add a variable bar and add a property to it called "salesValue"
In the workflow detail pane, provide this url: "/booklist/{salesValue}" - the value in braces must match the name of the property in the variable bar
Add a Formatter, put your SQL template including "WHERE (Book.quantity_in_stock * Book.price) > {salesValue}" in the Expression property then add a custom field called salesValue and pin that from the variable bar salesValue property. Set Escaping to SQL.
Take the output of the Formatter and plug that into the SQL Query property of a SQL Query Connector.
Add another variable bar, and add the special properties FgResponseBody and FgResponseContentType
Pin the SQL result to FgResponseBody and set FgResponseContentType to 'text/xml'
If you want to return JSON, convert the result from the SQL Query to JSON using JSON Convert and then pin that to FgResponseBody and set FgResponseContentType to 'application/json'
#sanjay I will try to give you an overview of what I did back then when I was experimenting with Flowgear through PHP following instructions from here.
I am not sure if you are also invoking the Flowgear REST API through PHP or any other language but regardless I presume logic should remain the same.
What I did was to wrap the PHP CURL sample code in a class so that I can be able to reuse it. Below is a code I wrote for a simple select query:
<?php
//Require the FlowgearConnect class
require_once '/path/to/flowgear_class_with_api_call.php';
try{
$workflow = new FlowgearConnect(return include 'endpoints.php');
$serial = $_POST['serial'];
$clientId = $_POST['client_id'];
//Get the results
$sql = '';
if(empty($serial)){
$conditions = sprintf(' `a`.`client_id` = %s AND `a`.`serial` > -1 ORDER BY `a`.`serial` ASC', $clientId);
}else{
$conditions = ' `a`.`serial` = ' . $serial;
}
/**
In your workflow you will most probably have a VARIABLE BAR that holds your request parameters which is what $conditions speaks to.
*/
$conditions = array('conditions' => $conditions);
$results = $workflow->getResults('orders', 'orders', $conditions);
}catch(catch any exceptions thrown by the API here){
//Log the exceptions here or do whatever
}
The listing above should be self explanatory. Below I will show you the functions I have made use of from my FlowgearConnect class. This is not a standard way as you may configure your code differently to suite your needs.
//FlowgearConnect constructor
class FlowgearConnect
{
protetced $endpoints = [];
protected $domain = "https://your-domain.flowgear.io";
public function __construct(array $endpoints)
{
$this->endpoints = $endpoints;
}
public function getResults($model, $workflow, $options= array())
{
$endpoint = $this->getEndpoint($model, $workflow);
$results = array();
if(!empty($endpoint)){
$results = FlowgearInvoke::run($authOpts, $endpoint, $options, array('timeout' => 30));
}
return $results;
}
....
}
The enpoints.php file, as mentioned before, just returns an array of configured endpoints and/or worflow names from within flowgear console. Below is a excerpt of how mine looked like:
return array(
'orders' => array(
'shipped_orders' => '/shipped_orders',
//etc
),
'items' => array(
'your_model' => '/workflow_name_from_flowgear_console',
),
);
This is just a basic select query with Flowgear's REST API using PHP. If you are lucky you should get your records the way you have configured your response body for your workflow.
Below is a typical testing of a workflow and what you should get back in your API.
I advice you to first create your workflows on your flowgear console and make sure that the produce the desired output and the extract the parts that you want changed no your query, move them to a variable bar for your request and have them injected at run-time based on what you looking to achieve. This explanation can be substituted for other operations such as update and/or delete. Best thing is to understand flowgear first and make sure that you can have everything working there before attempting to create a restful interactive application.
Caution: It's over a year that I have since worked with this platform so you might find errors in this but I am hoping that it will lead you to finding a solution for your problem. If not then perhaps you can create a repo and have me check it out to see how you are configuring everything.
I'd like to exclude results from a call to a Lithium model's find() method. I need to do this for models with both MongoDB and MySQL data sources, but in SQL I mean something like WHERE myfield NOT IN (1,2,3).
I'd like to just be able to pass a not clause in the conditions array like below, but that doesn't appear to be possible.
Item::all(array('conditions' => array('not' => array('myfield' => array(1,2,3))));
So my question is, is this possible in Lithium in a way that I've overlooked? And if not, what would be the most Lithium-ish way to implement it for my models?
Just to clarify, Lithium's MongoDB adapter supports most SQL comparison operators as a convenience, so for either Mongo or MySQL, you could simply write the query as follows:
Item::all(array('conditions' => array(
'myfield' => array('!=' => array(1,2,3))
)));
And it should give you the results you expect. For MySQL, the query should look something like:
SELECT * FROM items WHERE myfield NOT IN (1, 2, 3);
And in Mongo:
db.items.find({ myfield: { $nin: [1, 2, 3] }})
Merely filtering for MongoDB can easily be achieved like this:
Item::all(array('conditions' =>
array('myfield' => array(
'$nin' => array(1,2,3)
))
));
If this is something you do a lot you could even create a custom finder for it :
class MyModel extends \lithium\data\Model {
public static function __init()
{
parent::__init();
static::finder('notin', function($self, $params, $chain) {
// Take all array keys that are not option keys
$array = array_diff_key($params['options'],
array_fill_keys(array('conditions', 'fields','order','limit','page'),0));
// Clean up options leaving only what li3 expects
$params['options'] = array_diff_key($params['options'], $array);
$params['options']['conditions'] = array(
'myfield' => array(
'$nin' => $array
)
);
return $chain->next($self, $params, $chain);
});
}
}
And call it like this :
MyModel::notin(array(1,2,3));
In the same manner you could create a custom finder for MySQL sources.
As you probably can see this creates some issues if you pass something like array('fields'=>$array) as it would overwrite the option.
What happens is that ::notin() (finders in general) has a distinct behavior for the (array,null) signature. If that happens it thinks the first array is options and the finder took no arguments.
Using notin($array,array()) breaks the previous finder because the first argument ends up in $params['notin'] when the real second argument (options) is passed.
If you mix data sources on the fly here I would create a custom model that does not inherit \lithium\data\Model and have it delegate
to the different models and create the conditions based on the end models data source.
class MyFacadeModel {
public static function byNotIn($conditions, $source) {
return ($source == "mongodb")
? $source::find( $rewrittenConditions)
: $source::find( $rewrittenConditionsForMysql );
}
}
(Code might be slightly incorrect as its mostly taken from the top of my head)