can I throw custom error in laravel-lighthouse - php

is the any way that throw an error by GraphQL\Error\Error with no additional data except message.
the current return data is
{
"errors": [
{
"message": "Some Errors",
"extensions": {
"reason": "",
"category": "custom"
},
"locations": [
{
"line": 2,
"column": 3
}
],
"path": [
"to the path"
],
"trace": [{},{},{},{}]
}
],
"data": {
"testQuery": null
}
}
it contains unnecessary data
but I want something like:
{
"errors": [
{
"message": "Some Errors",
"extensions": {
"reason": "",
"category": "custom"
},
],
"data": {
"testQuery": null
}
}

You can throw your own exceptions, that will allow you to do that. Check https://lighthouse-php.com/5/digging-deeper/error-handling.html#registering-error-handlers.

for a simple error you can use this code, in case you don't want to create an error handler class
use GraphQL\Error\Error;
...
return Error::createLocatedError('Some Errors');

Related

Google ads API "PERMISSION_DENIED"

I get the error :
{
"message": "The caller does not have permission",
"code": 7,
"status": "PERMISSION_DENIED",
"details": [
{
"#type": "type.googleapis.com\/google.ads.googleads.v10.errors.GoogleAdsFailure",
"errors": [
{
"errorCode": {
"authorizationError": "USER_PERMISSION_DENIED"
},
"message": "User doesn't have permission to access customer. Note: If you're accessing a client customer, the manager's customer id must be set in the 'login-customer-id' header. See https:\/\/developers.google.com\/google-ads\/api\/docs\/concepts\/call-structure#cid"
}
],
"requestId": "asds33ad3sdadad334"
}
]
}
and the code I get from here : https://developers.google.com/google-ads/api/docs/reporting/example
what I need to do with login-customer-id ?

PHP Quickbooks SDK - Batch requests and handling failures

I've building out a small app that connects to a Quickbooks API via an SDK. The SDK provides batch operations to help reduce the number of API requests needed.
However, I'm hoping to make a large amount of requests (ie: bulk deletes, uploads in the 100s/1000s). I've gotten the deletes to work, however, now I'm hoping to integrate Laravel's Queue system so that any items in the $batch that fail (due to these business-rules or other reasons) are sent to a worker who will reattempt them after waiting a minute .
Below is an example of a delete request.
class QuickBooksAPIController extends Controller
{
public function batchDelete(Request $request, $category)
{
$chunks = array_chunk($request->data, 30);
foreach ($chunks as $key => $value) {
$batch[$key] = $this->dataService()->CreateNewBatch();
foreach ($value as $id) {
$item = $this->dataService()->FindById($category, $id);
$batch[$key]->AddEntity($item, $id, "delete");
}
$batch[$key]->Execute();
}
return response()->json(['message' => 'Items Deleted'], 200);
}
}
The documentations are a bit sparse for my scenario though. How can I get the failed batch items on order to try again?
Is using batches even the right choice here? Because I have to hit the API anyway to get the $item... which doesn't make sense to me (I think I'm doing something wrong there).
EDIT:
I intentionally sent out a request with more then 30 items and this is the failure message. Which doesn't have the values that didn't make the cut.
EDIT#2:
Ended up using array_chunk to separate the payload into 30 items (which is the limit of the API). Doing so helps process many requests. I've adjusted my code above to represent my current code.
How can I get the failed batch items on order to try again?
If you look at Intuit's documentation, you can see that the HTTP response the API returns contains this information. Here's the example request they show:
{
"BatchItemRequest": [
{
"bId": "bid1",
"Vendor": {
"DisplayName": "Smith Family Store"
},
"operation": "create"
},
{
"bId": "bid2",
"operation": "delete",
"Invoice": {
"SyncToken": "0",
"Id": "129"
}
},
{
"SalesReceipt": {
"PrivateNote": "A private note.",
"SyncToken": "0",
"domain": "QBO",
"Id": "11",
"sparse": true
},
"bId": "bid3",
"operation": "update"
},
{
"Query": "select * from SalesReceipt where TotalAmt > '300.00'",
"bId": "bid4"
}
]
}
And the corresponding response:
{
"BatchItemResponse": [
{
"Fault": {
"type": "ValidationFault",
"Error": [
{
"Message": "Duplicate Name Exists Error",
"code": "6240",
"Detail": "The name supplied already exists. : Another customer, vendor or employee is already using this \nname. Please use a different name.",
"element": ""
}
]
},
"bId": "bid1"
},
{
"Fault": {
"type": "ValidationFault",
"Error": [
{
"Message": "Object Not Found",
"code": "610",
"Detail": "Object Not Found : Something you're trying to use has been made inactive. Check the fields with accounts, customers, items, vendors or employees.",
"element": ""
}
]
},
"bId": "bid2"
},
{
"Fault": {
"type": "ValidationFault",
"Error": [
{
"Message": "Stale Object Error",
"code": "5010",
"Detail": "Stale Object Error : You and root were working on this at the same time. root finished before you did, so your work was not saved.",
"element": ""
}
]
},
"bId": "bid3"
},
{
"bId": "bid4",
"QueryResponse": {
"SalesReceipt": [
{
"TxnDate": "2015-08-25",
"domain": "QBO",
"CurrencyRef": {
"name": "United States Dollar",
"value": "USD"
},
"PrintStatus": "NotSet",
"PaymentRefNum": "10264",
"TotalAmt": 337.5,
"Line": [
{
"Description": "Custom Design",
"DetailType": "SalesItemLineDetail",
"SalesItemLineDetail": {
"TaxCodeRef": {
"value": "NON"
},
"Qty": 4.5,
"UnitPrice": 75,
"ItemRef": {
"name": "Design",
"value": "4"
}
},
"LineNum": 1,
"Amount": 337.5,
"Id": "1"
},
{
"DetailType": "SubTotalLineDetail",
"Amount": 337.5,
"SubTotalLineDetail": {}
}
],
"ApplyTaxAfterDiscount": false,
"DocNumber": "1003",
"PrivateNote": "A private note.",
"sparse": false,
"DepositToAccountRef": {
"name": "Checking",
"value": "35"
},
"CustomerMemo": {
"value": "Thank you for your business and have a great day!"
},
"Balance": 0,
"CustomerRef": {
"name": "Dylan Sollfrank",
"value": "6"
},
"TxnTaxDetail": {
"TotalTax": 0
},
"SyncToken": "1",
"PaymentMethodRef": {
"name": "Check",
"value": "2"
},
"EmailStatus": "NotSet",
"BillAddr": {
"Lat": "INVALID",
"Long": "INVALID",
"Id": "49",
"Line1": "Dylan Sollfrank"
},
"MetaData": {
"CreateTime": "2015-08-27T14:59:48-07:00",
"LastUpdatedTime": "2016-04-15T09:01:10-07:00"
},
"CustomField": [
{
"DefinitionId": "1",
"Type": "StringType",
"Name": "Crew #"
}
],
"Id": "11"
}
],
"startPosition": 1,
"maxResults": 1
}
}
],
"time": "2016-04-15T09:01:18.141-07:00"
}
Notice the separate response object for each request.
The bId value is a unique value you send in the request, which is then echo'd back to you in the response, so you can match up the requests you send with the responses you get back.
Here's the docs:
https://developer.intuit.com/app/developer/qbo/docs/api/accounting/all-entities/batch#sample-batch-request
Is using batches even the right choice here?
Batches make a lot of sense when you are doing a lot of things all at once.
The way you're trying to use them is... weird. What you should probably be doing is:
Batch 1
- go find all your items
Batch 2
- delete all the items
Your existing code doesn't make sense because you're trying to both find the item and delete the item in the exact same batch HTTP request, which isn't possible via the API.
I intentionally sent out a request with more then 30 items and this is the failure message.
No, it's not. That's a PHP error message - you have an error in your code.
You need to fix the PHP error, and then look at the actual response you're getting back from the API.

Delete document older than One hour is not working in elasticsearch

I am new to elasticsearch and I have an document in elasticsearch and document contain thosands of user views and now I want to delete those view that are older than 3Hours for this purpose I write following query in elasticsearch
POST {INDEX}/_delete_by_query
{
"query": {
"bool": {
"must": [
{
"term": {
"type": "box_views"
}
},
{
"query": {
"range": {
"#created_at": {
"gte": "now-3h"
}
}
}
}
]
}
}
}
When I execute this query I receive following error
{ "error": {
"root_cause": [
{
"type": "parsing_exception",
"reason": "no [query] registered for [query]",
"line": 1,
"col": 66
}
],
"type": "parsing_exception",
"reason": "no [query] registered for [query]",
"line": 1,
"col": 66 }, "status": 400 }
Your query should look like this:
POST {INDEX}/_delete_by_query
{
"query": {
"bool": {
"must": [
{
"term": {
"type": "box_views"
}
},
{
"range": {
"#created_at": {
"gte": "now-3h"
}
}
}
]
}
}
}
Besides, if you're looking for older documents, I think you should use lte instead of gte.

Elastic search geo point query filter

I am trying to use geo_point for distance but it always shows location type double not geo_point how can I set location mapped to geo_point .
Actually I have to find all records within 5km sorted.
"pin" : {
"properties" : {
"location" : {
"properties" : {
"lat" : {
"type" : "double"
},
"lon" : {
"type" : "double"
}
}
},
"type" : {
"type" : "string"
}
}
},
and when I am trying searches with this query below to find result within 5km of delhi lat long :
{
"query": {
"filtered": {
"query": {
"match_all": {}
},
"filter": {
"pin": {
"distance": "5",
"distance_unit": "km",
"coordinate": {
"lat": 28.5402707,
"lon": 77.2289643
}
}
}
}
}
}
It shows me error query_parsing_exception and No query registered for [pin]. I cannot figure out the problem. It always throws this exception
{
"error": {
"root_cause": [
{
"type": "query_parsing_exception",
"reason": "No query registered for [pin]",
"index": "find_index",
"line": 1,
"col": 58
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards": [
{
"shard": 0,
"index": "find_index",
"node": "DtpkEdLCSZCr8r2bTd8p5w",
"reason": {
"type": "query_parsing_exception",
"reason": "No query registered for [pin]",
"index": "find_index",
"line": 1,
"col": 58
}
}
]
},
"status": 400
}
Please help me to figure out this problem. how can I set geo_point and solve this exception error and status 400 and all_shards_failed error
You simply need to map your pin type like this, i.e. using the geo_point data type:
# 1. first delete your index
DELETE shouut_find_index
# 2. create a new one with the proper mapping
PUT shouut_find_index
{
"mappings": {
"search_shouut": {
"properties": {
"location": {
"type": "geo_point"
},
"type": {
"type": "string"
}
}
}
}
}
Then you can index a document like this
PUT shouut_find_index/search_shouut/1
{
"location": {
"lat": 28.5402707,
"lon": 77.2289643
},
"type": "dummy"
}
And finally your query can look like this
POST shouut_find_index/search_shouut/_search
{
"query": {
"filtered": {
"filter": {
"geo_distance": {
"distance": "5km",
"location": {
"lat": 28.5402707,
"lon": 77.2289643
}
}
}
}
}
}

What's the right way to use custom exceptions in Restler?

Lets say insted of just output:
{
"error":
{
"code": 500,
"message": "Some internal error"
}
}
I would like to output:
{
"error":
{
"code": 500,
"message": "Some internal error",
"error_code" : 1050
}
}
Also is there a way we can catch all the exceptions for log purposes for example?
Use RestException to throw the exception and use the details parameter (an array) to add additional details
throw new RestException(400, 'invalid user', array('error_code' => 12002));
gives me the following
{
"error": {
"code": 400,
"message": "Bad Request: invalid user",
"error_code": 12002
},
"debug": {
"source": "Say.php:5 at call stage",
"stages": {
"success": [
"get",
"route",
"negotiate",
"validate"
],
"failure": [
"call",
"message"
]
}
}
}
Info:- additional debug information is returned when restler is running in debug mode.
It can be turned off by using Compose::$includeDebugInfo=false;
Note:- Make sure you are using Restler 3.0 RC4 or higher

Categories