guzzle php post request response issue - php

I am able to make a post request with guzzle php using the following code
$request = $this->request('POST', $this->url, array('form_params' => $params));
Everything works fine. But when I call
$request->getBody()->getContents();
A string "root" is attached to the beginning of the content returned.
I dont seems to understand why this is happening.
Any assistance will be appreciated.
An example of what I get when I var_dump is this
string(4) "root"
{"access_token":"kjVbpzmk3VAWTHn3jyeaM1nal1zkFIPZrI8khmKQ",
"token_type":"Bearer",
"expires_in":604800,
"user_id":3,
"user":{
"id":3,
"name":"Thomas Paul"
}
}
Meanwhile in postman I get this
{
"access_token": "y9Jeovb3EERC4oE13yCS8WfFi3XK1eul4D4luwX3",
"token_type": "Bearer",
"expires_in": 604800,
"user_id": 3,
"user": {
"id": 3,
"name": "Thomas Paul"
}
}

This is a security measure as there is a known security vulnerability in some browsers.
Also the JSON API specs require this top level element.
Guzzle is a really robust library. Maybe Postman isn't or maybe Postman removes the root element by itself... I don't know this.

I solved the problem using a
substr($request->getBody()->getContents(), 17)
of the response and was able to remove the unwanted strings and decoded it to json

Related

What is the correct way to read a large JSON API response in Guzzle 6?

I currently have the following Guzzle 6 implementation returning a stream of JSON data containing user data:
$client = new GuzzleHttp\Client([
'base_uri' => 'https://www.apiexample.com',
'handler' => $oauthHandler,
'auth' => 'oauth',
'headers' => [
'Authorization' => 'Bearer xxxxxxxxxxxxxx',
'Content-Type' => 'application/json',
'Accept' => 'application/json',
],
]);
$res = $client->post('example');
$stream = GuzzleHttp\Psr7\stream_for($res->getBody());
The JSON responses look like:
{
"name": "Users",
"record": [
{
"title": "Consulting",
"_name": "Users",
"end date": "07/03/2020",
"session number": "1",
"start date": "09/02/2019",
"course id": "2900",
"first name": "John",
"unique user number": "123456",
"time": "08 AM",
"last name": "Doe",
"year": "19-20",
"location name": "SD"
},
.........
],
"#extensions": "activities,corefields,u_extension,u_stu_x,s_ncea_x,s_stu_crdc_x,c_locator,u_userfields,s_edfi_x"
}
This is being run for a number of clients using different API endpoints. Many of them return too many users for the entire JSON response to be loaded into RAM at once, which is why I am using a stream.
There may be a way to get the API to return chunks incrementally, through multiple calls. But from everything I have gotten from the developers of the API it appears that this is intended to be consumed as one streamed response.
I am new to having to stream an API response like this and am wondering what the correct approach would be to iterate through the records? Looking at the Guzzle 6 docs it appears that the iteration happens by choosing the number x character in the string and grabbing that subsection:
http://docs.guzzlephp.org/en/stable/psr7.html#streams
use GuzzleHttp\Psr7;
$stream = Psr7\stream_for('string data');
echo $stream;
// string data
echo $stream->read(3);
// str
echo $stream->getContents();
// ing data
var_export($stream->eof());
// true
var_export($stream->tell());
// 11
I could potentially write something that parses the strings in subsections through pattern matching and incrementally writes the data to disk as I move through the response. But it seems like that would be error prone and something that would be part of Guzzle 6.
Can you provide an example of how something like this should work or point out where I might be missing something?
I appreciate it, thanks!
But it seems like that would be error prone and something that would be part of Guzzle 6.
Nope, Guzzle is a HTTP client, it has nothing to do with parsing different response formats.
What you need is a JSON streaming parser. Please take a looks at this SO question, and also at the libraries: https://github.com/salsify/jsonstreamingparser, https://github.com/clue/php-json-stream, https://github.com/halaxa/json-machine.
In Guzzle you will have 2 possibilities:
to read the response stream manually (as you do currently), but this probably requires manual integration with a JSON streaming parser
to stream the whole response to a temporary file (see "sink" request option) and read this file later with a JSON streaming parser, this should be supported by all of the libraries.

Laravel checking data with API through API

I have some data that compressed with gzip in an application from here:
app.myaddress.com/data/api/1.
The data contains several parameters in JSON format like follows:
{
"id": 1,
"data": "abcabcabcabcabc" //this is the compressed data
}
I need to check the compressed data with another 3rd party service, we can just say the address like follows: app2.myaddress.com/check_data/abcabc by API request, but it's needed a header authentication:
{
"content-type": "application/json",
"api-key": 123456
}
app2.myaddress.com will return data JSON format like follows:
{
"name": "hello",
"address": "australia"
}
What I need to do is just checking a data by accessing URL like:
app.myaddress.com/data/api/checked/1
then the controller will process the data include checking through app2.myaddress.com and return the value into app.myaddress.com
Solved by #rafitio:
You can use cURL or Guzzle to access both URL inside your function. Here is the documentation of Guzzle: http://docs.guzzlephp.org/en/stable/request-options.html

Return a Json file via get request

(ionic 3 app)
let me walk you through, whats going on here.
i have a file information.json, which contains information about various categories.
{
"items": [
{
"name":"Phones and Tablets",
"category_id": "20",
"image": "http://somesite/image/cache/catalog/menu-icons/phone-tablet-icon-500x500.png",
"children": [
{
"name": "Smartphones",
"category_id": "60"
},
{
"name": "Tablets",
"category_id": "62"
},
{
"name": "Accessories",
"category_id": "63"
},
{
"name": "Basic Gsm Phones",
"category_id": "64"
}
]
},
which is just sitting there looking pretty in main directory on my site.
when i maka a get request.
var headers = new Headers();
headers.append('accept', 'application/json');
let options = new RequestOptions({ headers: headers, });
this.http.get("http://somesite/information.json", options)
.map(data => data.json().items)
.subscribe(data => {
this.information = (data);
console.log(this.information);
}
)}
i get CORS error in console.
Failed to load http://somesite/information.json: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:8100' is therefore not allowed access.
i have a Moesif origin & Cross changer chrome extension, when i enable that then everything works fine in browser.
when i build/run the app on android everything seems to work fine.
but when i run the same app on ios, it doesn't seem to get any response.
i dont know if ios block all CORS by default or something.
i tried fiddling around with .htaccess file of my site, i included
Headers Allow-Access-Control-Origin "*", which seems to work for only 1 request, so i was able to get categories information. but all other get requests from my app became invalid. currently i am looking around to set up a PHP file with headers information so when i make a get request to php, it echos out that file information.json , but the problem is i just dont know how to set it up properly.
if you guys have any other solution for this problem, feels free to share them.
thanks.

Getting Instagram subscription JSON data from post in PHP

This whole process of subscriptions for the Instagram API seems to be less than straight forward.
I have some code set up to receive the post data sent when Instagram hits me with a notification of a post from one of my subscriptions. However when I try to view the data, the raw JSON, it posts I can't get at it. If I print_r or var_dump I just get the number 1.
See my code for accessing the data:
// Catches realtime updates from Instagram
if ($_SERVER['REQUEST_METHOD']==='POST') {
// Retrieves the POST data from Instagram
$update = file_get_contents('php://input');
$data = json_decode($update);
var_dump($data); //Outputs 1
print_r($data[0]); //Outputs 1
}
How can I get at the JSON as an array?
This is what the JSON should look like:
[
{
"subscription_id": "1",
"object": "user",
"object_id": "1234",
"changed_aspect": "media",
"time": 1297286541
},
{
"subscription_id": "2",
"object": "tag",
"object_id": "nofilter",
"changed_aspect": "media",
"time": 1297286541
},
...
]
Thanks for any help.
Update 1
I've used PHP to print the HTTP headers. There's content because it show's it's length. Still unable to get at it though. This rules out it being an Instagram issue, I think
If you are using PHP, I guess the simplest way to access input data is using $_GET and $_POST superglobals. In this case, try to var_dump($_POST) and see what you get.
If you get some content from $_POST, you can use json_decode to decode JSON into an array.
You can also try some PHP implementations of the Instagram API, like this one: https://github.com/macuenca/Instagram-PHP-API It will to the work you need.
Woop found the problem and solved it. It's not easy to debug because all of this happens when Instagram hit your page so you don't really see the output.
What I needed to do was create a foreach loop to run through the decoded JSON. After a lot of debugging and head scratching the JSON isn't empty, it just starts with a JSON array.
Anyway here's the code now that works:
// Catches realtime updates from Instagram
if ($_SERVER['REQUEST_METHOD']==='POST') {
// Retrieves the POST data from Instagram
$update = file_get_contents('php://input');
$data = json_decode($update);
foreach($data as $k => $v) // can be multiple updates per call
{
$sub_id = $v->subscription_id; //Contains the JSON values
$user = $v->object_id;
}
}
If you want to see the outputs from $sub_id for example I suggest logging them or email them to yourself for example.

Submitting a POST request to Piwik.php

I'm trying to send bulk requests to the Piwik tracking api (/piwik.php) and I'm running into a problem. When I send the request (from a PHP script over ajax, curl and from fiddler2), I'm receiving the following:
Debug enabled - Input parameters:<br/>array ( )
token_auth is authenticated!
Loading plugins: { Provider,Goals,UserCountry }
Current datetime: 2013-05-02 16:02:27
The request is invalid: empty request, or maybe tracking is disabled in the config.ini.php via record_statistics=0
My post looks like this:
{"requests":["%3Fidsite%3D1%26url%3Dhttp%3A%2F%2Fexample.org%26action_name%3DTest+bulk+log+Pageview%26rec%3D1"],"token_auth":"mytokenhere"}
Which is the example straight from their website. I've made sure to set the content-type to "Content-Type: application/json" and that my configuration has record_statistics = 1 explicitly defined.
According to the documentation, this should all work, but I'm still getting the empty request. The import_logs.py script also works, so I know that the general bulk importing is not broken, but I'm not sure how to get the program to accept my data. Has anyone had any luck with it?
Thanks!
Perhaps the problem with your request is that your query strings are URL encoded, but they don't need to be since they're part of the POST body.
Your POST should be like this instead:
{"requests":["?idsite=1&url=http://example.org&action_name=Test+bulk+log+Pageview&rec=1"],"token_auth":"mytokenhere"}
See the example at the docs for the Bulk Tracking API: http://piwik.org/docs/tracking-api/reference/#toc-advanced-bulk-tracking-requests
Figured out what was wrong. Their documentation was incorrect in how the request needed to be formatted. First, URL Encoded data was unnecessary. Second, the JSON string needs to look like this:
{
"requests": [
{
"apiv": "1",
"bots": "1",
"idsite": "1",
"download": "",
"cdt": "",
"dp": "",
"url": "",
"urlref": "",
"cip": "",
"ua": "",
"_cvar": {
"1": [
"Not-Bot",
"Mozilla/5.0+(Macintosh;+U;+Intel+Mac+OS+X+10_6_5;+en-US)+AppleWebKit/534.10+(KHTML,+like+Gecko)+Chrome/8.0.552.231+Safari/534.10"
]
},
"rec": "1"
}
]
}
Not all of those pieces of data need to be sent, but that's the format necessary. After that it's just data cleansing.

Categories