I am pretty new to using Guzzle with Laravel. I currently just use it for communication between my front-end and a seperate REST api.
I'm trying to download a file from my api but I'm not sure how to go about it. I could just specify the path to the file and directly download it but I also want to be able to stream or view it in my browser so a user can just view a word document instead of just downloading it.
Currently I'm sending a GET request from front end project (with backend to do api calls) to the api project:
$resp = $client->request('GET', env('API_BASE_URL').'/project/'.$id. '/download-report', [ 'headers' => [ 'Authorization' => 'Bearer '. session()->get('api_token') ] ]);
and in my api backend I return the file with the ->download() function.
return response()->download($report->getPath());
Can someone explain what would be the best way to approach this situation?
Solutions for both issues would be awesome, just downloading it, or actually viewing the contents of the word document.
Thanks in advance!
First of all, it's better to serve files with the web server (and leave PHP for "smarter" work). The idea behind it is to generate a "secure" link, that hides the real file and has an expiration timeout.
There are a few ways to do that (depends on your server: nginx, Apache or something else). A good example for nginx, with which you can just generate such link in your API, and send it to the end user through your frontend.
If you prefer to do in PHP for some reasons, just download the file to a temporary file and send it using response()->download() with corresponding headers (Content-Type) in your frontend.
$tmpFile = tempnam(sys_get_temp_dir(), 'reports_');
$resp = $client->request('GET', '...', [
// ... your options ...
'sink' => $tmpFile,
]);
response()->download(
$tmpFile,
null,
[
'Content-Type' => $resp->getHeader('Content-Type')[0]
]
)->deleteFileAfterSend(true);
Related
I'm trying to update a project to using resumable uploads and have managed to upload to my bucket using client-side to handle all PUT requests.
One issue I have though is setting the ACL on the object itself.
Client Side.
I have tried setting the header on the PUTs with both,
'x-goog-acl':'public-read'
and
'acl':'public-read'
The latter works fine on my non-resumable uploading, but I'm not 100% on which I'm expected to use with resumable or if it even matters.
Server Side
I'm using the 'beginSignedUploadSession' method with the Google Cloud Storage for PHP Library
I've seen examples like.
$bucket->upload(
fopen('/data/file.txt', 'r'),
[
'predefinedAcl' => 'publicRead'
]
);
So I've tried...
$url = $object->beginSignedUploadSession([
'predefinedAcl' => 'publicRead'
]);
However looking at the docs, the predefinedAcl parameter does not seem to be supported for this method.
beginSignedUploadSession Parameters
The only thing I can think to try is using the headers directly, like..
$url = $object->beginSignedUploadSession([
'headers' => array('x-goog-acl' => 'public-read'),
'contentType' => $filetype
]);
Although this seems to also fail with both 'x-goog-acl' and 'acl' headers.
So, my question is, does anyone know of the correct way to set the ACL on an object using the beginSignedUploadSession method or whether there is a workaround
if it is not possible directly ?
Thanks.
Update
So far the only way I've been able to do this is to edit the source of the library.
I've hard coded the header in google-cloud-php/Core/src/Upload/SignedUrlUploader.php with..
'x-goog-acl' => 'public-read'
preview
This is obviously horrible, but it works for me for now.
I'm still very much interested in the correct way, or as to why the header isn't getting passed through. If I find out at a later date, then I'll update this post.
Thanks again.
You can use the getResumableUpload function of Google apis, there is an option for ACL.
getResumableUpload
$uploader = $bucket->getResumableUploader(
fopen('/data/file.txt', 'r'),
[
'predefinedAcl' => 'publicRead'
]
);
Then to upload:
try {
$object = $uploader->upload();
} catch (GoogleException $ex) {
$resumeUri = $uploader->getResumeUri();
$object = $uploader->resume($resumeUri);
}
Good Afternooon all,
Please can someone give me a example of a PUT request? i have seen a couple online but i can not seam to get any to work.... I am trying to create an app for my live streaming channel, below is what i am trying to use PUT for.
Here is the DEV link to the API: https://dev.streamelements.com
So the URL would be: https://api.streamelements.com/kappa/v2
the PUT i need is the following
/points/{channel}/{user}/{amount}
Media type: application/json
so i understand the url in full if it was a get:
(api.streamelements.com/kappa/v2/points/channe id removed/username removed)
That gives me my points on a selected channel but to add or remove points it has to be a PUT and i have no idea on how to use it, so if anyone could give me some example of the above i could learn from it and do all the other requests myself
Many Thanks for your time
Kev (TADS)
You may need to be familiar with HTTP clients such as Guzzle or other clients that implements Psr7 interface.
In your case your code should looks like:
$client = new GuzzleHttp\Client();
$client->put('https://api.streamelements.com/kappa/v2/REST_OF_URL', [
'headers' => ['X-Foo' => 'Bar'],
'body' => [
'field1' => 'value1',
// ...
],
'timeout' => 10
]);
Obviously I'm assuming that you know how to include the Guzzle library to your project using Composer or standalone include.
I want to use drupal_http_request to upload a file to another REST API and then parse the result into JSON.
I am able to generate a "PHP client" for this in Postman but it doesn't work in Drupal. Here is what it looks like:
$client = new http\Client;
$request = new http\Client\Request;
$request->setRequestUrl('https://api.cloudmersive.com/virus/scan/file');
$request->setRequestMethod('POST');
$request->setHeaders(array(
'cache-control' => 'no-cache',
'apikey' => 'KEY_HERE'
));
$client->enqueue($request)->send();
$response = $client->getResponse();
echo $response->getBody();
There are two issues - one, PostMan doesn't actually add the code to upload the file. Second, the above code won't run in Drupal because it is making references that aren't available so it looks like I need to use the drupal_http_request function. I'm having a hard time figuring out how to actually do that since I don't use PHP much.
Any thoughts on how I could actually post a file to that endpoint using only the built-in Drupal 7 functions, e.g. drupal_http_request?
I have a service app and I'm willing to download a file from an user's Drive using msgraph-sdk
I am able to upload a file using
$graph->createRequest("PUT", "/users/".$userId."/drive/items/root:/".$folderName."/".$fileName.":/content")
->upload($filePath);
But I am not able to download it. Here is the code I am using:
$graph = new Graph();
$graph->setAccessToken($accessToken);
$graph->createRequest("GET", "/users/".$userId."/drive/items/".$docId."/content")
->download($filePath);
The error is: PHP Warning: fclose(): 16 is not a valid stream resource in microsoft-graph/src/Http/GraphRequest.php on line 344
I could note that making a request to the URL https://graph.microsoft.com/v1.0/users/{USER_ID}/drive/items/{DOC_ID}/content using Postman and passing Authorization Bearer {ACCESS_TOKEN}, the response is the file's content, which is the expected behavior, but for some reason, calling it via PHP is not working.
Am I doing something wrong?
---- UPDATE ----
The problem seems to be related to the response I am getting after making the request. Documentation says that the response code will be 302 and I need to redirect to the address in Location header, but it looks like that the MS Graph (or Guzzle Client) is not being capable to redirect to the address. I tried editing GraphRequest.php to enable it when creating a Client like this:
$clientSettings = [
'base_uri' => $this->baseUrl,
'verify' => false,
'allow_redirects' => true,
'headers' => $this->headers
];
$client = new Client($clientSettings);
but it didn't work.
I believe this is a bug. I have logged it here. There is a fix checked in and we are getting ready to publish a new version of the SDK, so you should be able to pull in the fix soon. In the meantime, you can incorporate the PR and verify that it solves the problem that you are seeing.
I wrote a scheduling app in laravel 5 that basically builds some json from local storage and sends this to the controller and then the view. the view is mostly javascript that parses the json and builds the schedule. I would like to extend this to allow a client coder to generate their own json and post that to my app and have my app send them a full view / schedule back.
I'm using php 7 and the php -S options to bring up 2 servers - one hosting the main schedule that i have and one hosting the client test code that posts to the schedule.
require 'vendor/autoload.php';
use GuzzleHttp\Client;
$uri ='/api/clientJSON';
$uri_token ='eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzdWIiOjEsImlzcyI6Imh0dHA6XC9cL2xvY2FsaG9zdDo4MDAwXC9hcGlcL2F1dGhlbnRpY2F0ZSIsImlhdCI6MTQ1Njg0NDI1NiwiZXhwIjoxNDU2ODQ3ODU2LCJuYmYiOjE0NTY4NDQyNTYsImp0aSI6IjI2ODZjZWIwNjI2ZDVmZWE1YmVlZjMwNzM0ZDhkMzZmIn0.hWrIGNGLlIHOLP9FltefsN066WOHpGTm2SmsF6feAsI';
$calendarJson = '{"auth":"test"}';
$client = new Client([
'base_uri' => 'http://localhost:8000',
'timeout' => 2.0,
]);
$response = $client->request('POST', $uri, [
'query'=>['token' => $uri_token],
'form_params' => [
'calendarJson' => $calendarJson
]
]);
echo $response->getBody();
this issue seems to be that while the body does echo out, its only pulling in the html page and all the dependent js / css files are not loaded. I'm clearly missing something fundamental about building my web service. Can someone enlighten me please?
While Guzzle is a PHP HTTP Client, it is not a web browser. As such it will perform HTTP requests, but it will not parse the response to determine if or what other files need to be downloaded to view the content properly.
As observed, if a request is actioned, and the response contains references for externally stored javascript, or CSS files, they will not be requested.
For years the internet has been very good at blurring the lines between content and presentation. As a service provider, you are interested in the content obtained from external sources, not how those external sources present it.