Need help downloading the the json response zip files.
I am using guzzle json response from the remote server which returns filenames and modification dates to me. i want to download all these zip files to my storage folder of local server..
so it would work like this
remote server gives json response filenames
i take the filenames and create my link and force guzzle to download those files to my local server.
Here is my code
ini_set('max_execution_time', '0');
$url = "https://feeds.example.com/zips/publisher/";
try
{
$client = new GuzzleHttp\Client();
$res = $client->request('GET', $url, [
'auth' => ['username', 'password'],
'headers' => [
'Accept' => 'application/json',
'Content-Type' => 'application/json'
],
]);
$responses = json_decode($res->getBody()->getContents());
foreach ($responses->incrementalFeed as $feed)
{
$downloadLink = $url . $feed->name;
$client->get($downloadLink, ['auth' => ['username', 'password']]);
Storage::put($feed->name, $downloadLink);
}
}
catch (ClientException $e) {
echo $e->getMessage();
}
it downloads all the zip files to my storage folder but all the file are like 1kb , empty .. however in remote website those files are like from 10mb to 600mbs
what am i doing wrong here?
You’re not downloading anything.
You need to use guzzle again to download the files
e.g
$response = $client->get($downloadLink);
Storage::put($feed->name, $response->getBody());
Related
I'm attempting to retrieve a file attachment with Guzzle. The file isn't available directly through an endpoint, but the download is initiated via the end point and downloaded to my browser. Can I retrieve this file with Guzzle?
I successfully login to the site, but what is saved to my file is the html of the site not the download. The file contents seems to come through when I make the request with insomnia rest client, but not with Guzzle.
$client = new GuzzleHttp\Client();
$cookieJar = new \GuzzleHttp\Cookie\CookieJar();
$response = $client->post('https://test.com/login', [
'form_params' => [
'username' => $username,
'password' => $password,
'action' => 'login'
],
'cookies' => $cookieJar
]);
$resource = fopen(__DIR__.'/../../feeds/test.xls', 'w');
$stream = GuzzleHttp\Psr7\stream_for($resource);
$response = $client->request('GET', 'https://test.com/download', ['sink' => $stream]);
If you want to perform an authentication step and then a download step, you'll need to make sure the cookies are persisted across both requests. Right now you're only passing your $cookieJar variable to the first one.
The explicit way of doing this would be to add it to the options for the second request:
['sink' => $stream, 'cookies' => $cookieJar]
but it might be easier to take advantage of the option in the client constructor itself:
$client = new GuzzleHttp\Client(['cookies' => true);
That means that every request (with that client) will automatically use a shared cookie jar, and you don't need to worry about passing it into each request separately.
You should send Content-Disposition header in order to specify that the client should receive file downloading as a response. According to your GET HTTP request which will capture the contents into the $stream resource, finally you can output these contents to browser with stream_get_contents.
<?php
// your 3rd party end-point authentication
...
header('Content-Type: application/vnd.ms-excel');
header('Content-Disposition: attachment; filename="test.xls"');
$resource = fopen(__DIR__.'/../../feeds/test.xls', 'w');
$stream = GuzzleHttp\Psr7\stream_for($resource);
$response = $client->request('GET', 'https://test.com/download', ['sink' => $stream]);
echo stream_get_contents($stream);
Need help using Guzzle 6 for downloading a file from a rest API. I don't want the file to be saved locally but downloaded from web browser. Code so far below, but believe I am missing something?
<?php
//code for Guzzle etc removed
$responsesfile = $client->request('GET', 'documents/1234/content',
[
'headers' => [
'Cache-Control' => 'no-cache',
'Content-Type' => 'application/pdf',
'Content-Type' => 'Content-Disposition: attachment; filename="test"'
]
]
);
return $responsesfile;
?>
Just do research inside Guzzle's docs, for example here
Pass a string to specify the path to a file that will store the contents of the response body:
$client->request('GET', '/stream/20', ['sink' => '/path/to/file']);
Pass a resource returned from fopen() to write the response to a PHP stream:
$resource = fopen('/path/to/file', 'w');
$client->request('GET', '/stream/20', ['sink' => $resource]);
Pass a Psr\Http\Message\StreamInterface object to stream the response body to an open PSR-7 stream.
$resource = fopen('/path/to/file', 'w');
$stream = GuzzleHttp\Psr7\stream_for($resource);
$client->request('GET', '/stream/20', ['save_to' => $stream]);
stream_for is deprecated in version 7.2. You can use
GuzzleHttp\Psr7\Utils::streamFor($resource) instead.
First of all, Content-Type header only makes sense when you send something (POST/PUT), but not for GET requests.
Secondly, what is your issue? Guzzle by default does not store the response body (file) somewhere, so you can work with it inside your app, like $responsesfile->getBody().
Download link:-
<a href='"+ downloadlink + '/attachment-download/' + $('#employee_ID').val()+'/'+ res[i].file_name +"'>
Route:-
Route::get('/attachment-download/{id}/{filename}', array(
'uses' => 'AttachmentsController#getDownloadAttachments'
));
Attachment Controller:-
public function getDownloadAttachments($id,$filename){
$file="./img/user-icon.png";
$resource = \Employee::WhereCompany()->findOrfail($id);
$path = $this->attachmentsList($resource);
foreach($path as $index => $attachment){
if ($filename == $attachment['file_name']) {
$filePath = $attachment['url'];
}
}
//return \Response::download($file);
return \Response::download($filePath);
}
File URL Output:-
https://zetpayroll.s3.ap-south-1.amazonaws.com/employees/81/Screenshot%20from%202017-04-26%2015%3A07%3A45.png?X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAI57OFN3HPCBPQZIQ%2F20170612%2Fap-south-1%2Fs3%2Faws4_request&X-Amz-Date=20170612T144818Z&X-Amz-SignedHeaders=host&X-Amz-Expires=3600&X-Amz-Signature=59ecc4d11b7ed71bd336531bd7f4ab7c84da6b7424878d6487679c97a8c52ca7
In this, if try to download the file by using a static path like
$file="./img/user-icon.png";
return \Response::download($file);
it is downloaded fine. But not possible to downloading file from AWS URL, Please help me how to down file automatically using URL. Or How to get Path from URL in laravel or PHP.
Thank you.
Using the above function all the files are being downloaded. But while trying to open the files, text, pdf, ... files open (.text, .csv, .pdf..) without problem, but images don't.
$fileContent = file_get_contents($filePath);
$response = response($fileContent, 200, [
'Content-Type' => 'application/json',
'Content-Disposition' => 'attachment; filename="'.$filename.'"',
]);
return $response;
Im trying to get a remote file (an image) using PHP and then put that image to an S3 Bucket.
It mostly works, except that the file that is uploaded to S3 is empty when downloaded back again.
Here is my code so far:
$src = "http://images.neventum.com/2016/83/thumb100x100/56f3fb1b9dbd6-acluvaq_63324840.png";
$name = md5(uniqid());
$ext = pathinfo($src, PATHINFO_EXTENSION);
$file_content = file_get_contents($src);
$filename = "{$name}.{$ext}";
try {
$file = $this->s3->putObject([
'Bucket' => getEnv('s3bucket'),
'Key' => $filename,
'Body' => $file_content,
'ACL' => 'private'
]);
} catch (S3Exception $e) {
//Catch
}
Hope you can help. Thank you so much in advance.
Update 1:
The problem is not with the ACL (I have tested using "public"), is that the saved object on S3 is not uploded correctly (I think is something to do with the encoding, but have not been able to figure it out)
Once you upload image to S3 bucket it will be private you can't able to download directly. You need to give public read access to make object available for download to users.
I'm trying to upload a video using Laravel and GuzzleHttp to DailyMotion. Here's my code:
$file = "3.mp4";
$fields["file"] = fopen($file, 'rb');
$res = $client->post($upload_url, [
'headers' => ['Content-Type' => 'multipart/form-data'],
$fields
]);
$data3 = $res->getBody();
$response_upload_video = json_decode($data3,true);
echo "<br>Getting dm upload video response: ";
print_r($response_upload_video);
$upload_url is a dynamically generated URL passed by DailyMotion. Upon executing the code above, I'll always get this error:
Production.ERROR: GuzzleHttp\Exception\ClientException:
Client error:
POST
http://upload-02.sg1.dailymotion.com/upload?uuid=werewkrewrewrwer&seal=pppppppppppppppp`resulted
in a 400 Bad Request response:
{"error":"invalid content range","seal":"yyyyyyyyyyyyyyyyyy"}
in /home/vagrant/Code/svc-titus/vendor/guzzlehttp/guzzle/src/Exception/RequestException.php:111
But I can upload video to the same upload URL using Postman, as displayed below:
i don't think you need to specify content-type header guzzle will decide it automatically when you provide it a resource also the path of your video here seems problematic if video is stored at public directory you need to use public_path or respective path helper function to get its physical path
below should work in guzzleHttp 6 check sending form files here
http://docs.guzzlephp.org/en/latest/quickstart.html#uploading-data
$file = "3.mp4";
$res = $client->post($upload_url, [
'multipart' => [
[
'name' => 'file',
'contents' => fopen(base_path($file), 'r') // give absolute path using path helper function
]
]
]);
$data3 = $res->getBody();
$response_upload_video = json_decode($data3,true);
echo "<br>Getting dm upload video response: ";
print_r($response_upload_video);