How can I download file from s3 in php. The following code not working for me...
Here is my code
upload file should work correct
try {
$s3->putObject([
'Bucket' => $config['s3']['bucket'],//'eliuserfiles',
'Key' => "uploads/{$name}",
'Body' => fopen($tmp_name, 'r+'),
//'SourceFile' => $pathToFile,
'ACL' => 'public-read',
]);
download gives me error
$objInfo = $s3->get_object_headers($config['s3']['bucket'], "uploads/{$name}");
$obj = $s3->get_object($config['s3']['bucket'], "uploads/{$name}");
header('Content-type: ' . $objInfo->header['_info']['content_type']);
echo $obj->body;
error
PHP Catchable fatal error: Argument 2 passed to Aws\\AwsClient::getCommand() must be of the type array, string given, called in /php_workspace/s3_php/vendor/aws/aws-sdk-php/src/AwsClient.php on line 167 and defined in /php_workspace/s3_php/vendor/aws/aws-sdk-php/src/AwsClient.php on line 211, referer: http://localhost/upload.php
Simple way to do this:
include Amazon S3 PHP Class in you're project.
instantiate the class:
1. OO method (e,g; $s3->getObject(...)):
$s3 = new S3($awsAccessKey, $awsSecretKey);
2. Statically (e,g; S3::getObject(...)):
S3::setAuth($awsAccessKey, $awsSecretKey);
Then get Objects:
// Return an entire object buffer:
$object = S3::getObject($bucket, $uri));
var_dump($object);
Usually, the most efficient way to do this is to save the object to a file or resource
<?php
// To save it to a file (unbuffered write stream):
if (($object = S3::getObject($bucket, $uri, "/tmp/savefile.txt")) !== false) {
print_r($object);
}
// To write it to a resource (unbuffered write stream):
$fp = fopen("/tmp/savefile.txt", "wb");
if (($object = S3::getObject($bucket, $uri, $fp)) !== false) {
print_r($object);
}
?>
S3 Class -With Examples
S3 Class -Documentation
You can try this :
$bucket= "bucket-name";
$filetodownload = "name-of-the-file";
$resultbool = $s3->doesObjectExist ($bucket, $filetodownload );
if ($resultbool) {
$result = $client->getObject ( [
'Bucket' => $bucket,
'Key' => $filetodownload
] );
}
else
{
echo "file not found";die;
}
header ( "Content-Type: {$result['ContentType']}" );
header ( "Content-Disposition: attachment; filename=" . $filetodownload );
header ('Pragma: public');
echo $result ['Body'];
die ();
Related
In the code below: In safari where it does partial downloading. I keep getting the error:
Failed to load resource: Plug-in handled load
I get inside of the if (isset($_SERVER['HTTP_RANGE'])) branch. Error_Log Verifies this.
I checked in google chrome to make sure the resource was actually being grabbed and it indeed is being grabbed and google chrome downloads the whole file inside of the else statement of the previously mentioned if statement.
What do I need to do to correct the headers? The stream is always grabbed in 1024 bytes or less due to the "->read(1024)".
$s3 = S3Client::factory([
'version' => 'latest',
'region' => 'A Region',
'credentials' => [
'key' => "A KEY",
'secret' => "A SECRET",
]
]);
//echo "Step 4\n\n";
$bucket = "bucket";
$image_path = $_GET['video_path'];
try {
$result = $s3->getObject([
'Bucket' => $bucket,
'Key' => $image_path
]);
header("Content-Type: {$result['ContentType']}");
$fileSize = $result['ContentLength'];
if (isset($_SERVER['HTTP_RANGE'])){ // do it for any device that supports byte-ranges not only iPhone
// echo "partial";
error_log("Error message We are good on safari in HTTP_RANGE\n", 3, "error_log");
// Read the body off of the underlying stream in chunks
header('HTTP/1.1 206 Partial Content');
header('Accept-Ranges: bytes');
$start = 0;
//$theLengthOfTheNextData = mb_strlen($data, '8bit');
header("Content-Range: bytes 0-1023/$fileSize");
header("Content-Length: 1024");
while ($data = $result['Body']->read(1024))
{
//$theLengthOfTheNextData = mb_strlen($data, '8bit');
//it starts at zero
//$end = $start + $theLengthOfTheNextData - 1;
//$theLengthOfTheNextData = mb_strlen(data);
//echo "Length: $theLengthOfTheNextData\n\n";
//header("Content-Range: bytes $start-$end/$fileSize");
//header("Content-Length: $theLengthOfTheNextData");
//$start = $start + $theLengthOfTheNextData;
// header("Content-Range: bytes $start-$end/$size");
set_time_limit(0);
echo $data;
// Flush the buffer immediately
//#ob_flush();
flush();
}
//header("Content-Length: $fileSize");
//rangeDownload($result['Body'],$fileSize);
}
else {
// echo "Just entire thing";
error_log("Error message ---- NOT good on safari in HTTP_RANGE\n", 4, "error_log");
header("Content-Length: ".$fileSize);
echo $result['Body'];
}
//echo "Step 7\n\n";
//no need to cast as string
// Cast as a string
// $body = (string)$result->get('Body');
//below should actually work too
// $bodyAsString = (string) $result['Body'];
} catch (S3Exception $e) {
echo $e;
//echo "Step 6B\n\n";
//grabbing the file was unsuccessful
$successfulMove = 0;
}
If anyone is looking for the answer to this. You will need to utilize "registerStreamWrapper" in the S3 functions.
Reference Link
You will need to scroll down too " tedchou12 " comment. It works!!!
Found a handful of questions on here about this with no answer, so hopefully, someone can point me in the right direction...
I'm trying to create and save a csv file to storage, then update the DB in Laravel. I can create the file successfully, and I can update the DB successfully... but I'm stuck on putting them both together. In my controller, I have this for creating the file (taken from here):
public function updatePaymentConfirm(Request $request) {
$users = User::all();
$fileName = 'test.csv';
$headers = array(
"Content-type" => "text/csv",
"Content-Disposition" => "attachment; filename=$fileName",
"Pragma" => "no-cache",
"Cache-Control" => "must-revalidate, post-check=0, pre-check=0",
"Expires" => "0"
);
$columns = array('First Name', 'Email');
$callback = function() use($users, $columns) {
$file = fopen('php://output', 'w');
fputcsv($file, $columns);
foreach ($users as $user) {
$row['First Name'] = $user->first_name;
$row['Email'] = $user->email;
fputcsv($file, array($row['First Name'], $row['Email']));
}
fclose($file);
};
// return response()->stream($callback, 200, $headers);
}
When the function completes, the last line (that's commented out) prompts the user to download the newly created file (which is not the functionality I'm looking for). I tried adding this to my controller in its place for saving to storage and also updating the database:
$fileModel = new UserDocument;
if($callback) {
$filePath = $callback->storeAs('uploads', $fileName, 'public');
$fileModel->name = $fileName;
$fileModel->file_path = '/storage/' . $filePath;
$fileModel->save();
return back()
->with('success','File has been uploaded.')
->with('file', $fileName);
}
It saves a row to the db, albeit incorrectly, but it doesn't save the file to storage. I've reworked the $filePath line a million times, but I keep getting this error Call to a member function storeAs() on resource or something similar. I'm relatively new to working with Laravel, so I'm not sure what I should be looking for. Thoughts?
Removed everything and started over... got it! And for anyone else running into the same issue: just calling for a file that doesn't exist creates the file (unless the file exists - then it updates it), so you don't have to create a temp file or use $file = fopen('php://output', 'w'); to create the file. It'll automatically "save" the newly generated file in the file path you specified when you fclose() out of the file.
The only thing I'll note is that the file path has to exist (the file doesn't, but the file path does). In my instance, the file path already exists, but if yours doesn't or if you're not sure if it does, check to see if it exists, and then make the directory.
public function updatePaymentConfirm(Request $request) {
$user = Auth::user();
$path = storage_path('app/public/docs/user_docs/'.$user->id);
$fileName = $user->ein.'.csv';
$file = fopen($path.$fileName, 'w');
$columns = array('First Name', 'Email Address');
fputcsv($file, $columns);
$data = [
'First Name' => $user->first_name,
'Email Address' => $user->email,
];
fputcsv($file, $data);
fclose($file);
$symlink = 'public/docs/user_docs/'.$user->id.'/';
$fileModel = new UserDocument;
$fileModel->name = 'csv';
$fileModel->file_path = $symlink.$fileName;
$fileModel->save();
return redirect()->route('completed');
}
** UPDATE **
Everything worked perfectly locally, and when I pushed this to production, I received this error 🙄:
fopen(https://..../12-3456789.csv): failed to open stream: HTTP wrapper does not support writeable connections.
I'm saving to an s3 bucket, and I had to rework the entire process. You can't create and/or write to a file in the directory. I had to create a temp file first. Here's where I landed:
$user = Auth::user();
$s3 = Storage::disk('s3');
$storage = Storage::disk('s3')->url('/');
$path = 'public/docs/user_docs/'.$user->id.'/';
$csvFile = tmpfile();
$csvPath = stream_get_meta_data($csvFile)['uri'];
$fd = fopen($csvPath, 'w');
$columns = array('First Name', 'Email Address');
$data = array(
'First Name' => $user->first_name,
'Email Address' => $user->email,
);
fputcsv($fd, $columns);
fputcsv($fd, $data);
fclose($fd);
$s3->putFileAs('', $csvPath, $path.$user->ein.'.csv');
Today I have fixed it with this snipe:
// output up to 5MB is kept in memory, if it becomes bigger it will
// automatically be written to a temporary file
$csv = fopen('php://temp/maxmemory:'. (5*1024*1024), 'r+');
fputcsv($csv, array('blah','blah'));
rewind($csv);
$output = stream_get_contents($csv);
// Put the content directly in file into the disk
Storage::disk('myDisk')->put("report.csv", $output);
This code is easy and functional, use Laravel Storage Class
https://laravel.com/docs/9.x/filesystem#main-content
use Illuminate\Support\Facades\Storage;
// data array
$results = [
['id' => 0, 'name' => 'David', 'parent' => 1],
['id' => 1, 'name' => 'Ron', 'parent' => 0],
['id' => 2, 'name' => 'Mark', 'parent' => 1]
];
// create a variable to store data
$pages = "id,name,parent\n"; // use " not ' or \n not working
// use foreach to data
foreach ($results as $where) {
$pages .= "{$where['id']},{$where['name']},{$where['parent']}\n";
}
// use Fecades Laravel Storage
Storage::disk('local')->put('file.csv', $pages);
I have this mostly working but having a tough time finalizing it.
For now I have a simple route:
Route::get('file/{id}/', 'FileController#fileStream')->name('file');
this route connects to an action in the FileController:
public function fileStream($id){
$audio = \App\Audio::where('id', $id)->first();
$client = S3Client::factory([
'credentials' => [
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
],
'region' => env('S3REGION'),
'version' => 'latest',
]);
// Register the stream wrapper from an S3Client object
$client->registerStreamWrapper();
if ($stream = fopen('s3://[bucket_name]/'. $audio->audio_url, 'r')) {
while (!feof($stream)) {
echo fread($stream, 1024);
}
fclose($stream);
}
}
This works to the browser: if I go to a url: /file/1 it looks up the right file, and in a clean browser window I get:
And then in my view I am trying to output the audio like:
<audio>
<source src="{{ url('file', ['id' => $section->id]) }}" type="{{ $section->audio_mime_type}}"></audio>
</audio>
But no player is getting output to the screen.
TIA
You should use Laravel Streamed response
return response()->streamDownload(function () use ($audio) {
if ($stream = fopen('s3://[bucket_name]/'. $audio->audio_url, 'r')) {
while (!feof($stream)) {
echo fread($stream, 1024);
flush();
}
fclose($stream);
}
}, 'file-name.ext');
//Get Url from s3 using
$fileUrl = \Storage::disk('s3')->url($filePath);
$fileName = 'name_of_file.extension';
//Set headers
header('Content-Description: File Transfer');
header('Content-Disposition: attachment; filename='.$fileName);
if (!($stream = fopen($response, 'r'))) {
throw new \Exception('Could not open stream for reading file:
['.$fileName.']');
}
// Check if the stream has more data to read
while (!feof($stream)) {
// Read 1024 bytes from the stream
echo fread($stream, 1024);
}
// Be sure to close the stream resource when you're done with it
fclose($stream);
Use Laravel/Symfony Response class. Echoing the response could not be setting the right headers.
Even if the headers are set up correctly, you are relying on the echo in the controller action, therefore you should do exit(0); at the end of the controller. Bear in mind that this is rather ugly and it kills the script, you should always aim to use Response classes mentioned above.
My S3 contains .gz objects that contain JSON within. I simply want to access this JSON without actually downloading objects to a file.
$iterator = $client->getIterator('ListObjects', array(
'Bucket' => $bucket
));
foreach ($iterator as $object) {
$object = $object['Key'];
$result = $client->getObject(array(
'Bucket' => $bucket,
'Key' => $object
));
echo $result['Body'] . "\n";
}
When I run the above in the shell it outputs gibberish on the echo line. What's the correct way to simply retrieve the contents of the .gz object and save to a variable?
Thank you
You can use stream wrapper like this.
$client->registerStreamWrapper();
if ($stream = fopen('s3://bucket/key.gz', 'r')) {
// While the stream is still open
while (!feof($stream)) {
// Read 1024 bytes from the stream
$d = fread($stream, 1024);
echo zlib_decode($d);
}
// Be sure to close the stream resource when you're done with it
fclose($stream);
}
If you are sending it to a browser you dont need to zlib_decode it, just set a header:
header('Content-Encoding: gzip');
I'm a little sure as to how to launch a download of a file from Amazon S3 with Laravel 4. I'm using the AWS
$result = $s3->getObject(array(
'Bucket' => $bucket,
'Key' => 'data.txt',
));
// temp file
$file = tempnam('../uploads', 'download_');
file_put_contents($file, $result['Body']);
$response = Response::download($file, 'test-file.txt');
//unlink($file);
return $response;
The above works, but I'm stuck with saving the file locally. How can I use the result from S3 correctly with Response::download()?
Thanks!
EDIT: I've found I can use $s3->getObjectUrl($bucket, $file, $expiration) to generate an access URL. This could work, but it still doesn't solve the problem above completely.
EDIT2:
$result = $s3->getObject(array(
'Bucket' => $bucket,
'Key' => 'data.txt',
));
header('Content-type: ' . $result['ContentType']);
header('Content-Disposition: attachment; filename="' . $fileName . '"');
header('Content-length:' . $result['ContentLength']);
echo $result['Body'];
Still don't think it's ideal, though?
The S3Client::getObject() method allows you to specify headers that S3 should use when it sends the response. The getObjectUrl() method uses the GetObject operation to generate the URL, and can accept any valid GetObject parameters in its last argument. You should be able to do a direct S3-to-user download with your desired headers using a pre-signed URL by doing something like this:
$downloadUrl = $s3->getObjectUrl($bucket, 'data.txt', '+5 minutes', array(
'ResponseContentDisposition' => 'attachment; filename="' . $fileName . '"',
));
If you want to stream an S3 object from your server, then you should check out the Streaming Amazon S3 Objects From a Web Server article on the AWS Developer Guide
This question is not answered fully. Initially it was asked to how to save a file locally on the server itself from S3 to make use of it.
So, you can use the SaveAs option with getObject method. You can also specify the version id if you are using versioning on your bucket and want to make use of it.
$result = $this->client->getObject(array(
'Bucket'=> $bucket_name,
'Key' => $file_name,
'SaveAs' => $to_file,
'VersionId' => $version_id));
The answer is somewhat outdated with the new SDK. The following works with v3 SDK.
$client->registerStreamWrapper();
$result = $client->headObject([
'Bucket' => $bucket,
'Key' => $key
]);
$headers = $result->toArray();
header('Content-Type: ' . $headers['ContentType']);
header('Content-Disposition: attachment');
// Stop output buffering
if (ob_get_level()) {
ob_end_flush();
}
flush();
// stream the output
readfile("s3://{$bucket}/{$key}");