PHP S3 deleteBucketAsync wierd working - php

I have a little problem with AWS S3 service. I'm trying to delete whole bucket and i would like to use deleteBucketAsync().
My code:
$result = $this->s3->listObjects(array(
'Bucket' => $bucket_name,
'Prefix' => ''
));
foreach($result['Contents'] as $file){
$this->s3->deleteObjectAsync(array(
'Bucket' => $bucket_name,
'Key' => $file['Key']
));
}
$result = $this->s3->deleteBucketAsync(
[
'Bucket' => $bucket_name,
]
);
Sometimes this code works and delete whole bucket in seconds. But sometimes it doesn't.
Can someone please explain me how exacly S3 async functions work?

Related

Performance of Laravel storage streaming file from Digital ocean spaces

I've obtained the following code from searching about the topic
Route::get('/test', function () {
//disable execution time limit when downloading a big file.
set_time_limit(0);
$fs = Storage::disk('local');
$path = 'uploads/user-1/1653600850867.mp3';
$stream = $fs->readStream($path);
if (ob_get_level()) ob_end_clean();
return response()->stream(function () use ($stream) {
fpassthru($stream);
},
200,
[
'Accept-Ranges' => 'bytes',
'Content-Length' => 14098560,
'Content-Type' => 'application/octet-stream',
]);
});
However when I click play on the UI, it takes a good four seconds to start playing. If I switch the disk to local though, it plays almost instantly.
Is there a way to improve the performance or, read the stream by range as per request?
Edit
My current DO config is as per below
'driver' => 's3',
'key' => env('DO_ACCESS_KEY_ID'),
'secret' => env('DO_SECRET_ACCESS_KEY'),
'region' => env('DO_DEFAULT_REGION'),
'bucket' => env('DO_BUCKET'),
'url' => env('DO_URL'),
'endpoint' => env('DO_ENDPOINT'),
'use_path_style_endpoint' => env('DO_USE_PATH_STYLE_ENDPOINT', false),
But I find two type of integration online one specifying the CDN endpoint and one doesn't. I am not sure which one is relevant, though the one that specifies CDN is for Laravel 8 and I am on Laravel 9.
I had to change my code such that:
I had to use the php SDK client for connecting to Aws for the Laravel API isn't flexible to allow passing additional arguments (at least I haven't found anything while researching)
Change to streamDownload as I can't see any description to the stream method in the docs despite that it is present in code.
So the below code allows to achieve what I was aiming for which is, download by chunk based on the range received in the request.
return response()->streamDownload(function(){
$client = new Aws\S3\S3Client([
'version' => 'latest',
'region' => config('filesystems.disks.do.region'),
'endpoint' => config('filesystems.disks.do.endpoint'),
'credentials' => [
'key' => config('filesystems.disks.do.key'),
'secret' => config('filesystems.disks.do.secret'),
],
]);
$path = 'uploads/user-1/1653600850867.mp3';
$range = request()->header('Range');
$result = $client->getObject([
'Bucket' => 'wyxos-streaming',
'Key' => $path,
'Range' => $range
]);
echo $result['Body'];
},
200,
[
'Accept-Ranges' => 'bytes',
'Content-Length' => 14098560,
'Content-Type' => 'application/octet-stream',
]);
Note:
In a live scenario, you would need to cater for if range isn't specified, the content length will need to be the actual file size
When range is present however, the content length should then be the size of the segment being echoed

How to fix upload image to s3 using Laravel

I try to upload an image to s3 using Laravel but I receive a runtime error. Using Laravel 5.8, PHP7 and API REST with Postman I send by body base64
I receive an image base64 and I must to upload to s3 and get the request URL.
public function store(Request $request)
{
$s3Client = new S3Client([
'region' => 'us-east-2',
'version' => 'latest',
'credentials' => [
'key' => $key,
'secret' => $secret
]
]);
$base64_str = substr($input['base64'], strpos($input['base64'], ",") + 1);
$image = base64_decode($base64_str);
$result = $s3Client->putObject([
'Bucket' => 's3-galgun',
'Key' => 'saraza.jpg',
'SourceFile' => $image
]);
return $this->sendResponse($result['ObjectURL'], 'message.', 'ObjectURL');
}
Says:
RuntimeException: Unable to open u�Z�f�{��zڱ��� .......
The SourceFile parameter is leading to the path of file to upload to S3, not the binary
You can use Body parameter to replace the SourceFile, or saving the file to local temporary and get the path for SourceFile
Like this:
public function store(Request $request)
{
$s3Client = new S3Client([
'region' => 'us-east-2',
'version' => 'latest',
'credentials' => [
'key' => $key,
'secret' => $secret
]
]);
$base64_str = substr($input['base64'], strpos($input['base64'], ",") + 1);
$image = base64_decode($base64_str);
Storage::disk('local')->put("/temp/saraza.jpg", $image);
$result = $s3Client->putObject([
'Bucket' => 's3-galgun',
'Key' => 'saraza.jpg',
'SourceFile' => Storage::disk('local')->path('/temp/saraza.jpg')
]);
Storage::delete('/temp/saraza.jpg');
return $this->sendResponse($result['ObjectURL'], 'message.', 'ObjectURL');
}
And, if you're using S3 with Laravel, you should consider the S3 filesystem driver instead of access the S3Client manually in your controller
To do this, add the S3 driver composer require league/flysystem-aws-s3-v3, put your S3 IAM settings in .env or config\filesystems.php
Then update the default filesystem in config\filesystems, or indicate the disk driver when using the Storage Storage::disk('s3')
Detail see document here
Instead of SourceFile you have to use Body. SourceFile is a path to a file, but you do not have a file, you have a base64 encoded source of img. That is why you need to use Body which can be a string. More here: https://docs.aws.amazon.com/aws-sdk-php/v3/api/api-s3-2006-03-01.html#putobject
Fixed version:
public function store(Request $request)
{
$s3Client = new S3Client([
'region' => 'us-east-2',
'version' => 'latest',
'credentials' => [
'key' => $key,
'secret' => $secret
]
]);
$base64_str = substr($input['base64'], strpos($input['base64'], ",") + 1);
$image = base64_decode($base64_str);
$result = $s3Client->putObject([
'Bucket' => 's3-galgun',
'Key' => 'saraza.jpg',
'Body' => $image
]);
return $this->sendResponse($result['ObjectURL'], 'message.', 'ObjectURL');
}
A very simple way to uploads Any file in AWS-S3 Storage.
First, check your ENV setting.
AWS_ACCESS_KEY_ID=your key
AWS_SECRET_ACCESS_KEY= your access key
AWS_DEFAULT_REGION=ap-south-1
AWS_BUCKET=your bucket name
AWS_URL=Your URL
The second FileStorage.php
's3' => [
'driver' => 's3',
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION'),
'bucket' => env('AWS_BUCKET'),
'url' => env('AWS_URL'),
//'visibility' => 'public', // do not use this line for security purpose. try to make bucket private.
],
Now come on main Code.
Upload Binary File from HTML Form.
$fileName = 'sh_'.mt_rand(11111,9999).".".$imageFile->clientExtension();;
$s3path = "/uploads/".$this::$SchoolCode."/";
Storage::disk('s3')->put($s3path, file_get_contents($req->file('userDoc')));
Upload Base64 File
For Public Bucket or if you want to keep file Public
$binary_data = base64_decode($file);
Storage::disk('s3')->put($s3Path, $binary_data, 'public');
For Private Bucket or if you want to keep file Private
$binary_data = base64_decode($file);
Storage::disk('s3')->put($s3Path, $binary_data);
I Recommend you keep your file private... that is a more secure way and safe. for this, you have to use PreSign in URL to access that file.
For Pre sign-In URL check this post. How access image in s3 bucket using pre-signed url

AWS s3Client->copyObject succeeds but the copied object loses is garbled

Please assist.
I am successfully uploading objects to S3 using the following code snippet:
// Send a PutObject request and get the result object.
$result = $this->s3EncryptionClient->putObject([
'#MaterialsProvider' => $this->materialsProvider,
'#CipherOptions' => $this->cipherOptions,
'Bucket' => $this->s3BucketName,
'Key' => $key,
'ContentType' => $mimeType,
'ContentLength' => filesize($filePath),
'ContentDisposition' => "attachment; filename='" . $fileName . "'",
'Body' => fopen($filePath ,'r')
]);
And I can successfully download the object using the following snippet:
// Download the contents of the object.
$result = $this->s3EncryptionClient->getObject([
'#MaterialsProvider' => $this->materialsProvider,
'#CipherOptions' => $this->cipherOptions,
'Bucket' => $this->s3BucketName,
'Key' => $key
]);
The problem comes in when i try copy the object using the following code snippet:
$result = $this->s3Client->CopyObject([
'Bucket' => $this->s3BucketName,
'CopySource' => $CopySource,
'Key' => $dstKey,
]);
The object seems to be copied incorrectly and when i try download the new object using the download code i pasted earlier in this post, The object is not found and an AWSException
resulted in a 404 Not Found response
Any idea how I can resolve the issue?

Uploading file to S3 using presigned URL in PHP

I am developing a Web Application using PHP. In my application, I need to upload the file to the AWS S3 bucket using Presigned URL. Now, I can read the private file from the S3 bucket using pre-signed like this.
$s3Client = new S3Client([
'version' => 'latest',
'region' => env('AWS_REGION', ''),
'credentials' => [
'key' => env('AWS_IAM_KEY', ''),
'secret' => env('AWS_IAM_SECRET', '')
]
]);
//GetObject
$cmd = $s3Client->getCommand('GetObject', [
'Bucket' => env('AWS_BUCKET',''),
'Key' => 'this-is-uploaded-using-presigned-url.png'
]);
$request = $s3Client->createPresignedRequest($cmd, '+20 minutes');
//This is for reading the image. It is working.
$presignedUrl = (string) $request->getUri();
When I access the $presignedUrl from the browser, I can get the file from the s3. It is working. But now, I am uploading a file to S3. Not reading the file from s3. Normally, I can upload the file to the S3 like this.
$client->putObject(array(
'Bucket' => $bucket,
'Key' => 'data.txt',
'Body' => 'Hello!'
));
The above code is not using the pre-signed URL. But I need to upload the file using a pre-signed URL. How, can I upload the file using a pre-signed URL. For example, what I am thinking is something like this.
$client->putObject(array(
'presigned-url' => 'url'
'Bucket' => $bucket,
'Key' => 'data.txt',
'Body' => 'Hello!'
));
How can I upload?
It seems reasonable that you can create a pre-signed PutPobject command by running:
$cmd = $s3Client->getCommand('PutObject', [
'Bucket' => $bucket,
'Key' => $key
]);
$request = $s3Client->createPresignedRequest($cmd, '+20 minutes')->withMethod('PUT');
Then you might want to perform the PUT call from PHP using:
file_put_contents(
$request->getUri(),
'Hello!',
stream_context_create(['http' => [ 'method' => 'PUT' ]])
);
If you want to create a URL that a browser can submit, then you need to have the browser send the file as a form POST. This AWS documentation explains how to create a pre-signed POST request with the fields that you then need to put into an HTML form and display to the user: https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/s3-presigned-post.html
Also, this answer might be useful: https://stackoverflow.com/a/59644117/53538

The last step of a AWS EC2 to S3 file upload

I have this code :
require '/home/ubuntu/vendor/autoload.php';
$sharedConfig = [
'region' => 'us-west-2',
'version' => 'latest'
];
$sdk = new Aws\Sdk($sharedConfig);
$s3Client = $sdk->createS3();
$result = $s3Client->putObject([
'Bucket' => 'my-bucket',
'Key' => $_FILES["fileToUpload"]["name"],
'Body' => $_FILES["fileToUpload"]["tmp_name"]
]);
It works, basically. It sends a file to S3. But it apparently sends it badly since it always shows as a corrupted file... Can anyone tell me what I am doing wrong?
To be specific - the image I am uploading is a jpg image and when I try to look at it on the S3 instance, I am told that it "cannot be displayed because it contains errors"

Categories