Blob upload to s3 with cropme.js - php

I really need some help with this. I have never used blob as a resource before but I need to use cropme.js and it only outputs the file uploaded as a blob or base64. Here is the github repo for it: https://github.com/shpontex/cropme. I have it working 100% on my side and it gives me the output but I do not know how to go from here. I have tried to upload the blob as a file to Amazon S3 but all it does is save the file to S3 without an extension and when you try to open it, it is a the blob link. Which is useless because the blob will be removed once the browser is cleared. Uploading the base64 route does the same. I just need some direction here please.
Here is the function in app.js from cropme that gives the result:
let img = document.getElementById('cropme-result')
this.position = this.cropme.position()
this.cropme.crop({
type: 'blob', <-- here you can set it to either 'blob' or 'base64'
width: 600
}).then(function(res) {
img.src = res
document.getElementById('croppedimage').src = img.src
})
},
Here is part of the code which uploads the file to S3:
try {
$imagecropped = $_POST['image'];
$key = 'photos/random';
// S3 details
$s3Client = new S3Client([
'version' => 'latest',
'region' => 'us-west-2',
'credentials' => [
'key' => 'MYKEY',
'secret' => 'MYSECRET',
],
]);
$result = $s3Client->putObject([
'Bucket' => 'uploads',
'ACL' => 'public-read',
'Key' => $key,
'Body' => $imagecropped,
]);

Related

How to fix upload image to s3 using Laravel

I try to upload an image to s3 using Laravel but I receive a runtime error. Using Laravel 5.8, PHP7 and API REST with Postman I send by body base64
I receive an image base64 and I must to upload to s3 and get the request URL.
public function store(Request $request)
{
$s3Client = new S3Client([
'region' => 'us-east-2',
'version' => 'latest',
'credentials' => [
'key' => $key,
'secret' => $secret
]
]);
$base64_str = substr($input['base64'], strpos($input['base64'], ",") + 1);
$image = base64_decode($base64_str);
$result = $s3Client->putObject([
'Bucket' => 's3-galgun',
'Key' => 'saraza.jpg',
'SourceFile' => $image
]);
return $this->sendResponse($result['ObjectURL'], 'message.', 'ObjectURL');
}
Says:
RuntimeException: Unable to open u�Z�f�{��zڱ��� .......
The SourceFile parameter is leading to the path of file to upload to S3, not the binary
You can use Body parameter to replace the SourceFile, or saving the file to local temporary and get the path for SourceFile
Like this:
public function store(Request $request)
{
$s3Client = new S3Client([
'region' => 'us-east-2',
'version' => 'latest',
'credentials' => [
'key' => $key,
'secret' => $secret
]
]);
$base64_str = substr($input['base64'], strpos($input['base64'], ",") + 1);
$image = base64_decode($base64_str);
Storage::disk('local')->put("/temp/saraza.jpg", $image);
$result = $s3Client->putObject([
'Bucket' => 's3-galgun',
'Key' => 'saraza.jpg',
'SourceFile' => Storage::disk('local')->path('/temp/saraza.jpg')
]);
Storage::delete('/temp/saraza.jpg');
return $this->sendResponse($result['ObjectURL'], 'message.', 'ObjectURL');
}
And, if you're using S3 with Laravel, you should consider the S3 filesystem driver instead of access the S3Client manually in your controller
To do this, add the S3 driver composer require league/flysystem-aws-s3-v3, put your S3 IAM settings in .env or config\filesystems.php
Then update the default filesystem in config\filesystems, or indicate the disk driver when using the Storage Storage::disk('s3')
Detail see document here
Instead of SourceFile you have to use Body. SourceFile is a path to a file, but you do not have a file, you have a base64 encoded source of img. That is why you need to use Body which can be a string. More here: https://docs.aws.amazon.com/aws-sdk-php/v3/api/api-s3-2006-03-01.html#putobject
Fixed version:
public function store(Request $request)
{
$s3Client = new S3Client([
'region' => 'us-east-2',
'version' => 'latest',
'credentials' => [
'key' => $key,
'secret' => $secret
]
]);
$base64_str = substr($input['base64'], strpos($input['base64'], ",") + 1);
$image = base64_decode($base64_str);
$result = $s3Client->putObject([
'Bucket' => 's3-galgun',
'Key' => 'saraza.jpg',
'Body' => $image
]);
return $this->sendResponse($result['ObjectURL'], 'message.', 'ObjectURL');
}
A very simple way to uploads Any file in AWS-S3 Storage.
First, check your ENV setting.
AWS_ACCESS_KEY_ID=your key
AWS_SECRET_ACCESS_KEY= your access key
AWS_DEFAULT_REGION=ap-south-1
AWS_BUCKET=your bucket name
AWS_URL=Your URL
The second FileStorage.php
's3' => [
'driver' => 's3',
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION'),
'bucket' => env('AWS_BUCKET'),
'url' => env('AWS_URL'),
//'visibility' => 'public', // do not use this line for security purpose. try to make bucket private.
],
Now come on main Code.
Upload Binary File from HTML Form.
$fileName = 'sh_'.mt_rand(11111,9999).".".$imageFile->clientExtension();;
$s3path = "/uploads/".$this::$SchoolCode."/";
Storage::disk('s3')->put($s3path, file_get_contents($req->file('userDoc')));
Upload Base64 File
For Public Bucket or if you want to keep file Public
$binary_data = base64_decode($file);
Storage::disk('s3')->put($s3Path, $binary_data, 'public');
For Private Bucket or if you want to keep file Private
$binary_data = base64_decode($file);
Storage::disk('s3')->put($s3Path, $binary_data);
I Recommend you keep your file private... that is a more secure way and safe. for this, you have to use PreSign in URL to access that file.
For Pre sign-In URL check this post. How access image in s3 bucket using pre-signed url

Uploading file to S3 using presigned URL in PHP

I am developing a Web Application using PHP. In my application, I need to upload the file to the AWS S3 bucket using Presigned URL. Now, I can read the private file from the S3 bucket using pre-signed like this.
$s3Client = new S3Client([
'version' => 'latest',
'region' => env('AWS_REGION', ''),
'credentials' => [
'key' => env('AWS_IAM_KEY', ''),
'secret' => env('AWS_IAM_SECRET', '')
]
]);
//GetObject
$cmd = $s3Client->getCommand('GetObject', [
'Bucket' => env('AWS_BUCKET',''),
'Key' => 'this-is-uploaded-using-presigned-url.png'
]);
$request = $s3Client->createPresignedRequest($cmd, '+20 minutes');
//This is for reading the image. It is working.
$presignedUrl = (string) $request->getUri();
When I access the $presignedUrl from the browser, I can get the file from the s3. It is working. But now, I am uploading a file to S3. Not reading the file from s3. Normally, I can upload the file to the S3 like this.
$client->putObject(array(
'Bucket' => $bucket,
'Key' => 'data.txt',
'Body' => 'Hello!'
));
The above code is not using the pre-signed URL. But I need to upload the file using a pre-signed URL. How, can I upload the file using a pre-signed URL. For example, what I am thinking is something like this.
$client->putObject(array(
'presigned-url' => 'url'
'Bucket' => $bucket,
'Key' => 'data.txt',
'Body' => 'Hello!'
));
How can I upload?
It seems reasonable that you can create a pre-signed PutPobject command by running:
$cmd = $s3Client->getCommand('PutObject', [
'Bucket' => $bucket,
'Key' => $key
]);
$request = $s3Client->createPresignedRequest($cmd, '+20 minutes')->withMethod('PUT');
Then you might want to perform the PUT call from PHP using:
file_put_contents(
$request->getUri(),
'Hello!',
stream_context_create(['http' => [ 'method' => 'PUT' ]])
);
If you want to create a URL that a browser can submit, then you need to have the browser send the file as a form POST. This AWS documentation explains how to create a pre-signed POST request with the fields that you then need to put into an HTML form and display to the user: https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/s3-presigned-post.html
Also, this answer might be useful: https://stackoverflow.com/a/59644117/53538

Amazon aws - s3 bucket not uploading image - It creates only the key

I am using Laravel 5.0 and using "aws/aws-sdk-php-laravel": "~2.0"
Here is my script to upload the image
$s3 = App::make('aws')->get('s3');
$s3->putObject(array(
'Bucket' => 'greenhoppingbucket',
'Key' => 'sups',
'Body' => Input::file('file'),
));
After the execution only the key is uploaded in the s3 bucket
i.e., sups is created bucket but not the image.
What is the mistake i am doing and how can i fix this
try this:
$s3 = App::make('aws')->get('s3');
$s3->putObject(array(
'Bucket' => 'greenhoppingbucket',
'Key' => 'sups',
'Body' => File::get((string)Input::file('file')),
));
dont forget to add use File;
when you do 'Body' => Input::file('file'), you bassiclly putting the temp path into the body instead of the content of the file.
the File::get is simply getting the contents of a file

Can't upload image to S3 bucket using direct url of image

Here is my code, which works for forms upload (via $_FILES) (I'm omitting that part of the code because it is irrelevant):
$file = "http://i.imgur.com/QLQjDpT.jpg";
$s3 = S3Client::factory(array(
'region' => $region,
'version' => $version
));
try {
$content_type = "image/" . $ext;
$to_send = array();
$to_send["SourceFile"] = $file;
$to_send["Bucket"] = $bucket;
$to_send["Key"] = $file_path;
$to_send["ACL"] = 'public-read';
$to_send["ContentType"] = $content_type;
// Upload a file.
$result = $s3->putObject($to_send);
As I said, this works if file is a $_FILES["files"]["tmp_name"] but fails if $file is a valid image url with Uncaught exception 'Aws\Exception\CouldNotCreateChecksumException' with message 'A sha256 checksum could not be calculated for the provided upload body, because it was not seekable. To prevent this error you can either 1) include the ContentMD5 or ContentSHA256 parameters with your request, 2) use a seekable stream for the body, or 3) wrap the non-seekable stream in a GuzzleHttp\Psr7\CachingStream object. You should be careful though and remember that the CachingStream utilizes PHP temp streams. This means that the stream will be temporarily stored on the local disk.'. Does anyone know why this happens? What might be off? Tyvm for your help!
For anyone looking for option #3 (CachingStream), you can pass the PutObject command a Body stream instead of a source file.
use GuzzleHttp\Psr7\Stream;
use GuzzleHttp\Psr7\CachingStream;
...
$s3->putObject([
'Bucket' => $bucket,
'Key' => $file_path,
'Body' => new CachingStream(
new Stream(fopen($file, 'r'))
),
'ACL' => 'public-read',
'ContentType' => $content_type,
]);
Alternatively, you can just request the file using guzzle.
$client = new GuzzleHttp\Client();
$response = $client->get($file);
$s3->putObject([
'Bucket' => $bucket,
'Key' => $file_path,
'Body' => $response->getBody(),
'ACL' => 'public-read',
'ContentType' => $content_type,
]);
You have to download the file to the server where PHP is running first. S3 uploads are only for local files - which is why $_FILES["files"]["tmp_name"] works - its a file that's local to the PHP server.

The last step of a AWS EC2 to S3 file upload

I have this code :
require '/home/ubuntu/vendor/autoload.php';
$sharedConfig = [
'region' => 'us-west-2',
'version' => 'latest'
];
$sdk = new Aws\Sdk($sharedConfig);
$s3Client = $sdk->createS3();
$result = $s3Client->putObject([
'Bucket' => 'my-bucket',
'Key' => $_FILES["fileToUpload"]["name"],
'Body' => $_FILES["fileToUpload"]["tmp_name"]
]);
It works, basically. It sends a file to S3. But it apparently sends it badly since it always shows as a corrupted file... Can anyone tell me what I am doing wrong?
To be specific - the image I am uploading is a jpg image and when I try to look at it on the S3 instance, I am told that it "cannot be displayed because it contains errors"

Categories