I am getting some MMS messages from my users. Those MMS are coming via twilio. So twilio storing those files into their server and I can visit those files from twilio. But in my case, I need to store those files into S3 and show into our system from S3. I can store those files into my local folder or my server. But I am not finding any way to store file into the S3 directly from the url.
This is what I have done to store into the local directory from url.
// url of my file. Mostly it will be image.
$url = 'urlofmyfile';
// Path where I am saving. Keeping for jpg for now
$img = 'file/sms/file.jpg';
// saving the file into the folder
file_put_contents($img, file_get_contents($url));
And this is how I am saving my files into S3 if anyone want to upload it directly into my system. For example if any user want to upload their profile picture.
public function saveToS3Bucket($uploadFileName, $imageTmpName) {
$s3Client = new \Aws\S3\S3Client([
'version' => env('S3_BUCKET_VERSION'),
'region' => env('S3_BUCKET_REGION'),
'credentials' => array(
'key' => env('S3_BUCKET_KEY'),
'secret' => env('S3_BUCKET_SECRET'),
)
]);
try {
$s3Client->putObject([
'Bucket' => env('S3_BUCKET_NAME'),
'Key' => $uploadFileName,
'SourceFile' => $imageTmpName,
'StorageClass' => 'REDUCED_REDUNDANCY',
'ACL' => 'public-read'
]);
return true;
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
return false;
}
}
Those above codes are working fine. But I am not finding any way to store into S3 from url. Please note I am writing code in CakePHP.
Have a look at the Twilio Function below, it should point you in the right direction.
It was derived from this Twilio Blog:
Encrypting and Storing Twilio Flex Recordings Off-site
const axios = require('axios');
let AWS = require('aws-sdk');
const S3UploadStream = require('s3-upload-stream');
exports.handler = async function(context, event, callback) {
// Set the region
AWS.config.update({region: 'us-west-2'});
AWS.config.update({ accessKeyId: context.AWSaccessKeyId, secretAccessKey: context.AWSsecretAccessKey });
// The name of the bucket that you have created
const BUCKET_NAME = 'winston';
const fileUrl = "https://a.b.twil.io/assets/KittehWinston.jpg";
const fileName = "winston.jpg";
const s3Stream = S3UploadStream(new AWS.S3());
// call S3 to retrieve upload file to specified bucket
let upload = s3Stream.upload({Bucket: BUCKET_NAME, Key: fileName, ContentType: 'image/jpeg', ACL: 'public-read' });
const fileUpload = await uploadFile(fileUrl, upload)
.then(result => callback(null, `success: ${JSON.stringify(result)}`))
.catch(err => callback(err.message));
async function uploadFile (url, upload) {
const response = await axios({
url,
method: 'GET',
responseType: 'stream'
})
response.data.pipe(upload);
return new Promise((resolve, reject) => {
upload.on('uploaded', resolve)
upload.on('error', reject)
})
}
};
Related
Having a bit of an issue; I am trying to upload to a PUT signed url - but any time I add in 'content-md5' to my upload header GCS responds with
<Message>The MD5 you specified in Content-MD5 or x-goog-hash was invalid.</Message>
<Details>Invalid MD5 value: ZWFmMjRlNmE0OTExMjRhMTllYmYxYzU2ODliOWIyZGE=</Details>
I can confirm that eaf24e6a491124a19ebf1c5689b9b2da is the correct MD5, and I am providing it in base64 form. When I upload the file manually to my bucket and check the MD5 content it matches with eaf24e6a491124a19ebf1c5689b9b2da.
My signed URL is being made with
$url = $object->signedUrl(
new \DateTime("60 min"),
[
"method" => "PUT",
'version' => 'v4',
"contentType" => "application/octet-stream"
"contentMd5" => base64_encode($md5)
]
);
On the Cordova side I am using the FileTransfer plugin
var uploadFileTransfer = new FileTransfer();
var options = new FileUploadOptions();
options.fileKey = "file";
options.httpMethod = "PUT";
options.fileName = "my_file.zip";
options.mimeType = "application/octet-stream";
options.headers = {
"content-md5": btoa(fileMd5),
'content-type': 'application/octet-stream'
}
uploadFileTransfer.upload(file, url, success, fail, options);
If I remove content-md5, the upload works. Any help is greatly appreciated here, thanks.
I am building a testing website. In login confirmation, I have to show the examinee their pictures, which is already saved in Google Drive Folder.
$optParams = array(
'pageSize' => 1,
'fields' => 'nextPageToken, files(contentHints/thumbnail,fileExtension,id,name,size)',
'q' =>"mimeType contains 'image/' AND name contains '".$imageId."' AND '".$folderIdId."' in parents"
);
$results = $googleDriveService->files->listFiles($optParams);
if (count($results->getFiles()) == 0) {
print "No files found.\n";
} else {
print "Files:\n";
foreach ($results->getFiles() as $file) {
printf("%s (%s)\n", $file->getName(), $file->getId());
}
}
This is what I used to get the file ID. Now in order to preview the image to the page, do I have to download the image (then delete it later) in order to show it, or is there another way to do it without downloading?
Answer:
While Drive is not designed to be a hosting platform, you use the preview link as a workaround.
More Information:
I really should reiterate here: Drive is not designed to be an image hosting platform. It is a file sharing and cloud storage solution primarily, but does provide methods of viewing images via preview, view and embed links.
You can use the following link, replacing [ID] with your image ID to embed or preview an image in a page:
https://drive.google.com/uc?export=view&id=[ID]
NB: While this works, the image will load slowly as image hosting is not the M.O. of Drive.
There is also an iframe option provided in the form of an embed:
<iframe src="https://drive.google.com/file/d/[ID]/preview" width="640" height="480"></iframe>
This iframe embed is obtainable from the Drive UI:
After double-clicking your image at drive.google.com, and following the ⋮ > Open in new window menu item, in the newly opened tab follow the ⋮ > Embed item... menu option and the iframe code will open in a modal.
This works for me. You need to follow the GOOGLE DRIVE API docs. This code takes a file and uploads it to google drive. Records the URL and the "name" in a database for use in displaying the images later. I installed multer in express.js to help with multiple image uploads.
app.post('/api/uploadmultiple', uploader.any("files", 12),
async (req, res, next) => {
const scopes = [
'https://www.googleapis.com/auth/drive'
];
const stream = require('stream');
const { google } = require('googleapis');
const credentials = require('./google-credentials.json');
let fileObject = req.files[0];
let bufferStream = new stream.PassThrough();
bufferStream.end(fileObject.buffer);
const auth = new google.auth.JWT(
credentials.client_email, null,
credentials.private_key, scopes
);
const drive = google.drive({ version: "v3", auth });
const fileName = req.files[0].originalname;
const fileType = req.files[0].mimetype;
var fileMetadata = {
name: fileName, // file name that will be saved in google drive
};
var media = {
mimeType: fileType,
body: req.files[0].buffer, // Reading the file from our server
};
var petname = req.body.petname;
// Uploading Single image to drive
drive.files.create(
{
media: {
mimeType: fileType,
body: bufferStream
},
resource: {
name: fileName,
// if you want to store the file in the root, remove this parents
parents: ['GET THIS ID FROM GOOGLE DRIVE']
},
fields: 'id',
}).then(function (resp) {
if (resp.data) {
res.status(200).end("File uploaded");
var fileLocation = "https://drive.google.com/uc?id=" + resp.data.id;
console.log("Upload Success", fileLocation);
db.addPet(petname, fileLocation);
db.addImage(petname, fileName, fileLocation);
}
console.log(resp.data.id,'resp');
}).catch(function (error) {
console.log(error);
res.status(500);
})});
I am using jQuery file uploader to upload a resized image, convert it to blob and save it as blob into a DB.
For the database I also need to save the mimeType, which I see in the request, which I receive, but I dont understand how to get the mimeType value.
The code to send the image:
var formData = new FormData();
formData.append("_token", $('meta[name="csrf-token"]').attr('content'));
formData.append("user_id_val", $('.user-general-i').data('userid'));
// HTML file input, chosen by user
formData.append("userfile", data.files[0]);
var request = new XMLHttpRequest();
request.open("POST", "http://localhost.eu/home/create_comment_images");
request.onload = function(oEvent) {
if (request.status == 200) {
console.log('success');
} else {
console.log(request.status);
}
};
request.send(formData);
Code on the server:
public function create_comment_images(Request $data) {
\Log::info($data);
try {
$path = $data->userfile;
$logo = file_get_contents($path);
$base64 = base64_encode($logo);
return ['success' => true];
} catch (\Exception $e) {
return ['success' => false, 'message' => $e->getMessage()];
}
return ['success' => false, 'message' => 'Something went wrong'];
}
The log info shows me this:
array (
'_token' => 'QxOqetFU2Re6fwe442vksGNnvV0C88v8dcrFpAp',
'user_id_val' => '568092',
'userfile' =>
Illuminate\Http\UploadedFile::__set_state(array(
'test' => false,
'originalName' => 'Unbenannt.png',
'mimeType' => 'image/png',
'error' => 0,
'hashName' => NULL,
)),
)
I am almost there, I need the get the mimeType information from the array.
I tried:
$data->userfile->mime_content_type
$data->userfile->mimeType
$data->userfile['mimeType']
$data->userfile[0]['mimeType']
Nothing works. Any ideas how to extract that information?
On Laravel you can use the intervention library. This library is very powerfull you can change the format, resize and do all kind of stuff.
Here a basic example..
// read image from temporary file
$img = Image::make($_FILES['image']['tmp_name']);
// get or sets the mime type.
$mimeType = $img::mime();
// save image
$img->save('foo/bar.jpg');
// Get image as string.
$string = base64_encode($img->encode('jpg'));
Intervention Reference
To get the mimetype from the uploaded file header, you can call getMimeType() on the \Illuminate\Http\UploadedFile class.
$uploadedFile = $data->file('userfile');// or
$mimeType = $uploadedFile->getMimeType()
Like the title says, I'm trying to do an image upload from VueJs to a Laravel endpoint. I discovered that the only way(if there is another please tell me) is to send the base64 of the image through the request. On the Vue side I think everything is covered.
However, on the Laravel side is where it gets complicated. I can't decode the base64 string that gets passed, and when I try to store the image in my AWS S3 bucket, it doesn't store properly. Here is the code:
VueJS
<template>
<input type="file" name="image" class="form-control" #change="imagePreview($event)">
</template>
methods: {
submitForm(){
axios.defaults.headers.post['Content-Type'] = 'multipart/form-data';
axios.post(this.$apiUrl + '/article', {
image: this.image
}).then(response => {
flash(response.data.message, 'success');
}).catch(e => {
console.log(e);
})
},
imagePreview(event) {
let input = event.target;
if (input.files && input.files[0]) {
var reader = new FileReader();
let vm = this;
reader.onload = e => {
this.previewImageUrl = e.target.result;
vm.image = e.target.result;
}
reader.readAsDataURL(input.files[0]);
}
}
}
Laravel:
$this->validate($request, [
'image' => 'required',
]);
// return response()->json(base64_decode($request->image));
$timestampName = microtime(true) . '.jpg';
$url = env('AWS_URL') . '/article_images/' .$timestampName;
Storage::disk('s3')->put($timestampName, base64_decode($request->image));
If I add the image validation rule, it says it's not an image..
I would also like to retrieve the extension if possible.
you can do it using FormData in JS part and use getClientOriginalExtension() on the file to get the extension in Laravel.
VueJS
imagePreview(event) {
let selectedFile = event.target.files[0];
vm.image = selectedFile;
}
submitForm(){
let fomrData = new FormData();
formData.append('image', vm.image);
axios.post(this.$apiUrl + '/article', formData, {
headers: {
'Content-Type': 'multipart/form-data'
}
})
.then(response => {
flash(response.data.message, 'success');
})
.catch(e => {
console.log(e);
});
}
Laravel
$this->validate($request, [
'image' => 'required',
]);
$image = $request->file('image');
$extension = $image->getClientOriginalExtension(); // Get the extension
$timestampName = microtime(true) . '.' . $extension;
$url = env('AWS_URL') . '/article_images/' .$timestampName;
Storage::disk('s3')->put($url, file_get_contents($image));
Here is a link that might be useful
Hope that it helps.
This question already has answers here:
How to download excel/Zip files in Angular 4
(2 answers)
Closed 5 years ago.
i'm trying to download excel file (.xls) using angular and php for backend
my backend already send my excel file as response, and when i check it, it return with correct format
but my angular 4 is returning the file as corrupted format (it contain some symbols like ��ࡱ�;�� )
below is my angular code:
Service
private headers = new Headers({
'Content-Type': 'application/x-www-form-urlencoded'
});
downloadTemplate() {
this.options = new RequestOptions();
this.options.headers = this.headers;
const params: URLSearchParams = new URLSearchParams();
params.set('token', this.currentUser.token);
this.options.search = params;
return this.http.get(environment.api_host + '/template', this.options);
}
Component
template() {
this._apiService.downloadTemplate().subscribe((response) => {
const blob = new Blob([(<any>response)._body], {type: 'application/vnd.ms-excel'});
const link = document.createElement('a');
link.href = window.URL.createObjectURL(blob);
link.download = 'template.xls';
document.body.appendChild(link);
link.click();
});
}
from my code, is there something that i miss?
If subscription is not necessary then just use this in downloadTemplate() method in your service
window.location.href = your_URL;
I'm using FileSaver library for serving files from Angular apps: http://alferov.github.io/angular-file-saver/
Here you have sample code:
getTemplate(): Observable<Blob> {
this.apiService.http()
.get(environment.api_host + '/template', { responseType: ResponseContentType.Blob })
.map(r => r.blob())
.subscribe(
template => {
FileSaver.saveAs(new Blob([template]), "template.xml");
},
error => {
console.error(error);
}
);
}
Important is that you should add header Content-Type: application/octet-stream to your PHP response. Complete and working set of headers:
$file = fopen("/tmp/examplefile.xls", "r");
array(
'Content-Disposition' => 'attachment; filename="' . basename($filePath) . '"',
'Content-Type' => 'application/octet-stream',
'Content-Length' => filesize($filePath),
'Expires' => '#0',
'Cache-Control' => 'must-revalidate',
'Pragma' => 'public'
);
I think the problem in your code is header definition: const blob = new Blob([(<any>response)._body], {type: 'application/vnd.ms-excel'}); or response type in http client options. You should send 'octet/stream' from PHP(binary data). In angular create Blob object from this data and return Blob object to user. That's all.