Can we update contents of a specific file in Amazon S3? - php

I am using AWS PHP SDK. I uploaded a JSON file to S3 bucket. Now I would like to get the file contents(uploaded to S3 bucket), add some additional text to the grabbed file contents and update that file over the S3 bucket.
What I want is something like this:
file name: userlist.json
grab content of file using S3 provide methods
eg: existing file contents are {'abc','xyz'}
add additional contents {'abc','xyz'}, {'zxv','opiv','cvpo'}
update newly added content into S3 bucket file (userlist.json)
How we can do this ?

You can't add data to, or modify just part of an existing s3 object, you need to read the object, make changes to the object, and the write the entire object back to s3.

You can overwrite any file in s3.
When you write file in same location with same name then it remove old file and replace it with new one.

You can follow the following algorithm
step 1 : get the file content from a specific bucket using the function
return $result = $this->s3->getObject(array(
'Bucket' => bucket,
'Key' => 'file_name'
));
step 2: Write the return content in a local file.
step 3: Read the file and modify whatever you like. Save the file in local machine. Here you can save the file in different extension.
step 4: Upload the file in the desired bucket. Remember it s3 bucket shall not give immediate update if you overwrite the file with same name.
step 5:End

Related

How to update stripe's legal_entity.verification.document?

How to update stripe's legal_entity.verification.document with PHP ?
Is it a file to upload to stripe.
The legal_entity[verification][document] attribute should be set to a file upload's ID.
This part of the documentation explains how to upload a file using Stripe's API, and this part explains how to attach the uploaded file to a managed account.
You need to first pass the file to your server in multipart/form-data format. Get the path and add the path in Stripe FileUpload function.
You can even test the file upload by keeping the file manually in server directory and pass the file path to the function.
Below is the sample.
$account = \Stripe\Account::retrieve('acct_xxxxxxxxxxx');
$uploadedFile = \Stripe\FileUpload::create(
array("purpose" => "identity_document",
"file" => fopen('file.png', 'r')
)
);
You will get a success upload response with file id and this id you need to pass in legal_entity.verification.document like :
$account->legal_entity->verification->document = $uploadedFile->id;
$account->save();

Laravel Download from S3 To Local

I am trying to download a file that I stored on S3 to my local Laravel installation to manipulate it. Would appreciate some help.
I have the config data set up correctly because I am able to upload it without any trouble. I am saving it in S3 with following pattern "user->id / media->id.mp3" --> note the fact that I am not just dumping files on S3, I am saving them in directories.
After successfully uploading the file to S3 I update the save path in my DB to show "user->id / media->id.mp3", not some long public url (is that wrong)?
When I later go back to try and download the file I am getting a FileNotFoundException at S3. I'm doing this.
$audio = Storage::disk('s3')->get($media->location);
The weird thing is that in the exception it shows the resource that it cannot fetch but when I place that same url in a browser it displays the file without any trouble at all. Why can't the file system get the file?
I have tried to do a "has" check before the "get" and the has check comes up false.
Do I need to save the full public URL in the database for this to work? I tried that and it didn't help. I feel like I am missing something very simple and it is making me crazy!!
Late answer but important for others,
$s3_file = Storage::disk('s3')->get(request()->file);
$s3 = Storage::disk('public');
$s3->put("./file_name.tif", $s3_file);
The response of $s3_file will be a stream, you can save that stream data to file using Laravel put file method, you will find this stream file in storage/public directory.
You can give your Content-Type as desired and Content-Disposition as 'attachment' because your files are coming from S3 and you have to download it as an attachment.
$event_data = $this->ticket->where('user_id', $user_id)->first();
$data = $event_data->pdf;
$get_ticket = 'tickets/'. $data;
$file_name = "YOUR_DESIRED_NAME.pdf";
$headers = [
'Content-Type' => 'application/pdf',
'Content-Disposition' => 'attachment; filename="'. $file_name .'"',
];
return \Response::make(Storage::disk('s3')->get($get_ticket), 200, $headers);
Say, you have AWS S3 as your default storage.
And you want to download my_file.txt from S3 to my_laravel_project\storage\app\my_file.txt
And you want to make it a one-liner
Storage::disk('local')->put('my_file.txt', Storage::get('my_file.txt'));

PHP Amazon S3 access private files through URL

I'm using AWS PHP sdk to save images on S3. Files are saved privately. Then, I'm showing the image thumbnails using the S3 file url in my web application but since the files are private so the images are displayed as corrupt.
When the user clicks on the name of file, a modal is opened to show the file in larger size but file is displayed as corrupt there as well due to the same issue.
Now, I know that there are two ways to make this working. 1. Make the files public. 2. Generate pre-signed urls for files. But I cannot go with any of these two options due to the requirements of my project.
My question is that is there any third way to resolve this issue?
I'd highly advise against this, but you could create a script on your own server that pulls the image via the API, caches it and serves. You can then restrict access however you like without making the images public.
Example pass through script:
$headers = get_headers($realpath); // Real path being where ever the file really is
foreach($headers as $header) {
header($header);
}
$filename = $version->getFilename();
// These lines if it's a download you want to do
// header('Content-Description: File Transfer');
// header("Content-Disposition: attachment; filename={$filename}");
$file = fopen($realpath, 'r');
fpassthru($file);
fclose($file);
exit;
This will barely "touch the sides" and shouldn't delay the appearance of your files too much, but t's still going to take some resources and bandwidth.
You will need to access the files through a script on your server. That script will do some kind of authentication to make sure the request is valid and you want them to see the file. Then fetch the file from S3 using a valid IAM profile that can access the private files. Output the file
Instead of requesting the file from S3 request it from
http://www.yourdomain.com/fetchimages.php?key=8498439834
Then here is some pseudocode in fetchimages.php
<?php
//if authorized to get this image
$key=$_GET['key'];
//validate key is the proper format
//get s3 url from a database based on the $key
//connect to s3 securely and read the file from s3
//output the file
?>
as far as i know you could try to make your S3 bucket a "web server" like this but then you would probably "Make the files public".Then if you have some kind of logic to restrict the access you could create a bucket policy

PHP Google App Engine permanently delete Image from Cloud Storage

I'm using GAE version 1.9.0 and I want to delete an image from the data storage and upload another image to its location. This is how I'm doing it right now.
unlink("gs://my_storage/images/test.jpg");
move_uploaded_file($_FILES['image']['tmp_name'],'gs://my_storage/images/test.jpg');
And then I want to get the Image serving URL of the latest uploaded image, and I do it like this.
$image_link = CloudStorageTools::getImageServingUrl("gs://my_storage/images/test.jpg");
The issue is, when the name of the deleted image("test.jpg") and the uploaded image("test.jpg") is the same, the old file is served when I call for the newly uploaded file(I think it is cached.)
Is there anyway I can permanently delete this file without caching it?
You should probably delete the original serving URL before creating another with the same name.
There's a deleteImageServingUrl() method in CloudStorageTools that you can use to do this.
Here it is how to do in php laravel.
$object = $post_media->media_cloud;
$objectname = substr($object,48,100);
$bucket = Storage::disk('gcs')->delete($objectname);
Here in $object i get google cloud image url from db
Then we take only object name from that url, by substr.
Since you have given in your config Storage class as Storage::disk('gcs')
so this will call the function delete by taking the objectname.
Hope it helps anyone.
Note : For multiple images either pass an array of objects, or repeat it foreach loop.

How to set Content-Disposition Headers as a default on Amazon S3 Bucket

The problem I have is that I need the Content-Disposition: attachment header to be present on EVERY file that hits my bucket.
In Wordpress, I can just use .htaccess to cover the filetypes in question (videos), but those rules do not extend to my S3 downloads which browsers are simply trying to open, instead of download.
I need an automated/default solution, since I am not the only one that uploads these files (our staff uploads through Wordpress, and the uploads all are stored on our S3 bucket). So using Cloudberry or other browsers is not useful for this situation. I can't adjust the files on a per-file basis (the uploads are too frequent).
Is there a way to do this?
(Other information: I'm using the "Amazon S3 and Cloudfront" plugin on Wordpress that is responsible for linking the two together. Unfortunately, the site is not public, so I cannot link to it.)
Unfortunately there is no way to set this for an entire bucket in S3, and also Cloudfront can only set Cache-Headers
But you can set The Content-disposition parameter when uploading files to S3.
For existing files, you must change the Header, so loop through every object in the Bucket, and copy it to itself using the new headers.
All I can say now, please post the code that uploads the file to S3.
First, you need to locate the code that puts the object in the bucket.
You can use notepad++ to search for "putObject" within the php files of whatever plugin you are using.
An example code from another WP plugin that stores files to S3 is as follows:
$this->s3->putObject( array(
'Bucket' => $bucket,
'Key' => $file['name'],
'SourceFile' => $file['file'],
) );
Now, simply add ContentDisposition' => 'attachment' like so:
$this->s3->putObject( array(
'Bucket' => $bucket,
'Key' => $file['name'],
'SourceFile' => $file['file'],
'ContentDisposition' => 'attachment',
) );
Thats it :)
Yes, you can set default Content-Disposition header for your each and every upcoming uploading file in your S3 bucket using Bucket Explorer's Bucket Default feature.
For existing files, you can use Update Metadata option that update metadata on every file exist in your bucket in batch.
You just need to -
Select Key as : Content-Disposition
Add Value as : attachment;filename={$file_name_without_path_$$}
Then update metadata on the files.
See this page to set Content-Disposition on your file.
More references:
http://www.bucketexplorer.com/documentation/amazon-s3--metadata-http-header-bucket-default-metadata.html
http://www.bucketexplorer.com/documentation/amazon-s3--how-to-manage-http-headers-for-amazon-s3-objects.html
http://www.bucketexplorer.com/documentation/amazon-s3--metadata-http-header-update-custom-metadata.html
Thanks

Categories