Upload to AWS S3 'folder' inside bucket with PHP - php

Apologies, I'm not an AWS user. I need to upload a file to a 'folder' in an existing bucket within AWS S3. I understand that there is no concept of 'folders' within S3, so how I could specify the 'folder' it needs to go to?
My uploaded image needs to follow an existing structure of
https://s3.amazonaws.com/my.bucket.name/newfolder/test_image.png
The code I have below works, but it puts the image into the root bucket, whereas I need to put it into a new folder ($newfolder).
Does anyone know where in the below code I could specify the $newfolder to achieve this?
Thank you.
<?php
$newfolder = "newfolder";
$bucket = 'my.bucket.name';
require_once('S3.php');
$awsAccessKey = 'MyAccessKey';
$awsSecretKey = 'MySecretKey';
$s3 = new S3($awsAccessKey, $awsSecretKey);
$s3->putBucket($bucket, S3::ACL_PUBLIC_READ);
$actual_image_name = 'test_image.png';
if($s3->putObjectFile('/var/www/html/test/image.png', $bucket , $actual_image_name, S3::ACL_PUBLIC_READ) )
{
$image='http://'.$bucket.'.s3.amazonaws.com/'.$actual_image_name;
}
else
{
echo 'error uploading to S3 Amazon';
}
?>

$actual_image_name = 'newfolder/test_image.png';
S3 calls this the key inside the bucket. The key can contain slashes, which would be treated as folders when viewed using an appropriate client. But the concept of "folders" doesn't matter much when dealing with the S3 API nor URLs, so it's just "keys" with slashes in them.

Related

How to allow double forward slashes when upload file using Laravel League s3

There is a directory in S3 bucket named uploads// and I want to upload my files there since it is already using in existing web app and when I tried to upload to uploads// with Laravel league, it is ignoring one slash from two. So I added /// and it is also ignored and file uploaded to a new folder with uploads.
Example:
I want the file to be uploaded as
uploads//attachments/filename.jpg
Currently one slash ignored and uploaded as
uploads/attachments/filename.jpg
Here is the relevant code snippet:
// assume path as '/attachments/filename.jpg'
if($isPathDifferent==0){
$path = 'uploads//'. $path;
}
$upload = Storage::disk('s3')->store($path, file_get_contents($file));
Storage::disk('s3')->setVisibility($path, 'public');
Please note that I cannot change the name uploads// because it has lot of resources and usage.
Please apply following solution, it should fix your problem
$path = Storage::disk('s3')->put('uploads/', $file);

PHP: Uploading public files to a folder inside a Google storage bucket

Sorry for the noob question but this is a problem I can't seem to crack. I have a laravel script to upload file publicly to a google storage bucket and it is working properly. The problem is that object that I upload is just inside the bucket, where it should be placed inside a directory inside the bucket. e.g bucket - /dev-bucket/ , correct directory - /dev-bucket/media/ below is my script
$client = new \Google_Client();
$credential = new \Google_Auth_AssertionCredentials(
'xxxxxx#appspot.gserviceaccount.com',
['https://www.googleapis.com/auth/devstorage.full_control'],
file_get_contents(storage_path().'xxxxx.p12')
);
$client->setAssertionCredentials($credential);
// this access control is for project owners
$ownerAccess = new \Google_Service_Storage_ObjectAccessControl();
$ownerAccess->setEntity('project-owners-' . 'xxxxxxxxxxx');
$ownerAccess->setRole('OWNER');
// this access control is for public access
$readerAccess = new \Google_Service_Storage_ObjectAccessControl();
$readerAccess->setEntity('allUsers');
$readerAccess->setRole('READER');
$storage = new \Google_Service_Storage($client);
$obj = new \Google_Service_Storage_StorageObject();
$obj->setName($stageFileName);
$obj->setAcl([$ownerAccess, $readerAccess]);
$storage->objects->insert(
Config::get('constants.bucket'),
$obj,
[
'name' => $stageFileName,
'data' => file_get_contents($stageFile),
'uploadType' => 'media',
]
);
Is there an option on how to put the object inside the directory inside the bucket?
You should be able to upload files in the directory by pre-pending the file name of the object to be uploaded with the directory in the bucket for uploading. For example if you have a directory in your bucket called dev your filename should start with 'dev/{fileToBeUploaded}.txt'
To upload a file to a specific folder in google bucket, I wrote below code:
Blob blob=
storage.create(BlobInfo.newBuilder(bucketName,"folder_name/"+f.getName()).setAcl(acls).build(),f.getInputStream());
And for downloading from that folder:
BlobId blobId = BlobId.of(bucketName, vendorName+"/"+fileName);
Blob blob = storage.get(blobId);
P.S. This is in JAVA, can do similar stuff in any other language.

How to rename or move a file in Google Cloud Storage (PHP API)

I am currently trying to rename and/or move a cloud storage file to another name/position, but I can't get it to work. I am using https://github.com/google/google-api-php-client as client, the uploads works fine with:
...
$storageService = new \Google_Service_Storage( $client )
$file = new \Google_Service_Storage_StorageObject()
$file->setName( 'test.txt' );
$storageService->objects->insert(
$bucketName,
$file,
array(
'name' => $filename,
'data' => file_get_contents( $somefile )
)
);
...
So I have tried to change a filename by the $storageObject->objects->update() method, but I cannot find any documentation on this. I used $storageService->objects->get( $bucketName, $fileName ) to get a specific file I wanted to rename (with $file->setName()), but it seems I just cannot pass the file to the objects->update function. Am I doing it wrong?
Ok, it seems I cannot directly rename a file (please correct me if I'm wrong), I could only update the metadata. I managed to get it to work by copying the file to a new filename/destination and then delete the old file. I successfully used $storageService->objects->copy and $storageService->objects->delete for this. This doesn't feels right but at least it works.
As this is not very well documented with google, here a basic example:
//RENAME FILE ON GOOGLE CLOUD STORAGE (GCS)
//Get client and auth token (might vary depending on the way you connect to gcs – here with laravel framework facade)
//DOC: https://cloud.google.com/storage/docs/json_api/v1/json-api-php-samples
//DOC: https://developers.google.com/api-client-library/php/auth/service-accounts
//Laravel Client: https://github.com/pulkitjalan/google-apiclient
//Get google client
$gc = \Google::getClient();
//Get auth token if it is not valid/not there yet
if($gc->isAccessTokenExpired())
$gc->getAuth()->refreshTokenWithAssertion();
//Get google cloud storage service with the client
$gcStorageO = new \Google_Service_Storage($gc);
//GET object at old position ($path)
//DOC: https://cloud.google.com/storage/docs/json_api/v1/objects/get
$oldObj = $gcStorageO->objects->get($bucket, $path);
//COPY desired object from old position ($path) to new position ($newpath)
//DOC: https://cloud.google.com/storage/docs/json_api/v1/objects/copy
$gcStorageO->objects->copy(
$bucket, $path,
$bucket, $newpath,
$oldObj
);
//DELETE old object ($path)
//DOC: https://cloud.google.com/storage/docs/json_api/v1/objects/delete
$gcStorageO->objects->delete($bucket, $path);
I found that when using gcutils in conjunction with PHP, you can execute pretty much every php file command on app engine. Copy, delete, check if file exists.
if(file_exists("gs://$bucket/{$folder}/$old_temp_file")){
$old_path = "gs://$bucket/{$folder}/$old_temp_file";
$new_permanent_path = "gs://$bucket/{$folder}/$new_permanent_file";
copy($old_path, $new_permanent_path);
unlink($old_path);
}

How to copy a file from a website to amazon bucket using zend service amazon?

I need to copy a resource from a website to my s3-bucket. For example an image like this 'http://upload.wikimedia.org/wikipedia/commons/6/63/Wikipedia-logo.png' I need to copy this to a folder in my s3-bucket. Is this possible using Zend_Service_Amazon?
You have to use stream wrappers.
I haven't dealt with Image files but hope it will work .
$s3 = new Zend_Service_Amazon_S3($my_aws_key, $my_aws_secret_key);
$s5="s".rand();
$s3->registerStreamWrapper($s5);
//$bucketname- your bucket name
mkdir($s5."://".$bucketname);
//$path - where you want to store your file including bucketname
$s1=$s5."://".$path;
$filedata = file_get_contents('yourimage url');
file_put_contents($s1, $fildata);

How to copy an image to Amazon S3?

I have a problem a copying an image to Amazon S3
I am using the PHP copy function to copy the image from one server to other server ..it works on go-daddy host server. But it doesn't work for S3. Here is the code that is not working:
$strSource =http://img.youtube.com/vi/got6nXcpLGA/hqdefault.jpg
copy($strSource ,$dest);
$dest is my bucket url with folder present to upload images
I am not sure you could copy an image to AWS just like that. I would suggest using a library which talks to the AWS server and then running your commands.
Check this - http://undesigned.org.za/2007/10/22/amazon-s3-php-class
It provides a REST implementation for AWS.
For example, if you want to copy your image, you can do:
$s3 = new S3($awsAccessKey, $awsSecretKey);
$s3->copyObject($srcBucket, $srcName, $bucketName, $saveName, $metaHeaders = array(), $requestHeaders = array());
$awsAccessKey and $awsSecretKey are your secret keys for AWS a/c.
Check it out and hope it helps.
Not sure if you have used the AWS PHP SDK, but the AWS SDKs can come in handy in situations like this. The SDK can be used in conjunction with IAM roles to grant access to your S3 bucket. These are the steps:
Modify your code to use the PHP SDK to upload the files (if needed).
Create an IAM Role and grant the role permission to the needed S3 buckets.
When you start your EC2 instance, specify that you want to use the role.
Then your code will automatically use the permissions that you grant that role. IAM gives the instance temporary credentials that the SDK uses. These credentials are automatically rotated for you by IAM and EC2.
Here is my examnple from the documentation to copy an object in S3 Bucket
public function copyObject($sSourceKey, $sDestKey)
{
$this->checkKey($sSourceKey);
$this->checkKey($sDestKey);
$bRet = false;
// http://docs.aws.amazon.com/aws-sdk-php-2/latest/class-Aws.S3.S3Client.html#_copyObject
try {
$response = $this->_oS3Client->copyObject(
array(
'Bucket' => $this->getBucketName(),
'Key' => $sDestKey,
'CopySource' => urlencode($this->getBucketName() . '/' . $sSourceKey),
)
);
if (isset($response['LastModified'])) {
$bRet = true;
}
} catch (Exception $e) {
$GLOBALS['error'] = 1;
$GLOBALS["info_msg"][] = __METHOD__ . ' ' . $e->getMessage();
$bRet = false;
}
return $bRet;
}

Categories